Hardware manual

Version 1.1, 03/31/2015
GSS CCT Evaluation Technical Report Page 54 of 56 © 2015 Gossamer Security Solutions, Inc.
Document: AAR-BrocadeNetIron5.8 All rights reserved.
The test plan identifies the platforms to be tested, and for those platforms not included in the test plan but
included in the ST, the test plan provides a justification for not testing the platforms. This justification must
address the differences between the tested platforms and the untested platforms, and make an argument that the
differences do not affect the testing to be performed. It is not sufficient to merely assert that the differences have
no affect; rationale must be provided. If all platforms claimed in the ST are tested, then no rationale is necessary.
The test plan describes the composition of each platform to be tested, and any setup that is necessary beyond
what is contained in the AGD documentation. It should be noted that the evaluator is expected to follow the AGD
documentation for installation and setup of each platform either as part of a test or as a standard pre-test
condition. This may include special test drivers or tools. For each driver or tool, an argument (not just an assertion)
should be provided that the driver or tool will not adversely affect the performance of the functionality by the TOE
and its platform. This also includes the configuration of the cryptographic engine to be used. The cryptographic
algorithms implemented by this engine are those specified by this PP and used by the cryptographic protocols
being evaluated (IPsec, TLS/HTTPS, SSH).
The test plan identifies high-level test objectives as well as the test procedures to be followed to achieve those
objectives. These procedures include expected results. The test report (which could just be an annotated version
of the test plan) details the activities that took place when the test procedures were executed, and includes the
actual results of the tests. This shall be a cumulative account, so if there was a test run that resulted in a failure; a
fix installed; and then a successful re-run of the test, the report would show a 'fail' and 'pass' result (and the
supporting details), and not just the 'pass' result.
The evaluator created a Detailed Test Report: Evaluation Team Test Report for Brocade MLX® and NetIron®
Family Devices with Multi-Service IronWare R05.8.00, Version 1.1, 03/31/2015 (DTR) to address all aspects of
this requirement. The DTR discusses the test configuration, test cases, expected results, and test results.
The following is the test configuration used by the evaluation team: