Best Practices for HP BladeSystem Deployments using HP Serviceguard Solutions for HP-UX 11i (May 2010)
24
InfiniBand:
• The HP BLc 4X DDR InfiniBand Mezzanine card, which requires the HP BLc 4X DDR IB Switch
Module (figure 15), is supported with Serviceguard
Figure 15: HP BLc 4X DDR InfiniBand Mezzanine cardand HP BLc 4X DDR IB Switch Module
• Considerations for InfiniBand use:
– Few applications use native InfiniBand protocol; thus requiring the use of IPoverIB protocol (e.g.,
Oracle RAC 10g and 11g currently support only IPoverIB), which dramatically increases CPU
overhead
– If VERITAS CVM or CFS is used, InfiniBand must not be configured as the Serviceguard cluster
heartbeat
– Using InfiniBand limits the ability to have high availability configurations for the Fibre Channel
and Ethernet mezzanine card as the IB interconnect module physically requires two interconnect
bay slots
Serviceguard Solutions Portfolio:
• The Serviceguard Storage Management Suite, SGeRAC, SGeSAP, and Enterprise Cluster Master
Toolkit can be used with HP BladeSystem configurations without any constraints or special
considerations
– Follow published manuals, release notes and white papers for suggested best practice
configurations
• HP BladeSystems are supported with all Serviceguard Disaster Recovery solutions (i.e., Extended
Distance Serviceguard clusters, Metrocluster, Continentalclusters)
Other Areas to Improve HP BladeSystem Solution Availability:
• Consider adding high availability to the Central Management Server. See the white paper titled
“Using HP Insight Software from a Highly Available Central Management Server with Microsoft
Cluster Service” posted at http://h20195.www2.hp.com/V2/GetPDF.aspx/c01956953.pdf for
more information.
• Also consider configuring the quorum service as a high availability Serviceguard cluster, which is
described in the HP Quorum Server documentation posted at
http://docs.hp.com/en/ha.html#Quorum%20Server