HP-MPI Version 2.2.5.1 for Linux Release Note

HP-MPI V2.2.5.1 for Linux Release Note
Known Problems and Workarounds
17
HP-MPI complies with the MPI-1.2 standard, which defines bindings for Fortran 77 and
C, but not Fortran 90 or C++. HP-MPI also complies with the C++ binding definitions
detailed in the MPI-2 standard. However, the C++ bindings provided are not thread safe
and should not be used with the HP-MPI threaded libraries (i.e. libmtmpi). HP-MPI does
not provide bindings for Fortran 90. Some features of Fortran 90 may interact with MPI
non-blocking semantics to produce unexpected results. Consult the HP-MPI User’s Guide
for details.
When using the HP Caliper profiling tool with HP-MPI applications, it may be necessary
to specify the following environment variable setting in order to avoid an application
abort.
% setenv HPMPI_NOPROPAGATE_SUSP 1
or
$ export HPMPI_NOPROPAGATE_SUSP=1
Locating your instrumentation
Whether mpirun is invoked on a host where at least one MPI process is running, or on a
host remote from all MPI processes, HP-MPI writes the instrumentation output file
(prefix.instr) to the working directory on the host that is running rank 0 (when
instrumentation for multi-host runs is enabled).
In order to use the -tv option to mpirun, the TotalView binary must be in the user’s PATH,
or the TOTALVIEW environment variable must be set to the full path of the TotalView
binary.
% export TOTALVIEW=/usr/toolworks/totalview/bin/totalview
Extended collectives with intercommunicators are not profiled by our lightweight
instrumentation mode.
High Availability (H/A) mode and the diagnostic library are not allowed at the same time.
The diagnostic library strict mode is not compatible with some MPI-2 features.
Some versions of Quadrics have a memory leak. The error received will look like:
0 MAP_SDRAM(140008800): can't map SDRAM 2824000(2404000) -
4020000(3c00000) (25149440 bytes) : -1
ELAN_EXCEPTION @ 0: 5 (Memory exhausted)
newRxDesc: Elan memory exhausted: port 2b200
This error can occur in following two cases:
1. If the application calls MPI_Cancel repeatedly.
2. If the application receives on MPI_ANY_SOURCE.