[buildd-tools-devel] running mpi in sbuild (petsc testing)

Drew Parsons dparsons at debian.org
Fri Oct 9 03:23:47 UTC 2015


The new packages for PETSc 3.6.2 are all but ready, 
http://anonscm.debian.org/cgit/debian-science/packages/petsc.git/.

The build provides some simple testing.  The tests look successful when
building at the command line.  But for building packages I prefer to
use sbuild.

In the sbuild environment the tests fail.  I gather it means that MPI
(openMPI) cannot run in sbuild, or that I need to change something to
let it run.

Does anyone have experience running mpi tests in sbuild?

Drew.

p.s. an example of the test failure is:

mpicxx -o ex19.o -c -g -O2 -fstack-protector-strong -Wformat -Werror=format-security   -fPIC  -D_FORTIFY_SOURCE=2  -I/<<PKGBUILDDIR>>/include -I/<<PKGBUILDDIR>>/x86_64-linux-gnu-debug/include -I/usr/include/suitesparse -I/usr/include -I/usr/include/superlu -I/usr/include/scotch -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi  -D_FORTIFY_SOURCE=2  `pwd`/ex19.c
mpicxx -Wl,-z,relro -Wl,--no-as-needed -g -O2 -fstack-protector-strong -Wformat -Werror=format-security   -o ex19  ex19.o -L/<<PKGBUILDDIR>>/x86_64-linux-gnu-debug/lib -L/<<PKGBUILDDIR>>/x86_64-linux-gnu-debug/lib  -lpetsc_real -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lumfpack -lamd -lcholmod -lklu -L/usr/lib -lHYPRE_utilities -lHYPRE_struct_mv -lHYPRE_struct_ls -lHYPRE_sstruct_mv -lHYPRE_sstruct_ls -lHYPRE_IJ_mv -lHYPRE_parcsr_ls -lsuperlu -lscalapack-openmpi -llapack -lblas -lssl -lcrypto -lptesmumps -lptscotch -lptscotcherr -lfftw3 -lfftw3_mpi -lm -L/usr/lib/openmpi/lib -L/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -L/usr/lib/openmpi/lib -L/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl 
/bin/rm -f ex19.o
Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
See http://www.mcs.anl.gov/petsc/documentation/faq.html
[schumann:13424] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file ess_hnp_module.c at line 170
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  orte_plm_base_select failed
  --> Returned value Not found (-13) instead of ORTE_SUCCESS
--------------------------------------------------------------------------
[schumann:13424] [[INVALID],INVALID] ORTE_ERROR_LOG: Not found in file runtime/orte_init.c at line 128
--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  orte_ess_set_name failed
  --> Returned value Not found (-13) instead of ORTE_SUCCESS



More information about the Buildd-tools-devel mailing list