Installing GetFEM with MPI Support on Ubuntu 24.04.3 LT

Hi everyone,

I would like to install GetFEM with parallel (MPI) support on Ubuntu 24.04.3 LTS.
I previously installed the standard version using:

sudo apt install python3-getfem

but now I’d like to compile or install the MPI-enabled version.

Could someone please guide me on the correct installation procedure for the parallel version and also how to properly remove the version currently installed via apt?

Thanks in advance for your help

hi, although you do not need to remove the installed package with apt, I think it is a good idea to do it anyway. Just run

sudo apt purge libgetfem5t64 libgetfem-dev libgmm-dev python3-getfem

Then make sure you have installed one of the parallel versions of the mumps solver (I normally use the version with the ptscotch partitioner), plus some other dependencies:

sudo apt install libmumps-ptscotch-dev libmumps-ptscotch-5.6t64 libopenmpi-dev libopenblas-serial-dev libqhull-dev autoconf automake libtool python3-dev

Then, download the most recent development version of getfem from the git repo

and run

./autogen.sh
./configure --prefix=/opt/gf20251104_libmumps_ptscotch --with-blas=openblas --with-optimization=-O3 --with-pic --enable-python --disable-matlab --disable-superlu LIBS="-lmumps_common_ptscotch -lpord_ptscotch" --enable-paralevel=2 --with-mumps="-lsmumps_ptscotch -ldmumps_ptscotch -lcmumps_ptscotch -lzmumps_ptscotch"
make -j4
make install

The --prefix option is where the library will be installed, you can choose it yourself, but just remember your choice. The ./autogen.sh may give you something that looks like an error but do not worry. If the configure command can run everything is fine. If you get any error in the configure command, it is probably because some prerequisite is missing on your system, so just report the error here.

After you completed the installation you should be able to run scripts with MPI using a command like

PYTHONPATH=/opt/gf20251104_libmumps_ptscotch/lib/python3.13/site-packages/ mpirun -np 4 python3 check_mumps_ctx.py

or

PYTHONPATH=/opt/gf20251104_libmumps_ptscotch/lib/python3.13/site-packages/ mpirun --map-by rankfile:file=rankfile -np 8 python3

The python version on your system might be different than 3.13, adapt if necessary. Regarding the content of the file rankfile, check the documentation of mpi.

Compiling getfem with MPI support is easy. Making OpenMPI runs of any program, really scale reasonably is not that easy. If you do not have already learned this, you need to learn stuff like this

Hi Thanks for the process,
I’ve clone the getfem using:

git clone https://git.savannah.nongnu.org/git/getfem.git

after I entered in getfem cd getfem and the use the following commands

./autogen.sh
./configure --prefix=/opt/gf20251104_libmumps_ptscotch --with-blas=openblas --with-optimization=-O3 --with-pic --enable-python --disable-matlab --disable-superlu LIBS="-lmumps_common_ptscotch -lpord_ptscotch" --enable-paralevel=2 --with-mumps="-lsmumps_ptscotch -ldmumps_ptscotch -lcmumps_ptscotch -lzmumps_ptscotch"
make -j4
make install

but I have the following error, seems like it not creating the make file because of METIS.

./autogen.sh
./configure --prefix=/opt/gf20251104_libmumps_ptscotch --with-blas=openblas --with-optimization=-O3 --with-pic --enable-python --disable-matlab --disable-superlu LIBS="-lmumps_common_ptscotch -lpord_ptscotch" --enable-paralevel=2 --with-mumps="-lsmumps_ptscotch -ldmumps_ptscotch -lcmumps_ptscotch -lzmumps_ptscotch"
make -j4
make install
libtoolize: putting auxiliary files in '.'.
libtoolize: linking file './ltmain.sh'
libtoolize: putting macros in AC_CONFIG_MACRO_DIRS, 'm4'.
libtoolize: linking file 'm4/libtool.m4'
libtoolize: linking file 'm4/ltoptions.m4'
libtoolize: linking file 'm4/ltsugar.m4'
libtoolize: linking file 'm4/ltversion.m4'
libtoolize: linking file 'm4/lt~obsolete.m4'
configure.ac:91: error: required file './compile' not found
configure.ac:91:   'automake --add-missing' can install 'compile'
configure.ac:105: error: required file './config.guess' not found
configure.ac:105:   'automake --add-missing' can install 'config.guess'
configure.ac:105: error: required file './config.sub' not found
configure.ac:105:   'automake --add-missing' can install 'config.sub'
configure.ac:51: error: required file './install-sh' not found
configure.ac:51:   'automake --add-missing' can install 'install-sh'
configure.ac:51: error: required file './missing' not found
configure.ac:51:   'automake --add-missing' can install 'missing'
contrib/aposteriori/Makefile.am: error: required file './depcomp' not found
contrib/aposteriori/Makefile.am:   'automake --add-missing' can install 'depcomp'
parallel-tests: error: required file './test-driver' not found
parallel-tests:   'automake --add-missing' can install 'test-driver'
interface/src/python/Makefile.am:25: error: required file './py-compile' not found
interface/src/python/Makefile.am:25:   'automake --add-missing' can install 'py-compile'
autoreconf: error: automake failed with exit status: 1
configure.ac:91: installing './compile'
configure.ac:105: installing './config.guess'
configure.ac:105: installing './config.sub'
configure.ac:51: installing './install-sh'
configure.ac:51: installing './missing'
contrib/aposteriori/Makefile.am: installing './depcomp'
parallel-tests: installing './test-driver'
interface/src/python/Makefile.am:25: installing './py-compile'
autogen.sh is ok, you can run the ./configure script
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a race-free mkdir -p... /usr/bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking whether make supports the include directive... yes (GNU style)
checking whether to compile using MPI... yes
checking for mpic++... mpic++
checking whether the C++ compiler works... yes
checking for C++ compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether the compiler supports GNU C++... yes
checking whether mpic++ accepts -g... yes
checking for mpic++ option to enable C++11 features... none needed
checking dependency style of mpic++... gcc3
checking for function MPI_Init... yes
checking for mpi.h... yes
checking whether to compile using MPI... yes
checking for mpicc... mpicc
checking whether the compiler supports GNU C... yes
checking whether mpicc accepts -g... yes
checking for mpicc option to enable C11 features... none needed
checking whether mpicc understands -c and -o together... yes
checking dependency style of mpicc... gcc3
checking for function MPI_Init... yes
checking for mpi.h... yes
checking whether to compile using MPI... yes
checking for mpif95... no
checking for mpxlf95_r... no
checking for mpxlf95... no
checking for ftn... no
checking for mpif90... mpif90
checking whether the compiler supports GNU Fortran... yes
checking whether mpif90 accepts -g... yes
checking for function MPI_INIT... yes
checking for mpif.h... yes
checking for _init in -lmpi_cxx... yes
checking how to run the C++ preprocessor... mpic++ -E
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking how to get verbose linking output from mpif90... -v
checking for Fortran libraries of mpif90...  -L/usr/lib/x86_64-linux-gnu/openmpi/lib/fortran/gfortran -L/usr/lib/gcc/x86_64-linux-gnu/13 -L/usr/lib/gcc/x86_64-linux-gnu/13/../../../x86_64-linux-gnu -L/usr/lib/gcc/x86_64-linux-gnu/13/../../../../lib -L/lib/x86_64-linux-gnu -L/lib/../lib -L/usr/lib/x86_64-linux-gnu -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-linux-gnu/13/../../.. -lmumps_common_ptscotch -lpord_ptscotch -lmpi_cxx -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lgfortran -lm -lz -lquadmath
checking whether the compiler recognizes the partial specialization syntax... yes
you are compiling GetFEM on a x86_64-pc-linux-gnu
Using a unknown compiler
checking how to print strings... printf
checking for a sed that does not truncate output... /usr/bin/sed
checking for grep that handles long lines and -e... /usr/bin/grep
checking for egrep... /usr/bin/grep -E
checking for fgrep... /usr/bin/grep -F
checking for ld used by mpicc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking the maximum length of command line arguments... 1572864
checking how to convert x86_64-pc-linux-gnu file names to x86_64-pc-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-pc-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking for file... file
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for ar... ar
checking for archiver @FILE support... @
checking for strip... strip
checking for ranlib... ranlib
checking command to parse /usr/bin/nm -B output from mpicc object... ok
checking for sysroot... no
checking for a working dd... /usr/bin/dd
checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1
checking for mt... mt
checking if mt is a manifest tool... no
checking for stdio.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for strings.h... yes
checking for sys/stat.h... yes
checking for sys/types.h... yes
checking for unistd.h... yes
checking for dlfcn.h... yes
checking for objdir... .libs
checking if mpicc supports -fno-rtti -fno-exceptions... no
checking for mpicc option to produce PIC... -fPIC -DPIC
checking if mpicc PIC flag -fPIC -DPIC works... yes
checking if mpicc static flag -static works... no
checking if mpicc supports -c -o file.o... yes
checking if mpicc supports -c -o file.o... (cached) yes
checking whether the mpicc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... no
checking whether to build static libraries... yes
checking how to run the C++ preprocessor... mpic++ -E
checking for ld used by mpic++... /usr/bin/ld -m elf_x86_64
checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
checking whether the mpic++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
checking for mpic++ option to produce PIC... -fPIC -DPIC
checking if mpic++ PIC flag -fPIC -DPIC works... yes
checking if mpic++ static flag -static works... no
checking if mpic++ supports -c -o file.o... yes
checking if mpic++ supports -c -o file.o... (cached) yes
checking whether the mpic++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
checking dynamic linker characteristics... (cached) GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... no
checking whether to build static libraries... yes
checking for mpif90 option to produce PIC... -fPIC
checking if mpif90 PIC flag -fPIC works... yes
checking if mpif90 static flag -static works... no
checking if mpif90 supports -c -o file.o... yes
checking if mpif90 supports -c -o file.o... (cached) yes
checking whether the mpif90 linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
checking dynamic linker characteristics... (cached) GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate

checking for a Python interpreter with version >= 3.6... python3
checking for python3... /usr/bin/python3
checking for python3 version... 3.12
checking for python3 platform... linux
checking for GNU default python3 prefix... ${prefix}
checking for GNU default python3 exec_prefix... ${exec_prefix}
checking for python3 script directory (pythondir)... ${PYTHON_PREFIX}/lib/python3.12/site-packages
checking for python3 extension module directory (pyexecdir)... ${PYTHON_EXEC_PREFIX}/lib/python3.12/site-packages
Building with python (/usr/bin/python3) support (use --enable-python=no to disable it) 
You will need the python-numpy and python-scipy packages.
checking for python3.12... (cached) /usr/bin/python3
checking for a version of Python >= '2.1.0'... yes
checking for the sysconfig Python package... yes
checking for Python include path... -I/usr/include/python3.12
checking for Python library path... -L/usr/lib/x86_64-linux-gnu -lpython3.12
checking for Python site-packages path... /opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages
checking for Python platform specific site-packages path... /opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages
checking python extra libraries... -ldl -lm
checking python extra linking flags... -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions
checking consistency of all components of python development environment... yes
PARALLEL PYTHON DISABLED: mpi4py not found. You need to install the python-mpi4py package.
checking for dummy main to link with Fortran libraries... none
checking for Fortran name-mangling scheme... lower case, underscore, no extra underscore
BLAS_LIBS=-lopenblas
checking for sgemm_ in -lopenblas... yes
OK, You have working BLAS libs ! Using -lopenblas
checking for dgetrf_... yes
checking for dlsym in -ldl... yes
Building with SuperLU explicitly disabled
checking for smumps_c.h... yes
checking for dmumps_c.h... yes
checking for cmumps_c.h... yes
checking for zmumps_c.h... yes
checking for library containing smumps_c... -lsmumps_ptscotch
checking for library containing dmumps_c... -ldmumps_ptscotch
checking for library containing cmumps_c... -lcmumps_ptscotch
checking for library containing zmumps_c... -lzmumps_ptscotch
Building with MUMPS (use --enable-mumps=no to disable it)
Configuration of MUMPS done
checking for qh_new_qhull in -lqhull_r... yes
checking for libqhull_r/qhull_ra.h... yes
Building with libqhull (use --enable-qhull=no to disable it)
Configuration of qhull done
checking for METIS_PartGraphRecursive in -lmetis... no
configure: error: METIS library required for parallel getfem was not found
make: *** No targets specified and no makefile found.  Stop.
make: *** No rule to make target 'install'.  Stop.

should install METIS using the following

sudo apt install libmetis-dev

Thanks in advance
TH.

sure, I forgot that! In general it is very safe to install-uninstall stuff with apt. Installations are reversible and leave no leftovers after uninstall/purge.

Hi @Konstantinos,
I installed Getfem following the instructions, this in the log.

sudo make install
[sudo] password for standard: 
Making install in m4
make[1]: Entering directory '/home/standard/getfem/m4'
make[2]: Entering directory '/home/standard/getfem/m4'
make[2]: Nothing to be done for 'install-exec-am'.
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/standard/getfem/m4'
make[1]: Leaving directory '/home/standard/getfem/m4'
Making install in cubature
make[1]: Entering directory '/home/standard/getfem/cubature'
make[2]: Entering directory '/home/standard/getfem/cubature'
make[2]: Nothing to be done for 'install-exec-am'.
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/standard/getfem/cubature'
make[1]: Leaving directory '/home/standard/getfem/cubature'
Making install in src
make[1]: Entering directory '/home/standard/getfem/src'
make[2]: Entering directory '/home/standard/getfem/src'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/lib'
 /bin/bash ../libtool   --mode=install /usr/bin/install -c   libgetfem.la '/opt/gf20251104_libmumps_ptscotch/lib'
libtool: install: /usr/bin/install -c .libs/libgetfem.lai /opt/gf20251104_libmumps_ptscotch/lib/libgetfem.la
libtool: install: /usr/bin/install -c .libs/libgetfem.a /opt/gf20251104_libmumps_ptscotch/lib/libgetfem.a
libtool: install: chmod 644 /opt/gf20251104_libmumps_ptscotch/lib/libgetfem.a
libtool: install: ranlib /opt/gf20251104_libmumps_ptscotch/lib/libgetfem.a
libtool: finish: PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin:/sbin" ldconfig -n /opt/gf20251104_libmumps_ptscotch/lib
----------------------------------------------------------------------
Libraries have been installed in:
   /opt/gf20251104_libmumps_ptscotch/lib

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the '-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the 'LD_RUN_PATH' environment variable
     during linking
   - use the '-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to '/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/include'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/include/gmm'
 /usr/bin/install -c -m 644  gmm/gmm.h gmm/gmm_arch_config.h gmm/gmm_matrix.h gmm/gmm_iter_solvers.h gmm/gmm_iter.h gmm/gmm_inoutput.h gmm/gmm_vector.h gmm/gmm_transposed.h gmm/gmm_scaled.h gmm/gmm_conjugated.h gmm/gmm_real_part.h gmm/gmm_def.h gmm/gmm_sub_index.h gmm/gmm_vector_to_matrix.h gmm/gmm_sub_vector.h gmm/gmm_sub_matrix.h gmm/gmm_interface.h gmm/gmm_kernel.h gmm/gmm_interface_bgeot.h gmm/gmm_solver_cg.h gmm/gmm_solver_constrained_cg.h gmm/gmm_modified_gram_schmidt.h gmm/gmm_dense_Householder.h gmm/gmm_dense_lu.h gmm/gmm_dense_matrix_functions.h gmm/gmm_dense_qr.h gmm/gmm_dense_sylvester.h gmm/gmm_tri_solve.h gmm/gmm_solver_gmres.h gmm/gmm_solver_idgmres.h gmm/gmm_solver_qmr.h gmm/gmm_solver_bicgstab.h gmm/gmm_solver_Schwarz_additive.h gmm/gmm_solver_bfgs.h gmm/gmm_domain_decomp.h gmm/gmm_superlu_interface.h gmm/gmm_precond.h gmm/gmm_precond_ildlt.h gmm/gmm_precond_ildltt.h gmm/gmm_precond_mr_approx_inverse.h '/opt/gf20251104_libmumps_ptscotch/include/gmm'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/include/getfem'
 /usr/bin/install -c -m 644  getfem/dal_config.h getfem/dal_singleton.h getfem/dal_basic.h getfem/dal_bit_vector.h getfem/dal_static_stored_objects.h getfem/dal_naming_system.h getfem/dal_backtrace.h getfem/dal_tas.h getfem/dal_tree_sorted.h getfem/bgeot_config.h getfem/bgeot_permutations.h getfem/bgeot_convex_structure.h getfem/bgeot_convex.h getfem/bgeot_convex_ref.h getfem/bgeot_poly.h getfem/bgeot_geometric_trans.h getfem/bgeot_geotrans_inv.h getfem/bgeot_kdtree.h getfem/bgeot_mesh_structure.h getfem/bgeot_mesh.h getfem/bgeot_poly_composite.h getfem/bgeot_rtree.h getfem/bgeot_node_tab.h getfem/bgeot_small_vector.h getfem/bgeot_sparse_tensors.h getfem/bgeot_tensor.h getfem/bgeot_comma_init.h getfem/bgeot_torus.h getfem/bgeot_ftool.h getfem/getfem_accumulated_distro.h getfem/getfem_arch_config.h getfem/getfem_copyable_ptr.h getfem/getfem_integration.h getfem/getfem_assembling.h getfem/getfem_assembling_tensors.h getfem/getfem_generic_assembly.h getfem/getfem_generic_assembly_tree.h getfem/getfem_generic_assembly_functions_and_operators.h getfem/getfem_generic_assembly_semantic.h getfem/getfem_generic_assembly_compile_and_exec.h '/opt/gf20251104_libmumps_ptscotch/include/getfem'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/include/getfem'
 /usr/bin/install -c -m 644  getfem/getfem_context.h getfem/getfem_config.h getfem/getfem_interpolation.h getfem/getfem_export.h getfem/getfem_import.h getfem/getfem_derivatives.h getfem/getfem_global_function.h getfem/getfem_fem.h getfem/getfem_interpolated_fem.h getfem/getfem_projected_fem.h getfem/getfem_fem_global_function.h getfem/getfem_mesh_fem_global_function.h getfem/getfem_mesh_fem_sum.h getfem/getfem_im_list.h getfem/getfem_mat_elem.h getfem/getfem_mat_elem_type.h getfem/getfem_mesh.h getfem/getfem_mesh_region.h getfem/getfem_mesh_fem.h getfem/getfem_mesh_im.h getfem/getfem_error_estimate.h getfem/getfem_level_set.h getfem/getfem_partial_mesh_fem.h getfem/getfem_torus.h getfem/getfem_mesh_level_set.h getfem/getfem_mesh_im_level_set.h getfem/getfem_crack_sif.h getfem/getfem_mesh_fem_level_set.h getfem/getfem_mesh_fem_product.h getfem/getfem_fem_level_set.h getfem/getfem_mesh_slicers.h getfem/getfem_mesh_slice.h getfem/getfem_regular_meshes.h getfem/getfem_models.h getfem/getfem_model_solvers.h getfem/getfem_linearized_plates.h getfem/getfem_HHO.h getfem/getfem_locale.h getfem/getfem_contact_and_friction_common.h getfem/getfem_contact_and_friction_large_sliding.h '/opt/gf20251104_libmumps_ptscotch/include/getfem'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/include/getfem'
 /usr/bin/install -c -m 644  getfem/getfem_contact_and_friction_nodal.h getfem/getfem_contact_and_friction_integral.h getfem/getfem_nonlinear_elasticity.h getfem/getfem_fourth_order.h getfem/getfem_Navier_Stokes.h getfem/getfem_plasticity.h getfem/getfem_omp.h getfem/getfem_continuation.h getfem/getfem_mesher.h getfem/getfem_convect.h getfem/getfem_deformable_mesh.h getfem/getfem_level_set_contact.h getfem/getfem_im_data.h '/opt/gf20251104_libmumps_ptscotch/include/getfem'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/include/gmm'
 /usr/bin/install -c -m 644  gmm/gmm_precond_diagonal.h gmm/gmm_precond_ilu.h gmm/gmm_precond_ilut.h gmm/gmm_precond_ilutp.h gmm/gmm_blas.h gmm/gmm_blas_interface.h gmm/gmm_lapack_interface.h gmm/gmm_condition_number.h gmm/gmm_least_squares_cg.h gmm/gmm_range_basis.h gmm/gmm_opt.h gmm/gmm_algobase.h gmm/gmm_ref.h gmm/gmm_std.h gmm/gmm_except.h gmm/gmm_feedback_management.h gmm/gmm_MUMPS_interface.h '/opt/gf20251104_libmumps_ptscotch/include/gmm'
make[2]: Leaving directory '/home/standard/getfem/src'
make[1]: Leaving directory '/home/standard/getfem/src'
Making install in tests
make[1]: Entering directory '/home/standard/getfem/tests'
make[2]: Entering directory '/home/standard/getfem/tests'
make[2]: Nothing to be done for 'install-exec-am'.
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/standard/getfem/tests'
make[1]: Leaving directory '/home/standard/getfem/tests'
Making install in interface
make[1]: Entering directory '/home/standard/getfem/interface'
Making install in src
make[2]: Entering directory '/home/standard/getfem/interface/src'
Making install in .
make[3]: Entering directory '/home/standard/getfem/interface/src'
make[4]: Entering directory '/home/standard/getfem/interface/src'
make[4]: Nothing to be done for 'install-data-am'.
make[4]: Leaving directory '/home/standard/getfem/interface/src'
make[3]: Leaving directory '/home/standard/getfem/interface/src'
make[2]: Leaving directory '/home/standard/getfem/interface/src'
Making install in tests
make[2]: Entering directory '/home/standard/getfem/interface/tests'
Making install in meshes
make[3]: Entering directory '/home/standard/getfem/interface/tests/meshes'
make[4]: Entering directory '/home/standard/getfem/interface/tests/meshes'
make[4]: Nothing to be done for 'install-exec-am'.
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/getfem_toolbox/meshes'
 /usr/bin/install -c tank_quadratic_2500.GiD.msh holed_disc_with_quadratic_2D_triangles.msh tube_2D_spline.GiD.msh tripod.GiD.msh mixed_mesh.gmf '/opt/gf20251104_libmumps_ptscotch/getfem_toolbox/meshes'
make[4]: Leaving directory '/home/standard/getfem/interface/tests/meshes'
make[3]: Leaving directory '/home/standard/getfem/interface/tests/meshes'
make[3]: Entering directory '/home/standard/getfem/interface/tests'
make[4]: Entering directory '/home/standard/getfem/interface/tests'
make[4]: Nothing to be done for 'install-exec-am'.
make[4]: Nothing to be done for 'install-data-am'.
make[4]: Leaving directory '/home/standard/getfem/interface/tests'
make[3]: Leaving directory '/home/standard/getfem/interface/tests'
make[2]: Leaving directory '/home/standard/getfem/interface/tests'
make[2]: Entering directory '/home/standard/getfem/interface'
make[3]: Entering directory '/home/standard/getfem/interface'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/interface'
make[2]: Leaving directory '/home/standard/getfem/interface'
make[1]: Leaving directory '/home/standard/getfem/interface'
Making install in contrib
make[1]: Entering directory '/home/standard/getfem/contrib'
Making install in icare
make[2]: Entering directory '/home/standard/getfem/contrib/icare'
make[3]: Entering directory '/home/standard/getfem/contrib/icare'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/icare'
make[2]: Leaving directory '/home/standard/getfem/contrib/icare'
Making install in delaminated_crack
make[2]: Entering directory '/home/standard/getfem/contrib/delaminated_crack'
make[3]: Entering directory '/home/standard/getfem/contrib/delaminated_crack'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/delaminated_crack'
make[2]: Leaving directory '/home/standard/getfem/contrib/delaminated_crack'
Making install in aposteriori
make[2]: Entering directory '/home/standard/getfem/contrib/aposteriori'
make[3]: Entering directory '/home/standard/getfem/contrib/aposteriori'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/aposteriori'
make[2]: Leaving directory '/home/standard/getfem/contrib/aposteriori'
Making install in xfem_stab_unilat_contact
make[2]: Entering directory '/home/standard/getfem/contrib/xfem_stab_unilat_contact'
make[3]: Entering directory '/home/standard/getfem/contrib/xfem_stab_unilat_contact'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/xfem_stab_unilat_contact'
make[2]: Leaving directory '/home/standard/getfem/contrib/xfem_stab_unilat_contact'
Making install in bimaterial_crack_test
make[2]: Entering directory '/home/standard/getfem/contrib/bimaterial_crack_test'
make[3]: Entering directory '/home/standard/getfem/contrib/bimaterial_crack_test'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/bimaterial_crack_test'
make[2]: Leaving directory '/home/standard/getfem/contrib/bimaterial_crack_test'
Making install in mixed_elastostatic
make[2]: Entering directory '/home/standard/getfem/contrib/mixed_elastostatic'
make[3]: Entering directory '/home/standard/getfem/contrib/mixed_elastostatic'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/mixed_elastostatic'
make[2]: Leaving directory '/home/standard/getfem/contrib/mixed_elastostatic'
Making install in xfem_contact
make[2]: Entering directory '/home/standard/getfem/contrib/xfem_contact'
make[3]: Entering directory '/home/standard/getfem/contrib/xfem_contact'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/xfem_contact'
make[2]: Leaving directory '/home/standard/getfem/contrib/xfem_contact'
Making install in crack_plate
make[2]: Entering directory '/home/standard/getfem/contrib/crack_plate'
make[3]: Entering directory '/home/standard/getfem/contrib/crack_plate'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/crack_plate'
make[2]: Leaving directory '/home/standard/getfem/contrib/crack_plate'
Making install in static_contact_gears
make[2]: Entering directory '/home/standard/getfem/contrib/static_contact_gears'
make[3]: Entering directory '/home/standard/getfem/contrib/static_contact_gears'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/static_contact_gears'
make[2]: Leaving directory '/home/standard/getfem/contrib/static_contact_gears'
Making install in level_set_contact
make[2]: Entering directory '/home/standard/getfem/contrib/level_set_contact'
make[3]: Entering directory '/home/standard/getfem/contrib/level_set_contact'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/level_set_contact'
make[2]: Leaving directory '/home/standard/getfem/contrib/level_set_contact'
Making install in test_plasticity
make[2]: Entering directory '/home/standard/getfem/contrib/test_plasticity'
make[3]: Entering directory '/home/standard/getfem/contrib/test_plasticity'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/test_plasticity'
make[2]: Leaving directory '/home/standard/getfem/contrib/test_plasticity'
Making install in opt_assembly
make[2]: Entering directory '/home/standard/getfem/contrib/opt_assembly'
make[3]: Entering directory '/home/standard/getfem/contrib/opt_assembly'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/opt_assembly'
make[2]: Leaving directory '/home/standard/getfem/contrib/opt_assembly'
Making install in continuum_mechanics
make[2]: Entering directory '/home/standard/getfem/contrib/continuum_mechanics'
make[3]: Entering directory '/home/standard/getfem/contrib/continuum_mechanics'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib/continuum_mechanics'
make[2]: Leaving directory '/home/standard/getfem/contrib/continuum_mechanics'
make[2]: Entering directory '/home/standard/getfem/contrib'
make[3]: Entering directory '/home/standard/getfem/contrib'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/contrib'
make[2]: Leaving directory '/home/standard/getfem/contrib'
make[1]: Leaving directory '/home/standard/getfem/contrib'
Making install in bin
make[1]: Entering directory '/home/standard/getfem/bin'
make[2]: Entering directory '/home/standard/getfem/bin'
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/standard/getfem/bin'
make[1]: Leaving directory '/home/standard/getfem/bin'
Making install in doc
make[1]: Entering directory '/home/standard/getfem/doc'
Making install in sphinx
make[2]: Entering directory '/home/standard/getfem/doc/sphinx'
make[3]: Entering directory '/home/standard/getfem/doc/sphinx'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/doc/sphinx'
make[2]: Leaving directory '/home/standard/getfem/doc/sphinx'
make[2]: Entering directory '/home/standard/getfem/doc'
make[3]: Entering directory '/home/standard/getfem/doc'
make[3]: Nothing to be done for 'install-exec-am'.
make[3]: Nothing to be done for 'install-data-am'.
make[3]: Leaving directory '/home/standard/getfem/doc'
make[2]: Leaving directory '/home/standard/getfem/doc'
make[1]: Leaving directory '/home/standard/getfem/doc'
make[1]: Entering directory '/home/standard/getfem'
make[2]: Entering directory '/home/standard/getfem'
 /usr/bin/mkdir -p '/opt/gf20251104_libmumps_ptscotch/bin'
 /usr/bin/install -c getfem-config '/opt/gf20251104_libmumps_ptscotch/bin'
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/standard/getfem'
make[1]: Leaving directory '/home/standard/getfem'

After installation I’ve tested the installation with the following command:

PYTHONPATH=/opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/ mpirun -np 4 python3 check_mumps_ctx.py

I’ve the following log

Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[2750,1],0]
  Exit code:    1

or the following command

PYTHONPATH=/opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/ mpirun -np 4 python3 check_mumps_ctx.py

and the log

Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
Traceback (most recent call last):
  File "/home/standard/getfem/interface/tests/python/check_mumps_ctx.py", line 32, in <module>
    import getfem as gf
ModuleNotFoundError: No module named 'getfem'
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[2722,1],0]
  Exit code:    1
--------------------------------------------------------------------------

so it is saying that No module getfem

I also added the following line to the .bashrc

# <<< End GetFEM parallel installation >>>
export GETFEM_DIR=/opt/gf20251104_libmumps_ptscotch
export PATH="$GETFEM_DIR/bin:$PATH"
export LD_LIBRARY_PATH="$GETFEM_DIR/lib:$LD_LIBRARY_PATH"
export PYTHONPATH="$GETFEM_DIR/lib/python3.12/site-packages:$PYTHONPATH"

Please can you check and let me know if I did everything correctly?

Thank you for your help

Thierry

do not add stuff to your bashrc, it is not a good habit.

what is the output of

ls -al /opt/gf20251104_libmumps_ptscotch/lib

and

ls -al /opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/getfem/

?

If LD_… is required, the command should be something like

LD_LIBRARY_PATH=/opt/gf20251104_libmumps_ptscotch/lib PYTHONPATH=/opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/getfem python3 ....

Hi Konstantinos,
Here is what I have on the terminal for the first command

total 40860
drwxr-xr-x 2 root root       45 Nov  5 15:45 .
drwxr-xr-x 6 root root       65 Nov  5 15:45 ..
-rw-r--r-- 1 root root 41833632 Nov  5 15:45 libgetfem.a
-rwxr-xr-x 1 root root     1060 Nov  5 15:45 libgetfem.la

the for the second one I have the following

/opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/getfem/
ls: cannot access '/opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/getfem/': No such file or directory

I don’t know if I have succeded the installation or not.

no, it seems only the C++ library was compiled. Run the configure command again and show the outcome, or at least the last part of it. Maybe you lack the python scipy package or something similar.

Yes, i didint install scipy and also I need sicpy
thi is the log

Ready to build getfem
  building MATLAB interface: NO
  building OCTAVE interface: NO
  building PYTHON interface: YES (requires numpy, scipy and also mpi4py for the parallel version)
  building SCILAB interface: NO
  If you want to build the shared library of GetFEM, use --enable-shared
  (by default, only the static one will be built)

Thank Konstantinos,
I succed the installation.
I just run my old code wtin the following command

PYTHONPATH=/opt/gf20251104_libmumps_ptscotch/lib/python3.12/site-packages/ mpirun -np 4 python3 distributed_tickness.py 

and it works

alpha =      1  iter  17 residual  1.87688e-05
alpha =      1  iter  17 residual  1.87688e-05
alpha =      1  iter  17 residual  1.87688e-05
alpha =      1  iter  17 residual  1.87688e-05
Assembly time 0.102119
UNSYMMETRIC MUMPS time 0.271421
Assembly time 0.0319855
alpha =      1  iter  18 residual  9.32491e-06
alpha =      1  iter  18 residual  9.32491e-06
alpha =      1  iter  18 residual  9.32491e-06
alpha =      1  iter  18 residual  9.32491e-06
Assembly time 0.101541
UNSYMMETRIC MUMPS time 0.271344
Assembly time 0.0317143
alpha =      1  iter  19 residual  4.63444e-06
alpha =      1  iter  19 residual  4.63444e-06
alpha =      1  iter  19 residual  4.63444e-06
alpha =      1  iter  19 residual  4.63444e-06
Assembly time 0.100186
UNSYMMETRIC MUMPS time 0.271002
Assembly time 0.0314643
alpha =      1  iter  20 residual  2.30414e-06
alpha =      1  iter  20 residual  2.30414e-06
alpha =      1  iter  20 residual  2.30414e-06
alpha =      1  iter  20 residual  2.30414e-06
Assembly time 0.0314349
 iter   0 residual       1.9119
 iter   0 residual       1.9119
 iter   0 residual       1.9119
 iter   0 residual       1.9119
Assembly time 0.10127
UNSYMMETRIC MUMPS time 0.273652
Assembly time 0.0344527
alpha =      1  iter   1 residual     0.109444
alpha =      1  iter   1 residual     0.109444
alpha =      1  iter   1 residual     0.109444
alpha =      1  iter   1 residual     0.109444
Assembly time 0.100606
UNSYMMETRIC MUMPS time 0.261013
Assembly time 0.0319752
alpha =      1  iter   2 residual    0.0169839
alpha =      1  iter   2 residual    0.0169839
alpha =      1  iter   2 residual    0.0169839
alpha =      1  iter   2 residual    0.0169839
Assembly time 0.100385
UNSYMMETRIC MUMPS time 0.261423
Assembly time 0.0315738
alpha =      1  iter   3 residual   0.00582688
alpha =      1  iter   3 residual   0.00582688
alpha =      1  iter   3 residual   0.00582688
alpha =      1  iter   3 residual   0.00582688
Assembly time 0.0984828
UNSYMMETRIC MUMPS time 0.271813
Assembly time 0.0316569
alpha =      1  iter   4 residual   0.00131978
alpha =      1  iter   4 residual   0.00131978
alpha =      1  iter   4 residual   0.00131978
alpha =      1  iter   4 residual   0.00131978
Assembly time 0.0986642
UNSYMMETRIC MUMPS time 0.273473
Assembly time 0.0316265
alpha =      1  iter   5 residual   0.00215794
alpha =      1  iter   5 residual   0.00215794
alpha =      1  iter   5 residual   0.00215794
alpha =      1  iter   5 residual   0.00215794
Assembly time 0.0994241
UNSYMMETRIC MUMPS time 0.271528
Assembly time 0.0316944
alpha =      1  iter   6 residual  0.000379096
alpha =      1  iter   6 residual  0.000379096
alpha =      1  iter   6 residual  0.000379096
alpha =      1  iter   6 residual  0.000379096
Assembly time 0.100187
UNSYMMETRIC MUMPS time 0.27726
Assembly time 0.0317471
alpha =      1  iter   7 residual  0.000265548
alpha =      1  iter   7 residual  0.000265548
alpha =      1  iter   7 residual  0.000265548
alpha =      1  iter   7 residual  0.000265548
Assembly time 0.100876
UNSYMMETRIC MUMPS time 0.270736
Assembly time 0.0359517
alpha =      1  iter   8 residual  0.000104877
alpha =      1  iter   8 residual  0.000104877
alpha =      1  iter   8 residual  0.000104877
alpha =      1  iter   8 residual  0.000104877
Assembly time 0.09877
UNSYMMETRIC MUMPS time 0.27919
Assembly time 0.036367
alpha =      1  iter   9 residual  0.000412521
alpha =      1  iter   9 residual  0.000412521
alpha =      1  iter   9 residual  0.000412521
alpha =      1  iter   9 residual  0.000412521
Assembly time 0.0995431
UNSYMMETRIC MUMPS time 0.268591
Assembly time 0.0323194
alpha =      1  iter  10 residual   0.00010977
alpha =      1  iter  10 residual   0.00010977
alpha =      1  iter  10 residual   0.00010977
alpha =      1  iter  10 residual   0.00010977
Assembly time 0.099383
UNSYMMETRIC MUMPS time 0.279637
Assembly time 0.0339313
alpha =      1  iter  11 residual   0.00045345
alpha =      1  iter  11 residual   0.00045345
alpha =      1  iter  11 residual   0.00045345
alpha =      1  iter  11 residual   0.00045345
Assembly time 0.103306
UNSYMMETRIC MUMPS time 0.278841
Assembly time 0.0312605
alpha =      1  iter  12 residual  6.10579e-05
alpha =      1  iter  12 residual  6.10579e-05
alpha =      1  iter  12 residual  6.10579e-05
alpha =      1  iter  12 residual  6.10579e-05
Assembly time 0.100815
UNSYMMETRIC MUMPS time 0.264416
Assembly time 0.0317974
alpha =      1  iter  13 residual  0.000140615
alpha =      1  iter  13 residual  0.000140615
alpha =      1  iter  13 residual  0.000140615
alpha =      1  iter  13 residual  0.000140615
Assembly time 0.0989135
UNSYMMETRIC MUMPS time 0.268727
Assembly time 0.0315189
alpha =      1  iter  14 residual   0.00010228
alpha =      1  iter  14 residual   0.00010228
alpha =      1  iter  14 residual   0.00010228
alpha =      1  iter  14 residual   0.00010228
Assembly time 0.100625
UNSYMMETRIC MUMPS time 0.261669
Assembly time 0.03635
alpha =      1  iter  15 residual  9.34604e-05
alpha =      1  iter  15 residual  9.34604e-05
alpha =      1  iter  15 residual  9.34604e-05
alpha =      1  iter  15 residual  9.34604e-05
Assembly time 0.0988323
UNSYMMETRIC MUMPS time 0.27452
Assembly time 0.0366321
alpha =      1  iter  16 residual  4.90821e-05
alpha =      1  iter  16 residual  4.90821e-05
alpha =      1  iter  16 residual  4.90821e-05
alpha =      1  iter  16 residual  4.90821e-05
Assembly time 0.101246
UNSYMMETRIC MUMPS time 0.272269
Assembly time 0.0378068
alpha =      1  iter  17 residual  2.48083e-05
alpha =      1  iter  17 residual  2.48083e-05
alpha =      1  iter  17 residual  2.48083e-05
alpha =      1  iter  17 residual  2.48083e-05
Assembly time 0.0990927
UNSYMMETRIC MUMPS time 0.268353
Assembly time 0.0315099
alpha =      1  iter  18 residual  1.25437e-05
alpha =      1  iter  18 residual  1.25437e-05
alpha =      1  iter  18 residual  1.25437e-05
alpha =      1  iter  18 residual  1.25437e-05
Assembly time 0.0989411
UNSYMMETRIC MUMPS time 0.276875
Assembly time 0.0311362
alpha =      1  iter  19 residual  6.34457e-06
alpha =      1  iter  19 residual  6.34457e-06
alpha =      1  iter  19 residual  6.34457e-06
alpha =      1  iter  19 residual  6.34457e-06
Assembly time 0.0995021
UNSYMMETRIC MUMPS time 0.278682
Assembly time 0.0318687
alpha =      1  iter  20 residual   3.2102e-06
alpha =      1  iter  20 residual   3.2102e-06
alpha =      1  iter  20 residual   3.2102e-06
alpha =      1  iter  20 residual   3.2102e-06

Thanks for your helps.

I apologize for not getting back to you sooner.
After installing GetFEM, I tested the demo_parallel_laplacian.py code.

I would like to clarify whether a code written entirely in GWFL is already parallelized, or whether I need to rewrite it in another way.
Correct me if I am mistaken, but I noticed that the demo_parallel_laplacian.py example appears to be written entirely in GWFL.

Moreover, do you have any more advanced resources beyond the provided example that could help me better understand and implement this approach?

Thank you in advance for your assistance.
Kind regards,
TH

Normally, with paralevel=2, if you use GWFL for assembling and MUMPS for solving, everything should work out of the box without you even noticing.

What happens behind the scenes is that your mesh is partitioned with Metis in regions per process, these regions you never see, but they are used internally. Let’s say you run with -np 4, then getfem on process #1 will have an mpi_region which is 1/4th of the mesh, on process #1 mpi_rehgion will be another 1/4th of the mesh and so on. Let’s call these regions RG1, RG2, RG3, RG4, although they do not exist at the same time, process #1 knows nothing about RG2, RG3, RG4 and so on.

Whenever GetFEM needs to assemble a matrix on some region RG, then it will assemble it only in the intersection between RG and mpi_region. On process #1 this will be the overlapping between RG and RG1, on process #2 it will be the overlapping between RG and RG2, and so on. This is called a distributed tangent matrix. If you added the submatrices from all processor, you would get the total matrix.

Whenever GetFEM needs to assemble a residual vector, it does the same as with matrices to begin with, but at the end all processes talk to each other and add all their results together, so that all processes have the total residual vector.

That’s exactly the format MUMPS requires by default to perform the linear solves, distributed matrix and non-distributed right hand side. The best way to understand how MUMPS works is to understand the very simple examples found here:

https://cgit.git.savannah.gnu.org/cgit/getfem.git/tree/interface/tests/python/check_mumps_ctx.py

Otherwise as a GetFEM user you should not care about all this, it should just work.

In my MPI python scripts, the only MPI related code is to limit printing and exporting to just 1 process, e.g.

from mpi4py import MPI
comm = MPI.COMM_WORLD
rank = comm.Get_rank()

....

if rank == 0: md.variable_list()

....

if rank == 0: print(f"{i}: solve for p={p:.4g} (dp={p-p_old:.4g})")

....

    if rank == 0:
      output = (mfu, md.variable("u"), "Displacements",
                mfx, md.variable("x"), "x",
                mfx, md.variable("x")-md.variable("x0"), "dx",
                ...)
      mfout.export_to_vtu(f"{resultspath}/somefilename_{i}.vtu", *output)

Hello Konstantinos,

Thank you very much for your clear and detailed explanation. It really helps me better understand the internal workings of GetFEM with MUMPS and MPI parallelism.

I have a small question: is there a way to visualize the partitioning performed by Metis directly in ParaView? This would greatly help me to better grasp the mesh partition and give me clearer ideas about domain decomposition.

Thank you again for your valuable help.

Best regards

sure, just define a 0-order mesh_fem (i.e. element wise constant), and perform some residual vector assembly depending on the rank of the process

mf = gf.MeshFem(mesh, 1)
mf.set_classical_fem(0)   # 0-order

mim = gf.MeshIm(mesh, 1)

VEC0 = gf.asm_generic(mim, 1, "Test_t", -1, "t", True, mf, np.zeros(mf.nbdof()))
VEC = gf.asm_generic(mim, 1, f"{rank}*Test_t", -1, "t", True, mf, np.zeros(mf.nbdof()))

mf.export_to_vtu("MPIregions.vtu", mf, VEC/VEC0, "rank")

Hi Konstantinos,

Thank you for sharing this solution. It is really clever and elegant.

If I understand correctly, the idea is the following: you first compute the area or volume of each element with a standard residual assembly. Then you repeat the same assembly but multiplied by the MPI rank. By dividing VEC / VEC0, you recover a field that contains only the rank value per element. This is a very smart way to visualize the Metis partition directly in ParaView.
This technic is similar to what I doing to assign different materials properties.
Thank you again it extremely helpful.

If I remember correctly, the model.interpolation function should also work properly with MPI, so a simpler syntax would be, just in one line

mf.export_to_vtu("MPIregions.vtu", mf, md.interpolation(f"{rank}", mf), "rank")

it requires you to have some model object instant md though.

Thank you very much for your previous solution. I had already tested it with md.interpolation, and I obtained the same result as with the previous version.

I do have another question regarding the solver behaviour. At the moment, I am using md.solve("noisy") without specifying any rank-related settings. To my surprise, the residual printed during the iterations is identical on all processors.

If I understand correctly, this is expected: the linear system assembled in parallel does not produce a local residual per processor. Instead, MUMPS performs the distributed assembly and factorisation, then solves the global system and broadcasts the solution to all ranks. Therefore, the reported value corresponds to the global Newton residual, which is printed by every processor.

Could you please confirm that this interpretation is correct?

Best regards,

Temps 0.40
Assembly time 0.00643555
 iter   0 residual      1.22799
 iter   0 residual      1.22799
 iter   0 residual      1.22799
 iter   0 residual      1.22799
Assembly time 0.0166266
UNSYMMETRIC MUMPS time 0.168325
Assembly time 0.00646912
step control [       0,       1,         1] iter   1 residual   0.00225306
step control [       0,       1,         1] iter   1 residual   0.00225306
step control [       0,       1,         1] iter   1 residual   0.00225306
step control [       0,       1,         1] iter   1 residual   0.00225306
Assembly time 0.0165932
UNSYMMETRIC MUMPS time 0.165075
Assembly time 0.00655757
step control [       0,       1,         1] iter   2 residual  1.58291e-07
step control [       0,       1,         1] iter   2 residual  1.58291e-07
step control [       0,       1,         1] iter   2 residual  1.58291e-07
step control [       0,       1,         1] iter   2 residual  1.58291e-07
Assembly time 0.0164527
UNSYMMETRIC MUMPS time 0.195681
Assembly time 0.00642749
step control [       0,       1,         1] iter   3 residual  2.96827e-15
Time for model solve Electrophysiology 0.8602870020000006
step control [       0,       1,         1] iter   3 residual  2.96827e-15
step control [       0,       1,         1] iter   3 residual  2.96827e-15
step control [       0,       1,         1] iter   3 residual  2.96827e-15
Temps 0.50

yes, as mentioned earlier, MUMPS only requires the matrix to be distributed, rhs and solution are not distributed. MUMPS has an option for distributed rhs, but we do not use that.

If you wonder why this is so simple, compared to parallelization in other software, this is because in MUMPS they have made a very nice design decision, MUMPS is designed to solve distributed matrices with overlapping entries.

Which means you can partition the matrix

\begin{bmatrix}10 &20 &0\\30 &40 &50\\0&60&70\end{bmatrix}

as

\begin{bmatrix}10 &20 &0\\30 &20 &0\\0&0&0\end{bmatrix}+ \begin{bmatrix}0 &0 &0\\0 &20 &50\\0&60&70\end{bmatrix}

or as

\begin{bmatrix}10 &20 &0\\30 &10 &0\\0&0&0\end{bmatrix}+ \begin{bmatrix}0 &0 &0\\0 &30 &50\\0&60&70\end{bmatrix}
and MUMPS will give you the correct result.

Thank you very much for your response. Indeed, I was wondering why it was so simple, and your explanation clarified it perfectly.
Best regards.