×

Notice

The forum is in read only mode.

CA 10.7 parallel Debian7 - make ptscotch

  • MGolbs
  • MGolbs's Avatar Topic Author
  • Offline
  • Platinum Member
  • Platinum Member
More
11 years 11 months ago - 11 years 11 months ago #6637 by MGolbs
Hallo,

ich versuche CA 10.7 parallel unter Debian 7 zu installieren.

Quelle: Code-Aster 10.7 parallel Installation
Version
OS:Ubuntu 12.04
Code_Aster : ver.10.7


Preparation
Download files
Save location for files : ~/Install_Files
Install location : /opt and /opt/aster

Download and locate lower files to Install_Files

site file name
Code_Aster aster-full-src-10.7.0-1.noarch.tar.gz
ACML acml-5-1-0-gfortran-64bit.tgz
ScaLAPACK
scalapack_installer.tgz
ParMETIS ParMetis-3.2.0.tar.gz
PETSc petsc-2.3.3.tar.gz

Change owner of install location
Change owner of Install location to log-in user from root user.
$ sudo chown username /opt/

Install libraries for Code_Aster
Install libraries for Code_Aster.
$ sudo apt-get install gfortran g++ python-dev python-qt4 python-numpy liblapack-dev libblas-dev tcl tk zlib1g-dev bison flex checkinstall openmpi-bin openmpi-dev

Compile ACML
Compile ACML (Math LIbrary for Code_Aster).

$ cd ~/Install_Files/

$ mkdir ./acml

$ cp acml-5-1-0-gfortran-64bit.tgz ./acml

$ cd ./acml

$ tar xfvz acml-5-1-0-gfortran-64bit.tgz

$ ./install-acml-5-1-0-gfortran-64bit.sh -accept -installdir=/opt/acml5.1.0

$ echo /opt/acml5.1.0/gfortran64/lib | sudo tee -a /etc/ld.so.conf.d/amd_math.conf

$ sudo ldconfig

Compile Code_Aster (sequential)
Compile Code_Aster (sequential) with ACML.

$ cd ~/Install_Files

$ tar xfvz aster-full-src-10.7.0.noarch.tar.gz

$ cd aster-full-src-10.7.0/

$ sed -i "s:PREFER_COMPILER\ =\ 'GNU':PREFER_COMPILER\ =\'GNU_without_MATH'\nMATHLIB=\ '-L/opt/acml5.1.0/gfortran64/lib -lacml':g" setup.cfg

$ python setup.py install


Make host-file for parallel calculation. Type your PC'name in place of 'ubuntu' , and the number of processors for parallel calculation behind 'cpu='.
$ echo ubuntu cpu=2 >> /opt/aster/etc/codeaster/mpi_hostfile

Compile ScaLAPACK

$ cd ~/Install_Files

$ tar xfvz scalapack_installer.tgz

$ cd scalapack_installer_1.0.2

$ ./setup.py --lapacklib=/opt/acml5.1.0/gfortran64/lib/libacml.a --mpicc=mpicc --mpif90=mpif90 --mpiincdir=/usr/lib/openmpi/include --prefix=/opt/scalapack


An error message "BLACS: error running BLACS test routines xCbtest" will show up after compilation.

But you succeed , if there is an file "/opt/scalapack/lib/libscalapack.a".



Compile ParMETIS

$ cp ~/Install_Files/ParMetis-3.2.0.tar.gz /opt

$ cd /opt

$ tar xfvz ParMetis-3.2.0.tar.gz

$ cd ParMetis-3.2.0


Change 'Makefile.in' like ParMETIS Makefile.in

$ make

Compile PT-Scotch
Copy scotch-5.1.11_esmumps in Code_Aster's source files to '/opt.
Compile by mpi-compiler

$ cp ~/Install_Files/aster-full-src-10.7.0/SRC/scotch-5.1.11_esmumps-2.tar.gz /opt/

$ cd /opt

$ tar xfvz scotch-5.1.11_esmumps-2.tar.gz

$ mv scotch_5.1.11_esmumps scotch_5.1.11_esmumps_mpi

$ cd scotch_5.1.11_esmumps_mpi/src



Change 'Makefile.inc' like PT-Scotch Makefile.inc

$ make scotch

$ make ptscotch

Compile MUMPS
Copy mumps-4.9.2 in Code_Aster's source files to '/opt.
Compile by mpi-compiler
$ cp ~/Install_Files/aster-full-src-10.7.0/SRC/mumps-4.9.2.tar.gz /opt/

$ cd /opt

$ tar xfvz mumps-4.9.2.tar.gz

$ mv mumps-4.9.2 mumps-4.9.2_mpi

$ cd mumps-4.9.2_mpi/


Change 'Makefile.inc' like MUMPS Makefile.inc

$ make all


Compile PETSc

$ cp ~/Install_Files/petsc-2.3.3.tar.gz /opt

$ cd /opt

$ tar xfvz petsc-2.3.3.tar.gz

$ cd petsc-2.3.3-p16

$ ./config/configure.py --with-mpi-dir=/usr/lib/openmpi/lib --with-blas-lapack-lib=/opt/acml5.1.0/gfortran64/lib/libacml.a --with-debugging=0 COPTFLAGS=-O3 FOPTFLAGS=-O3 --with-shared=0 --configModules=PETSc.Configure --optionsModule=PETSc.compilerOptions --with-x=0

$ PETSC_ARCH=linux-gnu-c-opt; export PETSC_ARCH

$ PETSC_DIR=/opt/petsc-2.3.3-p16; export PETSC_DIR

$ make

Compile Code_Aster (parallel)
Change a part of 'mpi_get_procid_cmd' of '/opt/aster/etc/codeaster/asrun' to as following with text editor.
mpi_get_procid_cmd : echo $OMPI_COMM_WORLD_RANK

Change following numbers if the number of processors is over 32.
batch_mpi_nbpmax : 32
interactif_mpi_nbpmax : 32

Copy '/opt/aster/STA10.7’ and make 'PAR10.7'.

$ cd /opt/aster

$ cp -ax STA10.7 PAR10.7

$ cd /opt/aster/PAR10.7

Change config.txt and profile.sh like PAR config.txt PAR profile.sh, and re-compile Code_Aster.

$ ../bin/as_run --vers=PAR10.7 --make clean

$ ../bin/as_run --vers=PAR10.7 --make clean bibf90/mumps

$ ../bin/as_run --vers=PAR10.7 --make clean bibf90/petsc

$ ../bin/as_run --vers=PAR10.7 --make clean bibc/scotch

$ ../bin/as_run --vers=PAR10.7 --make


Zuerst muss ich auf gfortran_4.4.5-1_amd64.deb zurückrüsten.

" Fortran fno-tree-dse" Problematik...
..

Dann bekomme ich:

golbs@debian7-cae:/opt/scalapack_installer_1.0.2$ ./setup.py --lapacklib=/opt/acml5.1.0/gfortran64/lib/libacml.a --mpicc=mpicc --mpif90=mpif90 --mpiincdir=/usr/lib/openmpi/include --prefix=/opt/scalapack
========================================
Setting up the framework
ScaLAPACK installer version (1, 0, 2)
mpicc is mpicc
mpif90 is mpif90
MPI include dir is /usr/lib/openmpi/include
Creating directory /opt/scalapack
Install directory is... /opt/scalapack
Creating directory /opt/scalapack_installer_1.0.2/build
Build directory is... /opt/scalapack_installer_1.0.2/build
BLAS library is...
LAPACK library is... /opt/acml5.1.0/gfortran64/lib/libacml.a
Checking if mpicc works... yes
Checking if mpirun works... yes
Checking if mpif90 works... yes
Setting Fortran mangling... -DAdd_
Setting download command...
Checking availability of urllib... available
Testing urllib... working
Setting ranlib command... /usr/bin/ranlib
Detecting Fortran compiler... unknown
Detecting C compiler... unknown
C flags are... -O2
Fortran flags are... -O2
Selected loader flags (C main):
Selected loader flags (f90 main):
Selected NOOPT flags: -O0
AR flags are... rc
Checking loader... works

========================================
BLAS installation/verification
========================================
Checking if the provided LAPACK (/opt/acml5.1.0/gfortran64/lib/libacml.a) contains BLAS
BLAS library is set to /opt/acml5.1.0/gfortran64/lib/libacml.a
Checking if provided BLAS works... yes
Using the BLAS library contained in LAPACK library /opt/acml5.1.0/gfortran64/lib/libacml.a

========================================
Lapack installation/verification
========================================
LAPACK library is /opt/acml5.1.0/gfortran64/lib/libacml.a
Checking if provided LAPACK works... yes
Getting LAPACK version number... 3.3.0
Checking if provided LAPACK contains functions for test works... no

========================================
ScaLAPACK installer is starting now. Buckle up!
========================================
Downloading ScaLAPACK... Creating directory /opt/scalapack_installer_1.0.2/build/download
done
Installing scalapack-2.0.2 ...
Writing SLmake.inc... done.
Compiling BLACS, PBLAS and ScaLAPACK... done
Getting ScaLAPACK version number... 2.0.1
Installation of ScaLAPACK successful.
(log is in /opt/scalapack_installer_1.0.2/build/log/scalog )
Compiling test routines... done
Running BLACS test routines...

BLACS: error running BLACS test routines xCbtest


BLACS: Command -np 4 ./xCbtest
stderr:
****************************************
/bin/sh: 1: -np: not found

****************************************
golbs@debian7-cae:/opt/scalapack_installer_1.0.2$

..
..

Als Makefile.inc nehme ich eines aus dem Ordner ./src/ Make.inc -für Linux amd64. Ist dass korrekt? make scotch läuft problemlos.
Dann kommt bei mir:

golbs@debian7-cae:/opt/scotch_5.1.11_esmumps_mpi/src$ make ptscotch
(cd libscotch ; make VERSION=5 RELEASE=1 PATCHLEVEL=10 ptscotch && make ptinstall)
make[1]: Entering directory `/opt/scotch_5.1.11_esmumps_mpi/src/libscotch'
rm -f *~ *.o lib*.a parser_yy.c parser_ly.h parser_ll.c *scotch.h *scotchf.h y.output dummysizes
make CFLAGS="-O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_PTHREAD -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -DSCOTCH_PTHREAD -Drestrict=__restrict -DIDXSIZE64 -DSCOTCH_PTSCOTCH" CC="mpicc" \
scotch.h \
scotchf.h \
libptscotch.a \
libscotch.a \
libptscotcherr.a \
libptscotcherrexit.a
make[2]: Entering directory `/opt/scotch_5.1.11_esmumps_mpi/src/libscotch'
gcc -O3 -DCOMMON_FILE_COMPRESS_GZ -DCOMMON_PTHREAD -DCOMMON_RANDOM_FIXED_SEED -DSCOTCH_RENAME -DSCOTCH_RENAME_PARSER -DSCOTCH_PTHREAD -Drestrict=__restrict -DIDXSIZE64 -DSCOTCH_PTSCOTCH -DSCOTCH_VERSION=5 -DSCOTCH_RELEASE=1 -DSCOTCH_PATCHLEVEL=10 dummysizes.c -o dummysizes -lz -lm -lrt
In file included from dummysizes.c:80:0:
common.h:99:28: fatal error: mpi.h: Datei oder Verzeichnis nicht gefunden
compilation terminated.
make[2]: *** [dummysizes] Fehler 1
make[2]: Leaving directory `/opt/scotch_5.1.11_esmumps_mpi/src/libscotch'
make[1]: *** [ptscotch] Fehler 2
make[1]: Leaving directory `/opt/scotch_5.1.11_esmumps_mpi/src/libscotch'
make: *** [ptscotch] Fehler 2
golbs@debian7-cae:/opt/scotch_5.1.11_esmumps_mpi/src$


Was mache ich da falsch?

Auch beim mumps komme ich nicht weiter. System ist ein 2x4 Core Intel Xeon auf amd64 Debian7.

golbs@debian7-cae:/opt/mumps-4.9.2_mpi$ make all
make ARITH=s mumps_lib
make[1]: Entering directory `/opt/mumps-4.9.2_mpi'
(cd src ; make s)
make[2]: Entering directory `/opt/mumps-4.9.2_mpi/src'
make ARITH=s mumps_lib
make[3]: Entering directory `/opt/mumps-4.9.2_mpi/src'
cc -O -I. -I/usr/include -I../include -DAdd_ -I../PORD/include/ -Dpord -c mumps_orderings.c
cc -O -I. -I/usr/include -I../include -DAdd_ -I../PORD/include/ -Dpord -c mumps_size.c
cc -O -I. -I/usr/include -I../include -DAdd_ -I../PORD/include/ -Dpord -c mumps_io.c
cc -O -I. -I/usr/include -I../include -DAdd_ -I../PORD/include/ -Dpord -c mumps_io_basic.c
cc -O -I. -I/usr/include -I../include -DAdd_ -I../PORD/include/ -Dpord -c mumps_io_thread.c
cc -O -I. -I/usr/include -I../include -DAdd_ -I../PORD/include/ -Dpord -c mumps_io_err.c
f90 -O -I/usr/include -Dpord -I. -I../include -c mumps_static_mapping.F
make[3]: f90: Kommando nicht gefunden
make[3]: *** [mumps_static_mapping.o] Fehler 127
make[3]: Leaving directory `/opt/mumps-4.9.2_mpi/src'
make[2]: *** Fehler 2
make[2]: Leaving directory `/opt/mumps-4.9.2_mpi/src'
make[1]: *** [mumps_lib] Fehler 2
make[1]: Leaving directory `/opt/mumps-4.9.2_mpi'
make: *** Fehler 2
golbs@debian7-cae:/opt/mumps-4.9.2_mpi$ make all
make ARITH=s mumps_lib
make[1]: Entering directory `/opt/mumps-4.9.2_mpi'
(cd src ; make s)
make[2]: Entering directory `/opt/mumps-4.9.2_mpi/src'
make ARITH=s mumps_lib
make[3]: Entering directory `/opt/mumps-4.9.2_mpi/src'
gfortran -O -Dintel_ -DALLOW_NON_INIT -I/usr/local/mpich/include -Dpord -I. -I../include -c mumps_static_mapping.F
mumps_static_mapping.F:4407: Error: Can't open included file 'mpif.h'
make[3]: *** [mumps_static_mapping.o] Fehler 1
make[3]: Leaving directory `/opt/mumps-4.9.2_mpi/src'
make[2]: *** Fehler 2
make[2]: Leaving directory `/opt/mumps-4.9.2_mpi/src'
make[1]: *** [mumps_lib] Fehler 2
make[1]: Leaving directory `/opt/mumps-4.9.2_mpi'
make: *** Fehler 2
golbs@debian7-cae:/opt/mumps-4.9.2_mpi$


Kann mal Versionen ScaLAPACK, ParMETIS, PT-Scotch, MUMPS & Co. aus der Debianpaketverwaltung parallel auf dem System haben?


Über Tipps und Infos würde ich mich freuen.

Gruß Markus

Dem Überflüssigen nachlaufen, heißt das Wesentliche verpassen.
Jules Saliège
Last edit: 11 years 11 months ago by MGolbs.
More
11 years 11 months ago #6638 by RichardS
Replied by RichardS on topic Re: CA 10.7 parallel Debian7 - make ptscotch
Hallo Markus,
es siet aus als ob der compiler die mpi header files nicht finden kann.
Ich denke entweder wurden die includes für mpi nicht korrekt angegeben, oder es wurde der falsche compiler verwendet.
Für die parallelen Versionen müssen als compiler mpif90 anstatt gfortran und mpicc anstatt gcc verwendet werden.
Am besten du postest mal deine make files.
Gruß,
Richard

SimScale - Engineering Simulation in your browser!
  • MGolbs
  • MGolbs's Avatar Topic Author
  • Offline
  • Platinum Member
  • Platinum Member
More
11 years 11 months ago #6639 by MGolbs
Hallo Richard,

danke für die Tipps. Der Rechner steht zu Hause, werde erst heute Nacht erneut testen können. Wie sind mpif90 und mpicc zu bekommen, Paketverwaltung?

Gruß und Dank Markus

Dem Überflüssigen nachlaufen, heißt das Wesentliche verpassen.
Jules Saliège
More
11 years 11 months ago #6640 by RichardS
Replied by RichardS on topic Re: CA 10.7 parallel Debian7 - make ptscotch

MGolbs wrote: Hallo Richard,
Wie sind mpif90 und mpicc zu bekommen, Paketverwaltung?
Gruß und Dank Markus


Nach deinem output zu Urteilen, sind alle compiler vorhanden und funktionstüchtig:
golbs@debian7-cae:/opt/scalapack_installer_1.0.2$ ./setup.py --lapacklib=/opt/acml5.1.0/gfortran64/lib/libacml.a --mpicc=mpicc --mpif90=mpif90 --mpiincdir=/usr/lib/openmpi/include --prefix=/opt/scalapack
========================================
Setting up the framework
ScaLAPACK installer version (1, 0, 2)
mpicc is mpicc
mpif90 is mpif90
MPI include dir is /usr/lib/openmpi/include
Creating directory /opt/scalapack
Install directory is... /opt/scalapack
Creating directory /opt/scalapack_installer_1.0.2/build
Build directory is... /opt/scalapack_installer_1.0.2/build
BLAS library is...
LAPACK library is... /opt/acml5.1.0/gfortran64/lib/libacml.a
Checking if mpicc works... yes
Checking if mpirun works... yes
Checking if mpif90 works... yes

Wurde OpenMPI installiert? Ist der angegebene Pfad korrekt?
--mpiincdir=/usr/lib/openmpi/include
In diesem Verzeichnis müsste sich eine Datei Namens mpif.h befinden.

Gruß,
Richard

SimScale - Engineering Simulation in your browser!
  • MGolbs
  • MGolbs's Avatar Topic Author
  • Offline
  • Platinum Member
  • Platinum Member
More
11 years 11 months ago #6641 by MGolbs
Hallo,

ja openmpi ist installiert. Der Pfad passt auch wie angeben. Hier mal die Mitschnitte des Kompilierens.


File Attachment:

File Name: 2012-12-17...l.tar.gz
File Size:7 KB


Gruß und Dank Markus

Dem Überflüssigen nachlaufen, heißt das Wesentliche verpassen.
Jules Saliège
Attachments:
More
11 years 11 months ago - 11 years 11 months ago #6642 by RichardS
Replied by RichardS on topic Re: CA 10.7 parallel Debian7 - make ptscotch
Hallo Markus,
einen ersten Hinweis.
Ändere die Zeile beginnend mit CCD in deinem Makefile für ptscotch:
CCD		= gcc -I  /usr/lib/openmpi/include
Damit sollte der Kompiler die header für MPI finden.

Gruß,
Richard

SimScale - Engineering Simulation in your browser!
Last edit: 11 years 11 months ago by RichardS.
Moderators: catux
Time to create page: 0.137 seconds
Powered by Kunena Forum