User Tools

Site Tools


cluster:156

Warning: Undefined array key 22 in /usr/share/dokuwiki/inc/html.php on line 1453

Warning: Undefined array key -1 in /usr/share/dokuwiki/inc/html.php on line 1458

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
cluster:156 [2017/03/27 15:11]
hmeij07 created
cluster:156 [2017/04/05 10:42]
hmeij07 [OpenHPC page 3]
Line 2: Line 2:
 **[[cluster:0|Back]]** **[[cluster:0|Back]]**
  
-page 3+==== OpenHPC page 3 ==== 
 + 
 +** Tools ** 
 + 
 +<code> 
 + 
 +yum -y groupinstall ohpc-autotools 
 +yum -y install valgrind-ohpc 
 +yum -y install EasyBuild-ohpc 
 +yum -y install spack-ohpc 
 +yum -y install R_base-ohpc 
 + 
 +</code> 
 + 
 +  * "Valgrind is an instrumentation framework for building dynamic analysis tools. There are Valgrind tools that can automatically detect many memory management and threading bugs" (memcheck) 
 +    * http://valgrind.org/ 
 +  * "Welcome to the documentation of EasyBuild, a software build and installation framework that allows you to manage (scientific) software on High Performance Computing (HPC) systems in an efficient way." 
 +    * https://easybuild.readthedocs.io/en/latest/ 
 +  * "Spack is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments." 
 +    * http://spack.readthedocs.io/en/latest/tutorial_sc16.html 
 +    * ''/opt/ohpc/admin/spack/0.8.17/bin/spack'' 
 +  * R_base contains ''R'' and ''Rscript'' 
 + 
 +** Compilers ** 
 + 
 +<code> 
 + 
 +yum install gnu-compilers-ohpc 
 + 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/c++ 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/cpp 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/g++ 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gcc 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gcc-ar 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gcc-nm 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gcc-ranlib 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gcov 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gcov-tool 
 +/opt/ohpc/pub/compiler/gcc/5.4.0/bin/gfortran 
 + 
 +</code> 
 +  
 + 
 +**MPIs** 
 + 
 +  * Both for ethernet and infiniband networks 
 + 
 +<code> 
 + 
 +yum -y install openmpi-gnu-ohpc mvapich2-gnu-ohpc mpich-gnu-ohpc 
 + 
 +/opt/ohpc/pub/mpi/openmpi-gnu/1.10.4/bin/mpicc 
 +/opt/ohpc/pub/mpi/openmpi-gnu/1.10.4/bin/mpirun 
 +/opt/ohpc/pub/mpi/mvapich2-gnu/2.2/bin/mpicc 
 +/opt/ohpc/pub/mpi/mvapich2-gnu/2.2/bin/mpirun 
 +/opt/ohpc/pub/mpi/mpich-gnu-ohpc/3.2/bin/mpicc 
 +/opt/ohpc/pub/mpi/mpich-gnu-ohpc/3.2/bin/mpirun 
 + 
 + 
 +</code> 
 + 
 +** Perf Tools ** 
 + 
 +  * ''yum -y groupinstall ohpc-perf-tools-gnu'' 
 +  * Appendix C 
 + 
 +**3rd Party & Libs Tools** 
 + 
 +  * OpenHPC provides package aliases for these 3rd party libraries and utilities that can 
 +be used to install available libraries for use with the GNU compiler family toolchain. 
 + 
 +<code> 
 + 
 +# Install libs for all available GNU compiler family toolchain 
 +yum -y groupinstall ohpc-serial-libs-gnu 
 +yum -y groupinstall ohpc-io-libs-gnu 
 +yum -y groupinstall ohpc-python-libs-gnu 
 +yum -y groupinstall ohpc-runtimes-gnu 
 +# Install parallel libs for all available MPI toolchains 
 +yum -y groupinstall ohpc-parallel-libs-gnu-mpich 
 +yum -y groupinstall ohpc-parallel-libs-gnu-mvapich2 
 +yum -y groupinstall ohpc-parallel-libs-gnu-openmpi 
 + 
 +# things like 
 +# netcdf, hdf5, numpy and scipy for python, fftw, scalapack 
 + 
 +</code> 
 + 
 +Finish with installing Intel's Parallel Data Studio (icc/ifort). 
 + 
 +{{:cluster:install_guide-centos7.2-slurm-1.2-x86_64.pdf|install_guide-centos7.2-slurm-1.2-x86_64.pdf}} 
 + 
 +As user ''test'' 
 + 
 +<code> 
 + 
 +module avail 
 +module spider 
 +which mpicc 
 +module load gnu/5.4.0 
 +module load openmpi/1.10.4 
 +which gcc 
 +which mpicc 
 +mpicc -O3 /opt/ohpc/pub/examples/mpi/hello.c 
 +cp /opt/ohpc/pub/examples/slurm/job.mpi . 
 +which prun 
 +find /opt/ohpc/pub -name prun 
 +module spider prun 
 +module load prun/1.1 
 +which prun 
 +sbatch job.mpi 
 +squeue 
 + 
 +</code> 
 + 
 +You do need to install the Infiniband section so you can run over ethernet with OpenMPI 
 + 
 +<code> 
 + 
 +  yum  -y groupinstall "Infiniband Support" 
 +  yum -y install infinipath-psm 
 +  systemctl enable rdma 
 +  systemctl start rdma 
 + 
 +# recipe is missing this: flavor openmpi 
 + 
 +  yum -y --installroot=/data/ohpc/images/centos7.2 install libibverbs opensm-libs infinipath-psm 
 + 
 +# remake vnfs 
 + 
 +</code> 
 + 
 +The following shows up when running MPI over ethernet 
 + 
 +<code> 
 + 
 +[prun] Master compute host = n29 
 +[prun] Resource manager = slurm 
 +[prun] Launch cmd = mpirun ./a.out 
 +-------------------------------------------------------------------------- 
 +[[49978,1],4]: A high-performance Open MPI point-to-point messaging module 
 +was unable to find any relevant network interfaces: 
 + 
 +Module: OpenFabrics (openib) 
 +  Host: n31 
 + 
 +Another transport will be used instead, although this may result in 
 +lower performance. 
 +-------------------------------------------------------------------------- 
 + 
 + Hello, world (8 procs total) 
 +    --> Process #   2 of   8 is alive. -> n29.localdomain 
 +    --> Process #   3 of   8 is alive. -> n29.localdomain 
 +    --> Process #   0 of   8 is alive. -> n29.localdomain 
 +    --> Process #   1 of   8 is alive. -> n29.localdomain 
 +    --> Process #   6 of   8 is alive. -> n31.localdomain 
 +    --> Process #   7 of   8 is alive. -> n31.localdomain 
 +    --> Process #   4 of   8 is alive. -> n31.localdomain 
 +    --> Process #   5 of   8 is alive. -> n31.localdomain 
 + 
 +</code> 
 +See page 4 for ib0 configuration which is incomplete on the recipe. 
 + 
 +[[cluster:154|OpenHPC page 1]] - [[cluster:155|OpenHPC page 2]] - page 3 - [[cluster:160|OpenHPC page 4]]
  
 \\ \\
 **[[cluster:0|Back]]** **[[cluster:0|Back]]**
cluster/156.txt · Last modified: 2017/04/05 10:42 by hmeij07