OpenHPC Software for Rocky 8
Below is all centos 6 and/or 7, which may or may not run in 8.
IMPORTANT NOTE: Since moving to the TrueNAS/ZFS appliance all references to /home/apps
should be replaced with /share/apps
which points to /zfshomes/apps
— Henk 2020/07/31 15:08
The listings below will be updated as software is installed, location is /share/apps/ or /home/apps/ (same thing).
Please note that what you find in /usr/local/bin is local to the head node and /home/apps/bin is available cluster wide (which is in your default PATH)
Most of the compilations have been performed by Intel's icc/ifort compilers.
Old compilations, mostly located in /share/apps, are RHEL 5.5
Newer compilations are done with CentOS 6.10 (since summer 2013)
on either cottontail2 or node n33 located at /share/apps/CENTOS6
Some compilations require 7.x and are in /share/apps/CENTOS7
source /share/apps/CENTOS7/miniconda3-py311/etc/profile.d/conda.sh or export PATH=/share/apps/CENTOS7/miniconda3-py311/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/miniconda3-py311/lib:$LD_LIBRARY_PATH which mpirun python conda conda list export CUDA_HOME=/usr/local/cuda export PATH=/usr/local/cuda/bin:$PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH
# pollacklab [hmeij@greentail52 ~]$ source /share/apps/CENTOS7/miniconda3-py311/etc/profile.d/conda.sh [hmeij@greentail52 ~]$ conda activate sage (sage) [hmeij@greentail52 ~]$ sage --version SageMath version 10.0, Release Date: 2023-05-20 (sage) [hmeij@greentail52 ~]$ sage ┌────────────────────────────────────────────────────────────────────┐ │ SageMath version 10.0, Release Date: 2023-05-20 │ │ Using Python 3.11.4. Type "help()" for help. │ └────────────────────────────────────────────────────────────────────┘ sage: exit (sage) [hmeij@greentail52 ~]$ conda deactivate [hmeij@greentail52 ~]$
[hmeij@greentail52 ~]$ which python /share/apps/CENTOS7/miniconda3-py311/bin/python [hmeij@greentail52 ~]$ python Python 3.11.4 (main, Jul 5 2023, 13:45:01) [GCC 11.2.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import mdtraj as mt >>>
patsy conda-forge/noarch::patsy-0.5.4-pyhd8ed1ab_0 seaborn conda-forge/noarch::seaborn-0.13.0-hd8ed1ab_0 seaborn-base conda-forge/noarch::seaborn-base-0.13.0-pyhd8ed1ab_0 statsmodels conda-forge/linux-64::statsmodels-0.14.0-py311h1f0f07a_3
joblib conda-forge/noarch::joblib-1.4.2-pyhd8ed1ab_0
export PATH=/share/apps/CENTOS7/masurca/4.1.0/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/masurca/4.1.0/lib:$LD_LIBRARY_PATH [hmeij@cottontail2 ~]$ masurca --version version 4.1.0
/share/apps/CENTOS6/Trimmomatic-0.39/ [root@cottontail Trimmomatic-0.39]# ll total 176 drwxr-xr-x 2 root root 8 Apr 8 2019 adapters -rw-r--r-- 1 root root 35147 May 16 2018 LICENSE -rw-r--r-- 1 root root 128502 Apr 8 2019 trimmomatic-0.39.jar
An open source research project exploring the role of machine learning as a tool in the creative process. Magenta is distributed as an open source Python library, powered by TensorFlow. This library includes utilities for manipulating source data (primarily music and images), using this data to train machine learning models, and finally generating new content from these models.
EasyBuild is a software build and installation framework that allows you to manage (scientific) software on High Performance Computing (HPC) systems in an efficient way.
Easybuild uses environment-modules
to control the environment. Load the module of interest and all dependencies will be taken care of. See below.
Installed modules and their dependencies
libcuda.so.1
set export path below
To use module, discover the modules and load desired module. On greentail52
(debugging) or in submit scripts on queue: exx96 (mwgpu and amber128 nodes do not work)
# cuda libs location export LD_LIBRARY_PATH=\ /sanscratch/CENTOS7/easybuild/4.4.2/software/CUDAcore/11.1.1/lib64:\ /sanscratch/CENTOS7/easybuild/4.4.2/software/CUDAcore/11.1.1/lib:\ /sanscratch/CENTOS7/easybuild/4.4.2/software/CUDAcore/11.1.1/targets/x86_64-linux/lib/stubs:\ $LD_LIBRARY_PATH # discover all modules module use /sanscratch/CENTOS7/easybuild/4.4.2/modules/all # view availability module avail # load module load module_name # list modules loaded module list # check, start app, import package (by full module name) # note that these modules use different python versions which python python >>>help('modules') # unload all modules (better to exit shell and start new one, # purge will also unload system modules... module purge
HOOMD-blue is a Python package that runs simulations of particle systems on CPUs and GPUs.
Set up env for miniconda and cuda
export CUDAHOME=/usr/local/cuda-10.2 export PATH=/usr/local/cuda-10.2/bin:$PATH export LD_LIBRARY_PATH=/usr/local/cuda-10.2/lib:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/usr/local/cuda-10.2/lib64:$LD_LIBRARY_PATH source /share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh export PATH=/share/apps/CENTOS7/amber/miniconda3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/lib:$LD_LIBRARY_PATH which mpirun python conda which nvcc # installation (not sure how this works, does the * not get expanded in shell?) # seems to have worked forcing gpu package install on greentail52 conda install -c conda-forge hoomd=*=*gpu* $ conda list | egrep -i "cuda|hoomd" cudatoolkit 11.0.221 h6bb024c_0 hoomd 2.9.4 gpu_py38h02d30ca_1 conda-forge # ahh, hamed found an error conda install -c conda-forge cudatoolkit=10.2 The following packages will be DOWNGRADED: cudatoolkit 11.0.221-h6bb024c_0 --> 10.2.89-hfd86e86_1 hoomd 2.9.4-gpu_py38h02d30ca_1 --> 2.9.4-gpu_py38h5bdc439_1
ca-certificates-2021.10.8 | ha878542_0 139 KB conda-forge certifi-2021.10.8 | py38h578d9bd_0 145 KB conda-forge conda-4.10.3 | py38h578d9bd_2 3.0 MB conda-forge cudatoolkit-11.0.221 | h6bb024c_0 622.9 MB hoomd-2.9.4 |gpu_py38h02d30ca_1 46.2 MB conda-forge openssl-1.1.1h | h516909a_0 2.1 MB conda-forge tbb-2020.2 | hc9558a2_0 1.4 MB conda-forge
conda install -c conda-forge gsd The following NEW packages will be INSTALLED: _openmp_mutex conda-forge/linux-64::_openmp_mutex-4.5-1_gnu gsd conda-forge/linux-64::gsd-2.5.1-py38h6c62de6_0 libgomp conda-forge/linux-64::libgomp-11.2.0-h1d223b6_11 The following packages will be UPDATED: certifi 2021.10.8-py38h578d9bd_0 --> 2021.10.8-py38h578d9bd_1 conda 4.10.3-py38h578d9bd_2 --> 4.10.3-py38h578d9bd_4 libgcc-ng pkgs/main::libgcc-ng-9.1.0-hdf63c60_0 --> conda-forge::libgcc-ng-11.2.0-h1d223b6_11 openssl 1.1.1h-h516909a_0 --> 1.1.1l-h7f98852_0
Added to miniconda3, see section “Miniconda3”
# environment source /share/apps/CENTOS7/miniconda3/etc/profile.d/conda.sh export PATH=/share/apps/CENTOS7/miniconda3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/miniconda3/lib:$LD_LIBRARY_PATH libgcc-7.2.0 | h69d50b8_2 304 KB conda-forge samtools-1.7 | 1 1.0 MB bioconda bowtie2-2.2.5 | py38hed8969a_7 11.8 MB bioconda perl-5.26.2 | h36c2ea0_1008 15.4 MB conda-forge bedtools-2.30.0 | hc088bd4_0 14.0 MB bioconda # cufflinks requires centos 6, weird, so also added to miniconda2, # see section "Kallisto & Trinity & FastQC"
Suite of tools for high-accuracy basecaling, assembly, polishing, and alignment. Installed in a specific miniconda3 environment (see below).
# environment for all modules source /share/apps/CENTOS7/miniconda3/etc/profile.d/conda.sh export PATH=/share/apps/CENTOS7/miniconda3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/miniconda3/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/gcc/9.2.0/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/cmake/3.12.1/bin:$PATH which conda python pip mpirun gcc cmake python --version /share/apps/CENTOS7/miniconda3/bin/conda /share/apps/CENTOS7/miniconda3/bin/python /share/apps/CENTOS7/miniconda3/bin/pip /share/apps/CENTOS7/miniconda3/bin/mpirun /share/apps/CENTOS7/gcc/9.2.0/bin/gcc /share/apps/CENTOS7/cmake/3.12.1/bin/cmake
Guppy Server
export PATH=/share/apps/CENTOS7/guppy/5.0.11/ont-guppy-cpu/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/guppy/5.0.11/ont-guppy-cpu/lib:$LD_LIBRARY_PATH ldd `which guppy_basecall_server` # cpu; note as user start server on scratch server greentail52 guppy_basecall_server --config dna_r9.4.1_450bps_fast.cfg -p 5555 -l /tmp/guppy & # check if running, use ''kill'' command to terminate lsof -i:5555 COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME guppy_bas 80002 bchernoff 12u IPv4 959121280 0t0 TCP *:personal-agent (LISTEN) # nodes can reach this server at this IP or start their own on localhost:port 192.168.102.251 greentail52-eth0 10.10.102.251 greentail52-eth1 (preferred)
Guppy Client
pip install cython PyUnicode timer pip install pyguppyclient Successfully installed flatbuffers-1.11 ont-fast5-api-3.3.0 pyguppyclient-0.0.6 pyzmq-17.1.2 note: from pyguppyclient import GuppyBasecallerClient, yield_reads success, but do not know how to execute the example
Medaka
# To activate this environment, use # $ conda activate medaka # To deactivate an active environment, use # $ conda deactivate # requires (medaka)$ conda list | egrep -i "samtools|minimap2|tabix|bgzip" minimap2 2.21 h5bf99c6_0 bioconda pbgzip 2016.08.04 h36cd882_2 bioconda py-bgzip 0.4.0 py38ha8cb210_0 conda-forge samtools 1.12 h9aed4be_1 bioconda tabix 1.11 hdfd78af_0 bioconda
Flye
Somewhat annoying, no miniconda3 package, (python 3.9, gcc 9.2). Local building (without installation) on centos7 greentail52.
/share/apps/CENTOS7/flye/2.9.2/bin/flye --version 2.9.2-b1794
/share/apps/CENTOS7/flye/2.9.1/bin/flye --version 2.9.1-b1780
export PATH=/share/apps/CENTOS7/flye/2.9/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/flye/2.9/lib:$LD_LIBRARY_PATH which flye /share/apps/CENTOS7/flye/2.9/bin/flye flye --version 2.9-b1774
Filtlong
# note: fails with gcc 9.2, native gcc ok 4.8.5, UNSET gcc92/miniconda3 env export PATH=/share/apps/CENTOS7/Filtlong/0.2.1/bin:$PATH ldd `which filtlong`
Bamtools
export PATH=/share/apps/CENTOS7/bamtools/2.5.2/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/bamtools/2.5.2/lib:$LD_LIBRARY_PATH ldd `which bamtools`
BBmap
Pilon
[hmeij@greentail52 pilon]$ java -jar pilon-1.24.jar --help Pilon version 1.24 Thu Jan 28 13:00:45 2021 -0500 Usage: pilon --genome genome.fasta [--frags frags.bam] [--jumps jumps.bam] [--unpaired unpaired.bam] [...other options...] pilon --help for option details <snip>
haslr
The following NEW packages will be INSTALLED: _openmp_mutex conda-forge/linux-64::_openmp_mutex-4.5-2_gnu fastutils bioconda/linux-64::fastutils-0.3-hdcf5f25_4 haslr bioconda/linux-64::haslr-0.8a1-py39hd65a603_4 k8 bioconda/linux-64::k8-0.2.5-hdcf5f25_4 libgomp conda-forge/linux-64::libgomp-13.2.0-h807b86a_2 libzlib conda-forge/linux-64::libzlib-1.2.13-hd590300_5 minia bioconda/linux-64::minia-3.2.6-hdcf5f25_3 minimap2 bioconda/linux-64::minimap2-2.26-he4a0461_1 pluggy conda-forge/noarch::pluggy-1.3.0-pyhd8ed1ab_0 ruamel.yaml conda-forge/linux-64::ruamel.yaml-0.17.32-py39hd1e30aa_0 ruamel.yaml.clib conda-forge/linux-64::ruamel.yaml.clib-0.2.7-py39h72bdee0_1 toolz conda-forge/noarch::toolz-0.12.0-pyhd8ed1ab_0
Compiled from source, do not source miniconda3, we are using standalone pyhton3
export PATH=/share/apps/CENTOS7/gcc/9.2.0/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH [hmeij@greentail52 20231006]$ which python /share/apps/CENTOS7/python/3.8.3/bin/python [hmeij@greentail52 20231006]$ pwd /share/apps/CENTOS7/haslr/20231006 [hmeij@greentail52 20231006]$ ll bin total 11448 -rwxr-xr-x 1 hmeij its 123448 Oct 6 15:01 fastutils -rwxr-xr-x 1 hmeij its 3823032 Oct 6 15:04 haslr_assemble -rwxr-xr-x 1 hmeij its 19949 Oct 6 14:48 haslr.py -rwxr-xr-x 1 hmeij its 20969840 Oct 6 14:59 minia -rwxr-xr-x 1 hmeij its 23856 Oct 6 15:01 minia_nooverlap -rwxr-xr-x 1 hmeij its 951922 Oct 6 14:59 minimap2
stLFR
cd /share/apps/CENTOS7/stLFR/20230929 [hmeij@greentail52 20230929]$ ./Release/stLFR_GapCloser -h Version: 1.00 Contact: dengli1@genomics.cn && guolidong@genomics.cn Usage: stLFR_GapCloser [options] ... # and another request by Luis # https://sourceforge.net/projects/soapdenovo2/files/SOAPdenovo2 [hmeij@greentail52 stLFR]$ pwd /share/apps/CENTOS7/stLFR [hmeij@greentail52 stLFR]$ ll GapCloser-bin-v1.12-r6/ total 175 -rwxr-xr-x 1 hmeij its 184713 Jan 11 2013 GapCloser -rw-r--r-- 1 hmeij its 44533 Jan 11 2013 GapCloser_Manual.pdf
BUSCO
[hmeij@n79 ~]$ docker images REPOSITORY TAG IMAGE ID CREATED SIZE ezlabgva/busco v5.5.0_cv1 30d3e831aba7 2 months ago 2.88GB [hmeij@n79 busco_wd]$ docker run --rm -v $(pwd):/busco_wd -u $(id -u):$(id -g) ezlabgva/busco:v5.5.0_cv1 busco -i /busco_wd/assembly.fasta -l actinopterygii_odb10 -o Busco_Output -m geno -f 2023-10-28 20:30:00 INFO: ***** Start a BUSCO v5.5.0 analysis, current time: 10/28/2023 20:30:00 ***** 2023-10-28 20:30:00 INFO: Configuring BUSCO with local environment 2023-10-28 20:30:00 INFO: Mode is genome 2023-10-28 20:30:00 INFO: 'Force' option selected; overwriting previous results directory 2023-10-28 20:30:00 INFO: Downloading information on latest versions of BUSCO data... that's a problem, trying to reach the internet ...
picard
picard centos7, weird > Failed to load native library 'libnative-platform.so' for Linux amd64. picard centos8 [hmeij@cottontail2 20231108]$ pwd /share/apps/CENTOS8/picard/20231108 [hmeij@cottontail2 20231108]$ ls -l build/libs/ total 119872 -rw-r--r-- 1 hmeij its 62520451 Nov 8 11:46 picard-3.1.0-3-ga9194bd97-SNAPSHOT-all.jar -rw-r--r-- 1 hmeij its 1856980 Nov 8 11:01 picard-3.1.0-3-ga9194bd97-SNAPSHOT.jar -rw-r--r-- 1 hmeij its 62520451 Nov 8 11:46 picard.jar [hmeij@cottontail2 20231108]$
GPU is nonfunctional
Conversion of python code to C binaries was successful
read the history.txt file
— Henk 2021/07/07 15:38
# sample scripts in /zfshomes/hmeij/amhughes # read README, it's complicated
VSEARCH stands for vectorized search, as the tool takes advantage of parallelism in the form of SIMD vectorization … an alternative to USEARCH; offers search and clustering algorithms that are often orders of magnitude faster than BLAST.
DeepChem is a deep learning tool built for bio/chem research
# add to ~/.bashrc and log back in, then check: 'which python pip conda' # >>> conda initialize >>> # !! Contents within this block are managed by 'conda init' !! __conda_setup="$('/share/apps/CENTOS7/amber/miniconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)" if [ $? -eq 0 ]; then eval "$__conda_setup" else if [ -f "/share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh" ]; then . "/share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh" else export PATH="/share/apps/CENTOS7/amber/miniconda3/bin:$PATH" fi fi unset __conda_setup # <<< conda initialize <<<
Successfully installed absl-py-0.12.0 astunparse-1.6.3 cachetools-4.2.2 flatbuffers-1.12 gast-0.4.0 google-auth-1.30.2 google-auth-oauthlib-0.4.4 google-pasta-0.2.0 grpcio-1.34.1 h5py-3.1.0 keras-nightly-2.5.0.dev2021032900 keras-preprocessing-1.1.2 markdown-3.3.4 numpy-1.19.5 oauthlib-3.1.1 opt-einsum-3.3.0 protobuf-3.17.3 pyasn1-0.4.8 pyasn1-modules-0.2.8 requests-oauthlib-1.3.0 rsa-4.7.2 tensorboard-2.5.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.0 tensorflow-2.5.0 tensorflow-estimator-2.5.0 termcolor-1.1.0 typing-extensions-3.7.4.3 werkzeug-2.0.1 wrapt-1.12.1
Due to technical limitations, the conda package does not support GPUs according to web site. You must build manual libraries for GPU, see “gallario CPU+GPU” section above
source /share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh export PATH=/share/apps/CENTOS7/amber/miniconda3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/lib:$LD_LIBRARY_PATH which mpirun python conda
# https://github.com/mtazzari/galario (CPU only apparently) conda install -c conda-forge galario fftw conda-forge/linux-64::fftw-3.3.8-nompi_hfc0cae8_1114 galario conda-forge/linux-64::galario-1.2.2-py38h9d68f60_1002 libblas conda-forge/linux-64::libblas-3.9.0-8_openblas libcblas conda-forge/linux-64::libcblas-3.9.0-8_openblas libgfortran-ng conda-forge/linux-64::libgfortran-ng-7.5.0-h14aa051_19 libgfortran4 conda-forge/linux-64::libgfortran4-7.5.0-h14aa051_19 liblapack conda-forge/linux-64::liblapack-3.9.0-8_openblas libopenblas conda-forge/linux-64::libopenblas-0.3.12-pthreads_hb3c22a3_1 numpy conda-forge/linux-64::numpy-1.19.4-py38hf0fd68c_1 python_abi conda-forge/linux-64::python_abi-3.8-1_cp38 scipy conda-forge/linux-64::scipy-1.5.3-py38h828c644_0 mpi conda-forge/linux-64::mpi-1.0-openmpi mpi4py conda-forge/linux-64::mpi4py-3.0.3-py38h2e4b629_2 openmpi pkgs/main/linux-64::openmpi-4.0.2-hb1b8bf9_1 pandas conda-forge/linux-64::pandas-1.1.4-py38h0ef3d22_0 python-dateutil conda-forge/noarch::python-dateutil-2.8.1-py_0 pytz conda-forge/noarch::pytz-2021.1-pyhd8ed1ab_0 schwimmbad conda-forge/linux-64::schwimmbad-0.3.2-py38h578d9bd_0
For astronomy group (amhughes and students) added ('conda list' to see all packages installed)
argparse 1.4.0 pypi_0 pypi corner 2.2.1 pypi_0 pypi emcee 3.0.2 pypi_0 pypi galario 1.2.2 py38h9d68f60_1002 conda-forge matplotlib 3.4.2 pypi_0 pypi numpy 1.19.5 pypi_0 pypi time 1.8 h516909a_0 conda-forge astropy conda-forge/linux-64::astropy-3.2.3-py38h516909a_0 attrs conda-forge/noarch::attrs-21.2.0-pyhd8ed1ab_0 click conda-forge/linux-64::click-8.0.3-py38h578d9bd_0 coverage pkgs/main/linux-64::coverage-5.5-py38h27cfd23_2 hypothesis conda-forge/noarch::hypothesis-6.23.4-pyhd8ed1ab_0 iniconfig conda-forge/noarch::iniconfig-1.1.1-pyh9f0ad1d_0 more-itertools conda-forge/noarch::more-itertools-8.10.0-pyhd8ed1ab_0 packaging conda-forge/noarch::packaging-21.0-pyhd8ed1ab_0 pluggy conda-forge/linux-64::pluggy-1.0.0-py38h578d9bd_1 psutil pkgs/main/linux-64::psutil-5.8.0-py38h27cfd23_1 py conda-forge/noarch::py-1.10.0-pyhd3deb0d_0 pyparsing conda-forge/noarch::pyparsing-2.4.7-pyh9f0ad1d_0 pytest conda-forge/linux-64::pytest-6.2.5-py38h578d9bd_0 pytest-arraydiff conda-forge/noarch::pytest-arraydiff-0.3-py_0 pytest-astropy conda-forge/noarch::pytest-astropy-0.9.0-pyhd8ed1ab_0 pytest-astropy-he~ conda-forge/noarch::pytest-astropy-header-0.1.2-py_0 pytest-cov conda-forge/noarch::pytest-cov-3.0.0-pyhd8ed1ab_0 pytest-doctestplus conda-forge/noarch::pytest-doctestplus-0.11.0-pyhd8ed1ab_0 pytest-filter-sub~ conda-forge/noarch::pytest-filter-subpackage-0.1.1-py_0 pytest-mock conda-forge/noarch::pytest-mock-3.6.1-pyhd8ed1ab_0 pytest-openfiles conda-forge/noarch::pytest-openfiles-0.5.0-py_0 pytest-remotedata conda-forge/noarch::pytest-remotedata-0.3.2-pyh9f0ad1d_0 sortedcontainers conda-forge/noarch::sortedcontainers-2.4.0-pyhd8ed1ab_0 toml conda-forge/noarch::toml-0.10.2-pyhd8ed1ab_0
For nwells/smithlab added
conda install -c schrodinger pymol-bundle package | build ---------------------------|----------------- apbs-1.5 | h14c3975_3 277 KB schrodinger biopython-1.78 | py38h7b6447c_0 2.1 MB bzip2-1.0.8 | h516909a_3 398 KB conda-forge collada2gltf-2.1.4 | h6bb024c_0 3.2 MB schrodinger conda-4.10.3 | py38h578d9bd_0 3.1 MB conda-forge curl-7.71.1 | hbc83047_1 140 KB dbus-1.13.18 | hb2f20db_0 504 KB expat-2.4.1 | h2531618_2 168 KB fontconfig-2.13.1 | he4413a7_1000 327 KB conda-forge freemol-1.158 | py_2 6 KB schrodinger freetype-2.10.4 | h7ca028e_0 912 KB conda-forge glew-2.0.0 | 0 660 KB schrodinger glib-2.69.0 | h5202010_0 1.7 MB gst-plugins-base-1.14.0 | hbbd80ab_1 4.8 MB gstreamer-1.14.0 | h28cd5cc_2 3.2 MB h5py-2.10.0 |nompi_py38hafa665b_105 1.1 MB conda-forge hdf4-4.2.13 | h3ca952b_2 714 KB hdf5-1.10.6 |nompi_h7c3c948_1111 3.1 MB conda-forge icu-58.2 | hf484d3e_1000 22.6 MB conda-forge jpeg-9d | h36c2ea0_0 264 KB conda-forge krb5-1.18.2 | h173b8e3_0 1.3 MB libcurl-7.71.1 | h20c2e04_1 305 KB libglu-9.0.0 | he1b5a44_1001 413 KB conda-forge libholoplaycore-0.1.0_rc4 | 1 325 KB schrodinger libnetcdf-4.7.4 |nompi_h56d31a8_107 1.3 MB conda-forge libpng-1.6.37 | h21135ba_2 306 KB conda-forge libssh2-1.9.0 | hab1572f_5 225 KB conda-forge libtiff-4.0.10 | hc3755c2_1005 602 KB conda-forge libuuid-2.32.1 | h14c3975_1000 26 KB conda-forge libxcb-1.13 | h14c3975_1002 396 KB conda-forge libxml2-2.9.10 | hb55368b_3 1.2 MB lz4-c-1.9.2 | he1b5a44_3 203 KB conda-forge mengine-1 | h14c3975_1 676 KB schrodinger mpeg_encode-1 | h14c3975_1 106 KB schrodinger mtz2ccp4_px-1.0 | h9ac9557_3 547 KB schrodinger olefile-0.46 | pyh9f0ad1d_1 32 KB conda-forge pcre-8.45 | h295c915_0 207 KB pdb2pqr-2.1.2+pymol | py_0 236 KB schrodinger pillow-6.2.1 | py38h6b7be26_0 637 KB conda-forge pmw-2.0.1+3 | py_3 60 KB schrodinger pthread-stubs-0.4 | h36c2ea0_1001 5 KB conda-forge pycollada-0.7.1+bdf414c7 | py_1 80 KB schrodinger pykerberos-1.2.1 | py38h27cfd23_2 259 KB pymol-2.4.1 | py38h4463551_0 8.5 MB schrodinger pymol-bundle-2.4.1 | 0 16 KB schrodinger pymol-web-examples-2.4 | 1 1.9 MB schrodinger pyqt-5.9.2 | py38h05f1152_4 4.5 MB qt-5.9.7 | h5867ecd_1 68.5 MB rigimol-1.3 | 2 489 KB schrodinger sip-4.19.13 | py38he6710b0_0 277 KB xorg-libxau-1.0.9 | h14c3975_0 13 KB conda-forge xorg-libxdmcp-1.1.3 | h516909a_0 18 KB conda-forge zstd-1.4.5 | h9ceee32_0 619 KB
Then — Henk 2021/11/24 13:16
# using pip3 Successfully installed magenta-2.1.3 python-rtmidi-1.1.2 Successfully installed keras-2.7.0 libclang-12.0.0 tensorboard-2.7.0 \ tensorflow-2.7.0 tensorflow-estimator-2.7.0 tensorflow-io-gcs-filesystem-0.22.0
Added
— Henk 2022/01/03 14:35
# https://github.com/merenlab/anvio/issues/1479
conda install -c bioconda samtools=1.9 --force-reinstall The following NEW packages will be INSTALLED: gettext conda-forge/linux-64::gettext-0.19.8.1-hf34092f_1004 htslib bioconda/linux-64::htslib-1.9-h4da6232_3 libdeflate bioconda/linux-64::libdeflate-1.2-h516909a_1 libglib conda-forge/linux-64::libglib-2.66.3-hbe7bbb4_0 libiconv conda-forge/linux-64::libiconv-1.16-h516909a_0 The following packages will be UPDATED: cffi pkgs/main::cffi-1.14.3-py38h261ae71_2 --> conda-forge::cffi-1.14.4-py38ha312104_0 libedit pkgs/main::libedit-3.1.20191231-h14c3~ --> conda-forge::libedit-3.1.20191231-h46ee950_2 samtools 1.7-1 --> 1.9-h10a08f8_12 The following packages will be SUPERSEDED by a higher-priority channel: glib pkgs/main::glib-2.69.0-h5202010_0 --> conda-forge::glib-2.66.3-h58526e2_0 libffi pkgs/main::libffi-3.3-he6710b0_2 --> conda-forge::libffi-3.2.1-he1b5a44_1007 ncurses pkgs/main::ncurses-6.2-he6710b0_1 --> conda-forge::ncurses-6.1-hf484d3e_1002 python pkgs/main::python-3.8.5-h7579374_1 --> conda-forge::python-3.8.3-cpython_he5300dc_0 sqlite pkgs/main::sqlite-3.33.0-h62c20be_0 --> conda-forge::sqlite-3.32.3-hcee41ef_1 $ samtools --version samtools 1.9 Using htslib 1.9 Copyright (C) 2018 Genome Research Ltd.
[hmeij@greentail52 ~]$ iqtree --version IQ-TREE multicore version 2.0.3 for Linux 64-bit built Dec 20 2020
Successfully installed GridDataFormats-1.0.1 MDAnalysis-2.3.0 fasteners-0.18 mmtf-python-1.1.3 mrcfile-1.4.3 msgpack-1.0.4 numpy-1.23.4
For Lammps (starrlab) 23April2023, consult install file
# Note: you must add this line to you env export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/pkgs/cudatoolkit-11.0.221-h6bb024c_0/lib:$LD_LIBRARY_PATH # Note2: had to install 'conda install cudnn' and make two links to higher level libraries (a must not do) for libcudnn.so.8 and libcusolver.so.11 [hmeij@greentail52 ~]$ which python /share/apps/CENTOS7/amber/miniconda3/bin/python [hmeij@greentail52 ~]$ pip list | grep tensorflow mesh-tensorflow 0.1.19 tensorflow 2.8.0 tensorflow-addons 0.15.0 tensorflow-datasets 4.4.0 tensorflow-estimator 2.7.0 tensorflow-gan 2.1.0 tensorflow-hub 0.12.0 tensorflow-io-gcs-filesystem 0.32.0 tensorflow-metadata 1.4.0 tensorflow-probability 0.15.0 [hmeij@greentail52 ~]$ which pacemaker /usr/bin/which: no pacemaker in (...failed to install at first, see readme install file)
Pytraj is a wrapper for cpptraj
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH # An environment resource file for CPPTRAJ has been created: /zfshomes/apps/CENTOS7/amber/pytraj/cpptraj/cpptraj.sh # You may 'source' this file to set up your environment for CPPTRAJ. Installed /zfshomes/apps/CENTOS7/python/3.8.3/lib/python3.8/site-packages/pytraj-2.0.6.dev0-py3.8-linux-x86_64.egg Processing dependencies for pytraj==2.0.6.dev0 Searching for numpy==1.20.3 Best match: numpy 1.20.3 Adding numpy 1.20.3 to easy-install.pth file Installing f2py script to /share/apps/CENTOS7/python/3.8.3/bin Installing f2py3 script to /share/apps/CENTOS7/python/3.8.3/bin Installing f2py3.8 script to /share/apps/CENTOS7/python/3.8.3/bin Using /zfshomes/apps/CENTOS7/python/3.8.3/lib/python3.8/site-packages Finished processing dependencies for pytraj==2.0.6.dev0
cottontail
and greentail52
So I remember where I put the tar repo file
[root@cottontail2 ~]# ll /share/apps/src/cottontail2/openhpc/ total 8290421 -rw------- 1 root root 562276 Mar 17 09:34 Install_guide-Rocky8-Warewulf-SLURM-2.4-x86_64.pdf -rw-r--r-- 1 root root 10720 Mar 17 09:35 ohpc-release-2-1.el8.x86_64.rpm -rw-r--r-- 1 root root 8531681280 Mar 17 09:33 OpenHPC-2.4.EL_8.x86_64.tar # this is all that is needed for ohpc base compute client # see the recipe for cct2 and follow compute node only ingredients [root@cottontail2 ~]# ll /share/apps/src/cottontail2/ohpc-compute/ total 95196 -rw-r--r-- 1 root root 2696360 Feb 10 09:21 hwloc-ohpc-2.7.0-3.9.ohpc.2.6.x86_64.rpm -rw-r--r-- 1 root root 266252 Feb 10 09:22 lmod-ohpc-8.7.6-12.3.ohpc.2.6.x86_64.rpm -rw-r--r-- 1 root root 7268 Feb 10 09:22 ohpc-base-compute-2.6.1-5.1.ohpc.2.6.1.x86_64.rpm -rw-r--r-- 1 root root 8204 Feb 10 09:23 ohpc-filesystem-2.6-2.3.ohpc.2.6.noarch.rpm -rw-r--r-- 1 root root 10720 Feb 10 09:23 ohpc-release-2-1.el8.x86_64.rpm -rw-r--r-- 1 root root 7124 Feb 10 09:27 ohpc-slurm-client-2.6.1-5.1.ohpc.2.6.1.x86_64.rpm -rw-r--r-- 1 root root 74326532 Feb 10 09:23 singularity-ohpc-3.7.1-5.1.ohpc.2.1.x86_64.rpm -rw-r--r-- 1 root root 22168 Feb 10 09:24 slurm-contribs-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm -rw-r--r-- 1 root root 247392 Feb 10 09:24 slurm-example-configs-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm -rw-r--r-- 1 root root 18764472 Feb 10 09:24 slurm-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm -rw-r--r-- 1 root root 176480 Feb 10 09:28 slurm-pam_slurm-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm -rw-r--r-- 1 root root 785112 Feb 10 09:25 slurm-slurmd-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm
# which icc icx icpc icpx dpcpp ifort ifx mpirun mpicc python clang clang++ /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/intel64/icc /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/icx /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/intel64/icpc /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/icpx /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/dpcpp /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/intel64/ifort /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/ifx /share/apps/CENTOS7/intel/oneapi/intelpython/latest/bin/mpirun /share/apps/CENTOS7/intel/oneapi/mpi/2021.1.1/bin/mpicc /share/apps/CENTOS7/intel/oneapi/intelpython/latest/bin/python /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/clang /share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/clang++ # in addition these were installed /share/apps/CENTOS7/intel/oneapi/mkl/2021.1.1 /share/apps/CENTOS7/intel/oneapi/pytorch/1.5.0 /share/apps/CENTOS7/intel/oneapi/tensorflow/2.2.0
# native OS installation for rstudio /bin/R
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH" export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH" Successfully installed macs-1.4.3 # note: macs2 Python version must >= 3.6! # or change python version to # ls /share/apps/CENTOS6/python/2.7.9/bin/macs2 /share/apps/CENTOS6/python/2.7.9/bin/macs2 # or change python version to which python: /share/apps/CENTOS7/python/3.8.3/bin/python pip list: Package Version --------------- ------- bowtie 0.11.0 MACS2 2.2.7.1 numpy 1.18.5 picard 2.3.2 scipy 1.4.1 export PATH="/share/apps/CENTOS6/sequencingtools/bowtie2-2.4.1-linux-x86_64:$PATH" export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:/share/apps/CENTOS6/gcc/4.8.4/lib:$LD_LIBRARY_PATH which gcc: /share/apps/CENTOS6/gcc/4.8.4/bin/gcc which python: /share/apps/CENTOS6/miniconda2/bin/python which perl: /share/apps/CENTOS6/miniconda2/bin/perl export PATH=/share/apps/CENTOS6/R/3.6.1b/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/3.6.1b/lib64:$LD_LIBRARY_PATH export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH which R: /share/apps/CENTOS6/R/3.6.1b/bin/R ldd /home/apps/CENTOS6/R/3.6.1b/lib64/R/bin/exec/R java --version: java 12.0.2 2019-07-16 # /share/apps/java/latest -> jdk-12.0.2 java -jar /share/apps/CENTOS6/sequencingtools/picard-2.23.0/picard.jar -h USAGE: PicardCommandLine <program name> [-h] export PATH=/share/apps/CENTOS6/sequencingtools/bedtools-2.29.2/bin:$PATH bedtools --version: bedtools v2.29.2
exx96
provides access to docker containers andSamtools is a suite of programs for interacting with high-throughput sequencing data. It consists of three separate repositories:
# add to environment export PATH=/share/apps/CENTOS6/samtools/1.10/bin:$PATH export PATH=/share/apps/CENTOS6/bcftools/1.10.2/bin:$PATH export PATH=/share/apps/CENTOS6/htslib/1.10.2/bin:$PATH
# does not seem to work against htslib version above... export HTSSRC=/share/apps/CENTOS6/ANGSD/htslib export PATH=/share/apps/CENTOS6/ANGSD/angsd:$PATH
# with local java jre1.8.0_212 export PATH=/share/apps/CENTOS6/beast/2.6.0/bin:$PATH
export PATH=/share/apps/CENTOS7/sratools/sratoolkit.2.11.0-centos_linux64/bin:$PATH
These are all part of Miniconda3 (centos7), to setup the environment consult
They are also available via Miniconda2 (centos6), consult
[hmeij@greentail52 ~]$ conda list | egrep "adapterremoval|bwa|bowtie" adapterremoval 2.3.2 hb7ba0dd_0 bioconda bowtie2 2.2.5 py38hed8969a_7 bioconda bwa 0.7.17 hed695b0_7 bioconda
# env export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH" export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH" The following NEW packages will be INSTALLED: mapdamage2 bioconda/linux-64::mapdamage2-2.0.6-1 pysam bioconda/linux-64::pysam-0.16.0.1-py27hc729bab_3 seqtk bioconda/linux-64::seqtk-1.3-h5bf99c6_3 [hmeij@petaltail ~]$ mapDamage --version 2.0.6
/share/apps/CENTOS7/pcangsd/1.1.0
source /share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh export PATH=/share/apps/CENTOS7/amber/miniconda3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/lib:$LD_LIBRARY_PATH which pcangsd /share/apps/CENTOS7/amber/miniconda3/bin/pcangsd pcangsd --version pcangsd 1.10
$ pwd /share/apps/CENTOS7/angsd/0.940/angsd $ ls -l angsd -rwxr-xr-x 1 hmeij its 5517248 Jul 6 14:11 angsd $ ./angsd -> angsd version: 0.940-dirty (htslib: 1.16) build(Jul 6 2023 14:11:30)
# add path to $PATH [hmeij@greentail52 tmp]$ ll /share/apps/CENTOS7/raxml/8.2.12/ total 3202 -rwxr-xr-x 1 hmeij its 1177632 Apr 25 14:49 raxmlHPC-AVX -rwxr-xr-x 1 hmeij its 1177632 Apr 25 14:49 raxmlHPC-AVX2 -rwxr-xr-x 1 hmeij its 1152040 Apr 25 14:49 raxmlHPC_serial -rwxr-xr-x 1 hmeij its 1185704 Apr 25 14:49 raxmlHPC-SSE3
For both apps above java is v20, developers want and tested v17
For both apps you may find R local in operating system or a recent versions at
CD-HIT is a widely used program for clustering biological sequences to reduce sequence redundancy and improve the performance of other sequence analyses.
Open-Source Computational Structural Biology Framework
OpenStructure relies on these applications and Python 2.7.15 (see python2 section).
-- Boost version: 1.68.0 -- Found the following Boost libraries: -- python -- unit_test_framework -- filesystem -- system -- iostreams -- program_options -- regex -- thread -- Found ZLIB: /share/apps/CENTOS6/zlib/1.2.11/lib/libz.so (found version "1.2.11") -- Found PNG: /share/apps/CENTOS6/lpng/1.6.37/lib/libpng.so (found version "1.2.49") -- Found EIGEN: /share/apps/CENTOS6/eigen/3.2.10 -- Numpy headers found -- Found FFTW: /share/apps/CENTOS6/fftw/3.3.8-enable-single/lib/libfftw3f.so -- Found TIFF: /share/apps/CENTOS6/tiff/4.0.9/lib/libtiff.so (found version "4.0.9") # setup the environment egrep ^export /share/apps/CENTOS6/openstructure/1.9-install.txt ost --help Usage: ost [ost options] [script to execute] [script parameters] or ost [action name] [action options]
Remarkably complex compilations.
An Easy and Professional Tool to Learn & Teach Programming with Python
Phylogenetic Analysis by Maximum Likelihood (PAML)…programs for phylogenetic analyses of DNA or protein sequences using maximum likelihood.
From website:
“
We now have an up to date webservice at http://bio2byte.be/acpype/
(but it does not have the amb2gmx funcionality).
To run acpype, locally, with its all functionalities, you need ANTECHAMBER from package AmberTools and Open Babel if your input files are of PDB format.
However, if one wants acpype just to emulate amb2gmx.pl, one needs nothing at all but Python.
“
— Henk 2021/02/18 14:41
# needed for _ctypes to compile properly for https retrievals yum install openssl openssl-devel yum install yum install libffi-devel libffi tar -xvf ../Python-3.8.3.tgz cd Python-3.8.3/ ./configure --prefix=/share/apps/CENTOS7/python/3.8.3 --enable-optimizations make make install pip3 install --upgrade pip # make symbolic links python3->pyhton, pip3->pip # drop all the packages in from previous versions https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#python3 export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH which pip3 python3 /share/apps/CENTOS7/python/3.8.3/bin/pip3 /share/apps/CENTOS7/python/3.8.3/bin/python3
Added to both 3.5.2 versions
pycuda
— Henk 2020/02/18 08:45
Added to both 3.5.2 versions
Successfully installed
boto-2.49.0 boto3-1.12.0 botocore-1.15.0 gensim-3.8.1 jmespath-0.9.4 s3transfer-0.3.3 smart-open-1.9.0
joblib, scikit-learn, sklearn
Requirement already satisfied:
keras keras-preprocessing keras-applications
tensorflow (requires centos7)
— Henk 2020/02/17 13:02
Added to both 3.5.2 versions
jupyter-notebook
only runs in 3.5.2 (non-mpi)
— Henk 2019/07/23 09:04
Successfully installed Send2Trash-1.5.0 ipywidgets-7.5.0 jupyter-1.0.0 jupyter-client-5.3.1 jupyter-console-6.0.0 notebook-6.0.0 prometheus-client-0.7.1 terminado-0.8.2 widgetsnbextension-3.5.0 json5-0.8.5 jupyterlab-1.0.2 jupyterlab-server-1.0.0 tornado-6.0.3
Added to both python versions 3.5.2 and 3.5.2-mpi the following modules all needed for Spyder.
— Henk 2019/04/25 09:21
Successfully installed PyQT5-5.12.1 PyQt5-sip-4.19.15 Successfully installed Jinja2-2.10.1 MarkupSafe-1.1.1 alabaster-0.7.12 asn1crypto-0.24.0 astroid-2.2.5 attrs-19.1.0 babel-2.6.0 backcall-0.1.0 bleach-3.1.0 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 cloudpickle-0.8.1 cryptography-2.6.1 decorator-4.4.0 defusedxml-0.6.0 docutils-0.14 entrypoints-0.3 idna-2.8 imagesize-1.1.0 ipykernel-5.1.0 ipython-7.4.0 ipython-genutils-0.2.0 isort-4.3.17 jedi-0.13.3 jeepney-0.4 jsonschema-3.0.1 jupyter-client-5.2.4 jupyter-core-4.4.0 keyring-19.0.1 lazy-object-proxy-1.3.1 mccabe-0.6.1 mistune-0.8.4 nbconvert-5.4.1 nbformat-4.4.0 numpydoc-0.9.1 packaging-19.0 pandocfilters-1.4.2 parso-0.4.0 pexpect-4.7.0 pickleshare-0.7.5 prompt-toolkit-2.0.9 psutil-5.6.1 ptyprocess-0.6.0 pycodestyle-2.5.0 pycparser-2.19 pyflakes-2.1.1 pygments-2.3.1 pylint-2.3.1 pyqtwebengine-5.12.1 pyrsistent-0.14.11 pyzmq-18.0.1 qtawesome-0.5.7 qtconsole-4.4.3 qtpy-1.7.0 requests-2.21.0 rope-0.14.0 secretstorage-3.1.1 snowballstemmer-1.2.1 sphinx-2.0.1 sphinxcontrib-applehelp-1.0.1 sphinxcontrib-devhelp-1.0.1 sphinxcontrib-htmlhelp-1.0.2 sphinxcontrib-jsmath-1.0.1 sphinxcontrib-qthelp-1.0.2 sphinxcontrib-serializinghtml-1.1.3 spyder-3.3.4 spyder-kernels-0.4.4 testpath-0.4.2 tornado-6.0.2 traitlets-4.3.2 typed-ast-1.3.4 urllib3-1.24.2 wcwidth-0.1.7 webencodings-0.5.1 wrapt-1.11.1 wurlitzer-1.0.2
Environment
export PATH=/share/apps/CENTOS6/openmpi/3.1.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/3.1.3/lib:$LD_LIBRARY_PATH which mpicc /share/apps/CENTOS6/openmpi/3.1.3/bin/mpicc CC=mpicc LDSHARED="mpicc -shared" ./configure --prefix=/share/apps/CENTOS6/python/3.5.2-mpi export PATH=/share/apps/CENTOS6/python/3.5.2-mpi/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2-mpi/lib:$LD_LIBRARY_PATH <code> Packages <code> Successfully installed nltk-3.4 Successfully installed numpy-1.16.2 Successfully installed scipy-1.2.1 Successfully installed Cython-0.29.6 Successfully installed jenkspy-0.1.5 Successfully installed pandas-0.24.2 python-dateutil-2.8.0 pytz-2018.9 Successfully installed argparse-1.4.0 Installing collected packages: readline Successfully installed mpi4py-3.0.1 Successfully installed emcee-2.2.1 Installed /home/apps/CENTOS6/python/3.5.2-mpi/lib/python3.5/site-packages/pandas2-0.0.0-py3.5.egg Successfully installed cycler-0.10.0 Successfully installed kiwisolver-1.0.1 Successfully installed matplotlib-3.0.3 pyparsing-2.3.1 Successfully installed seaborn-0.9.0 Successfully installed subprocess32-3.5.3 Successfully installed astropy-3.1.2 Successfully installed pathlib2-2.3.3 Successfully installed scandir-1.10.0 Successfully installed os0-0.2.14 Successfully installed pickle-utils-0.1
Successfully installed emcee-2.2.1 Requirement already satisfied: mpi4py in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (2.0.0) Successfully installed argparse-1.4.0 Successfully installed readline-6.2.4.1 Requirement already satisfied: pandas in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (0.21.0) Requirement already satisfied: python-dateutil>=2 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (2.6.1) Requirement already satisfied: pytz>=2011k in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (2017.3) Requirement already satisfied: numpy>=1.9.0 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (1.15.4) Requirement already satisfied: six>=1.5 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from python-dateutil>=2->pandas) (1.11.0) Installed /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/pandas2-0.0.0-py3.5.egg Successfully installed cycler-0.10.0 kiwisolver-1.0.1 matplotlib-3.0.3 pyparsing-2.3.1 seaborn-0.9.0 Installed /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/subprocess32-3.5.3-py3.5.egg Successfully installed astropy-3.1.2 Successfully installed pathlib2-2.3.3 Requirement already satisfied: matplotlib in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (3.0.3) Requirement already satisfied: cycler>=0.10 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (0.10.0) Requirement already satisfied: numpy>=1.10.0 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (1.15.4) Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (2.3.1) Requirement already satisfied: python-dateutil>=2.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (2.6.1) Requirement already satisfied: kiwisolver>=1.0.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (1.0.1) Requirement already satisfied: six in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from cycler>=0.10->matplotlib) (1.11.0) Requirement already satisfied: setuptools in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from kiwisolver>=1.0.1->matplotlib) (20.10.1)
export PATH=/share/apps/CENTOS6/python/3.5.2/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2/lib:$LD_LIBRARY_PATH
— Henk 2017/02/06 10:45 Added to 3.5
Copying mpi4py.egg-info to /share/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/mpi4py-2.0.0-py3.5.egg-info but I had to change the code and I do not know what the impact might be src/mpi4py.MPI.c /* __pyx_v_rd = MPI_CONVERSION_FN_NULL; */ __pyx_v_rd = 0; Use this env pointing to old MPI versions export MPI_DIR=/share/apps/openmpi/1.2+intel-10 export PATH=/share/apps/CENTOS6/python/3.5.2/bin:$PATH export PATH=/share/apps/openmpi/1.2+intel-10/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2/lib:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/share/apps/openmpi/1.2+intel-10/lib:$LD_LIBRARY_PATH
These versions of python can be set with your PYTHONPATH and PYTHONHOME env variables. For syntax, see the Autodock section.
FastQC A quality control tool for high throughput sequence data.
java -version
latest JDK installed, in default PATH which java
kallisto is a program for quantifying abundances of transcripts from bulk and single-cell RNA-Seq data, or more generally of target sequences using high-throughput sequencing reads.
# env export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH" export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH" bwa: 0.7.17-h84994c4_5 bioconda perl: 5.26.2-h14c3975_1002 conda-forge bowtie bioconda/linux-64::bowtie-1.2.2-py27h2d50403_1 tbb conda-forge/linux-64::tbb-2019.4-h6bb024c_0 star bioconda/linux-64::star-2.7.0d-0 hdf5 conda-forge/linux-64::hdf5-1.10.3-hba1933b_1001 kallisto bioconda/linux-64::kallisto-0.45.0-hdcc98e5_0 # --- //[[hmeij@wesleyan.edu|Henk]] 2021/08/19 08:55// # Added packages below for Prof Coolon's lab (skalra) conda list | egrep -i "samtools|bowtie2|bedtools|cufflinks" bedtools 2.30.0 h7d7f7ad_1 bioconda bowtie2 2.3.5.1 py27he513fc3_0 bioconda cufflinks 2.2.1 py27_2 bioconda samtools 1.13 h8c37831_0 # fcohan lab [hmeij@petaltail ~]$ iqtree --version IQ-TREE multicore version 2.1.4-beta COVID-edition for Linux 64-bit built Jun 24 2021
Trinity assembles transcript sequences from Illumina RNA-Seq data.
Trinity seems to need makeblastdb. BLAST finds regions of similarity between biological sequences.
export TRINITY_HOME=/share/apps/CENTOS6/trinityrnaseq-Trinity-v2.8.4 export PATH=$TRINITY_HOME:$PATH python 2.7 modules (these do not work with Trinity...bowtie2, jellyfish, salmon?) bowtie2-2.3.4.3 | py27he860b03_1 11.9 MB bioconda jellyfish conda-forge/linux-64::jellyfish-0.6.1-py27h14c3975_1000 bzip2-1.0.6 | h14c3975_1002 415 KB conda-forge icu-58.2 | hf484d3e_1000 22.6 MB conda-forge jemalloc-5.1.0 | hf484d3e_1000 10.2 MB conda-forge libboost-1.67.0 | h46d08c1_4 20.9 MB salmon-0.13.1 | h86b0361_0 4.2 MB bioconda xz-5.2.4 | h14c3975_1001 366 KB conda-forge trinity modules Inchworm: has been Installed Properly Chrysalis: has been Installed Properly QuantifyGraph: has been Installed Properly GraphFromFasta: has been Installed Properly ReadsToTranscripts: has been Installed Properly parafly: has been Installed Properly slclust: has been Installed Properly collectl: has been Installed Properly
Apparently bowtie, jellyfish, and salmon are not found as python modules. Install binaries from
and put them in the miniconda2 directory as that is the python used.
#!/bin/bash # submit via 'bsub < run_trinity' #BSUB -n 24 #BSUB -R "span[hosts=1]" #BSUB -q mw256fd #BSUB -J trinity #BSUB -o trinity_assembly #BSUB -e trinity_stderr export TRINITY_HOME=/share/apps/CENTOS6/trinityrnaseq-Trinity-v2.8.4 export PATH=$TRINITY_HOME:$PATH export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH" export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH" export PATH="/share/apps/CENTOS6/python/3.5.2/samtools-1.9/bin:$PATH" export PATH="/share/apps/CENTOS6/miniconda2/bowtie2-2.3.5-linux-x86_64:$PATH" export PATH="/share/apps/CENTOS6/miniconda2/jellyfish-2.2.10:$PATH" export PATH="/share/apps/CENTOS6/miniconda2/salmon-0.13.1_linux_x86_64/bin:$PATH" export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/salmon-0.13.1_linux_x86_64/lib:$LD_LIBRARY_PATH" # unique job scratch dirs MYSANSCRATCH=/sanscratch/$LSB_JOBID MYLOCALSCRATCH=/localscratch/$LSB_JOBID MYLOCALSCRATCH5TB=/localscratch5tb/$LSB_JOBID export MYSANSCRATCH MYLOCALSCRATCH MYLOCALSCRATCH5TB # cd to remote working directory, only exists on mw256fd nodes cd $MYLOCALSCRATCH5TB pwd Trinity --seqType fq --max_memory 240G \ --left /sanscratch/tearley/seqs/illumina/fwd.fastq.gz \ --right /sanscratch/tearley/seqs/illumina/rev.fastq.gz --CPU 24 \ --long_reads /sanscratch/tearley/seqs/pacbio/ccs.fasta # copy results to home (fix this brute force, keep results only...) scp -r ./trinity_out_dir ~/trinity_out_dir_$LSB_JOBID
All of these are located in /share/apps/CENTOS6/python/3.5.2/
Deepbinner: a deep convolutional neural network barcode demultiplexer for Oxford Nanopore reads
Canu: a single molecule sequence assembler for genomes large and small.
Eigen: Eigen is a C++ template library for linear algebra: matrices, vectors, numerical solvers, and related algorithms.
Nanopolish : Software package for signal-level analysis of Oxford Nanopore sequencing data.
BioPython : Aset of freely available tools for biological computation
Samtools: is a suite of programs for interacting with high-throughput sequencing data. It consists of three separate repositories:
Minimap2: A versatile pairwise aligner for genomic and spliced nucleotide sequences
Caution Consult file /share/apps/src/cottontail2/ZENO_configure
regarding the fix that was applied to SphereCenterModel/NanoFLANNSort.cc
which is not a fix at all. Your mileage may vary.
# zeno-mpi was compiled with this MPI flavor /share/apps/CENTOS6/miniconda2/bin/mpicc /share/apps/CENTOS6/miniconda2/bin/mpicxx # also setup these export NANOFLANN_DIR=/share/apps/CENTOS6/nanoflann export SPRNG_DIR=/share/apps/CENTOS6/sprng5
Newer CPU+GPU standalone library version (top of page)
— Henk 2021/06/29 14:38
Newer CPU only version installed in Miniconda3 (top of page)
Miniconda3 is using python 3.8
— Henk 2021/06/22 15:41
Accelerated Library for Analysing Radio Interferometer Observations (CPU only)
[hmeij@cottontail ~]$ conda --version conda 4.5.4 $HOME/.bashrc for galario # added by Miniconda2 installer export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH" package | build ---------------------------|----------------- certifi-2021.5.30 | py37h89c1867_0 141 KB conda-forge libblas-3.8.0 | 17_openblas 11 KB conda-forge libcblas-3.8.0 | 17_openblas 11 KB conda-forge liblapack-3.8.0 | 17_openblas 11 KB conda-forge libopenblas-0.3.10 |pthreads_hb3c22a3_4 7.8 MB conda-forge libstdcxx-ng-9.3.0 | h6de172a_19 4.0 MB conda-forge numpy-1.21.0 | py37h038b26d_0 6.1 MB conda-forge pandas-1.2.5 | py37h219a48f_0 11.8 MB conda-forge pip-21.1.3 | pyhd8ed1ab_0 1.1 MB conda-forge python-3.7.3 | h5b0a415_0 35.7 MB conda-forge python-dateutil-2.8.1 | py_0 220 KB conda-forge python_abi-3.7 | 2_cp37m 4 KB conda-forge pytz-2021.1 | pyhd8ed1ab_0 239 KB conda-forge setuptools-49.6.0 | py37h89c1867_3 947 KB conda-forge six-1.16.0 | pyh6c4a22f_0 14 KB conda-forge wheel-0.36.2 | pyhd3deb0d_0 31 KB conda-forge
FreeSurfer is a software package for the analysis and visualization of structural and functional neuroimaging data from cross-sectional or longitudinal studies
FSL is a comprehensive library of analysis tools for FMRI, MRI and DTI brain imaging data.
Molecular Modeling Software
NOTE: Lots of errors in compiling serial mode.
NOTE2: When compiling campari_threads ewaldo.f90 fails terribly
FastTree infers approximately-maximum-likelihood phylogenetic trees from alignments of nucleotide or protein sequences.
Mathematics software, a viable free open source alternative to Magma, Maple, Mathematica and Matlab.
# compiled against CentOS 6.4 (sharptail6) using this python version export PATH=/share/apps/CENTOS6/python/2.7.9/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.9/lib:$LD_LIBRARY_PATH
The National Center for Biotechnology Information advances science and health by providing access to biomedical and genomic information.
Bayesian hierarchical models using Markov Chain Monte Carlo (MCMC) simulation that can be controlled from R.
Compiled for every queue, for usage consult BLCR Checkpoint in OL3 and BLCR Checkpoint in OL3 for serial and parallel approaches.
Astronomy package for exoplanet searches.
Installed into Python 2.7.10 on CentOS6.x, making sure CentOS gcc and gfortran are first in path.
# select vanilla python to use for centos6.x 64 bit # also installed easy_install and pulled down new Numpy export PATH=/share/apps/CENTOS6/python/2.7.10/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.10/lib:$LD_LIBRARY_PATH # notes: had to change --opt flag # had to use the MAC line for compilation to avoid the invalid ELF header error python setup.py config_fc --fcompiler=gnu95 --opt="-03" \ --f90flags="-cpp -fopenmp -march=native -mno-avx" build python setup.py install Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/PyTransit-1.0-py2.7.egg-info # later added Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.10.1-py2.7-linux-x86_64.egg Finished processing dependencies for emcee==2.1.0 Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/astropy-1.1rc2-py2.7-linux-x86_64.egg Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/scipy-0.16.1-py2.7.egg-info # make sure correct mpi tools env MPICC=/path/to/mpicc pip install mpi4py
For MPI4py had to force MPI-2
export PATH=/home/apps/openmpi/1.2+intel-10/bin:$PATH export LD_LIBRARY_PATH=/home/apps/openmpi/1.2+intel-10/lib:$LD_LIBRARY_PATH python setup.py build --configure python setup.py install
And added on 09Aug2017 https://github.com/lkreidberg/batman Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/batman_package-2.4.5-py2.7.egg-info
There is also a version in /share/apps/CENTOS6/blcr_soft with PyTransit and Emcee
Intel compiler library for parallel jobs.
Add to your LD_LIBRARY_PATH
Automatically generates schematic diagrams of protein-ligand interactions for a given PDB file.
Next generation parallel sequencing technologies made chromatin immunoprecipitation followed by sequencing (ChIP-Seq) a popular strategy to study genome-wide protein-DNA interactions, while creating challenges for analysis algorithms. .
Should use Python 2.7.9 (see python section on setting up environment) on CentOS6 nodes.
MODELLER is used for homology or comparative modeling of protein three-dimensional structures.
Please read the license at https://salilab.org/modeller/registration.html
How to use modeller: read /share/apps/modeller/9.13/INFO.TXT or visit web site.
Miriad, a package of highly-specialized radio astronomy software.
# Define your Miriad environment by invoking the initialization scripts % . MIRRC.sh # in sh-like shells, or % source MIRRC
Compiled with /share/apps/python/2.6.1/bin/python …
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/astropy-0.3.1-py2.6-linux-x86_64.egg Processing dependencies for astropy==0.3.1 Searching for numpy==1.6.2 Best match: numpy 1.6.2 Processing numpy-1.6.2-py2.6-linux-x86_64.egg numpy 1.6.2 is already the active version in easy-install.pth Installing f2py script to /share/apps/python/2.6.1/bin Using /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg Finished processing dependencies for astropy==0.3.1
For the latest version of emcee+mpi4py (03/15/2019) see the section of python 3.5.2 https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#python
Which is also used by the biology group for https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#sequencing_tools
Compiled with /share/apps/python/2.6.1/bin/python …
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/emcee-2.1.0-py2.6.egg Processing dependencies for emcee==2.1.0 Searching for numpy==1.6.2 Best match: numpy 1.6.2 Processing numpy-1.6.2-py2.6-linux-x86_64.egg numpy 1.6.2 is already the active version in easy-install.pth Installing f2py script to /share/apps/python/2.6.1/bin Using /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg Finished processing dependencies for emcee==2.1.0 Writing /share/apps/python/2.6.1/lib/python2.6/site-packages/mpi4py-1.3.1-py2.6.egg-info # it found /home/apps/openmpi/1.2+intel-10/bin/mpicc so be sure to use mpirun from there too export PATH=/home/apps/openmpi/1.2+intel-10/bin:$PATH
Also added emcee to python 2.7.10, see the section on pyTransit
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/argparse-1.4.0-py2.6.egg Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/pandas2-0.0.0-py2.6.egg Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/subprocess32-3.5.3-py2.6-linux-x86_64.egg
(Rosetta 2.2, 3.0 and 3.5 also are installed)
Compiled with /share/apps/python/2.6.1/bin/python …
Please read the license at /share/apps/pyrosetta/2014wk05/License.2014wk05
Note: although it probably does not matter, the python compile was performed under CentOS6 because latest phenix version requires that. So target queues mw256
and mw256fd
.
Gromacs-2020.3 for GPU (RTX2080S)
exx96
queueGromacs-2018 for GPU (RTX2080S)
exx96
queueGromacs-2018 for GPU (K20/GTX1080
Gromacs 5.1.4: there is only one version
Gromacs 5.1.2: there are four versions
mdrun
replaced by gmx
For the latest version there are 6 compilations. Gromacs developers are pretty verbal in terms of “compile on a host where you are goig to run it”. For v5.0.1, the nomeclature is like this:
The file /share/apps/gromacs/build-5.0.1.sh will show you notes on compiling. Update: The GPU versions failed to compile with icc/icpc but did compile with gcc. will investigate later on this strange behavior. — Meij, Henk 2014/10/01 15:33
# for all versions . /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/gromacs/5.0.1 [-icc or -mpi] [-hp12 or -mw256 or -mwgpu] /bin/GMXRC.sh # for all "icc versions" you are done # for all "mpi versions" add this export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH # for all "gpu versions" add this export PATH=\ /cm/shared/apps/gcc/4.7.0/bin:/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\ /cm/shared/apps/lammps/cuda/2013-01-27/:/cm/shared/apps/amber/amber12/bin:\ /cm/shared/apps/namd/ibverbs-smp-cuda/2013-06-02/:/usr/lib64/qt-3.3/bin:\ /usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/sbin:/usr/sbin:\ /cm/shared/apps/cuda50/toolkit/5.0.35/bin:/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\ /cm/shared/apps/cuda50/libs/current/bin:/cm/shared/apps/cuda50/toolkit/5.0.35/open64/bin:\ /cm/shared/apps/mvapich2/gcc/64/1.6/bin:/cm/shared/apps/mvapich2/gcc/64/1.6/sbin export LD_LIBRARY_PATH=\ /cm/shared/apps/gcc/4.7.0/lib:/cm/shared/apps/gcc/4.7.0/lib64:\ /cm/shared/apps/cuda50/toolkit/5.0.35/lib64:/cm/shared/apps/amber/amber12/lib:\ /cm/shared/apps/amber/amber12/lib64:\ /cm/shared/apps/namd/ibverbs-smp-cuda/2013-06-02/:/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:\ /cm/shared/apps/cuda50/libs/current/lib64:/cm/shared/apps/cuda50/toolkit/5.0.35/open64/lib:\ /cm/shared/apps/cuda50/toolkit/5.0.35/extras/CUPTI/lib:/cm/shared/apps/mvapich2/gcc/64/1.6/lib
# source for Intel compiler setup . /share/apps/intel/composerxe/bin/iccvars.sh intel64 # and source the GMXRC for your shell . /share/apps/gromacs/4.6-icc[or -mpi]/bin/GMXRC.[bash|zsh|csh]
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
Found Intel fortran compiler, so may have to source (albeit it should be platform independent)
. /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64
SERVER export PATH=/home/apps/python/2.6.1/bin/:$PATH Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/eon-0.0.0-py2.6.egg CLIENT eonclient -h Usage: eonclient [options] inputConfile [outputConfile] Job Type: -m Minimization of inputConfile saves to outputConfile -s Single point energy of inputConfile -o Optimization method [default: qm] Required Options: -p The potential (e.g. qsc, lj, eam_al)
SAS (v9.2 64 bits) is now available on the compute clusters. This software is an enormous statistical package and lots more, see http://www.sas.com It should be invoked in batch mode on the compute nodes just like all other software (ie command like “sas filename.sas” in job submission scripts). Some pointers, code, manual and an online tutor can be found at
The QAC has well trained tutors in SAS, for more info http://qac.wesleyan.edu
Note: Lapack and Blas packages are needed and were installed on host greentail only. Hence the program will only work on this host. If it needs to be available on compute nodes, please let me know.
yum install unixODBC
This package works together with mysql-connector-odbc-5.1.8.-1.rhel5.x86_64.rpm (from http://dev.mysql.com/downloads/connector/odbc/ installed in OS) and pyodbc (from http://code.google.com/p/pyodbc/downloads/list installed into python 2.7.2).
export PATH=/share/apps/go/bin:$PATH
Notes: “coulomb” did not compile with OpenMPI, MVApich1 or 2, but finally compiled against MVApich 0.99 (upgrade to 1.2)… the rest was compiled with gcc an 6g (from go language). In order to run the MPI binary, do not use a our lsf/lava wrappers but build up the command line with -np and -machinefile parameters.
export GOROOT=/share/apps/go export PATH=/share/apps/go/bin:/share/apps/mvapich/1.2+gnu-4.1.2+gen2/bin:$PATH export LD_LIBRARY_PATH=/share/apps/mvapich/1.2+gnu-4.1.2+gen2/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/fftw/3.3.6/bin:$PATH export LD_LIBRARY_PATH=/share/apps/fftw/3.3.6/lib:$LD_LIBRARY_PATH
Note: Relies on v2 of FFTW (a C subroutine library for computing the discrete Fourier transform), view http://www.fftw.org/
Compiled as follows, you may have to set up your environment similarly:
export PATH=/share/apps/intel/cce/10.0.025/bin:\ /share/apps/openmpi/1.2+intel-10/bin:\ $PATH export LD_LIBRARY_PATH=/share/apps/intel/cce/10.0.025/lib:\ /share/apps/openmpi/1.2+intel-10/lib:\ /share/apps/intel/cmkl/10.0.011/lib/em64t:\ $LD_LIBRARY_PATH ./configure --prefix=/share/apps/fftw/2.1.5_mpi \ --enable-mpi CC=icc CXX=icpc F77=ifort FC=ifort
You may to have set the following:
export MKL_HOME=/share/apps/intel/cmkl/9.1.021 export AMBERHOME=/share/apps/amber/9serial+ifort+plumed export LD_LIBRARY_PATH=/share/apps/intel/cmkl/9.1.021/lib/64:$LD_LIBRARY_PATH
7Feb2024
25Apr2023
# CPU export PATH=/share/apps/CENTOS7/gcc/6.5.0/bin/:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/6.5.0/lib64:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH which mpirun mpicc python gcc # CPU + GPU (exx96 queue, may run on test and amber128 queues) export CUDA_HOME=/usr/local/cuda export PATH=/usr/local/cuda/bin:$PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH which nvcc # related for starrlab # https://pacemaker.readthedocs.io/en/latest/pacemaker/install/ [hmeij@cottontail2 ~]$ module load miniconda3/py39 [hmeij@cottontail2 ~]$ which pacemaker /share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/pacemaker [hmeij@cottontail2 ~]$ pacemaker --version pacemaker/pyace version: 0.2.7+77.g8b40c11 ace_evaluator version: 2022.6.27 [hmeij@cottontail2 ~]$ python Python 3.9.10 | packaged by conda-forge | (main, Feb 1 2022, 21:24:11) [GCC 9.4.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import pyace as py >>>
USER_QUIP
This is a version of lammps (29Oct20, last stable) that contains all packages with user-quip.
It does compile with MPI (mpif90), not ifort, in CentOS7 … was tricky, solution for MPI was the mpifort –showme:link
answer
Read, if you wish
Environments
# serial - cpu only (although quippy compiled, it refused to install) # MPI STUBS is compiled in export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH /share/apps/CENTOS7/lammps/29Oct2020/lmp_serial-with-quip.cpu -h # mpi - cpu only # gfortran_openmpi export QUIP_LIBRARY=/share/apps/CENTOS7/lammps/QUIP-public/9Apr2021/linux_x86_64_gfortran_openmpi export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH /share/apps/CENTOS7/lammps/29Oct2020/lmp_mpi-with-quip.cpu -h # mpi - cpu+gpu # gfortran_openmpi arch export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH export QUIP_LIBRARY=/share/apps/CENTOS7/lammps/QUIP-public/9Apr2021/linux_x86_64_gfortran_openmpi export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH export CUDA_HOME=/usr/local/cuda export PATH=/usr/local/cuda/bin:$PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH # the "make" versions of lmp with quip do not work, deleted them # did find on nvdia dev forum reference problem disappears when using cmake (vs make) # these lmp binaries do work /share/apps/CENTOS7/lammps/29Oct2020/single_single/bin/lmp /share/apps/CENTOS7/lammps/29Oct2020/single_double/bin/lmp /share/apps/CENTOS7/lammps/29Oct2020/double_double/bin/lmp # however I had to load non-mpi version of libquid.a into lmp # witch seems ok given parallel work is driven by -suffix gpu # sample script: ~hmeij/k20redo/run.rtx.lammps2
CPU only -Centos6
CPU only OR CPU+GPU for exx96
queue - CentOS7
# CPU /share/apps/CENTOS7/lammps/3Mar2020/lmp_serial+user-omp+user-reaxc /share/apps/CENTOS7/lammps/3Mar2020/lmp_mpi+user-omp+user-reaxc # CPU+GPU /share/apps/CENTOS7/lammps/3Mar2020/lmp_mpi-single-single_kokkos_cuda_mpi+user-omp+user-reaxc # env export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH export CUDA_HOME=/usr/local/cuda export PATH=/usr/local/cuda/bin:$PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH
CPU ONLY v2 CentOS7
export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH
CPU ONLY
CPU or GPU
# simple colloid example CPU run -n1 Performance: 493311 tau/day, 1141 timesteps/s GPU run gpu4=1 double-double: 1.1x Performance: 571354 tau/day, 1322 timesteps/s GPU run gpu4=1 single-double: 1.2x Performance: 608498 tau/day, 1408 timesteps/s GPU run gpu4=1 single-single: 1.3x Performance: 623405 tau/day, 1443 timesteps/s
precision=N to set the precision level N = 1 for single precision (default) N = 2 for double precision N = 3 for positions in double precision N = 4 for positions and velocities in double precision
To use any of these you will need to setup the environment, like detailed below (PATH, LD_LIBRARY_PATH). Here is the environment showing you the version of the software used in compilation of lmp_cuda_double
. The other versions used the same compiler (ICC) and MPI version (OpenMPI 1.8.4).
Note: According to the docs with USER-CUDA you must have 1 cpu MPI process for 1 GPU, so 1:1, 2:2 …
### USER-CUDA sample script /home/hmeij/sharptail/run_user-cuda.gpu export PATH=/home/apps/bin:\ /bin:\ /usr/bin:\ /usr/local/sbin:\ /usr/sbin:\ /sbin:\ /usr/sbin:\ /usr/lib64/qt-3.3/bin:\ /cm/local/apps/cuda50/libs/304.54/bin:\ /cm/shared/apps/cuda50/toolkit/5.0.35/bin:\ /cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\ /cm/shared/apps/cuda50/libs/current/bin:\ /cm/shared/apps/cuda50/toolkit/5.0.35/open64/bin:\ /cm/shared/apps/mvapich2/gcc/64/1.6/bin:\ /cm/shared/apps/mvapich2/gcc/64/1.6/sbin:\ /cm/shared/apps/lammps/cuda/2013-01-27/:\ /share/apps/CENTOS6/openmpi/1.8.4/bin:\ /share/apps/CENTOS6/python/2.7.9/bin:\ /share/apps/CENTOS6/lammps/10Aug15:\ /home/hmeij/bin:\ $PATH export LD_LIBRARY_PATH=/cm/local/apps/cuda50/libs/304.54/lib64:\ /cm/shared/apps/cuda50/toolkit/5.0.35/lib64:\ /cm/shared/apps/cuda50/toolkit/5.0.35/lib:\ /cm/shared/apps/cuda50/libs/current/lib64:\ /cm/shared/apps/cuda50/toolkit/5.0.35/open64/lib:\ /cm/shared/apps/cuda50/toolkit/5.0.35/extras/CUPTI/lib:\ /share/apps/CENTOS6/openmpi/1.8.4/lib/:\ /share/apps/CENTOS6/python/2.7.9/lib:\ /cm/local/apps/cuda50/libs/304.54/lib64:\ /cm/local/apps/cuda50/libs/304.54/lib/:\ $LD_LIBRARY_PATH
go to How to submit GPU jobs with Lammps
Note: all versions have package yes-colloid installed.
. /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64 export MKL_HOME=/share/apps/intel/composerxe/mkl export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/1.8.4/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS6/openmpi/1.4.4/bin:$PATH
Please consult the README file to set up your environment properly. The '22Jun11' LAMMPS version was compiled with Intel compilers v12 and the OpenMPI mpicc compiler of v1.4.3
Note: Relies on v2 of FFTW (a C subroutine library for computing the discrete Fourier transform), view http://www.fftw.org/
Compiled with a variety of compilers and MPI flavors. Your mileage may vary as serious warnings were generated at compile time. However, lmp_serial_wes and lmp_tbird_wes compiled without any errors.
You may have redefine your PATH and LD_LIBRARY_PATH, please consult /share/apps/lammps/30Mar10/README
Pizza.py is a loosely integrated collection of tools, many of which provide pre- and post-processing capability for the LAMMPS molecular dynamics, ChemCell cell modeling, and SPPARKS kinetic Monte Carlo packages. (yea, “loosely” is right). In order for this to work a new python flavor was installed. Please read the section about python/2.6.1.tcltk
in the python section here
Please note the appropriate citation and reference guides for each library (found on web site).
Note: compiled with ifort
Note: compiled with gcc
Note: Contact David Beveridge for more information (dbeveridge@wes)
Please note, there is an 'Aborted' event in one of the demos. glibc
throws an error documented in the demo_output.txt file. Seems to still generate output though.
Note: compiled with ifort
Note: compiled with ifort
Note: For internal, non-commercial, research and development purposes only. Consult the LICENSE file and Citing information. Users need to set their environment, examples shown.
Version below only works on queues mw256
and mw256fd
# csh users: source /share/apps/phenix/1.8.4/phenix-1.8.4-1496/phenix_env # bash users: source /share/apps/phenix/1.8.4/phenix-1.8.4-1496/phenix_env.sh
Version below only works on queue hp12
# bash users: source /share/apps/phenix/phenix-1.5-2/phenix_env.sh
Note: our hardware/platform not supported, compiled with ifort (minor warnings)
Note: wham
compiled with warnings using icc. wham-2d
failed to compile.
* program: vina v 1.0 beta 03 (for molecular docking and virtual screening)
export PATH=/share/apps/autodock:$PATH # or export PATH=/share/apps/autodock/autodock_vina_1_1_2_linux_x86/bin:$PATH
Also linked in via /home/apps/bin which is in default PATH.
The dependencies for PyPat … drummroll … incredible!
In order to run some PyPat programs or invoke some of the underlying programs you may have to set some of the published environment variables listed below.
Python
export PYTHONHOME=/home/apps/python/2.6.1
ImageMagick
NumPy & Scipy
NumPy was configured against the CMKL BLAS and LAPACK by uncommenting the lines for [cmkl] in site.cfg
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH export PYTHONHOME=/home/apps/python/2.6.1
It appears you also need to set your PATH to find the appropriate binary:
export PATH=/home/apps/python/2.6.1/bin:$PATH
Also upgrade and installed installed
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/scipy-0.11.0-py2.6-linux-x86_64.egg
NumPy require the installation of Nose.
Nose
MatPlotLib
Note: needed to add the location of the numpy python egg to the basedir
array so that setup.py could find the egg.
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH export PYTHONHOME=/home/apps/python-2.6.1
PyMOL
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH export PYTHONHOME=/home/apps/python-2.6.1 export PYMOL_PATH=/home/apps/python-2.6.1/lib/python-2.6.1/site-packages/pymol
PyMOL required the presence of libglut
. Installed rpm -ivh freeglut-devel-2.2.0-14.x86_64.rpm
package.
PyMOL libraries have also been installed in the following location:
export LD_LIBRARY_PATH=/home/apps/pymol/1.1:$LD_LIBRARY_PATH
The following was not required by PyPat, but was requested for installation from http://sitemaker.umich.edu/carlsonlab/resources.html
Automated Flooding and Clustering source code
Global wRMSD Fit source code
Local wRMSD Fit source code
wRMSD input preparation
PyMOL Rendering plug-in
rpm -ql package_name
)centos6
pmv
,adt
,vision
,pythonsh
)below is rhel5
pmv
,adt
,vision
,pythonsh
)This package works with Autodock. This is a 64 bit installation against glibc 2.5+. You will need to adjust your PYTHONHOME and PYTHONPATH variables.
.cshrc: set path = (/home/apps/mgltools/1.5.2/bin $path) .bashrc export PATH=/home/apps/mgltools/1.5.2/bin:$PATH
Please try to run these on host
greentail
(centos 5.x) for interactive usecotontail2
(centos6.x) for interactive useLinked in via /home/apps/bin
Note: Compiled against v10 intel compilers, 64 bit. Program clearly configured for 32 bit operation.
Note: compiled with ifort -f77rtl -o name.exe name.f
Warning: when compiling micm, ifort warns of 2-digit year.
Note: configured with option –with-python which installed into /usr/lib/python2.4 (also copied to /home/apps/python/2.6.1/site_packages
Version 2.7b1 was compiled from source against OpenMPI (1.2+intel-10). You may have to add the OpenMPI lib/ directory to your LD_LIBARY_PATH.
namd2
(as well as some others) Following two install are binary distros:
namd2
(as well as some others) namd2
(as well as some others) Note: version 2.6 is swallowtail's version compiled against Topspin MPI.
namd2
(as well as some others) eport LD_LIBRARY_PATH=/usr/local/topspin/mpi/mpich/lib64
There is a sample page here: LINK
Note: this is the compiled copy of swallowtail and seems to find the libraries properly. NAB is now part of AmberTools so the newest nab
program can be found in the /home/apps/amber/10+openmpi-1.2+intel-10/bin
location.
nab
Users should set the NABHOME environment variable. Programs were compiled with icc
with the -mpi
flag.
If using csh/tcsh add to ~/.cshrc
or if using sh/bash add to ~/.bashrc
In order to make this the default in your PATH, for example …
export PATH=/home/apps/imagemagick/6.4.8-5/bin:$PATH
in your shell or add it to ~/.bashrc. The latest version is linked in via /home/apps/bin
Note: Please read the license agreement.
man mutt
man gnuplot
Make the programs available like so:
export PATH=/home/apps/qt/4.5.0/bin:$PATH
/share/apps/CENTOS6/R/R-3.6.1_configure
install.packages('lme4', dependencies = TRUE) install.packages('tidyverse', dependencies = TRUE) install.packages('mice', dependencies = TRUE) install.packages('rpart', dependencies = TRUE) install.packages('party', dependencies = TRUE) install.packages('caret', dependencies = TRUE) install.packages('nnet', dependencies = TRUE) install.packages('e1071', dependencies = TRUE) install.packages('xgboost', dependencies = TRUE) install.packages('kernlab', dependencies = TRUE) install.packages('randomForest', dependencies = TRUE)
export PATH=/share/apps/CENTOS6/R/packages/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS6/R/3.6.1b/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/3.6.1b/lib64:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:/share/apps/CENTOS6/gcc/4.8.4/lib:$LD_LIBRARY_PATH #export PATH=/share/apps/java/jdk-12.0.2/bin:$PATH #export LD_LIBRARY_PATH=/share/apps/java/jdk-12.0.2/lib:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH export PATH=/home/apps/CENTOS6/R/3.5.1/bin:$PATH export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/3.5.1/lib64:$LD_LIBRARY_PATH export PATH=/share/apps/java/jdk1.7.0_21/bin:$PATH export LD_LIBRARY_PATH=/share/apps/java/jdk1.7.0_21/lib:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/3.4.3/bin:$PATH export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/3.4.3/lib64:$LD_LIBRARY_PATH export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH export MKL_HOME=/share/apps/intel/composerxe/mkl
Native CentOS R is most up to date on node swallowtail for interactive use. It also has a bunch of packages installed.
Note: All versions were compiled with Blas and Lapack libraries found in the Cluster Math Kernel Libraries.
Set your environment as follows for 3.1.0
. /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64 export MKL_HOME=/share/apps/intel/composerxe/mkl export PATH=/share/apps/R/3.1.0/bin:$PATH export LD_LIBRARY_PATH=/share/apps/R/3.1.0/lib64/R/lib:$LD_LIBRARY_PATH
Set your environment as follows for 2.14.1
. /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64 export MKL_HOME=/share/apps/intel/composerxe/mkl export PATH=/share/apps/R/2.14.1/bin:$PATH export LD_LIBRARY_PATH=/share/apps/R/2.14.1/lib64/R/lib:$LD_LIBRARY_PATH
Set your environment as follows for 2.8.1
. /share/apps/intel/cce/10.0.025/bin/iccvars.sh . /share/apps/intel/fce/10.0.025/bin/ifortvars.sh export PATH=/share/apps/R/2.8.1/bin:$PATH export LD_LIBRARY_PATH=/share/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
Please consult the README file in install dir to set your environment.
Small program to show the concept of parallel programmning using n-body physics problem. Compiled against old Topspin MPI.
export LD_LIBRARY_PATH=/usr/local/topspin/mpi/mpich/lib64:$LD_LIBRARY_PATH
Note: users of dowser should add one of the following lines …
if using csh/tcsh add to ~/.cshrc
or if using sh/bash add to ~/.bashrc
Amber 22
Details on how to run on CentOS7 can be found at
Amber 20
AMBER 18
n33
with a /usr/local/n37-cuda-9.2
soft linkexx96
we're mixing # compiled on n33 [hmeij@n33 amber18]$ ls -l bin/pmemd* -rwxr-xr-x 1 hmeij its 3549803 Mar 30 10:19 bin/pmemd lrwxrwxrwx 1 hmeij its 15 Mar 30 11:57 bin/pmemd.cuda -> pmemd.cuda_SPFP -rwxr-xr-x 1 hmeij its 93895208 Mar 30 12:11 bin/pmemd.cuda_DPFP -rwxr-xr-x 1 hmeij its 94734272 Mar 30 13:07 bin/pmemd.cuda_DPFP.MPI lrwxrwxrwx 1 hmeij its 19 Mar 30 12:51 bin/pmemd.cuda.MPI -> pmemd.cuda_SPFP.MPI -rwxr-xr-x 1 hmeij its 75579744 Mar 30 11:57 bin/pmemd.cuda_SPFP -rwxr-xr-x 1 hmeij its 76418760 Mar 30 12:51 bin/pmemd.cuda_SPFP.MPI -rwxr-xr-x 1 hmeij its 4304392 Mar 30 11:23 bin/pmemd.MPI # some quick results | Average timings for all steps: ~hmeij/k20redo/run.amber18 - mwgpu - K20 | ns/day = 2.84 seconds/ns = 30385.24 ~hmeij/k20redo/run.amber18 - exx96 - RTX2080S | ns/day = 11.66 seconds/ns = 7408.08 ~hmeij/k20redo/run.amber16 - exx96 - RTX2080S | ns/day = 7.09 seconds/ns = 12185.91
AMBER 16
AMBER 16
# next line important for Amber python modules, miniconda source /share/apps/CENTOS6/amber/amber16/amber.sh export AMBERHOME=/share/apps/CENTOS6/amber/amber16 export PATH=$AMBERHOME/bin:$PATH export PATH=/share/apps/CENTOS6/python/2.7.9/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.9/lib:$LD_LIBRARY_PATH export PATH=/share/apps/CENTOS6/openmpi/1.8.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/1.8.4/lib/:$LD_LIBRARY_PATH
AMBER 12
Has been compiled using Intel v12 compilers and Openmpi 1.4.4, hence you need the code below to setup your environment. This v12 version of Amber is for serial and parallel runs for CPUs only (not GPUs). The appropriate wrapper is the lava.openmpi.wrapper script. Script runiing v9 and v10 version of Amber are identical to this v12 version once paths have been adjusted.
export AMBERHOME=/share/apps/amber/12cpu-only export PATH=$AMBERHOME/bin:$PATH . /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64 export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
AMBER 9
Please invoke with appropriate wrapper script.
Some related details about the Amber installation(s) and how to run Amber jobs on the cluster can be found at this Link. This is old information but may help you along.
For sander.MPI and pmemd
please add the following line to your ~/.bashrc file
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/9.1.021/lib/em64t:$LD_LIBRARY_PATH
AMBER 10
Please invoke with appropriate wrapper script.
Some related details about the Amber installation(s) and how to run Amber jobs on the cluster can be found at this Link. This is old information but may help you along.
For sander.MPI and pmemd
please add the following line to your ~/.bashrc file (I compiled 9 and 10 against same CMKL version to avoid confusion).
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/9.1.021/lib/em64t:$LD_LIBRARY_PATH
AmberTools 1.2 has been installed.
RIP Scripts has been installed, consult http://boscoh.com/rip Location: /home/apps/amber/rip-scripts In order to use, you will need to point to Phyton 2.6.1 which contains the Numpy egg.
(PyRosetta v2.011 is also installed)
Both below compiled with /share/apps/python/2.6.1/bin/python …
Please read the license at /share/apps/rosetta/LICENSE-3.0
Please read the license at /share/apps/rosetta/LICENSE-3.0
Note: this is a 64-bit compilation using python 2.6.1 and therefore is an “unsupported” configuration.
Please read the license at /share/apps/rosetta/LICENSE-2.2.0
Note: this is a 64-bit compilation (done on swallowtail) using target gcc64
and therefore is an “unsupported” configuration.
EMT64 x86_64 version (for mw256[fd], mw128, tinymem)
g16
)EMT64 x86_64 version (for hp12)
export gdvroot="/share/apps/gaussian/gdvh11" . $gdvroot/gdv/bsd/gdv.profile
EMT64 x86_64 version (for swallowtail/petaltail)
g09
)AMD64 x86_64 version (for sharptail)
g09
)AMD64 x86_64 version (for sharptail)
g09
)AMD64 & EMT64 x86_64 version (for both swallowtail/petaltail and sharptail clusters)
Please note that if run from sharptail, you must explictedly enable x-forwarding when connecting to this host, like so
ssh -X sharptail
ENVIRONMENT
Users will want to add the following to their .login file:
setenv g09root "dir" setenv GAUSS_SCRDIR "scr-dir" source $g09root/g09/bsd/g09.login # sets up G09 run environment.
The <dir> in the first command is the location of the g09 directory. For example, if the path to this directory is /usr/local/g09, then set g09root to /usr/local. The <scr-dir> in the second command is a directory to be used by default for Gaussian 09 scratch files. There should be plenty of disk space available at this location.
Bourne-type shell users would use these commands instead of the ones above:
g09root="dir" GAUSS_SCRDIR="scr-dir" export g09root GAUSS_SCRDIR . $g09root/g09/bsd/g09.profile
DOCUMENTATION
Note: in order to use this application you must be a member of group gaussian
. Please send request to the administrator.
Details about the Gaussian installation and how to run Gaussian jobs on the cluster can be found at this Link. Please read that page.
You will need to add the bin/ and lib/ directories to your PATH and LD_LIBRABY_PATH env parameters (example son this page)
For CentOS6.x (mw256[fd] queues), compiled with cmake/gcc
For RHEL5.x and CentOS5.x (hp12 queue), compiled with ICC/ifort
autodock4, autogrid4
)Note. Requires Openbabel. You need to add the following lines to your ~/.bashrc file or program.
export PATH=/home/apps/autodock:/home/apps/openbabel/2.1.1/bin:$PATH export PYTHONPATH=/home/apps/openbabel/2.1.1/site-packages:$PYTHONPATH export PERL5LIB=/home/apps/openbabel/2.1.1/lib/perl:$PERL5LIB export LD_LIBRARY_PATH=/home/apps/openbabel/2.1.1/lib:$LD_LIBRARY_PATH
Note. The perl plugin will not compile. Have a message to the list.
— Meij, Henk 2008/01/25 15:39
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH
For v1.4.3/v1.4.4 you will have to set:
. /share/apps/intel/composerxe/bin/iccvars.sh intel64 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64 #also change (not sure it matters), LD_LIBRARY_PATH is modified by above scripts export MKL_HOME=/share/apps/intel/composerxe/mkl
Warning end-of-life, please use OpenMPI
Those are pretty old i suggest you do not use them.
Note: the is the mpich MPI suite of libraries that came with OCS 4.1.1 (swallowtail). In case users compiled programs against these libraries. Not supported anymore in OCS 5.1. Suggest using the latest OpenMPI flavor.
Versions are installed with the most recent linked into users' environments via /share/apps/bin. JAVAHOME points to /home/apps/java/java
which is the latest version. To set another flavor as your default java, here is an example:
# update [hmeij@greentail52 ~]$ java --version java 20.0.2 2023-07-18 export JAVAHOME=/home/apps/j2sdk1.4.2_17/bin:$PATH export CLASSPATH=$JAVAHOME/lib:$CLASSPATH
drwxrwxr-x 9 root root 4096 Aug 20 2007 j2sdk1.4.2_07 drwxr-xr-x 9 root root 4096 Oct 5 06:05 jdk1.5.0_14 drwxr-xr-x 9 root root 4096 Jun 14 2007 jdk1.6.0_02 drwxr-xr-x 7 root root 4096 Nov 10 2005 jre1.5.0_06 drwxr-xr-x 20 root root 4096 Feb 20 11:20 netbeans-6.0.1
Note 2.7 branch is end of life 01/01/2020
Note: since the latest update to CentOS 6.10 the native python in the OS is /usr/bin/python v2.6.6
export PATH=/share/apps/CENTOS6/python/2.7.15/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.15/lib:$LD_LIBRARY_PATH
blcr_soft
can be checkpointed— Henk 2017/02/06 10:47 Added to /share/apps/CENTOS6/blcr/pyton/2.7.10 …
export PATH=/share/apps/CENTOS6/python/2.7.10/bin:$PATH export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.10/lib:$LD_LIBRARY_PATH ...or... export PYTHONHOME=/share/apps/CENTOS6/blcr_soft/python/2.7.10 export PYTHONPATH=/home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages export PATH=$PYTHONHOME/bin:$PATH REBOUND: https://github.com/hannorein/rebound ; http://rebound.readthedocs.io/en/latest/ REBOUNDx: https://github.com/dtamayo/reboundx Installing collected packages: pandas Successfully installed pandas-0.19.2
Added to /share/apps/CENTOS6/python/2.7.10 — Henk 2019/03/15 11:14
Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.10.1-py2.7-linux-x86_64.egg Adding scipy 0.16.1 to easy-install.pth file Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/argparse-1.4.0-py2.7.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/readline-6.2.4.1-py2.7-linux-x86_64.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pandas2-0.0.0-py2.7.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/emcee-3.0rc2-py2.7.egg Adding mpi4py 2.0.0 to easy-install.pth file Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/matplotlib-1.5.0-py2.7-linux-x86_64.egg Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/astropy-1.1rc2-py2.7-linux-x86_64.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/scandir-1.10.0-py2.7-linux-x86_64.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/subprocess32-3.5.3-py2.7-linux-x86_64.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pandas-0.24.2-py2.7-linux-x86_64.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/seaborn-0.9.0-py2.7.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.16.2-py2.7-linux-x86_64.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/os0-0.2.14-py2.7.egg Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pickle_utils-0.1-py2.7.egg
Added to /share/apps/CENTOS6/blcr_soft/python/2.7.10 — Henk 2019/03/16 08:40
Successfully installed argparse-1.4.0 Successfully installed readline-6.2.4.1 Successfully installed numpy-1.16.2 Successfully installed scipy-1.2.1 emcee in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/emcee-2.1.0-py2.7.egg (2.1.0) mpi4py in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (2.0.0) pandas in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (0.19.2) pytz>=2011k in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pytz-2016.3-py2.7.egg (from pandas) (2016.3) numpy>=1.7.0 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from pandas) (1.16.2) python-dateutil in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/python_dateutil-2.5.2-py2.7.egg (from pandas) (2.5.2) six>=1.5 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/six-1.10.0-py2.7.egg (from python-dateutil->pandas) (1.10.0) Installed /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pandas2-0.0.0-py2.7.egg matplotlib in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/matplotlib-1.5.1-py2.7-linux-x86_64.egg (1.5.1) numpy>=1.6 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from matplotlib) (1.16.2) python-dateutil in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/python_dateutil-2.5.2-py2.7.egg (from matplotlib) (2.5.2) pytz in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pytz-2016.3-py2.7.egg (from matplotlib) (2016.3) cycler in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/cycler-0.10.0-py2.7.egg (from matplotlib) (0.10.0) pyparsing!=2.0.0,!=2.0.4,>=1.5.6 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pyparsing-2.1.1-py2.7.egg (from matplotlib) (2.1.1) six>=1.5 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/six-1.10.0-py2.7.egg (from python-dateutil->matplotlib) (1.10.0) Installing collected packages: seaborn Successfully installed subprocess32-3.5.3 astropy in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/astropy-1.1.2-py2.7-linux-x86_64.egg (1.1.2) numpy>=1.6.0 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from astropy) (1.16.2) Successfully installed pathlib2-2.3.3 scandir-1.10.0 Successfully installed os0-0.2.14 Successfully installed pickle-utils-0.1
Note: v 2.7.9 also contains Numpy 1.9.2 and Cython 0.22 and MACS2 2.1.0 Note: Should not be used on queue hp12 Note: Successfully installed insane-1.0rc1 simopt-0.1
export PYTHONHOME=/home/apps/CentOS6/python/2.7.9 export PYTHONPATH=/home/apps/CentOS6/python/2.7.9/site-packages:$PYTHONPATH export PATH=$PYTHONHOME/bin:$PATH
v 2.7.2 also contains pyodbc which relies on unixODBC and a mysql connector, consult section unixODBC.
v. 2.7.2 also contains
export PYTHONHOME=/home/apps/python/2.7.2 export PYTHONPATH=/home/apps/python/2.7.2/site-packages export PATH=$PYTHONHOME/bin:$PATH
Note: there is another version of Python (2.5) hiding inside MGLtools at /home/apps/mgltools/1.5.2/bin/python
. Read the section on MGLtools for more information.
For all these installations you will need to set $PATH and $LD_LIBRARY_PATH, look for samples on this page …
wget ftp://ftp.nluug.nl/mirror/languages/gcc/releases/gcc-9.2.0/gcc-9.2.0.tar.gz tar zxvf gcc-9.2.0.tar.gz cd gcc-9.2.0 ./contrib/download_prerequisites ./configure --prefix=/share/apps/CENTOS6/gcc/9.2.0 --disable-libgomp --disable-multilib make make install
Note: GMP, MPFR and MPC build into GCC. There are also standalone packges.
[hmeij@petaltail gcc]$ export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH [hmeij@petaltail gcc]$ export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH [hmeij@petaltail gcc]$ which gcc /share/apps/CENTOS6/gcc/4.8.4/bin/gcc [hmeij@petaltail gcc]$ gcc test.c [hmeij@petaltail gcc]$ ./a.out Hello world
For all these installations you will need to set $PATH and $LD_LIBRARY_PATH, look for samples on this page …
Note: Used the gcc-build approach so you will not need references to GMP, MPFR and MPC using GCC.
Note: Used the gcc-build approach so you will not need references to GMP, MPFR and MPC using GCC.
UPDATE 2021 latest version of icc/icx and ifort/ifx are inside OneAPI, search this page
UPDATE 2016 parallel studio cluster edition installed
UPDATE, 3 years later, we bought a new version, things change
/share/apps/bin
Note:
export PATH=/share/apps/stata/17:$PATH