\\
**[[cluster:0|Back]]**
====== Software ======
[[cluster:215|OpenHPC Software]] for **Rocky 8**
Below is all centos 6 and/or 7, which may or may not run in 8.
IMPORTANT NOTE: **Since moving to the TrueNAS/ZFS appliance all references to** ''/home/apps'' **should be replaced with** ''/share/apps'' **which points to** ''/zfshomes/apps''
--- //[[hmeij@wesleyan.edu|Henk]] 2020/07/31 15:08//
The listings below will be updated as software is installed, location is /share/apps/ or /home/apps/ (same thing).
Please note that what you find in **/usr/local/bin** is local to the head node and **/home/apps/bin** is available cluster wide (which is in your default PATH)
Most of the compilations have been performed by Intel's icc/ifort compilers.
Old compilations, mostly located in /share/apps, are RHEL 5.5 \\
Newer compilations are done with CentOS 6.10 (since summer 2013) \\
on either cottontail2 or node n33 located at /share/apps/CENTOS6 \\
Some compilations require 7.x and are in /share/apps/CENTOS7 \\
====== Julia ======
* https://julialang.org/
* /share/apps/julia/julia-1/10.4
* link in /share/apps/bin
* runs on centos7 and rocky8
====== Miniconda3-py311 ======
* Starting a new branch, the other Miniconda3 section listed on this is quite loaded
* Python 3.11
source /share/apps/CENTOS7/miniconda3-py311/etc/profile.d/conda.sh
or
export PATH=/share/apps/CENTOS7/miniconda3-py311/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/miniconda3-py311/lib:$LD_LIBRARY_PATH
which mpirun python conda
conda list
export CUDA_HOME=/usr/local/cuda
export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH
* conda install (same as what is listed on openhpc [[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:215#miniconda3-py311|miniconda3-py311]]) page
* cudatoolkit=10.2 cudnn=7 for exx96 queue
* same error, module cudatoolkit not found, but I found Cupy and it does load
* https://docs.cupy.dev/en/stable/overview.html
* thayerlab
# pollacklab
[hmeij@greentail52 ~]$ source /share/apps/CENTOS7/miniconda3-py311/etc/profile.d/conda.sh
[hmeij@greentail52 ~]$ conda activate sage
(sage) [hmeij@greentail52 ~]$ sage --version
SageMath version 10.0, Release Date: 2023-05-20
(sage) [hmeij@greentail52 ~]$ sage
┌────────────────────────────────────────────────────────────────────┐
│ SageMath version 10.0, Release Date: 2023-05-20 │
│ Using Python 3.11.4. Type "help()" for help. │
└────────────────────────────────────────────────────────────────────┘
sage: exit
(sage) [hmeij@greentail52 ~]$ conda deactivate
[hmeij@greentail52 ~]$
* **mdtraj, matplotlib, pandas**
[hmeij@greentail52 ~]$ which python
/share/apps/CENTOS7/miniconda3-py311/bin/python
[hmeij@greentail52 ~]$ python
Python 3.11.4 (main, Jul 5 2023, 13:45:01) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import mdtraj as mt
>>>
* ** seaborn**
* https://seaborn.pydata.org/index.html
* also in miniconda3/py311 on rock8 module (used pip3)
patsy conda-forge/noarch::patsy-0.5.4-pyhd8ed1ab_0
seaborn conda-forge/noarch::seaborn-0.13.0-hd8ed1ab_0
seaborn-base conda-forge/noarch::seaborn-base-0.13.0-pyhd8ed1ab_0
statsmodels conda-forge/linux-64::statsmodels-0.14.0-py311h1f0f07a_3
* joblib (thayerlab)
joblib conda-forge/noarch::joblib-1.4.2-pyhd8ed1ab_0
====== Masurca ======
* module: masurca/4.1.0
* https://github.com/alekseyzimin/masurca/releases
* supports openmp (maybe)
* queues mwgpu, exx96 (centos 7)
export PATH=/share/apps/CENTOS7/masurca/4.1.0/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/masurca/4.1.0/lib:$LD_LIBRARY_PATH
[hmeij@cottontail2 ~]$ masurca --version
version 4.1.0
====== Trimmomatic ======
* http://www.usadellab.org/cms/?page=trimmomatic
* binary download and stage
* java -jar trimmomatic-0.39.jar ...
/share/apps/CENTOS6/Trimmomatic-0.39/
[root@cottontail Trimmomatic-0.39]# ll
total 176
drwxr-xr-x 2 root root 8 Apr 8 2019 adapters
-rw-r--r-- 1 root root 35147 May 16 2018 LICENSE
-rw-r--r-- 1 root root 128502 Apr 8 2019 trimmomatic-0.39.jar
====== HPC SDK ======
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:212
* Nvidia SDK v 22.2
====== Magenta ======
An open source research project exploring the role of machine learning as a tool in the creative process. Magenta is distributed as an open source Python library, powered by TensorFlow. This library includes utilities for manipulating source data (primarily music and images), using this data to train machine learning models, and finally generating new content from these models.
* https://magenta.tensorflow.org/
* installed in miniconda3's python3
* queues: mwgpu, amber128, exx96
* (MUSC/hgonzalez)
====== EasyBuild ======
EasyBuild is a software build and installation framework that allows you to manage (scientific) software on High Performance Computing (HPC) systems in an efficient way.
* supports 2469 different software packages (incl. toolchains, bundles)
* local page with lots of details **[[cluster:209|EasyBuild]]**
Easybuild uses ''environment-modules'' to control the environment. Load the module of interest and all dependencies will be taken care of. See below.
**Installed modules** and their dependencies
* astropy/4.2.1-fosscuda-2020b
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:209#astropy
* contains package mpi4py, multithreading
* import erfa (needs to be loaded too), added later (pycuda, h5py for helios)
* to find ''libcuda.so.1'' set export path below
* emcee/2.2.1-foss-2019a
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:209#emcee
* contains package mpi4py, multithreading
* PyCUDA/2020.1-fosscuda-2020b
*
To use module, discover the modules and load desired module. On ''greentail52'' (debugging) or in submit scripts on queue: exx96 (mwgpu and amber128 nodes do not work)
# cuda libs location
export LD_LIBRARY_PATH=\
/sanscratch/CENTOS7/easybuild/4.4.2/software/CUDAcore/11.1.1/lib64:\
/sanscratch/CENTOS7/easybuild/4.4.2/software/CUDAcore/11.1.1/lib:\
/sanscratch/CENTOS7/easybuild/4.4.2/software/CUDAcore/11.1.1/targets/x86_64-linux/lib/stubs:\
$LD_LIBRARY_PATH
# discover all modules
module use /sanscratch/CENTOS7/easybuild/4.4.2/modules/all
# view availability
module avail
# load
module load module_name
# list modules loaded
module list
# check, start app, import package (by full module name)
# note that these modules use different python versions
which python
python
>>>help('modules')
# unload all modules (better to exit shell and start new one,
# purge will also unload system modules...
module purge
====== Hoomd ======
HOOMD-blue is a Python package that runs simulations of particle systems on CPUs and GPUs.
* https://hoomd-blue.readthedocs.io/en/latest/index.html
* requires both minconda3 (centos7) **and** cuda-10.2 (so queue exx96 only)
* python3.x included
* starrlab
Set up env for miniconda and cuda
export CUDAHOME=/usr/local/cuda-10.2
export PATH=/usr/local/cuda-10.2/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda-10.2/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/usr/local/cuda-10.2/lib64:$LD_LIBRARY_PATH
source /share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh
export PATH=/share/apps/CENTOS7/amber/miniconda3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/lib:$LD_LIBRARY_PATH
which mpirun python conda
which nvcc
# installation (not sure how this works, does the * not get expanded in shell?)
# seems to have worked forcing gpu package install on greentail52
conda install -c conda-forge hoomd=*=*gpu*
$ conda list | egrep -i "cuda|hoomd"
cudatoolkit 11.0.221 h6bb024c_0
hoomd 2.9.4 gpu_py38h02d30ca_1 conda-forge
# ahh, hamed found an error
conda install -c conda-forge cudatoolkit=10.2
The following packages will be DOWNGRADED:
cudatoolkit 11.0.221-h6bb024c_0 --> 10.2.89-hfd86e86_1
hoomd 2.9.4-gpu_py38h02d30ca_1 --> 2.9.4-gpu_py38h5bdc439_1
* installed/updated were...
ca-certificates-2021.10.8 | ha878542_0 139 KB conda-forge
certifi-2021.10.8 | py38h578d9bd_0 145 KB conda-forge
conda-4.10.3 | py38h578d9bd_2 3.0 MB conda-forge
cudatoolkit-11.0.221 | h6bb024c_0 622.9 MB
hoomd-2.9.4 |gpu_py38h02d30ca_1 46.2 MB conda-forge
openssl-1.1.1h | h516909a_0 2.1 MB conda-forge
tbb-2020.2 | hc9558a2_0 1.4 MB conda-forge
* and
conda install -c conda-forge gsd
The following NEW packages will be INSTALLED:
_openmp_mutex conda-forge/linux-64::_openmp_mutex-4.5-1_gnu
gsd conda-forge/linux-64::gsd-2.5.1-py38h6c62de6_0
libgomp conda-forge/linux-64::libgomp-11.2.0-h1d223b6_11
The following packages will be UPDATED:
certifi 2021.10.8-py38h578d9bd_0 --> 2021.10.8-py38h578d9bd_1
conda 4.10.3-py38h578d9bd_2 --> 4.10.3-py38h578d9bd_4
libgcc-ng pkgs/main::libgcc-ng-9.1.0-hdf63c60_0 --> conda-forge::libgcc-ng-11.2.0-h1d223b6_11
openssl 1.1.1h-h516909a_0 --> 1.1.1l-h7f98852_0
====== Miniconda2 ======
* miniconda2 (python 2.7) is hiding in section
* "Kallisto & Trinity & FastQC"
* some packages require centos 6 (like cufflinks)
* many packages are also available in miniconda3 section
* requires centos 7 (python 3.9)
====== Sequencing Tools #4 ======
Added to miniconda3, see section "Miniconda3"
* Requires centos7 so queues mwgpu, amber128, exx96
* debug server greentail52
* PREFIX=/share/apps/CENTOS7/miniconda3
* python 3.9
* jcoolon lab
# environment
source /share/apps/CENTOS7/miniconda3/etc/profile.d/conda.sh
export PATH=/share/apps/CENTOS7/miniconda3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/miniconda3/lib:$LD_LIBRARY_PATH
libgcc-7.2.0 | h69d50b8_2 304 KB conda-forge
samtools-1.7 | 1 1.0 MB bioconda
bowtie2-2.2.5 | py38hed8969a_7 11.8 MB bioconda
perl-5.26.2 | h36c2ea0_1008 15.4 MB conda-forge
bedtools-2.30.0 | hc088bd4_0 14.0 MB bioconda
# cufflinks requires centos 6, weird, so also added to miniconda2,
# see section "Kallisto & Trinity & FastQC"
====== Sequencing Tools #3 ======
Suite of tools for high-accuracy basecaling, assembly, polishing, and alignment. Installed in a specific miniconda3 environment (see below).
* Requires centos7 so queues mwgpu, amber128, exx96
* debug server greentail52
* PREFIX=/share/apps/CENTOS7/miniconda3
* python 3.9
* gcc 9.2
* cmake 3.8.5
* chernofflab
# environment for all modules
source /share/apps/CENTOS7/miniconda3/etc/profile.d/conda.sh
export PATH=/share/apps/CENTOS7/miniconda3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/miniconda3/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/gcc/9.2.0/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/cmake/3.12.1/bin:$PATH
which conda python pip mpirun gcc cmake
python --version
/share/apps/CENTOS7/miniconda3/bin/conda
/share/apps/CENTOS7/miniconda3/bin/python
/share/apps/CENTOS7/miniconda3/bin/pip
/share/apps/CENTOS7/miniconda3/bin/mpirun
/share/apps/CENTOS7/gcc/9.2.0/bin/gcc
/share/apps/CENTOS7/cmake/3.12.1/bin/cmake
** Guppy Server **
* https://github.com/nanoporetech/megalodon, find community page
* standalone build
* Local accelerated basecalling for Nanopore data
* v5.0.11
export PATH=/share/apps/CENTOS7/guppy/5.0.11/ont-guppy-cpu/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/guppy/5.0.11/ont-guppy-cpu/lib:$LD_LIBRARY_PATH
ldd `which guppy_basecall_server`
# cpu; note as user start server on scratch server greentail52
guppy_basecall_server --config dna_r9.4.1_450bps_fast.cfg -p 5555 -l /tmp/guppy &
# check if running, use ''kill'' command to terminate
lsof -i:5555
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
guppy_bas 80002 bchernoff 12u IPv4 959121280 0t0 TCP *:personal-agent (LISTEN)
# nodes can reach this server at this IP or start their own on localhost:port
192.168.102.251 greentail52-eth0
10.10.102.251 greentail52-eth1 (preferred)
** Guppy Client **
* https://github.com/nanoporetech/pyguppyclient
* miniconda3 python3.9
* Python client library for Guppy.
* Full Python client library for communicating with guppy_basecall_server.
* v0.1.0
pip install cython PyUnicode timer
pip install pyguppyclient
Successfully installed flatbuffers-1.11 ont-fast5-api-3.3.0 pyguppyclient-0.0.6 pyzmq-17.1.2
note: from pyguppyclient import GuppyBasecallerClient, yield_reads
success, but do not know how to execute the example
** Medaka **
* https://github.com/nanoporetech/medaka
* miniconda3
* Medaka is a tool to create consensus sequences and variant calls from nanopore sequencing data.
* v1.4.3
# To activate this environment, use
# $ conda activate medaka
# To deactivate an active environment, use
# $ conda deactivate
# requires
(medaka)$ conda list | egrep -i "samtools|minimap2|tabix|bgzip"
minimap2 2.21 h5bf99c6_0 bioconda
pbgzip 2016.08.04 h36cd882_2 bioconda
py-bgzip 0.4.0 py38ha8cb210_0 conda-forge
samtools 1.12 h9aed4be_1 bioconda
tabix 1.11 hdfd78af_0 bioconda
** Flye **
Somewhat annoying, no miniconda3 package, (python 3.9, gcc 9.2). Local building (without installation) on centos7 greentail52.
* setup env as for all packages in #3, see top of section
* v2.9.2 (June 2023)
* python bin/flye
* change the PATHS accordingly
/share/apps/CENTOS7/flye/2.9.2/bin/flye --version
2.9.2-b1794
* v2.9.1 (October 2022)
* python bin/flye
* change the PATHs accordingly
/share/apps/CENTOS7/flye/2.9.1/bin/flye --version
2.9.1-b1780
* v2.9 (August 2021)
* python bin/flye
export PATH=/share/apps/CENTOS7/flye/2.9/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/flye/2.9/lib:$LD_LIBRARY_PATH
which flye
/share/apps/CENTOS7/flye/2.9/bin/flye
flye --version
2.9-b1774
** Filtlong **
* https://github.com/rrwick/Filtlong
* binary
* Filtlong is a tool for filtering long reads by quality.
* v0.2.1
# note: fails with gcc 9.2, native gcc ok 4.8.5, UNSET gcc92/miniconda3 env
export PATH=/share/apps/CENTOS7/Filtlong/0.2.1/bin:$PATH
ldd `which filtlong`
** Bamtools**
* https://github.com/pezmaster31/bamtools
* standalone build
* C++ API & command-line toolkit for working with BAM data
* v2.5.2
export PATH=/share/apps/CENTOS7/bamtools/2.5.2/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/bamtools/2.5.2/lib:$LD_LIBRARY_PATH
ldd `which bamtools`
** BBmap **
* location: /share/apps/CENTOS7/bbmap/39.01
* java programs
* https://sourceforge.net/projects/bbmap/
** Pilon **
* location: /share/apps/CENTOS7/pilon/
* just one big jar file
* latest java is in $PATH
[hmeij@greentail52 pilon]$ java -jar pilon-1.24.jar --help
Pilon version 1.24 Thu Jan 28 13:00:45 2021 -0500
Usage: pilon --genome genome.fasta [--frags frags.bam] [--jumps jumps.bam] [--unpaired unpaired.bam]
[...other options...]
pilon --help for option details
** haslr **
* miniconda3 env
* https://github.com/vpc-ccg/haslr
* conda install -c bioconda haslr
The following NEW packages will be INSTALLED:
_openmp_mutex conda-forge/linux-64::_openmp_mutex-4.5-2_gnu
fastutils bioconda/linux-64::fastutils-0.3-hdcf5f25_4
haslr bioconda/linux-64::haslr-0.8a1-py39hd65a603_4
k8 bioconda/linux-64::k8-0.2.5-hdcf5f25_4
libgomp conda-forge/linux-64::libgomp-13.2.0-h807b86a_2
libzlib conda-forge/linux-64::libzlib-1.2.13-hd590300_5
minia bioconda/linux-64::minia-3.2.6-hdcf5f25_3
minimap2 bioconda/linux-64::minimap2-2.26-he4a0461_1
pluggy conda-forge/noarch::pluggy-1.3.0-pyhd8ed1ab_0
ruamel.yaml conda-forge/linux-64::ruamel.yaml-0.17.32-py39hd1e30aa_0
ruamel.yaml.clib conda-forge/linux-64::ruamel.yaml.clib-0.2.7-py39h72bdee0_1
toolz conda-forge/noarch::toolz-0.12.0-pyhd8ed1ab_0
Compiled from source, do **not** source miniconda3, we are using standalone pyhton3
export PATH=/share/apps/CENTOS7/gcc/9.2.0/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/9.2.0/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
[hmeij@greentail52 20231006]$ which python
/share/apps/CENTOS7/python/3.8.3/bin/python
[hmeij@greentail52 20231006]$ pwd
/share/apps/CENTOS7/haslr/20231006
[hmeij@greentail52 20231006]$ ll bin
total 11448
-rwxr-xr-x 1 hmeij its 123448 Oct 6 15:01 fastutils
-rwxr-xr-x 1 hmeij its 3823032 Oct 6 15:04 haslr_assemble
-rwxr-xr-x 1 hmeij its 19949 Oct 6 14:48 haslr.py
-rwxr-xr-x 1 hmeij its 20969840 Oct 6 14:59 minia
-rwxr-xr-x 1 hmeij its 23856 Oct 6 15:01 minia_nooverlap
-rwxr-xr-x 1 hmeij its 951922 Oct 6 14:59 minimap2
** stLFR **
* same centos 7 environment
* https://github.com/BGI-Qingdao/stLFR_GapCloser
cd /share/apps/CENTOS7/stLFR/20230929
[hmeij@greentail52 20230929]$ ./Release/stLFR_GapCloser -h
Version:
1.00
Contact:
dengli1@genomics.cn && guolidong@genomics.cn
Usage:
stLFR_GapCloser [options]
...
# and another request by Luis
# https://sourceforge.net/projects/soapdenovo2/files/SOAPdenovo2
[hmeij@greentail52 stLFR]$ pwd
/share/apps/CENTOS7/stLFR
[hmeij@greentail52 stLFR]$ ll GapCloser-bin-v1.12-r6/
total 175
-rwxr-xr-x 1 hmeij its 184713 Jan 11 2013 GapCloser
-rw-r--r-- 1 hmeij its 44533 Jan 11 2013 GapCloser_Manual.pdf
** BUSCO **
* https://busco.ezlab.org/busco_userguide.html#docker-image
* conda package fails on glib 2.17, weird
[hmeij@n79 ~]$ docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
ezlabgva/busco v5.5.0_cv1 30d3e831aba7 2 months ago 2.88GB
[hmeij@n79 busco_wd]$ docker run --rm -v $(pwd):/busco_wd -u $(id -u):$(id -g) ezlabgva/busco:v5.5.0_cv1 busco -i /busco_wd/assembly.fasta -l actinopterygii_odb10 -o Busco_Output -m geno -f
2023-10-28 20:30:00 INFO: ***** Start a BUSCO v5.5.0 analysis, current time: 10/28/2023 20:30:00 *****
2023-10-28 20:30:00 INFO: Configuring BUSCO with local environment
2023-10-28 20:30:00 INFO: Mode is genome
2023-10-28 20:30:00 INFO: 'Force' option selected; overwriting previous results directory
2023-10-28 20:30:00 INFO: Downloading information on latest versions of BUSCO data...
that's a problem, trying to reach the internet ...
** picard **
* https://github.com/broadinstitute/picard
* Warning: using Java 20 but only Java 17 has been tested.
picard centos7, weird
> Failed to load native library 'libnative-platform.so' for Linux amd64.
picard centos8
[hmeij@cottontail2 20231108]$ pwd
/share/apps/CENTOS8/picard/20231108
[hmeij@cottontail2 20231108]$ ls -l build/libs/
total 119872
-rw-r--r-- 1 hmeij its 62520451 Nov 8 11:46 picard-3.1.0-3-ga9194bd97-SNAPSHOT-all.jar
-rw-r--r-- 1 hmeij its 1856980 Nov 8 11:01 picard-3.1.0-3-ga9194bd97-SNAPSHOT.jar
-rw-r--r-- 1 hmeij its 62520451 Nov 8 11:46 picard.jar
[hmeij@cottontail2 20231108]$
====== Galario CPU+GPU ======
* c/c++ compiled library with FFTW3
* should be gpu and cpu able (not sure how to work it yet)
* install notes at /share/apps/CENTOS7/galario/1.2.2-install
* sample program at /zfshomes/hmeij/amhughes/run-gpu
* requires centos7
* queues: mwgpu, amber128, exx96
* debug node greentail52
**GPU is nonfunctional**
Conversion of python code to C binaries was successful \\
read the history.txt file
--- //[[hmeij@wesleyan.edu|Henk]] 2021/07/07 15:38//
# sample scripts in /zfshomes/hmeij/amhughes
# read README, it's complicated
====== VSEARCH ======
VSEARCH stands for vectorized search, as the tool takes advantage of parallelism in the form of SIMD vectorization ... an alternative to USEARCH; offers search and clustering algorithms that are often orders of magnitude faster than BLAST.
* https://github.com/torognes/vsearch
* requires centos7
* centos7 queues; mwgpu, amber128, exx96
* debug host greentail52
* /share/apps/CENTOS7/vsearch/2.17.0/bin/vsearch
====== DeepChem ======
DeepChem is a deep learning tool built for bio/chem research
* First installed latest Miniconda
* /share/apps/CENTOS7/amber/miniconda3, see below
* put the code block below in ~/.bashrc and log back in
* https://deepchem.io/
* installed stable/conda/gpu=no
* pip installed tensorflow-2.5
* Note: miniconda contains (Your python: python=3.8)
* "conda install -c ambermd pytraj" fails to install
* CentOS7 so queues mwgpu, amber128, exx96
* or for debugging use greentail52
# add to ~/.bashrc and log back in, then check: 'which python pip conda'
# >>> conda initialize >>>
# !! Contents within this block are managed by 'conda init' !!
__conda_setup="$('/share/apps/CENTOS7/amber/miniconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)"
if [ $? -eq 0 ]; then
eval "$__conda_setup"
else
if [ -f "/share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh" ]; then
. "/share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh"
else
export PATH="/share/apps/CENTOS7/amber/miniconda3/bin:$PATH"
fi
fi
unset __conda_setup
# <<< conda initialize <<<
Successfully installed absl-py-0.12.0 astunparse-1.6.3
cachetools-4.2.2 flatbuffers-1.12 gast-0.4.0 google-auth-1.30.2
google-auth-oauthlib-0.4.4 google-pasta-0.2.0 grpcio-1.34.1
h5py-3.1.0 keras-nightly-2.5.0.dev2021032900 keras-preprocessing-1.1.2
markdown-3.3.4 numpy-1.19.5 oauthlib-3.1.1 opt-einsum-3.3.0
protobuf-3.17.3 pyasn1-0.4.8 pyasn1-modules-0.2.8 requests-oauthlib-1.3.0
rsa-4.7.2 tensorboard-2.5.0 tensorboard-data-server-0.6.1
tensorboard-plugin-wit-1.8.0 tensorflow-2.5.0 tensorflow-estimator-2.5.0
termcolor-1.1.0 typing-extensions-3.7.4.3 werkzeug-2.0.1 wrapt-1.12.1
====== Miniconda3 ======
* add to ~/.bashrc and log back in, then check: 'which python pip conda'
* the conda initialization code block is list under DeepChem section above
* likely will only run on centos7, so
* queues: mwgpu, amber128, exx96
* for debug host use greentail52
** Due to technical limitations, the conda package does not support GPUs ** according to web site. You must build manual libraries for GPU, see "gallario CPU+GPU" section above
source /share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh
export PATH=/share/apps/CENTOS7/amber/miniconda3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/lib:$LD_LIBRARY_PATH
which mpirun python conda
# https://github.com/mtazzari/galario (CPU only apparently)
conda install -c conda-forge galario
fftw conda-forge/linux-64::fftw-3.3.8-nompi_hfc0cae8_1114
galario conda-forge/linux-64::galario-1.2.2-py38h9d68f60_1002
libblas conda-forge/linux-64::libblas-3.9.0-8_openblas
libcblas conda-forge/linux-64::libcblas-3.9.0-8_openblas
libgfortran-ng conda-forge/linux-64::libgfortran-ng-7.5.0-h14aa051_19
libgfortran4 conda-forge/linux-64::libgfortran4-7.5.0-h14aa051_19
liblapack conda-forge/linux-64::liblapack-3.9.0-8_openblas
libopenblas conda-forge/linux-64::libopenblas-0.3.12-pthreads_hb3c22a3_1
numpy conda-forge/linux-64::numpy-1.19.4-py38hf0fd68c_1
python_abi conda-forge/linux-64::python_abi-3.8-1_cp38
scipy conda-forge/linux-64::scipy-1.5.3-py38h828c644_0
mpi conda-forge/linux-64::mpi-1.0-openmpi
mpi4py conda-forge/linux-64::mpi4py-3.0.3-py38h2e4b629_2
openmpi pkgs/main/linux-64::openmpi-4.0.2-hb1b8bf9_1
pandas conda-forge/linux-64::pandas-1.1.4-py38h0ef3d22_0
python-dateutil conda-forge/noarch::python-dateutil-2.8.1-py_0
pytz conda-forge/noarch::pytz-2021.1-pyhd8ed1ab_0
schwimmbad conda-forge/linux-64::schwimmbad-0.3.2-py38h578d9bd_0
For astronomy group (amhughes and students) added ('conda list' to see all packages installed)
argparse 1.4.0 pypi_0 pypi
corner 2.2.1 pypi_0 pypi
emcee 3.0.2 pypi_0 pypi
galario 1.2.2 py38h9d68f60_1002 conda-forge
matplotlib 3.4.2 pypi_0 pypi
numpy 1.19.5 pypi_0 pypi
time 1.8 h516909a_0 conda-forge
astropy conda-forge/linux-64::astropy-3.2.3-py38h516909a_0
attrs conda-forge/noarch::attrs-21.2.0-pyhd8ed1ab_0
click conda-forge/linux-64::click-8.0.3-py38h578d9bd_0
coverage pkgs/main/linux-64::coverage-5.5-py38h27cfd23_2
hypothesis conda-forge/noarch::hypothesis-6.23.4-pyhd8ed1ab_0
iniconfig conda-forge/noarch::iniconfig-1.1.1-pyh9f0ad1d_0
more-itertools conda-forge/noarch::more-itertools-8.10.0-pyhd8ed1ab_0
packaging conda-forge/noarch::packaging-21.0-pyhd8ed1ab_0
pluggy conda-forge/linux-64::pluggy-1.0.0-py38h578d9bd_1
psutil pkgs/main/linux-64::psutil-5.8.0-py38h27cfd23_1
py conda-forge/noarch::py-1.10.0-pyhd3deb0d_0
pyparsing conda-forge/noarch::pyparsing-2.4.7-pyh9f0ad1d_0
pytest conda-forge/linux-64::pytest-6.2.5-py38h578d9bd_0
pytest-arraydiff conda-forge/noarch::pytest-arraydiff-0.3-py_0
pytest-astropy conda-forge/noarch::pytest-astropy-0.9.0-pyhd8ed1ab_0
pytest-astropy-he~ conda-forge/noarch::pytest-astropy-header-0.1.2-py_0
pytest-cov conda-forge/noarch::pytest-cov-3.0.0-pyhd8ed1ab_0
pytest-doctestplus conda-forge/noarch::pytest-doctestplus-0.11.0-pyhd8ed1ab_0
pytest-filter-sub~ conda-forge/noarch::pytest-filter-subpackage-0.1.1-py_0
pytest-mock conda-forge/noarch::pytest-mock-3.6.1-pyhd8ed1ab_0
pytest-openfiles conda-forge/noarch::pytest-openfiles-0.5.0-py_0
pytest-remotedata conda-forge/noarch::pytest-remotedata-0.3.2-pyh9f0ad1d_0
sortedcontainers conda-forge/noarch::sortedcontainers-2.4.0-pyhd8ed1ab_0
toml conda-forge/noarch::toml-0.10.2-pyhd8ed1ab_0
For nwells/smithlab added
conda install -c schrodinger pymol-bundle
package | build
---------------------------|-----------------
apbs-1.5 | h14c3975_3 277 KB schrodinger
biopython-1.78 | py38h7b6447c_0 2.1 MB
bzip2-1.0.8 | h516909a_3 398 KB conda-forge
collada2gltf-2.1.4 | h6bb024c_0 3.2 MB schrodinger
conda-4.10.3 | py38h578d9bd_0 3.1 MB conda-forge
curl-7.71.1 | hbc83047_1 140 KB
dbus-1.13.18 | hb2f20db_0 504 KB
expat-2.4.1 | h2531618_2 168 KB
fontconfig-2.13.1 | he4413a7_1000 327 KB conda-forge
freemol-1.158 | py_2 6 KB schrodinger
freetype-2.10.4 | h7ca028e_0 912 KB conda-forge
glew-2.0.0 | 0 660 KB schrodinger
glib-2.69.0 | h5202010_0 1.7 MB
gst-plugins-base-1.14.0 | hbbd80ab_1 4.8 MB
gstreamer-1.14.0 | h28cd5cc_2 3.2 MB
h5py-2.10.0 |nompi_py38hafa665b_105 1.1 MB conda-forge
hdf4-4.2.13 | h3ca952b_2 714 KB
hdf5-1.10.6 |nompi_h7c3c948_1111 3.1 MB conda-forge
icu-58.2 | hf484d3e_1000 22.6 MB conda-forge
jpeg-9d | h36c2ea0_0 264 KB conda-forge
krb5-1.18.2 | h173b8e3_0 1.3 MB
libcurl-7.71.1 | h20c2e04_1 305 KB
libglu-9.0.0 | he1b5a44_1001 413 KB conda-forge
libholoplaycore-0.1.0_rc4 | 1 325 KB schrodinger
libnetcdf-4.7.4 |nompi_h56d31a8_107 1.3 MB conda-forge
libpng-1.6.37 | h21135ba_2 306 KB conda-forge
libssh2-1.9.0 | hab1572f_5 225 KB conda-forge
libtiff-4.0.10 | hc3755c2_1005 602 KB conda-forge
libuuid-2.32.1 | h14c3975_1000 26 KB conda-forge
libxcb-1.13 | h14c3975_1002 396 KB conda-forge
libxml2-2.9.10 | hb55368b_3 1.2 MB
lz4-c-1.9.2 | he1b5a44_3 203 KB conda-forge
mengine-1 | h14c3975_1 676 KB schrodinger
mpeg_encode-1 | h14c3975_1 106 KB schrodinger
mtz2ccp4_px-1.0 | h9ac9557_3 547 KB schrodinger
olefile-0.46 | pyh9f0ad1d_1 32 KB conda-forge
pcre-8.45 | h295c915_0 207 KB
pdb2pqr-2.1.2+pymol | py_0 236 KB schrodinger
pillow-6.2.1 | py38h6b7be26_0 637 KB conda-forge
pmw-2.0.1+3 | py_3 60 KB schrodinger
pthread-stubs-0.4 | h36c2ea0_1001 5 KB conda-forge
pycollada-0.7.1+bdf414c7 | py_1 80 KB schrodinger
pykerberos-1.2.1 | py38h27cfd23_2 259 KB
pymol-2.4.1 | py38h4463551_0 8.5 MB schrodinger
pymol-bundle-2.4.1 | 0 16 KB schrodinger
pymol-web-examples-2.4 | 1 1.9 MB schrodinger
pyqt-5.9.2 | py38h05f1152_4 4.5 MB
qt-5.9.7 | h5867ecd_1 68.5 MB
rigimol-1.3 | 2 489 KB schrodinger
sip-4.19.13 | py38he6710b0_0 277 KB
xorg-libxau-1.0.9 | h14c3975_0 13 KB conda-forge
xorg-libxdmcp-1.1.3 | h516909a_0 18 KB conda-forge
zstd-1.4.5 | h9ceee32_0 619 KB
Then
--- //[[hmeij@wesleyan.edu|Henk]] 2021/11/24 13:16//
# using pip3
Successfully installed magenta-2.1.3 python-rtmidi-1.1.2
Successfully installed keras-2.7.0 libclang-12.0.0 tensorboard-2.7.0 \
tensorflow-2.7.0 tensorflow-estimator-2.7.0 tensorflow-io-gcs-filesystem-0.22.0
Added\\
--- //[[hmeij@wesleyan.edu|Henk]] 2022/01/03 14:35//\\
# https://github.com/merenlab/anvio/issues/1479
conda install -c bioconda samtools=1.9 --force-reinstall
The following NEW packages will be INSTALLED:
gettext conda-forge/linux-64::gettext-0.19.8.1-hf34092f_1004
htslib bioconda/linux-64::htslib-1.9-h4da6232_3
libdeflate bioconda/linux-64::libdeflate-1.2-h516909a_1
libglib conda-forge/linux-64::libglib-2.66.3-hbe7bbb4_0
libiconv conda-forge/linux-64::libiconv-1.16-h516909a_0
The following packages will be UPDATED:
cffi pkgs/main::cffi-1.14.3-py38h261ae71_2 --> conda-forge::cffi-1.14.4-py38ha312104_0
libedit pkgs/main::libedit-3.1.20191231-h14c3~ --> conda-forge::libedit-3.1.20191231-h46ee950_2
samtools 1.7-1 --> 1.9-h10a08f8_12
The following packages will be SUPERSEDED by a higher-priority channel:
glib pkgs/main::glib-2.69.0-h5202010_0 --> conda-forge::glib-2.66.3-h58526e2_0
libffi pkgs/main::libffi-3.3-he6710b0_2 --> conda-forge::libffi-3.2.1-he1b5a44_1007
ncurses pkgs/main::ncurses-6.2-he6710b0_1 --> conda-forge::ncurses-6.1-hf484d3e_1002
python pkgs/main::python-3.8.5-h7579374_1 --> conda-forge::python-3.8.3-cpython_he5300dc_0
sqlite pkgs/main::sqlite-3.33.0-h62c20be_0 --> conda-forge::sqlite-3.32.3-hcee41ef_1
$ samtools --version
samtools 1.9
Using htslib 1.9
Copyright (C) 2018 Genome Research Ltd.
* **iqtree** fcohanlab/jwang06
[hmeij@greentail52 ~]$ iqtree --version
IQ-TREE multicore version 2.0.3 for Linux 64-bit built Dec 20 2020
* https://userguide.mdanalysis.org/stable/installation.html
* calterlab, annika (used pip to install)
Successfully installed
GridDataFormats-1.0.1 MDAnalysis-2.3.0
fasteners-0.18 mmtf-python-1.1.3
mrcfile-1.4.3 msgpack-1.0.4 numpy-1.23.4
For **Lammps** (starrlab) 23April2023, consult install file
* /share/apps/CENTOS7/lammps/25Apr2023.install
* make yes-ml-pace
* make yes-gpu
* https://pacemaker.readthedocs.io/en/latest/pacemaker/install/
# Note: you must add this line to you env
export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/pkgs/cudatoolkit-11.0.221-h6bb024c_0/lib:$LD_LIBRARY_PATH
# Note2: had to install 'conda install cudnn' and make two links
to higher level libraries (a must not do)
for libcudnn.so.8 and libcusolver.so.11
[hmeij@greentail52 ~]$ which python
/share/apps/CENTOS7/amber/miniconda3/bin/python
[hmeij@greentail52 ~]$ pip list | grep tensorflow
mesh-tensorflow 0.1.19
tensorflow 2.8.0
tensorflow-addons 0.15.0
tensorflow-datasets 4.4.0
tensorflow-estimator 2.7.0
tensorflow-gan 2.1.0
tensorflow-hub 0.12.0
tensorflow-io-gcs-filesystem 0.32.0
tensorflow-metadata 1.4.0
tensorflow-probability 0.15.0
[hmeij@greentail52 ~]$ which pacemaker
/usr/bin/which: no pacemaker in (...failed to install at first, see readme install file)
====== pytraj ======
Pytraj is a wrapper for cpptraj
* https://amber-md.github.io/pytraj/latest/installation.html#installation
* From AMBER distribution (Linux, OSX)
* pytraj is included in AMBER (version >= 16): ambermd.org
* so below I opted for source code installation with python 3.8.3
* git clone https://github.com/Amber-MD/pytraj; cd pytra
* python ./setup.py install
* requires CentOS7 so queues mwgpu, amber128 and exx96
* or for debugging use greentail52
* /share/apps/CENTOS7/amber/pytraj
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
# An environment resource file for CPPTRAJ has been created:
/zfshomes/apps/CENTOS7/amber/pytraj/cpptraj/cpptraj.sh
# You may 'source' this file to set up your environment for CPPTRAJ.
Installed /zfshomes/apps/CENTOS7/python/3.8.3/lib/python3.8/site-packages/pytraj-2.0.6.dev0-py3.8-linux-x86_64.egg
Processing dependencies for pytraj==2.0.6.dev0
Searching for numpy==1.20.3
Best match: numpy 1.20.3
Adding numpy 1.20.3 to easy-install.pth file
Installing f2py script to /share/apps/CENTOS7/python/3.8.3/bin
Installing f2py3 script to /share/apps/CENTOS7/python/3.8.3/bin
Installing f2py3.8 script to /share/apps/CENTOS7/python/3.8.3/bin
Using /zfshomes/apps/CENTOS7/python/3.8.3/lib/python3.8/site-packages
Finished processing dependencies for pytraj==2.0.6.dev0
====== Ovito ======
* https://www.ovito.org
* /usr/local/bin/ovito -> /usr/local/ovito-basic-3.5.0-x86_64/bin/ovito
* on nodes ''cottontail'' and ''greentail52''
* uses freeglut/freeglut-devel libraries for opengl
====== OpenHPC ======
So I remember where I put the tar repo file
* OpenHPC v2.4
* Rocky 8.5
* Slurm 20.11.8
* Warewulf 3.9
* x86_64
[root@cottontail2 ~]# ll /share/apps/src/cottontail2/openhpc/
total 8290421
-rw------- 1 root root 562276 Mar 17 09:34 Install_guide-Rocky8-Warewulf-SLURM-2.4-x86_64.pdf
-rw-r--r-- 1 root root 10720 Mar 17 09:35 ohpc-release-2-1.el8.x86_64.rpm
-rw-r--r-- 1 root root 8531681280 Mar 17 09:33 OpenHPC-2.4.EL_8.x86_64.tar
# this is all that is needed for ohpc base compute client
# see the recipe for cct2 and follow compute node only ingredients
[root@cottontail2 ~]# ll /share/apps/src/cottontail2/ohpc-compute/
total 95196
-rw-r--r-- 1 root root 2696360 Feb 10 09:21 hwloc-ohpc-2.7.0-3.9.ohpc.2.6.x86_64.rpm
-rw-r--r-- 1 root root 266252 Feb 10 09:22 lmod-ohpc-8.7.6-12.3.ohpc.2.6.x86_64.rpm
-rw-r--r-- 1 root root 7268 Feb 10 09:22 ohpc-base-compute-2.6.1-5.1.ohpc.2.6.1.x86_64.rpm
-rw-r--r-- 1 root root 8204 Feb 10 09:23 ohpc-filesystem-2.6-2.3.ohpc.2.6.noarch.rpm
-rw-r--r-- 1 root root 10720 Feb 10 09:23 ohpc-release-2-1.el8.x86_64.rpm
-rw-r--r-- 1 root root 7124 Feb 10 09:27 ohpc-slurm-client-2.6.1-5.1.ohpc.2.6.1.x86_64.rpm
-rw-r--r-- 1 root root 74326532 Feb 10 09:23 singularity-ohpc-3.7.1-5.1.ohpc.2.1.x86_64.rpm
-rw-r--r-- 1 root root 22168 Feb 10 09:24 slurm-contribs-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm
-rw-r--r-- 1 root root 247392 Feb 10 09:24 slurm-example-configs-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm
-rw-r--r-- 1 root root 18764472 Feb 10 09:24 slurm-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm
-rw-r--r-- 1 root root 176480 Feb 10 09:28 slurm-pam_slurm-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm
-rw-r--r-- 1 root root 785112 Feb 10 09:25 slurm-slurmd-ohpc-22.05.2-14.1.ohpc.2.6.x86_64.rpm
====== OneAPI ======
* Requires CentOS 7 so queues mwgpu, amber128, exx96
* debug node greentail52
* Production version is installed in /share/apps/CENTOS7/intel/openapi (via **n78**)
* source /share/apps/CENTOS7/intel/oneapi/setvars.sh
# which icc icx icpc icpx dpcpp ifort ifx mpirun mpicc python clang clang++
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/intel64/icc
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/icx
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/intel64/icpc
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/icpx
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/dpcpp
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/intel64/ifort
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/ifx
/share/apps/CENTOS7/intel/oneapi/intelpython/latest/bin/mpirun
/share/apps/CENTOS7/intel/oneapi/mpi/2021.1.1/bin/mpicc
/share/apps/CENTOS7/intel/oneapi/intelpython/latest/bin/python
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/clang
/share/apps/CENTOS7/intel/oneapi/compiler/2021.1.1/linux/bin/clang++
# in addition these were installed
/share/apps/CENTOS7/intel/oneapi/mkl/2021.1.1
/share/apps/CENTOS7/intel/oneapi/pytorch/1.5.0
/share/apps/CENTOS7/intel/oneapi/tensorflow/2.2.0
* __Test__ version is installed on local disk on host **greentail52**
* source /opt/intel/oneapi/setvars.sh
* Read [[cluster:203|ICC vs ICX]]
====== StringTie2 ======
* Requires CentOS 7 so queues mwgpu, amber128, exx96
* /share/apps/CENTOS7/stringtie2/stringtie2 binarie
* https://github.com/skovaka/stringtie2
====== Rstudio ======
* http://our-sanscratch-server:8787/
* https://support.rstudio.com/hc/en-us/articles/200552316-Configuring-the-Server
* centos7 R v 3.6.0
* use your HPC username and credentials
# native OS installation for rstudio
/bin/R
====== DMTCP =====
* /share/apps/CENTOS7/dmtcp/3.0.0
* this came from the CRAC source, trying to install gpu checkpoint tool
* [[cluster:198|GPU checkpoint/restart]]
* but plugin will not compile, checking with developers 12 Dec 2020
* Installed as package ins OS on all nodes
* v2.5.2
* dmctp_launch ... etc
====== EMBOSS ======
* /share/apps/CENTOS6/EMBOSS
* /share/apps/CENTOS6/EMBOSS/6.6.0.install (read to setup env)
* https://www.ebi.ac.uk/services
* jfabry/kthayer
====== Sequencing Tools #2 ======
* macs python package (macs2 see below)
* http://bowtie-bio.sourceforge.net/bowtie2/index.shtml
* https://deweylab.github.io/RSEM/
* https://broadinstitute.github.io/picard/
* https://bedtools.readthedocs.io/en/latest/content/installation.html
* padillalab
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH"
Successfully installed macs-1.4.3 # note: macs2 Python version must >= 3.6!
# or change python version to
# ls /share/apps/CENTOS6/python/2.7.9/bin/macs2
/share/apps/CENTOS6/python/2.7.9/bin/macs2
# or change python version to
which python: /share/apps/CENTOS7/python/3.8.3/bin/python
pip list: Package Version
--------------- -------
bowtie 0.11.0
MACS2 2.2.7.1
numpy 1.18.5
picard 2.3.2
scipy 1.4.1
export PATH="/share/apps/CENTOS6/sequencingtools/bowtie2-2.4.1-linux-x86_64:$PATH"
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:/share/apps/CENTOS6/gcc/4.8.4/lib:$LD_LIBRARY_PATH
which gcc: /share/apps/CENTOS6/gcc/4.8.4/bin/gcc
which python: /share/apps/CENTOS6/miniconda2/bin/python
which perl: /share/apps/CENTOS6/miniconda2/bin/perl
export PATH=/share/apps/CENTOS6/R/3.6.1b/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/3.6.1b/lib64:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
which R: /share/apps/CENTOS6/R/3.6.1b/bin/R
ldd /home/apps/CENTOS6/R/3.6.1b/lib64/R/bin/exec/R
java --version: java 12.0.2 2019-07-16 # /share/apps/java/latest -> jdk-12.0.2
java -jar /share/apps/CENTOS6/sequencingtools/picard-2.23.0/picard.jar -h
USAGE: PicardCommandLine [-h]
export PATH=/share/apps/CENTOS6/sequencingtools/bedtools-2.29.2/bin:$PATH
bedtools --version: bedtools v2.29.2
====== Deep Learning Software Stack ======
* queue ''exx96'' provides access to docker containers and
* **digits, tensorflow, pytorch, caffe, rapidsai**
* cuda (several versions)
* jupyterlab
* portainer
* consult [[cluster:193|Docker Containers Usage]]
* QAC
====== Paleogenomics Tools ======
Samtools is a suite of programs for interacting with high-throughput sequencing data. It consists of three separate repositories:
* http://www.htslib.org/download/
* Samtools: Reading/writing/editing/indexing/viewing SAM/BAM/CRAM format
* BCFtools:Reading/writing BCF2/VCF/gVCF files and calling/filtering/summarising SNP and short indel sequence variants
* HTSlib: A C library for reading/writing high-throughput sequencing data
* Archeology (kbrunson)
# add to environment
export PATH=/share/apps/CENTOS6/samtools/1.10/bin:$PATH
export PATH=/share/apps/CENTOS6/bcftools/1.10.2/bin:$PATH
export PATH=/share/apps/CENTOS6/htslib/1.10.2/bin:$PATH
* https://github.com/ANGSD/angsd
* Program for analysing NGS data
# does not seem to work against htslib version above...
export HTSSRC=/share/apps/CENTOS6/ANGSD/htslib
export PATH=/share/apps/CENTOS6/ANGSD/angsd:$PATH
* https://www.beast2.org/
* BEAST 2 is a cross-platform program for Bayesian phylogenetic analysis of molecular sequences.
# with local java jre1.8.0_212
export PATH=/share/apps/CENTOS6/beast/2.6.0/bin:$PATH
* **sratools** https://github.com/ncbi/sra-tools/
* The SRA Toolkit and SDK from NCBI is a collection of tools and libraries for using data in the INSDC Sequence Read Archives.
* requires centos7 (glibc) so queues mwgpu, amber128, exx96
export PATH=/share/apps/CENTOS7/sratools/sratoolkit.2.11.0-centos_linux64/bin:$PATH
* **adapterremoval** https://github.com/MikkelSchubert/adapterremoval
* **bwa** https://github.com/lh3/bwa
* **bowtie2** http://bowtie-bio.sourceforge.net/bowtie2/index.shtml
These are all part of Miniconda3 (centos7), to setup the environment consult
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#miniconda3
* queues: mwgpu, amber128, exx96, debug server greentail52
They are also available via Miniconda2 (centos6), consult
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#miniconda2
* queues: hp12, mw256fd, tinymem, debug server swallowtail
[hmeij@greentail52 ~]$ conda list | egrep "adapterremoval|bwa|bowtie"
adapterremoval 2.3.2 hb7ba0dd_0 bioconda
bowtie2 2.2.5 py38hed8969a_7 bioconda
bwa 0.7.17 hed695b0_7 bioconda
* **mapDamage2**
* https://github.com/ginolhac/mapDamage/issues/25
* docker container is broke
* requires centos6 so queues hp12, mw128 .. may run elsewhere?
* conda install -c 'bioconda' mapdamage2
# env
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH"
The following NEW packages will be INSTALLED:
mapdamage2 bioconda/linux-64::mapdamage2-2.0.6-1
pysam bioconda/linux-64::pysam-0.16.0.1-py27hc729bab_3
seqtk bioconda/linux-64::seqtk-1.3-h5bf99c6_3
[hmeij@petaltail ~]$ mapDamage --version
2.0.6
* **PCAngsd**, requires miniconda3's python
* https://github.com/Rosemeis/pcangsd
* apparently the repo is needed ''/share/apps/CENTOS7/pcangsd/1.1.0''
source /share/apps/CENTOS7/amber/miniconda3/etc/profile.d/conda.sh
export PATH=/share/apps/CENTOS7/amber/miniconda3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/amber/miniconda3/lib:$LD_LIBRARY_PATH
which pcangsd
/share/apps/CENTOS7/amber/miniconda3/bin/pcangsd
pcangsd --version
pcangsd 1.10
* **angsd**, requires miniconda3, same environment as pcangsd
* http://www.popgen.dk/angsd/index.php/RealSFS
* add location of angsd to PATH env
$ pwd
/share/apps/CENTOS7/angsd/0.940/angsd
$ ls -l angsd
-rwxr-xr-x 1 hmeij its 5517248 Jul 6 14:11 angsd
$ ./angsd
-> angsd version: 0.940-dirty (htslib: 1.16) build(Jul 6 2023 14:11:30)
* **Standard RAxML version**
* https://github.com/stamatak/standard-RAxML
# add path to $PATH
[hmeij@greentail52 tmp]$ ll /share/apps/CENTOS7/raxml/8.2.12/
total 3202
-rwxr-xr-x 1 hmeij its 1177632 Apr 25 14:49 raxmlHPC-AVX
-rwxr-xr-x 1 hmeij its 1177632 Apr 25 14:49 raxmlHPC-AVX2
-rwxr-xr-x 1 hmeij its 1152040 Apr 25 14:49 raxmlHPC_serial
-rwxr-xr-x 1 hmeij its 1185704 Apr 25 14:49 raxmlHPC-SSE3
* **qualimap **
* http://qualimap.conesalab.org/
* java & R app, probably will work in centos7 and rocky8
* /share/apps/CENTOS7/qualimap/2.3
* **gatk**
* https://github.com/broadinstitute/gatk
* java & R app. probably will work in centos7 and rocky8
* /share/apps/CENTOS7/gatk/4.4.0.0
For both apps above java is v20, developers want and tested v17
For both apps you may find R local in operating system or a recent versions at
* (rocky 8) module load R/4.1.2
====== CD-HIT ======
CD-HIT is a widely used program for clustering biological sequences to reduce sequence redundancy and improve the performance of other sequence analyses.
* https://github.com/weizhongli/cdhit/wiki
* /share/apps/CENTOS6/cd-hit-v4.8.1-2019-0228/
====== OpenStructure ======
Open-Source Computational Structural Biology Framework
* v 1.9
* https://openstructure.org/
* recipe at /share/apps/CENTOS6/openstructure/1.9-install.txt
* Starrlab
OpenStructure relies on these applications and Python 2.7.15 (see python2 section).
-- Boost version: 1.68.0
-- Found the following Boost libraries:
-- python
-- unit_test_framework
-- filesystem
-- system
-- iostreams
-- program_options
-- regex
-- thread
-- Found ZLIB: /share/apps/CENTOS6/zlib/1.2.11/lib/libz.so (found version "1.2.11")
-- Found PNG: /share/apps/CENTOS6/lpng/1.6.37/lib/libpng.so (found version "1.2.49")
-- Found EIGEN: /share/apps/CENTOS6/eigen/3.2.10
-- Numpy headers found
-- Found FFTW: /share/apps/CENTOS6/fftw/3.3.8-enable-single/lib/libfftw3f.so
-- Found TIFF: /share/apps/CENTOS6/tiff/4.0.9/lib/libtiff.so (found version "4.0.9")
# setup the environment
egrep ^export /share/apps/CENTOS6/openstructure/1.9-install.txt
ost --help
Usage:
ost [ost options] [script to execute] [script parameters]
or
ost [action name] [action options]
Remarkably complex compilations.
====== PyCharm EDU ======
An Easy and Professional Tool to Learn & Teach Programming with Python
* https://www.jetbrains.com/pycharm-edu/
* /share/apps/pycharm/2018.03/bin
====== Paml ======
Phylogenetic Analysis by Maximum Likelihood (PAML)...programs for phylogenetic analyses of DNA or protein sequences using maximum likelihood.
* /share/apps/paml/paml4.9i
* add the bin/ directory to your PATH (see examples on this page)
* http://abacus.gene.ucl.ac.uk/software/paml.html
* fcolanlab (wmei, ndelgaudio)
====== Spyder ======
* Spyder: The Scientific Python Development Environment
* https://docs.spyder-ide.org/index.html
* See Python 3.5.2 for more details
====== Python3 ======
* Python 3.8.3
* /share/apps/CENTOS7/python/3.8.3 (so queues mwgpu, exx96)
* most packages found on this page
* picked up mpirun from miniconda2 apps (that may be a problem)
* Added to 3.8.3 "acpype"
* https://github.com/alanwilter/acpype
From website:\\
"
We now have an up to date webservice at http://bio2byte.be/acpype/ \\
(but it does not have the amb2gmx funcionality).
To run acpype, locally, with its all functionalities, you need ANTECHAMBER from package AmberTools and Open Babel if your input files are of PDB format.
However, if one wants acpype just to emulate amb2gmx.pl, one needs nothing at all but Python.
"\\
--- //[[hmeij@wesleyan.edu|Henk]] 2021/02/18 14:41//
# needed for _ctypes to compile properly for https retrievals
yum install openssl openssl-devel
yum install yum install libffi-devel libffi
tar -xvf ../Python-3.8.3.tgz
cd Python-3.8.3/
./configure --prefix=/share/apps/CENTOS7/python/3.8.3 --enable-optimizations
make
make install
pip3 install --upgrade pip
# make symbolic links python3->pyhton, pip3->pip
# drop all the packages in from previous versions
https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#python3
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
which pip3 python3
/share/apps/CENTOS7/python/3.8.3/bin/pip3
/share/apps/CENTOS7/python/3.8.3/bin/python3
Added to both 3.5.2 versions\\
pycuda\\
--- //[[hmeij@wesleyan.edu|Henk]] 2020/02/18 08:45//
Added to both 3.5.2 versions\\
Successfully installed\\
boto-2.49.0 boto3-1.12.0 botocore-1.15.0 gensim-3.8.1 jmespath-0.9.4 s3transfer-0.3.3 smart-open-1.9.0\\
joblib, scikit-learn, sklearn\\
\\
Requirement already satisfied:\\
keras keras-preprocessing keras-applications\\
tensorflow (requires centos7)\\
--- //[[hmeij@wesleyan.edu|Henk]] 2020/02/17 13:02//
Added to both 3.5.2 versions\\
''jupyter-notebook'' only runs in 3.5.2 (non-mpi)\\
--- //[[hmeij@wesleyan.edu|Henk]] 2019/07/23 09:04//
Successfully installed
Send2Trash-1.5.0
ipywidgets-7.5.0
jupyter-1.0.0
jupyter-client-5.3.1
jupyter-console-6.0.0
notebook-6.0.0
prometheus-client-0.7.1
terminado-0.8.2
widgetsnbextension-3.5.0
json5-0.8.5
jupyterlab-1.0.2
jupyterlab-server-1.0.0
tornado-6.0.3
Added to both python versions 3.5.2 and 3.5.2-mpi the following modules all needed for Spyder.\\
--- //[[hmeij@wesleyan.edu|Henk]] 2019/04/25 09:21//
Successfully installed PyQT5-5.12.1 PyQt5-sip-4.19.15
Successfully installed
Jinja2-2.10.1 MarkupSafe-1.1.1 alabaster-0.7.12 asn1crypto-0.24.0 astroid-2.2.5
attrs-19.1.0 babel-2.6.0 backcall-0.1.0 bleach-3.1.0 certifi-2019.3.9 cffi-1.12.3
chardet-3.0.4 cloudpickle-0.8.1 cryptography-2.6.1 decorator-4.4.0 defusedxml-0.6.0
docutils-0.14 entrypoints-0.3 idna-2.8 imagesize-1.1.0 ipykernel-5.1.0
ipython-7.4.0 ipython-genutils-0.2.0 isort-4.3.17 jedi-0.13.3 jeepney-0.4
jsonschema-3.0.1 jupyter-client-5.2.4 jupyter-core-4.4.0 keyring-19.0.1
lazy-object-proxy-1.3.1 mccabe-0.6.1 mistune-0.8.4 nbconvert-5.4.1 nbformat-4.4.0
numpydoc-0.9.1 packaging-19.0 pandocfilters-1.4.2 parso-0.4.0 pexpect-4.7.0
pickleshare-0.7.5 prompt-toolkit-2.0.9 psutil-5.6.1 ptyprocess-0.6.0 pycodestyle-2.5.0
pycparser-2.19 pyflakes-2.1.1 pygments-2.3.1 pylint-2.3.1 pyqtwebengine-5.12.1
pyrsistent-0.14.11 pyzmq-18.0.1 qtawesome-0.5.7 qtconsole-4.4.3 qtpy-1.7.0 requests-2.21.0
rope-0.14.0 secretstorage-3.1.1 snowballstemmer-1.2.1 sphinx-2.0.1 sphinxcontrib-applehelp-1.0.1
sphinxcontrib-devhelp-1.0.1 sphinxcontrib-htmlhelp-1.0.2 sphinxcontrib-jsmath-1.0.1
sphinxcontrib-qthelp-1.0.2 sphinxcontrib-serializinghtml-1.1.3
spyder-3.3.4 spyder-kernels-0.4.4
testpath-0.4.2 tornado-6.0.2 traitlets-4.3.2 typed-ast-1.3.4 urllib3-1.24.2
wcwidth-0.1.7 webencodings-0.5.1 wrapt-1.11.1 wurlitzer-1.0.2
* program: python v 3.5.2 mpi enabled
* /share/apps/CENTOS6/python/3.5.2-mpi (used in astronomy with emcee and mpi4py)
Environment
export PATH=/share/apps/CENTOS6/openmpi/3.1.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/3.1.3/lib:$LD_LIBRARY_PATH
which mpicc
/share/apps/CENTOS6/openmpi/3.1.3/bin/mpicc
CC=mpicc LDSHARED="mpicc -shared" ./configure --prefix=/share/apps/CENTOS6/python/3.5.2-mpi
export PATH=/share/apps/CENTOS6/python/3.5.2-mpi/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2-mpi/lib:$LD_LIBRARY_PATH
Packages
Successfully installed nltk-3.4
Successfully installed numpy-1.16.2
Successfully installed scipy-1.2.1
Successfully installed Cython-0.29.6
Successfully installed jenkspy-0.1.5
Successfully installed pandas-0.24.2 python-dateutil-2.8.0 pytz-2018.9
Successfully installed argparse-1.4.0
Installing collected packages: readline
Successfully installed mpi4py-3.0.1
Successfully installed emcee-2.2.1
Installed /home/apps/CENTOS6/python/3.5.2-mpi/lib/python3.5/site-packages/pandas2-0.0.0-py3.5.egg
Successfully installed cycler-0.10.0
Successfully installed kiwisolver-1.0.1
Successfully installed matplotlib-3.0.3 pyparsing-2.3.1
Successfully installed seaborn-0.9.0
Successfully installed subprocess32-3.5.3
Successfully installed astropy-3.1.2
Successfully installed pathlib2-2.3.3
Successfully installed scandir-1.10.0
Successfully installed os0-0.2.14
Successfully installed pickle-utils-0.1
* program: python v 3.5.2 not mpi enabled
* /share/apps/CENTOS6/python/3.5.2 (heavily used in DNA sequencing tools)
* python3, pip3, nltk v3.2.1, numpy v1.11.2
* Cython-0.27.3, jenkspy-0.1.4
* Pandas-0.21.0
* OpenPyXL
* and also ...
Successfully installed emcee-2.2.1
Requirement already satisfied: mpi4py in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (2.0.0)
Successfully installed argparse-1.4.0
Successfully installed readline-6.2.4.1
Requirement already satisfied: pandas in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (0.21.0)
Requirement already satisfied: python-dateutil>=2 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (2.6.1)
Requirement already satisfied: pytz>=2011k in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (2017.3)
Requirement already satisfied: numpy>=1.9.0 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (1.15.4)
Requirement already satisfied: six>=1.5 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from python-dateutil>=2->pandas) (1.11.0)
Installed /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/pandas2-0.0.0-py3.5.egg
Successfully installed cycler-0.10.0 kiwisolver-1.0.1 matplotlib-3.0.3 pyparsing-2.3.1 seaborn-0.9.0
Installed /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/subprocess32-3.5.3-py3.5.egg
Successfully installed astropy-3.1.2
Successfully installed pathlib2-2.3.3
Requirement already satisfied: matplotlib in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (3.0.3)
Requirement already satisfied: cycler>=0.10 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (0.10.0)
Requirement already satisfied: numpy>=1.10.0 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (1.15.4)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (2.3.1)
Requirement already satisfied: python-dateutil>=2.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (2.6.1)
Requirement already satisfied: kiwisolver>=1.0.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (1.0.1)
Requirement already satisfied: six in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from cycler>=0.10->matplotlib) (1.11.0)
Requirement already satisfied: setuptools in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from kiwisolver>=1.0.1->matplotlib) (20.10.1)
* install dir: /share/apps/CENTOS6/python/3.5.2
* docs: http://www.python.org/doc/
* www: http://www.python.org/
export PATH=/share/apps/CENTOS6/python/3.5.2/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2/lib:$LD_LIBRARY_PATH
--- //[[hmeij@wesleyan.edu|Henk]] 2017/02/06 10:45//
Added to 3.5
Copying mpi4py.egg-info to /share/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/mpi4py-2.0.0-py3.5.egg-info
but I had to change the code and I do not know what the impact might be
src/mpi4py.MPI.c
/* __pyx_v_rd = MPI_CONVERSION_FN_NULL; */
__pyx_v_rd = 0;
Use this env pointing to old MPI versions
export MPI_DIR=/share/apps/openmpi/1.2+intel-10
export PATH=/share/apps/CENTOS6/python/3.5.2/bin:$PATH
export PATH=/share/apps/openmpi/1.2+intel-10/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.2+intel-10/lib:$LD_LIBRARY_PATH
These versions of python can be set with your PYTHONPATH and PYTHONHOME env variables. For syntax, see the Autodock section.
* program: python v 3.0.1
* install dir: /share/apps/python/3.0.1
* docs: http://www.python.org/doc/
* www: http://www.python.org/
====== Kallisto & Trinity & FastQC======
**FastQC** A quality control tool for high throughput sequence data.
* ''java -version'' latest JDK installed, in default PATH ''which java''
* location: /share/apps/FastQC/0.11.8
* https://www.bioinformatics.babraham.ac.uk/projects/fastqc/
* requires centos6, may not run on centos7
* queues: hp12, tinymem, mw256fd mw128
* debug server swallowtail
* biolab (coolonlab, tearley)
**kallisto** is a program for quantifying abundances of transcripts from bulk and single-cell RNA-Seq data, or more generally of target sequences using high-throughput sequencing reads.
* https://pachterlab.github.io/kallisto/about
* also installed using miniconda2 packages (see below, set the PATH variable)
# env
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH"
bwa: 0.7.17-h84994c4_5 bioconda
perl: 5.26.2-h14c3975_1002 conda-forge
bowtie bioconda/linux-64::bowtie-1.2.2-py27h2d50403_1
tbb conda-forge/linux-64::tbb-2019.4-h6bb024c_0
star bioconda/linux-64::star-2.7.0d-0
hdf5 conda-forge/linux-64::hdf5-1.10.3-hba1933b_1001
kallisto bioconda/linux-64::kallisto-0.45.0-hdcc98e5_0
# --- //[[hmeij@wesleyan.edu|Henk]] 2021/08/19 08:55//
# Added packages below for Prof Coolon's lab (skalra)
conda list | egrep -i "samtools|bowtie2|bedtools|cufflinks"
bedtools 2.30.0 h7d7f7ad_1 bioconda
bowtie2 2.3.5.1 py27he513fc3_0 bioconda
cufflinks 2.2.1 py27_2 bioconda
samtools 1.13 h8c37831_0
# fcohan lab
[hmeij@petaltail ~]$ iqtree --version
IQ-TREE multicore version 2.1.4-beta COVID-edition for Linux 64-bit built Jun 24 2021
**Trinity** assembles transcript sequences from Illumina RNA-Seq data.
* https://github.com/trinityrnaseq/trinityrnaseq/wiki
* also installed using miniconda2 packages (see above, set the PATH variable)
Trinity seems to need //makeblastdb//. BLAST finds regions of similarity between biological sequences.
* ftp://ftp.ncbi.nlm.nih.gov/blast/executables/blast+/LATEST/
* https://blast.ncbi.nlm.nih.gov/Blast.cgi
* add to PATH /share/apps/CENTOS6/ncbi-blast-2.9.0+/bin
* add '/share/apps/CENTOS6/glibc-2.14/lib' to LD_LIBRARY_PATH
export TRINITY_HOME=/share/apps/CENTOS6/trinityrnaseq-Trinity-v2.8.4
export PATH=$TRINITY_HOME:$PATH
python 2.7 modules (these do not work with Trinity...bowtie2, jellyfish, salmon?)
bowtie2-2.3.4.3 | py27he860b03_1 11.9 MB bioconda
jellyfish conda-forge/linux-64::jellyfish-0.6.1-py27h14c3975_1000
bzip2-1.0.6 | h14c3975_1002 415 KB conda-forge
icu-58.2 | hf484d3e_1000 22.6 MB conda-forge
jemalloc-5.1.0 | hf484d3e_1000 10.2 MB conda-forge
libboost-1.67.0 | h46d08c1_4 20.9 MB
salmon-0.13.1 | h86b0361_0 4.2 MB bioconda
xz-5.2.4 | h14c3975_1001 366 KB conda-forge
trinity modules
Inchworm: has been Installed Properly
Chrysalis: has been Installed Properly
QuantifyGraph: has been Installed Properly
GraphFromFasta: has been Installed Properly
ReadsToTranscripts: has been Installed Properly
parafly: has been Installed Properly
slclust: has been Installed Properly
collectl: has been Installed Properly
Apparently bowtie, jellyfish, and salmon are not found as python modules. Install binaries from
* https://github.com/trinityrnaseq/trinityrnaseq/wiki/Installing-Trinity
and put them in the miniconda2 directory as that is the python used.
* Sample Run (must run on mw256fd using local scratch 5TB)
#!/bin/bash
# submit via 'bsub < run_trinity'
#BSUB -n 24
#BSUB -R "span[hosts=1]"
#BSUB -q mw256fd
#BSUB -J trinity
#BSUB -o trinity_assembly
#BSUB -e trinity_stderr
export TRINITY_HOME=/share/apps/CENTOS6/trinityrnaseq-Trinity-v2.8.4
export PATH=$TRINITY_HOME:$PATH
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH"
export PATH="/share/apps/CENTOS6/python/3.5.2/samtools-1.9/bin:$PATH"
export PATH="/share/apps/CENTOS6/miniconda2/bowtie2-2.3.5-linux-x86_64:$PATH"
export PATH="/share/apps/CENTOS6/miniconda2/jellyfish-2.2.10:$PATH"
export PATH="/share/apps/CENTOS6/miniconda2/salmon-0.13.1_linux_x86_64/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/salmon-0.13.1_linux_x86_64/lib:$LD_LIBRARY_PATH"
# unique job scratch dirs
MYSANSCRATCH=/sanscratch/$LSB_JOBID
MYLOCALSCRATCH=/localscratch/$LSB_JOBID
MYLOCALSCRATCH5TB=/localscratch5tb/$LSB_JOBID
export MYSANSCRATCH MYLOCALSCRATCH MYLOCALSCRATCH5TB
# cd to remote working directory, only exists on mw256fd nodes
cd $MYLOCALSCRATCH5TB
pwd
Trinity --seqType fq --max_memory 240G \
--left /sanscratch/tearley/seqs/illumina/fwd.fastq.gz \
--right /sanscratch/tearley/seqs/illumina/rev.fastq.gz --CPU 24 \
--long_reads /sanscratch/tearley/seqs/pacbio/ccs.fasta
# copy results to home (fix this brute force, keep results only...)
scp -r ./trinity_out_dir ~/trinity_out_dir_$LSB_JOBID
====== Sequencing Tools #1 ======
All of these are located in /share/apps/CENTOS6/python/3.5.2/
**Deepbinner**: a deep convolutional neural network barcode demultiplexer for Oxford Nanopore reads
* set up environment, read Deepbinner.install
* deepbinner --help
* https://github.com/rrwick/Deepbinner
* fcohanlab (ndelgaudio)
**Canu**: a single molecule sequence assembler for genomes large and small.
* set up environment, read Deepbinner.install
* canu --help
* https://github.com/marbl/canu
** Eigen**: Eigen is a C++ template library for linear algebra: matrices, vectors, numerical solvers, and related algorithms.
* http://eigen.tuxfamily.org/index.php?title=Main_Page
* required for nanopolish which had a lot of problems
* read nanopolish.install
** Nanopolish ** : Software package for signal-level analysis of Oxford Nanopore sequencing data.
* https://github.com/jts/nanopolish
* this build had a lot of problems
* remove all reference to openmp in this package
* Nicole reports it works; probably use on command line on cottontail2 only
**BioPython** : Aset of freely available tools for biological computation
* https://biopython.org/
* Also needs Gnu parallel
* https://www.gnu.org/software/parallel/
**Samtools**: is a suite of programs for interacting with high-throughput sequencing data. It consists of three separate repositories:
* Samtools Reading/writing/editing/indexing/viewing SAM/BAM/CRAM format
* BCFtools Reading/writing BCF2/VCF/gVCF files and calling/filtering/summarising SNP and short indel sequence variants
* HTSlib A C library for reading/writing high-throughput sequencing data
* https://www.htslib.org/
* read samtools.install
**Minimap2**: A versatile pairwise aligner for genomic and spliced nucleotide sequences
* https://lh3.github.io/minimap2/
* read minimap2.install
====== ZENO ======
* A software tool based on Monte Carlo methods, ZENO computes material properties at nanoscale
* https://github.com/usnistgov/ZENO
* /share/apps/CENTOS6/zeno5
**Caution** Consult file ''/share/apps/src/cottontail2/ZENO_configure'' regarding the fix that was applied to '' SphereCenterModel/NanoFLANNSort.cc '' which is not a fix at all. Your mileage may vary.
# zeno-mpi was compiled with this MPI flavor
/share/apps/CENTOS6/miniconda2/bin/mpicc
/share/apps/CENTOS6/miniconda2/bin/mpicxx
# also setup these
export NANOFLANN_DIR=/share/apps/CENTOS6/nanoflann
export SPRNG_DIR=/share/apps/CENTOS6/sprng5
====== nanoflann ======
* Nearest Neighbor library (header only)
* does not need compiling except for test suite (requires c++11 and gcc 4.8.2 devtools)
* test suite in build/bin directory
* /share/apps/CentOS6/nanoflann
====== SPRNG ======
* The Scalable Parallel Random Number Generators Library (SPRNG)
* http://sprng.org/
* /share/apps/CENTOS6/sprng5/
* no mpi (fails)
====== Galario ======
Newer CPU+GPU standalone library version (top of page)\\
--- //[[hmeij@wesleyan.edu|Henk]] 2021/06/29 14:38//
Newer CPU only version installed in Miniconda3 (top of page)\\
Miniconda3 is using python 3.8\\
--- //[[hmeij@wesleyan.edu|Henk]] 2021/06/22 15:41//
Accelerated Library for Analysing Radio Interferometer Observations (CPU only)
* /share/apps/CENTOS6/miniconda2
* contains python 2.7.15
* packages galario, numpy, scipy, matplotlib, argparse, corner, time and others ('conda list' for full listing)
* packages mpi4py, emcee, astropy
[hmeij@cottontail ~]$ conda --version
conda 4.5.4
$HOME/.bashrc for galario
# added by Miniconda2 installer
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
package | build
---------------------------|-----------------
certifi-2021.5.30 | py37h89c1867_0 141 KB conda-forge
libblas-3.8.0 | 17_openblas 11 KB conda-forge
libcblas-3.8.0 | 17_openblas 11 KB conda-forge
liblapack-3.8.0 | 17_openblas 11 KB conda-forge
libopenblas-0.3.10 |pthreads_hb3c22a3_4 7.8 MB conda-forge
libstdcxx-ng-9.3.0 | h6de172a_19 4.0 MB conda-forge
numpy-1.21.0 | py37h038b26d_0 6.1 MB conda-forge
pandas-1.2.5 | py37h219a48f_0 11.8 MB conda-forge
pip-21.1.3 | pyhd8ed1ab_0 1.1 MB conda-forge
python-3.7.3 | h5b0a415_0 35.7 MB conda-forge
python-dateutil-2.8.1 | py_0 220 KB conda-forge
python_abi-3.7 | 2_cp37m 4 KB conda-forge
pytz-2021.1 | pyhd8ed1ab_0 239 KB conda-forge
setuptools-49.6.0 | py37h89c1867_3 947 KB conda-forge
six-1.16.0 | pyh6c4a22f_0 14 KB conda-forge
wheel-0.36.2 | pyhd3deb0d_0 31 KB conda-forge
====== FreeSurfer ======
FreeSurfer is a software package for the analysis and visualization of structural and functional neuroimaging data from cross-sectional or longitudinal studies
* http://freesurfer.net/fswiki/FreeSurferWiki
* /share/apps/CENTOS6/freesurfer/6.0.0
* gpu version local to n78
* sample programs /home/hmeij/freesurfer
* mindlab (ploui et al)
====== FSL ======
FSL is a comprehensive library of analysis tools for FMRI, MRI and DTI brain imaging data.
* https://fsl.fmrib.ox.ac.uk/fsl/fslwiki
* /share/apps/CENTOS6/fls/5.0.10/bin
* gpu version local on n78
* sample programs /home/hmeij/fsl
* mindlab (ploui et al)
====== Campari ======
Molecular Modeling Software
* www: http://campari.sourceforge.net/V3/index.html
* /share/apps/CENTOS6/campari/3
* serial and threads version
* no mpi version
* Uses FFTW 3.3.6 (see section to set env)
NOTE: Lots of errors in compiling serial mode.
NOTE2: When compiling campari_threads ewaldo.f90 fails terribly
====== FastTree ======
FastTree infers approximately-maximum-likelihood phylogenetic trees from alignments of nucleotide or protein sequences.
* www: http://www.microbesonline.org/fasttree v2.1
* /share/apps/bin/
====== SageMath =====
Mathematics software, a viable free open source alternative to Magma, Maple, Mathematica and Matlab.
* http://www.sagemath.org/
* /share/apps/sage/7.6
* executables: sage (linked in via /share/apps/bin/sage)
# compiled against CentOS 6.4 (sharptail6) using this python version
export PATH=/share/apps/CENTOS6/python/2.7.9/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.9/lib:$LD_LIBRARY_PATH
====== Blast =====
The National Center for Biotechnology Information advances science and health by providing access to biomedical and genomic information.
* https://www.ncbi.nlm.nih.gov/guide/data-software/
* /share/apps/ncbi-blast-2.5.0+/bin
* executables: blastn, blastp, etc
====== JAGS =====
Bayesian hierarchical models using Markov Chain Monte Carlo (MCMC) simulation that can be controlled from R.
* v4.2.0 libs and startup script at /share/apps/CENTOS6/jags/4.2.0
* Probably need to set PATH and LD_LIBRARY_PATH, examples on this page
* http://mcmc-jags.sourceforge.net/
====== BLCR =====
Compiled for every queue, for usage consult [[cluster:147|BLCR Checkpoint in OL3]] and [[cluster:148|BLCR Checkpoint in OL3]] for serial and parallel approaches.
* program: cr_run, cr_restart and cr_checkpoint
* install dir: /share/apps/blcr/0.8.5/queue
====== PyTransit ======
Astronomy package for exoplanet searches.
* www: https://github.com/hpparvi/PyTransit
* astrolab
Installed into Python 2.7.10 on CentOS6.x,
making sure CentOS gcc and gfortran are first in path.
# select vanilla python to use for centos6.x 64 bit
# also installed easy_install and pulled down new Numpy
export PATH=/share/apps/CENTOS6/python/2.7.10/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.10/lib:$LD_LIBRARY_PATH
# notes: had to change --opt flag
# had to use the MAC line for compilation to avoid the invalid ELF header error
python setup.py config_fc --fcompiler=gnu95 --opt="-03" \
--f90flags="-cpp -fopenmp -march=native -mno-avx" build
python setup.py install
Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/PyTransit-1.0-py2.7.egg-info
# later added
Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.10.1-py2.7-linux-x86_64.egg
Finished processing dependencies for emcee==2.1.0
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/astropy-1.1rc2-py2.7-linux-x86_64.egg
Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/scipy-0.16.1-py2.7.egg-info
# make sure correct mpi tools
env MPICC=/path/to/mpicc pip install mpi4py
For MPI4py had to force MPI-2
export PATH=/home/apps/openmpi/1.2+intel-10/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/openmpi/1.2+intel-10/lib:$LD_LIBRARY_PATH
python setup.py build --configure
python setup.py install
And added on 09Aug2017
https://github.com/lkreidberg/batman
Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/batman_package-2.4.5-py2.7.egg-info
There is also a version in /share/apps/CENTOS6/blcr_soft with PyTransit and Emcee
====== OpenMP ======
Intel compiler library for parallel jobs.
* www: http://openmp.org
* /share/apps/CENTOS6/libomp/
Add to your LD_LIBRARY_PATH
====== LigPlus ======
Automatically generates schematic diagrams of protein-ligand interactions for a given PDB file.
* www: https://www.ebi.ac.uk/thornton-srv/software/LigPlus
* /share/apps/LigPlus/lib/exe_linux
====== MACS ======
Next generation parallel sequencing technologies made chromatin immunoprecipitation followed by sequencing (ChIP-Seq) a popular strategy to study genome-wide protein-DNA interactions, while creating challenges for analysis algorithms. .
* www: http://liulab.dfci.harvard.edu/MACS/
* /share/apps/CENTOS6/python/lib/python2.7/site-packages/
Should use Python 2.7.9 (see python section on setting up environment) on CentOS6 nodes.
====== Modeller ======
MODELLER is used for homology or comparative modeling of protein three-dimensional structures.
* program: mod9.13 (either invoked directly or with python, see below)
* install dir: /home/apps/modeller/9.13
* examples, demos and guides also posted
* www: https://salilab.org/modeller/
Please read the license at **''https://salilab.org/modeller/registration.html''**
How to use modeller: read /share/apps/modeller/9.13/INFO.TXT or visit web site.
====== Miriad ======
Miriad, a package of highly-specialized radio astronomy software.
* program: see below, binary download, no requirement checks.
* www: http://www.atnf.csiro.au/computing/software/miriad/INSTALL.html
* astrolab (kflaherty)
# Define your Miriad environment by invoking the initialization scripts
% . MIRRC.sh
# in sh-like shells, or
% source MIRRC
====== Astropy ======
Compiled with /share/apps/python/2.6.1/bin/python ...
* program: see below
* www: http://www.astropy.org/
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/astropy-0.3.1-py2.6-linux-x86_64.egg
Processing dependencies for astropy==0.3.1
Searching for numpy==1.6.2
Best match: numpy 1.6.2
Processing numpy-1.6.2-py2.6-linux-x86_64.egg
numpy 1.6.2 is already the active version in easy-install.pth
Installing f2py script to /share/apps/python/2.6.1/bin
Using /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg
Finished processing dependencies for astropy==0.3.1
====== Emcee + MPI4py ======
For the latest version of emcee+mpi4py (03/15/2019) see the section of python 3.5.2
https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#python
Which is also used by the biology group for
https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#sequencing_tools
Compiled with /share/apps/python/2.6.1/bin/python ...
* program: see below
* www: http://dan.iel.fm/emcee
* astrolab (kflaherty)
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/emcee-2.1.0-py2.6.egg
Processing dependencies for emcee==2.1.0
Searching for numpy==1.6.2
Best match: numpy 1.6.2
Processing numpy-1.6.2-py2.6-linux-x86_64.egg
numpy 1.6.2 is already the active version in easy-install.pth
Installing f2py script to /share/apps/python/2.6.1/bin
Using /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg
Finished processing dependencies for emcee==2.1.0
Writing /share/apps/python/2.6.1/lib/python2.6/site-packages/mpi4py-1.3.1-py2.6.egg-info
# it found /home/apps/openmpi/1.2+intel-10/bin/mpicc so be sure to use mpirun from there too
export PATH=/home/apps/openmpi/1.2+intel-10/bin:$PATH
Also added emcee to python 2.7.10, see the section on pyTransit
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/argparse-1.4.0-py2.6.egg
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/pandas2-0.0.0-py2.6.egg
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/subprocess32-3.5.3-py2.6-linux-x86_64.egg
====== PyRosetta ======
(Rosetta 2.2, 3.0 and 3.5 also are installed)
Compiled with /share/apps/python/2.6.1/bin/python ...
* program: PyRosetta Bundle 2014wk05 in bin suddirectory
* install dir: /home/apps/pyrosetta/2014wk05
* doxygen in html dir, user guide also posted
* demos and tools also posted
* www: http://www.rosettacommons.org/
Please read the license at **''/share/apps/pyrosetta/2014wk05/License.2014wk05''**
Note: although it probably does not matter, the python compile was performed under CentOS6 because latest phenix version requires that. So target queues ''mw256'' and ''mw256fd''.
====== Gromacs =====
Gromacs-2020.3 for GPU (RTX2080S)
* ''exx96'' queue
* see /share/apps/CENTOS7/gromacs/2020.3.install for env
* change env in script below
Gromacs-2018 for GPU (RTX2080S)
* ''exx96'' queue
* **/zfshomes/hmeij/k20redo/run.rtx sample script**
Gromacs-2018 for GPU (K20/GTX1080
* [[cluster:164|GTX 1080 Ti]]
* [[cluster:173|K20 Redo Usage]]
Gromacs 5.1.4: there is only one version
* cpu plus mvapich2 mpi
* uses the gpu env as below but do not request a gpu resource
* can run on both mwgpu and mw256fd in cpu/mpi mode
* sample /home/hmeij/jobs/gpu/gromacs2016.cpu.mpi
Gromacs 5.1.2: there are four versions
* cpu only (thread_mpi), cpu-mpi (openmpi), cpu-gpu (thread_mpi), mpi-gpu (openmpi)
* compilations are done a bit differently switching to mpicc/mpiCC for mpi versions
* consult the file /share/apps/CENTOS6/gromacs/build-5.1.2.txt to set up your environment
* There is a template script at **/home/hmeij/jobs/gpu/gromacs2016.sub**
* it gets as far as launching gromacs at least
* no more ''mdrun'' replaced by ''gmx''
* and use the correct wrapper program, read [[cluster:119|Submitting GPU Jobs]]
For the latest version there are 6 compilations. Gromacs developers are pretty verbal in terms of "compile on a host where you are goig to run it". For v5.0.1, the nomeclature is like this:
* /path/to/gromacs/5.0.1-[cpu only "icc"|mpi|destination queue(s)]
* you will need to setup the proper environment, see below
* /share/apps/gromacs/5.0.1-icc-hp12
* /share/apps/gromacs/5.0.1-mpi-hp12
* /share/apps/gromacs/5.0.1-icc-mw256 [all queues starting with this string]
* /share/apps/gromacs/5.0.1-mpi-mw256 [all queues starting with this string]
* /share/apps/gromacs/5.0.1-icc-mwgpu
* /share/apps/gromacs/5.0.1-mpi-mwgpu
The file /share/apps/gromacs/build-5.0.1.sh will show you notes on compiling.
Update: The GPU versions failed to compile with icc/icpc but did compile with gcc. will investigate later on this strange behavior.
--- //[[hmeij@wesleyan.edu|Meij, Henk]] 2014/10/01 15:33//
# for all versions
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/gromacs/5.0.1 [-icc or -mpi] [-hp12 or -mw256 or -mwgpu] /bin/GMXRC.sh
# for all "icc versions" you are done
# for all "mpi versions" add this
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
# for all "gpu versions" add this
export PATH=\
/cm/shared/apps/gcc/4.7.0/bin:/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\
/cm/shared/apps/lammps/cuda/2013-01-27/:/cm/shared/apps/amber/amber12/bin:\
/cm/shared/apps/namd/ibverbs-smp-cuda/2013-06-02/:/usr/lib64/qt-3.3/bin:\
/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/sbin:/usr/sbin:\
/cm/shared/apps/cuda50/toolkit/5.0.35/bin:/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\
/cm/shared/apps/cuda50/libs/current/bin:/cm/shared/apps/cuda50/toolkit/5.0.35/open64/bin:\
/cm/shared/apps/mvapich2/gcc/64/1.6/bin:/cm/shared/apps/mvapich2/gcc/64/1.6/sbin
export LD_LIBRARY_PATH=\
/cm/shared/apps/gcc/4.7.0/lib:/cm/shared/apps/gcc/4.7.0/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:/cm/shared/apps/amber/amber12/lib:\
/cm/shared/apps/amber/amber12/lib64:\
/cm/shared/apps/namd/ibverbs-smp-cuda/2013-06-02/:/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:\
/cm/shared/apps/cuda50/libs/current/lib64:/cm/shared/apps/cuda50/toolkit/5.0.35/open64/lib:\
/cm/shared/apps/cuda50/toolkit/5.0.35/extras/CUPTI/lib:/cm/shared/apps/mvapich2/gcc/64/1.6/lib
[[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:119| 4.6 GPU Job examples]]
* program: mdrun (v4.6)
* install dir: /share/apps/gromacs/4.6-icc
* docs: on the web
* www: http://www.gromacs.org
* Compiled with ICC, you'll need this line (replace dot with source in csh)
* Can run native ThreadMPI for parallel jobs but confined to a single node.
# source for Intel compiler setup
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
# and source the GMXRC for your shell
. /share/apps/gromacs/4.6-icc[or -mpi]/bin/GMXRC.[bash|zsh|csh]
* program: mdrun_mpi (v4.6)
* install dir: /share/apps/gromacs/4.6-mpi
* docs: on the web
* www: http://www.gromacs.org
* Compiled against OpenMPI, you'll need these lines in addition to lines above
* Can run parallel jobs across multiple nodes (use the lava wrapper program as usual)
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
====== Mathematica =====
[[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:119|GPU version]]
* program v13,2.1 installed
* program v13.2 installed
* program: mcc, math, mathematica (v 12.2)
* install dir: /share/apps/CENTOS7/mathematica/12.2
* docs: on the web
* www: http://www.wolfram.com/mathematica/
* Centos7, use queues mwgpu, amber128 or exx96
* For GUI launch on greentail52
* Parallel kernels
* http://reference.wolfram.com/mathematica/ParallelTools/tutorial/ConnectionMethods.html
* Parallel Computing
* http://reference.wolfram.com/mathematica/guide/ParallelComputing.html
====== Eigen =====
* program: headers files in /share/apps/eigen/3.0.5 or with cmake in /share/apps/eigen/3.0.5/build
* install dir: see above
* docs: on the web
* www: http://eigen.tuxfamily.org/index.php?title=Main_Page
Found Intel fortran compiler, so may have to source (albeit it should be platform independent)
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/intel/composerxe/bin/ifortvars.sh intel64
====== EON ======
* program: eonclient (client) and python (version 2.6.1, see PyPAL section, requires Numpy)
* install dir: client and server, see above
* docs: on the web
* www: http://theory.cm.utexas.edu/eon/
SERVER
export PATH=/home/apps/python/2.6.1/bin/:$PATH
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/eon-0.0.0-py2.6.egg
CLIENT
eonclient -h
Usage: eonclient [options] inputConfile [outputConfile]
Job Type:
-m Minimization of inputConfile saves to outputConfile
-s Single point energy of inputConfile
-o Optimization method [default: qm]
Required Options:
-p The potential (e.g. qsc, lj, eam_al)
====== SAS ======
SAS (v9.2 64 bits) is now available on the compute clusters. This software is an enormous statistical package and lots more, see http://www.sas.com It should be invoked in batch mode on the compute nodes just like all other software (ie command like "sas filename.sas" in job submission scripts). Some pointers, code, manual and an online tutor can be found at
http://sas.wesleyan.edu
The QAC has well trained tutors in SAS, for more info http://qac.wesleyan.edu
====== Wordom ======
* program: wordom v0.22 32 bit
* install dir: /usr/local/bin/ on greentail only
* docs: /usr/local/docs/wordom_ug.pdf
* www: http://wordom.sourceforge.net/
Note: Lapack and Blas packages are needed and were installed on host greentail only. Hence the program will only work on this host. If it needs to be available on compute nodes, please let me know.
====== unixODBC ======
* program: API with which to access Data Sources
* install dir: installed with ''yum install unixODBC''
* docs:
* www: http://www.unixodbc.org/
This package works together with mysql-connector-odbc-5.1.8.-1.rhel5.x86_64.rpm (from http://dev.mysql.com/downloads/connector/odbc/ installed in OS) and pyodbc (from http://code.google.com/p/pyodbc/downloads/list installed into python 2.7.2).
====== OMSSA ======
* program: Open Mass Spectrometry Search Algorithm (OMSSA), from NCBI.
* install dir: /home/apps/omssa
* docs:
* www: http://pubchem.ncbi.nlm.nih.gov/omssa
====== go language ======
* program: g6
* install dir: /home/apps/go
* docs: doc/ in install_dir
* www: http://golang.org/
export PATH=/share/apps/go/bin:$PATH
====== Correlation Analysis ======
* program: several, consult the README file
* install dir: /share/apps/mmdevel
* docs: browse install dir
* www: http://atlas.physbio.mssm.edu/~tom/correl.php
Notes: "coulomb" did not compile with OpenMPI, MVApich1 or 2, but finally compiled against MVApich 0.99 (upgrade to 1.2)... the rest was compiled with gcc an 6g (from go language). In order to run the MPI binary, do not use a our lsf/lava wrappers but build up the command line with -np and -machinefile parameters.
export GOROOT=/share/apps/go
export PATH=/share/apps/go/bin:/share/apps/mvapich/1.2+gnu-4.1.2+gen2/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/mvapich/1.2+gnu-4.1.2+gen2/lib:$LD_LIBRARY_PATH
====== FFTW MPI ======
* 3.3.6 compiled with icc/ifort
* /configure --prefix=/share/apps/fftw/3.3.6 \
* CC=icc CXX=icpc F77=ifort FC=ifort CFLAGS=-O3\
* --enable-openmp --enable-threads
* NOTE: --enable-mpi did not work
export PATH=/share/apps/fftw/3.3.6/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/fftw/3.3.6/lib:$LD_LIBRARY_PATH
* program: libfftw_mpi.a
* install dir: /home/apps/fftw/2.1.5_mpi
* docs: also consult the lammps section for FFTW
* www: see below
Note: Relies on v2 of FFTW (a C subroutine library for computing the discrete Fourier transform), view http://www.fftw.org/
Compiled as follows, you may have to set up your environment similarly:
export PATH=/share/apps/intel/cce/10.0.025/bin:\
/share/apps/openmpi/1.2+intel-10/bin:\
$PATH
export LD_LIBRARY_PATH=/share/apps/intel/cce/10.0.025/lib:\
/share/apps/openmpi/1.2+intel-10/lib:\
/share/apps/intel/cmkl/10.0.011/lib/em64t:\
$LD_LIBRARY_PATH
./configure --prefix=/share/apps/fftw/2.1.5_mpi \
--enable-mpi CC=icc CXX=icpc F77=ifort FC=ifort
====== Plumed ======
* v2.3
* on node n78:/usr/local/plumed-2.3
* compiled with mpich3 mpicxx
* v 1.x
* program: a portable plug-in for free energy calculations with molecular dynamics
* install dir: /home/apps/amber/9serial+ifort+plumed
* docs: inside install dir
* www: http://merlino.mi.infn.it/~plumed/PLUMED/Home.html
You may to have set the following:
export MKL_HOME=/share/apps/intel/cmkl/9.1.021
export AMBERHOME=/share/apps/amber/9serial+ifort+plumed
export LD_LIBRARY_PATH=/share/apps/intel/cmkl/9.1.021/lib/64:$LD_LIBRARY_PATH
====== SDA ======
* program: Simulation of Diffusional Association, programs in bin/ directory
* install dir: /home/apps/sda/5.01
* docs: inside install dir
* www: http://projects.villa-bosch.de/mcmsoft/sda/5.01/index.html
====== LAMMPS CPU/GPU ======
** 7Feb2024 **
* /share/apps/CENTOS7/lammps/7Feb2024 (feature version vs latest stable)
* lmp_serial and lmp_mpi
* 3 cuda versions with 3 different precision settings
* compiled against cuda 10.2 on exx96, may work in mwgpu cuda-11.2
* notes: /share/apps/CENTOS7/lammps/7Feb2024.install (setup env, look for export statements)
* does **not** support openmp, look for the OpenHPC version for that
** 25Apr2023**
* /share/apps/CENTOS7/lammps/25Arp2023
* lmp_serial and lmp_mpi
* 3 cuda versions with 3 different precision settings
* notes: /share/apps/CENTOS7/lammps/25Apr2023.install
* does **not** support openmp, look for the OpenHPC version for that
* 05/24/2023 remade with package CREATION added
* also read miniconda3 section for tensorflow/pacemaker info
# CPU
export PATH=/share/apps/CENTOS7/gcc/6.5.0/bin/:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/gcc/6.5.0/lib64:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH
which mpirun mpicc python gcc
# CPU + GPU (exx96 queue, may run on test and amber128 queues)
export CUDA_HOME=/usr/local/cuda
export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH
which nvcc
# related for starrlab
# https://pacemaker.readthedocs.io/en/latest/pacemaker/install/
[hmeij@cottontail2 ~]$ module load miniconda3/py39
[hmeij@cottontail2 ~]$ which pacemaker
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/pacemaker
[hmeij@cottontail2 ~]$ pacemaker --version
pacemaker/pyace version: 0.2.7+77.g8b40c11
ace_evaluator version: 2022.6.27
[hmeij@cottontail2 ~]$ python
Python 3.9.10 | packaged by conda-forge | (main, Feb 1 2022, 21:24:11)
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyace as py
>>>
** USER_QUIP **
This is a version of lammps (29Oct20, last stable) that contains all packages with user-quip.
It does compile with MPI (mpif90), not ifort, in CentOS7 ... was tricky, solution for MPI was the ''mpifort --showme:link'' answer
* Requires CentOS 7 so queues: mwgpu, amber128, exx96 for cpu-only binaries
* Rwequires CentOS7 plus Cuda 10.2 so queue: exx96 for cpu+gpu binaries
Read, if you wish
* /share/apps/CENTOS7/lammps/QUIP-public.install
* /share/apps/CENTOS7/lammps/29Oct2020.install
**Environments**
# serial - cpu only (although quippy compiled, it refused to install)
# MPI STUBS is compiled in
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
/share/apps/CENTOS7/lammps/29Oct2020/lmp_serial-with-quip.cpu -h
# mpi - cpu only
# gfortran_openmpi
export QUIP_LIBRARY=/share/apps/CENTOS7/lammps/QUIP-public/9Apr2021/linux_x86_64_gfortran_openmpi
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH
/share/apps/CENTOS7/lammps/29Oct2020/lmp_mpi-with-quip.cpu -h
# mpi - cpu+gpu
# gfortran_openmpi arch
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
export QUIP_LIBRARY=/share/apps/CENTOS7/lammps/QUIP-public/9Apr2021/linux_x86_64_gfortran_openmpi
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH
export CUDA_HOME=/usr/local/cuda
export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH
# the "make" versions of lmp with quip do not work, deleted them
# did find on nvdia dev forum reference problem disappears when using cmake (vs make)
# these lmp binaries do work
/share/apps/CENTOS7/lammps/29Oct2020/single_single/bin/lmp
/share/apps/CENTOS7/lammps/29Oct2020/single_double/bin/lmp
/share/apps/CENTOS7/lammps/29Oct2020/double_double/bin/lmp
# however I had to load non-mpi version of libquid.a into lmp
# witch seems ok given parallel work is driven by -suffix gpu
# sample script: ~hmeij/k20redo/run.rtx.lammps2
** CPU only ** -Centos6
* /share/apps/CENTOS6/lammps/29Oct20.install
* read for env and compilation env: lmp_mpi
** CPU only OR CPU+GPU** for ''exx96'' queue - CentOS7
* 3Mar2020 added USER-REAXC
* For KOKKOS MPI aware version /share/apps/CENTOS7/openmpi/4.0.4-ucx with UCX
* git download of latest did not work (lots of fortran errors?)
* latest tarball is still 3Mar2020 (?)
* added user-eaxc (but had to remove package python)
* "kokkos" version includes GPU accelerator
* please read **/share/apps/CENTOS7/lammps/3Mar2020.install **
# CPU
/share/apps/CENTOS7/lammps/3Mar2020/lmp_serial+user-omp+user-reaxc
/share/apps/CENTOS7/lammps/3Mar2020/lmp_mpi+user-omp+user-reaxc
# CPU+GPU
/share/apps/CENTOS7/lammps/3Mar2020/lmp_mpi-single-single_kokkos_cuda_mpi+user-omp+user-reaxc
# env
export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS7/python/3.8.3/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/python/3.8.3/lib:$LD_LIBRARY_PATH
export CUDA_HOME=/usr/local/cuda
export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib:$LD_LIBRARY_PATH
** CPU ONLY v2 CentOS7 **
* add user-opm on next compile ... done ... binaries with string +user-omp
* /share/apps/CENTOS7/lammps/3Mar2020
* lmp_serial, lmp_mpi
* //latest// openmpi version as of today v4.0.4
* consult this file for packages installed and compilations
* /share/apps/CENTOS7/lammps/03Mar2020.install
* should work only on exx96 and mwgpu queues
export PATH=/share/apps/CENTOS7/openmpi/4.0.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS7/openmpi/4.0.4/lib:$LD_LIBRARY_PATH
** CPU ONLY **
* /share/apps/CENTOS6/lammps/3Mar2020
* lmp_mpi, lmp_serial
* consult this file for packages installed and compilations
* /share/apps/CENTOS6/lammps/03Mar2020.install
* should work on all queues
* sample script in ~hmeij/k20redo/run.cpu.lammps
** CPU or GPU **
* added lmp_mpi-single-single+user-omp
* /share/apps/CENTOS7/lammps/3Mar2020
* lmp_mpi-[double-double|single-double|single-single]
* consult this file for packages installed and compilations
* /share/apps/CENTOS7/lammps/03Mar2020.install
* CUDA_ARCH = sm_75 (probably only works on queue exx96)
* **sample script in ~hmeij/k20redo/run.rtx.lammps**
# simple colloid example
CPU run -n1
Performance: 493311 tau/day, 1141 timesteps/s
GPU run gpu4=1 double-double: 1.1x
Performance: 571354 tau/day, 1322 timesteps/s
GPU run gpu4=1 single-double: 1.2x
Performance: 608498 tau/day, 1408 timesteps/s
GPU run gpu4=1 single-single: 1.3x
Performance: 623405 tau/day, 1443 timesteps/s
* /share/apps/CENTOS6/lammps/22Aug18
* consult [[cluster:161|lammps-11Aug17]] page
* /share/apps/CENTOS6/lammps/31MAR17
* **lmp_serial_icc, lmp_mpi (with Kiss FFT)**
* **lmp_gpu_single, lmp_gpu_double and lmp_gpu_mixed** with the accelerator library (arch=35) consult [[http://lammps.sandia.gov/doc/Section_accelerate.html]]
* Please read the lammps-install.txt file, it did not all go according to docs.
* Sample job: /home/hmeij/lammps/colloid-gpu/run.gpu2
* **lmp_gnu, lmp_gpu** done the "malaysian" way, consutl the lammps-install-usm.txt file
* this serial and gpu version will have more packages
* /share/apps/CENTOS6/blcr_soft/lammps/16Feb16
* can be checkpointed
* /share/apps/CENTOS6/lammps/10AUG15
* **lmp_serial_icc, lmp_mpi, lpm_mpi-with-fft**
* **lmp_cuda_single, lmp_cuda_double** with the USER-CUDA accelerator library (arch=35) consult [[http://lammps.sandia.gov/doc/accelerate_cuda.html]]
* **lmp_mpi.[1,2,3,4]** correspond to the precision levels below
precision=N to set the precision level
N = 1 for single precision (default)
N = 2 for double precision
N = 3 for positions in double precision
N = 4 for positions and velocities in double precision
To use any of these you will need to setup the environment, like detailed below (PATH, LD_LIBRARY_PATH). Here is the environment showing you the version of the software used in compilation of ''lmp_cuda_double''. The other versions used the same compiler (ICC) and MPI version (OpenMPI 1.8.4).
Note: According to the docs with USER-CUDA you must have 1 cpu MPI process for 1 GPU, so 1:1, 2:2 ...
### USER-CUDA sample script /home/hmeij/sharptail/run_user-cuda.gpu
export PATH=/home/apps/bin:\
/bin:\
/usr/bin:\
/usr/local/sbin:\
/usr/sbin:\
/sbin:\
/usr/sbin:\
/usr/lib64/qt-3.3/bin:\
/cm/local/apps/cuda50/libs/304.54/bin:\
/cm/shared/apps/cuda50/toolkit/5.0.35/bin:\
/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\
/cm/shared/apps/cuda50/libs/current/bin:\
/cm/shared/apps/cuda50/toolkit/5.0.35/open64/bin:\
/cm/shared/apps/mvapich2/gcc/64/1.6/bin:\
/cm/shared/apps/mvapich2/gcc/64/1.6/sbin:\
/cm/shared/apps/lammps/cuda/2013-01-27/:\
/share/apps/CENTOS6/openmpi/1.8.4/bin:\
/share/apps/CENTOS6/python/2.7.9/bin:\
/share/apps/CENTOS6/lammps/10Aug15:\
/home/hmeij/bin:\
$PATH
export LD_LIBRARY_PATH=/cm/local/apps/cuda50/libs/304.54/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/lib:\
/cm/shared/apps/cuda50/libs/current/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/open64/lib:\
/cm/shared/apps/cuda50/toolkit/5.0.35/extras/CUPTI/lib:\
/share/apps/CENTOS6/openmpi/1.8.4/lib/:\
/share/apps/CENTOS6/python/2.7.9/lib:\
/cm/local/apps/cuda50/libs/304.54/lib64:\
/cm/local/apps/cuda50/libs/304.54/lib/:\
$LD_LIBRARY_PATH
[[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:119|go to How to submit GPU jobs with Lammps ]]
* program: (compiled with mpicxx, different than presious versions
* install dir: /home/apps/CENTOS6/lammps/**9Dec14** ... consult the README file to set up environment
Note: all versions have package yes-colloid installed.
* **lmp_serial_icc, lmp_mpi, lmp_mpi-with-fft** (compiled with icc v12 and mpicxx v1.4.4)
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/intel/composerxe/bin/ifortvars.sh intel64
export MKL_HOME=/share/apps/intel/composerxe/mkl
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
* **lmp_serial, lmp_openmpi** (compiled with gcc 4.8.4 and mpicxx 1.8.4)
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/1.8.4/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS6/openmpi/1.4.4/bin:$PATH
* program: lmp_serialwes, lmp_linuxwes, lmp_tbirdwes (linux and tbird compiled with openmpi 1.4.4)
* install dir: /home/apps/lammps/**01Feb14** ... consult the README file to set up environment
* program: lmp_serialwes, lmp_linuxwes, lmp_tbirdwes (linux and tbird compiled with openmpi 1.4.4)
* install dir: /home/apps/lammps/**06Feb12** ... consult the README file
* program: lmp_something_wes
* install dir: /home/apps/lammps/22Jun11
* docs: online at web site
* www: http://lammps.sandia.gov/
Please consult the README file to set up your environment properly. The '22Jun11' LAMMPS version was compiled with Intel compilers v12 and the OpenMPI mpicc compiler of v1.4.3
* program: lmp_something_wes
* install dir: /home/apps/lammps/30Mar10
* docs: online at web site
* www: http://lammps.sandia.gov/
Note: Relies on v2 of FFTW (a C subroutine library for computing the discrete Fourier transform), view [[http://www.fftw.org/|http://www.fftw.org/]]
Compiled with a variety of compilers and MPI flavors. Your mileage may vary as serious warnings were generated at compile time. However, lmp_serial_wes and lmp_tbird_wes compiled without any errors.
You may have redefine your PATH and LD_LIBRARY_PATH, please consult
/share/apps/lammps/30Mar10/README
* program: pizza.py
* install dir: /home/apps/lammps/pizza-9Oct15
* docs: inside install dir
* www: http://www.sandia.gov/~sjplimp/pizza.html
* program: pizza.py
* install dir: /home/apps/lammps/pizza-24Nov10
* docs: inside install dir
* www: http://www.sandia.gov/~sjplimp/pizza.html
Pizza.py is a loosely integrated collection of tools, many of which provide pre- and post-processing capability for the LAMMPS molecular dynamics, ChemCell cell modeling, and SPPARKS kinetic Monte Carlo packages. (yea, "loosely" is right). In order for this to work a new python flavor was installed. Please read the section about ''python/2.6.1.tcltk'' in the python section [[cluster:73#python|here]]
====== Libraries ======
* fortran functions of all kind
* install dir: /home/apps/libraries/libs (links to all)
* docs: on the web
Please note the appropriate citation and reference guides for each library (found on web site).
* fft, fast fourier transforms
* libfftpack.a
* http://www.scd.ucar.edu/css/software/fftpack5
* lapack 3.2.1
* .../lapack/3.2.1/ [blas_intel.a, lapack_intel.a, tmglib_intel.a]
* http://www.netlib.org/lapack/
* minpack (version less)
* .../minpack/libminpack_intel.a
* http://www.netlib.org/minpack/index.html
* quadpack (version less)
* .../quadpack/libquadpack_intel.a
* http://www.netlib.org/quadpack/index.html
* ranlib.f 1.1
* .../ranlib.f/librandpack_intel.a
* http://www.netlib.org/random/
Note: compiled with ifort
====== GSL ======
* program: GNU Scientific Library (GSL) is a numerical library for C and C++ programmers.
* install dir: /home/apps/gsl/1.9
* docs: in man/ directory of install dir
* www: http://www.gnu.org/software/gsl/
Note: compiled with gcc
====== Newbend ======
* program: see the demo_ouput.txt files
* install dir: /home/apps/newbend
* docs: none
* www: none
Note: Contact David Beveridge for more information (dbeveridge@wes)
Please note, there is an 'Aborted' event in one of the demos. ''glibc'' throws an error documented in the demo_output.txt file. Seems to still generate output though.
====== Curves+ ======
* program: cur+, canal, cdif, sumr
* install dir: /home/apps/amber/Curves
* docs: on web site
* www: http://gbio-pbil.ibcp.fr/Curves_plus/Curves+.html
Note: compiled with ifort
====== Naccess ======
* program: naccess
* install dir: /home/apps/naccess/2.1.1
* docs: on web site
* www: http://www.bioinf.manchester.ac.uk/naccess/
Note: compiled with ifort
====== Phenix ======
Note: For internal, non-commercial, research and development purposes only. Consult the LICENSE file and Citing information. Users need to set their environment, examples shown.
Version below only works on queues ''mw256'' and ''mw256fd''
* program: phenix linux 2.6, x86_64, FC12
* install dir: /home/apps/phenix/phenix-1.8.4
* docs: in documentation/ directory of install dir
* www: http://www.phenix-online.org
# csh users:
source /share/apps/phenix/1.8.4/phenix-1.8.4-1496/phenix_env
# bash users:
source /share/apps/phenix/1.8.4/phenix-1.8.4-1496/phenix_env.sh
Version below only works on queue ''hp12''
* program: phenix, linux kernel 2.6, x86_64
* install dir: /home/apps/phenix/phenix-1.5-2
* docs: in documentation/ directory of install dir
* www: http://www.phenix-online.org
# bash users:
source /share/apps/phenix/phenix-1.5-2/phenix_env.sh
====== Expokit ======
* program: sample_? located in fortran subdirectory
* install dir: /home/apps/expokit
* docs: on the web
* www: http://www.maths.uq.edu.au/expokit/download.html
Note: our hardware/platform not supported, compiled with ifort (minor warnings)
====== WHAM ======
* program: wham (Weighted Histogram Analysis Method)
* install dir: /home/apps/wham/2.0.2
* docs: /home/apps/wham/2.0.2/docs
* www: http://membrane.urmc.rochester.edu/Software/WHAM/WHAM.html
Note: ''wham'' compiled with warnings using icc. ''wham-2d'' failed to compile.
====== Emacs ======
* on head node only in /usr/bin
* version 21.4.1
====== Autodock Vina ======
* we also have the autoduck4 program - check below (search for autodock4)
* update with 1.1.2 in /share/apps/autodock
* moved the beta version out of the way and linked to 1.1.2 version
* program: vina v 1.0 beta 03 (for molecular docking and virtual screening)
* install dir: /home/apps/bin
* docs:
* www: http://vina.scripps.edu
export PATH=/share/apps/autodock:$PATH
# or
export PATH=/share/apps/autodock/autodock_vina_1_1_2_linux_x86/bin:$PATH
Also linked in via /home/apps/bin which is in default PATH.
====== NetBeans ======
* program: several, v 6.0.1
* install dir: /usr/local/netbeans/6.0.1
* install dir: /usr/local/glassfish/v2ur1 http (28080), https (28181), admin (24848)
* install dir: /usr/local/apache-tomcat/6.0.14
* docs:
* www: http://netbeans.org
====== BaseMap ======
* program: Geometry Entry - Open Source (GEOS)
* install dir: /usr/local/lib v 2.2.3
* docs:
* www: http://geos.refractions.net
* program: Basemap (plot on map projections (with coastlines and political boundaries) using matplotlib.
* egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/basemap-0.99.3-py2.6.egg-info
* docs:
* www: http://matplotlib.sf.net/basemap/doc/html
====== PyPat ======
* program: python egg, version 1.0)
* egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/PyPat-1.0-py2.6.egg-info
* docs: /home/apps/docs/pypat-1.0
* www: http://sitemaker.umich.edu/carlsonlab/home.html
The dependencies for PyPat ... drummroll ... incredible!
In order to run some PyPat programs or invoke some of the underlying programs you may have to set some of the published environment variables listed below.
**Python**
* Installed 2.6.1 (see notes below)
* Must be compiled with zlib module (unsure)
* Must have Python-PMW (PyMol requirement, unsure)
export PYTHONHOME=/home/apps/python/2.6.1
**ImageMagick**
* Latest default version, see below.
** NumPy & Scipy**
* program: (python egg, version 1.2.1)
* egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.2.1-py2.6.egg-info
* docs: guide posted on web site
* www: http://numpy.scipy.org
NumPy was configured against the CMKL BLAS and LAPACK by uncommenting the lines for [cmkl] in site.cfg
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
export PYTHONHOME=/home/apps/python/2.6.1
It appears you also need to set your PATH to find the appropriate binary:
export PATH=/home/apps/python/2.6.1/bin:$PATH
Also upgrade and installed installed\\
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg\\
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/scipy-0.11.0-py2.6-linux-x86_64.egg\\
NumPy require the installation of Nose.
**Nose**
* program: (python egg, version 0.10.4)
* egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/nose-0.10.4-py2.6.egg-info
* docs: guide posted on web site
* www: http://somethingaboutorange.com/mrl/projects/nose
**MatPlotLib**
* program: (python egg, version 0.98.5.2)
* egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/matplotlib-0.98.5.2.egg-info
* docs:
* www: http://sourceforge.net/projects/matplotlib
Note: needed to add the location of the numpy python egg to the ''basedir'' array so that setup.py could find the egg.
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
export PYTHONHOME=/home/apps/python-2.6.1
**PyMOL**
* program: (python egg, versionless)
* egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/pymol-1.1.0-py2.6.egg-info
* docs:
* www: http://sourceforge.net/projects/pymol
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
export PYTHONHOME=/home/apps/python-2.6.1
export PYMOL_PATH=/home/apps/python-2.6.1/lib/python-2.6.1/site-packages/pymol
PyMOL required the presence of ''libglut''. Installed ''rpm -ivh freeglut-devel-2.2.0-14.x86_64.rpm'' package.
PyMOL libraries have also been installed in the following location:
export LD_LIBRARY_PATH=/home/apps/pymol/1.1:$LD_LIBRARY_PATH
The following was not required by PyPat, but was requested for installation from [[http://sitemaker.umich.edu/carlsonlab/resources.html|http://sitemaker.umich.edu/carlsonlab/resources.html]]
**Automated Flooding and Clustering source code**
* program: (python egg build from source code)
* egg: /home/apps/python/2.6.1/lib/python2.6/site-packages/AutoClustering-1.0-py2.6.egg-info
* www: carlson web site
**Global wRMSD Fit source code**
* program: (python source code)
* file: /home/apps/python/scripts/global_wrmsd_v8.3.py
* www: carlson web site
* required: /home/apps/python/2.6.1/lib/python2.6/site-packages/biopython-1.54b-py2.6.egg-info
* required: Numpy (set PATH, PYTHONPATH and LD_LIBRARY_PATH)
**Local wRMSD Fit source code**
* program: (python source code)
* file: /home/apps/python/scripts/local_wrmsd_v07.py
* www: carlson web site
* required: same as Global
**wRMSD input preparation**
* program: (python source code)
* file: /home/apps/python/scripts/prep_for wrmsd.py
* www: calson web site
**PyMOL Rendering plug-in**
* program: (python source code)
* file: /home/apps/python/scripts/rendering.[py|pyc]
* www: calson web site
====== Mesa ======
* program: open source implementation of OpenGL specification, a system for rendering #D graphics; v 6.5.1.-7.7
* install dir: (rpm based, ''rpm -ql package_name'')
* mesa-GL
* mesa-libGL
* mesa-libGl-devel
* mesa-libGLU
* mesa-libGLU-devel
* www: http://www.mesa3d.org
======= MGLTools =======
centos6
* program: MGLTools v 1.5.7 (''pmv'',''adt'',''vision'',''pythonsh'')
* install dir: /home/apps/CENTOS6/mgltools/1.5.7
* docs: http://mgltools.scripps.edu/documentation
* www: http://mgltools.scripps.edu/
below is rhel5
* program: MGLTools v 1.5.2 (''pmv'',''adt'',''vision'',''pythonsh'')
* install dir: /home/apps/mgltools/1.5.2
* docs: http://mgltools.scripps.edu/documentation
* www: http://mgltools.scripps.edu/
This package works with Autodock. This is a 64 bit installation against glibc 2.5+. You will need to adjust your PYTHONHOME and PYTHONPATH variables.
.cshrc:
set path = (/home/apps/mgltools/1.5.2/bin $path)
.bashrc
export PATH=/home/apps/mgltools/1.5.2/bin:$PATH
====== VMD ======
Please try to run these on host
* 1.8.6 on ''greentail'' (centos 5.x) for interactive use
* 1.9.2 on ''cotontail2'' (centos6.x) for interactive use
* set your PATH and LD_LIBRARY_PATH accordingly
* program: Visual Molecular Dynamics v1.8.6 or v1.9.2
* install dir: /home/apps/CENTOS/vmd/[1.8.6|1.9.2]
* plugins dir: /home/apps/CENTOS/vmd/[1.8.6|1.9.2]/plugins
* docs: http://www.ks.uiuc.edu/Research/vmd/current/docs.html
* www: http://www.ks.uiuc.edu/Research/vmd/
* ImageMagick (latest default version)
* NetPBM v10.47.05-11
* install dir: (RPM based: netpbm netpbm-devel netpbm-progs)
* docs: rpm -ql netpbm
====== MPlayer ======
* program: MPlayer Media Player (mplayer, mencoder)
* executable location: /home/apps/mplayer/2009-03-24 (SVN based)
* docs: http://linuxhelp.150m.com/installs/mplayer.htm
* www: http://www.mplayerhq.hu
Linked in via /home/apps/bin
====== PCASuite ======
* program: PCAsuite - Molecular Modeling and Bioinformatics Group
* executable location: /home/apps/bin/[pcazip, pcaunzip, pczdump]
* docs: http://mmb.pcb.ub.es/software/pcasuite.html
* www: http://mmb.pcb.ub.es
Note: Compiled against v10 intel compilers, 64 bit. Program clearly configured for 32 bit operation.
====== Perl ======
* **/usr/bin/perl**, version 5.8.8
* Additional modules installed:
* Bundle::CPAN
* IO::Compress::Bzip2
* Archive::Zip
* PAR
* PAR::Dist
* PAR::Packer
* PDL (Scientific computing using Perl Dta Language)
====== Chem Tools ======
* programs: gcmc.exe, mmc.exe , micm.exe
* install dir: /home/apps/bin
* no documentation, contact the Beveridge research group
Note: compiled with ''ifort -f77rtl -o name.exe name.f''
Warning: when compiling micm, ifort warns of 2-digit year.
====== Mead ======
* programs: consult the bin/ directory
* install dir: /home/apps/mead/2.2.8a
* web: http://www.stjuderesearch.org/apps/mead/
Note: configured with option --with-python which installed into /usr/lib/python2.4 (also copied to /home/apps/python/2.6.1/site_packages
====== Mopac ======
* program: mopac.exe (run via command ./runmopac "jobname", expects a "jobname.dat" file)
* install dir: /home/apps/mopac/7
* example, docs: inside install_dir
* www: http://www.webmo.net/support/binaries_linux.html#mopac7
======= NAMD =======
[[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:119|GPU version]]
Version 2.7b1 was compiled from source against OpenMPI (1.2+intel-10). You may have to add the OpenMPI lib/ directory to your LD_LIBARY_PATH.
* program: ''namd2'' (as well as some others)
* install dir: /home/apps/namd/NAMD_2.7b1_Source/Linux-86_64-icc/
* docs: on web site
* www: http://www.ks.uiuc.edu/Research/namd/
Following two install are binary distros:
* program: ''namd2'' (as well as some others)
* install dir: /home/apps/namd/NAMD_2.7b1_Linux-x86_64
* docs: on web site
* www: http://www.ks.uiuc.edu/Research/namd/
* program: ''namd2'' (as well as some others)
* install dir: /home/apps/namd/NAMD_2.7b1_Linux-x86-TCP
* docs: on web site
* www: http://www.ks.uiuc.edu/Research/namd/
Note: version 2.6 is swallowtail's version compiled against //Topspin// MPI.
* program: ''namd2'' (as well as some others)
* install dir: /home/apps/namd/NAMD_2.6_Source/Linux-amd64-MPI/
* docs: on web site
* www: http://www.ks.uiuc.edu/Research/namd/
eport LD_LIBRARY_PATH=/usr/local/topspin/mpi/mpich/lib64
There is a sample page here: [[cluster:69|LINK]]
======= NAB =======
Note: this is the compiled copy of swallowtail and seems to find the libraries properly. NAB is now part of AmberTools so the newest ''nab'' program can be found in the ''/home/apps/amber/10+openmpi-1.2+intel-10/bin'' location.
* program: several programs in bin/ directories; main compiler ''nab''
* install dir: /home/apps/nab/5.1.2
* docs:doc and examples directories in install dir
* www: http://www.scripps.edu/mb/case/
Users should set the NABHOME environment variable. Programs were compiled with ''icc'' with the ''-mpi'' flag.
If using csh/tcsh add to ~/.cshrc
* source /home/apps/nab/5.1.2/env.csh
or if using sh/bash add to ~/.bashrc
* . /home/apps/nab/5.1.2/env.sh
====== libsndfile ======
* program: C library for reading and writing files containing sampled sound.
* install dir: /home/apps/libsndfile/1.0.19
* docs: see below
* www: http://www.mega-nerd.com/libsndfile
====== Lame ======
* program: lame (high quality MPEG audio layer III (MP3) encoder)
* install dir: /home/apps/lame/398.2
* docs: see below
* www: http://lame.sourceforge.net
====== twoLame ======
* program: twolame (optimized MPEG audio layer II (MP2) encoder)
* install dir: /home/apps/twolame/0.3.10
* docs: see below
* www: http://www.twolame.org
====== ImageMagick ======
* program: ImageMagick (montage, animate, and more)
* install dir: /home/apps/imagemagick/[6.4.8-5 and 6.5.0.2]/bin
* docs: inside install_dir/share
* www: http://www.imagemagick.org/
In order to make this the default in your PATH, for example ...
export PATH=/home/apps/imagemagick/6.4.8-5/bin:$PATH
in your shell or add it to ~/.bashrc. The latest version is linked in via ''/home/apps/bin''
====== gdb ======
* program: Gnu Debugger v 6.8
* install dir: /usr/local/gdb/6.8 (linked via /usr/local/bin)
* docs: see below
* www: http://www.gnu.org/software/gdb/
====== ddd ======
* program: Data Display Debugger v 3.3.12
* install dir: /usr/local/ddd/3.3.12
* docs: see below
* www: http://www.gnu.org/software/ddd/
* program: Data Display Debugger v 3.3.9
* install dir: /usr/local/ddd/3.3.9
* docs: see below
* www: http://www.gnu.org/software/ddd/
======= Delphi =======
* program: delphi v4 release 1.1 (precompiled binaries)
* install dir: /home/apps/delphi/DELPHI_2004_LINUX
* docs: examples and data dirs inside install_dir
* www: http://wiki.c2b2.columbia.edu/honiglab_public/index.php/Main_Page
Note: Please read the license agreement.
====== Grace ======
* program: Grace v 5.1.22 (with X11, EPS, MIF, JPEG, PNG support)
* install dir: /usr/local/grace/5.1.22/grace ... default, linked via /usr/local/bin
* docs: user guide is located in doc/ inside install dir
* www: http://plasma-gate.weizmann.ac.il/Grace/
* program: Grace v 5.1.20 (with X11, EPS, MIF, JPEG, PNG support)
* install dir: /usr/local/grace/5.1.20/grace
* docs: user guide is located in doc/ inside install dir
* www: http://plasma-gate.weizmann.ac.il/Grace/
====== Mutt ======
* program: Mutt Mail User Agent v 1.5.18 (for those reading email locally)
* install dir: /usr/local
* docs: ''man mutt''
* www: http://www.mutt.org
====== Gnuplot ======
* program: Gnuplot v 4.2.2
* install dir: /home/apps/gnuplot/4.2.4 (linked in via /share/apps/bin)
* docs: ''man gnuplot''
* www: http://www.gnuplot.info
====== Qt ======
* program: Qt v 4.5.0
* install dir: /home/apps/qt/4.5.0
* docs: doc/html inside install_dir
* www: http://trolltech.com
* program: Qt v 4.3.0
* install dir: /home/apps/qt/4.3.0
* docs: doc/html inside install_dir
* www: http://trolltech.com
Make the programs available like so:
export PATH=/home/apps/qt/4.5.0/bin:$PATH
====== R & R/parallel ======
* https://edwards.sdsu.edu/research/c11-on-centos-6/
* C11++ on CentOS6, two version of gcc
* read the R-3.6.1_configure file in ''/share/apps/CENTOS6/R/R-3.6.1_configure''
install.packages('lme4', dependencies = TRUE)
install.packages('tidyverse', dependencies = TRUE)
install.packages('mice', dependencies = TRUE)
install.packages('rpart', dependencies = TRUE)
install.packages('party', dependencies = TRUE)
install.packages('caret', dependencies = TRUE)
install.packages('nnet', dependencies = TRUE)
install.packages('e1071', dependencies = TRUE)
install.packages('xgboost', dependencies = TRUE)
install.packages('kernlab', dependencies = TRUE)
install.packages('randomForest', dependencies = TRUE)
* R 3.6.1 (running against "packages", had to comment out libcurl version check)
* New java jdk-12.0.2 (by default in your path)
* packages lme4 and tidyverse installed (dependencies=TRUE)
* https://www.r-bloggers.com/what-are-the-best-machine-learning-packages-in-r/
* packages: mice, rpart, party, caret, randomforest, nnet, e1071, kernLab, xgboost
* note: had to make a link to omp.h from gcc 4.8.4 to OS native gcc...works?
export PATH=/share/apps/CENTOS6/R/packages/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS6/R/3.6.1b/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/3.6.1b/lib64:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:/share/apps/CENTOS6/gcc/4.8.4/lib:$LD_LIBRARY_PATH
#export PATH=/share/apps/java/jdk-12.0.2/bin:$PATH
#export LD_LIBRARY_PATH=/share/apps/java/jdk-12.0.2/lib:$LD_LIBRARY_PATH
* R 3.5.1 (new libcurl package)
* Uses same old java home 1.7.0_21 (jdk or jre?), should remember to update
* package lme4 installed
export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/3.5.1/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/3.5.1/lib64:$LD_LIBRARY_PATH
export PATH=/share/apps/java/jdk1.7.0_21/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/java/jdk1.7.0_21/lib:$LD_LIBRARY_PATH
* R 3.4.3 (major pain with small packages ahead of distro versions, consult)
* Uses JAVA_HOME: /home/apps/java/java (points to jdk1.7.0_21, should update that)
export PATH=/home/apps/CENTOS6/R/3.4.3/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/3.4.3/lib64:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
export MKL_HOME=/share/apps/intel/composerxe/mkl
**Native CentOS R is most up to date on node swallowtail for interactive use. It also has a bunch of packages installed.**
Note: All versions were compiled with Blas and Lapack libraries found in the Cluster Math Kernel Libraries.
* program: R v3.1.0 (default)
* install dir: /home/apps/R/3.1.0
* docs:
* www: http://www.r-project.org/
Set your environment as follows for 3.1.0
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/intel/composerxe/bin/ifortvars.sh intel64
export MKL_HOME=/share/apps/intel/composerxe/mkl
export PATH=/share/apps/R/3.1.0/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/R/3.1.0/lib64/R/lib:$LD_LIBRARY_PATH
Set your environment as follows for 2.14.1
* contains modules: Igraph and Multicore
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/intel/composerxe/bin/ifortvars.sh intel64
export MKL_HOME=/share/apps/intel/composerxe/mkl
export PATH=/share/apps/R/2.14.1/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/R/2.14.1/lib64/R/lib:$LD_LIBRARY_PATH
* program: R v2.8.1
* install dir: /home/apps/R/2.8.1
* docs:
* www: http://www.r-project.org/
Set your environment as follows for 2.8.1
. /share/apps/intel/cce/10.0.025/bin/iccvars.sh
. /share/apps/intel/fce/10.0.025/bin/ifortvars.sh
export PATH=/share/apps/R/2.8.1/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
* program: Rparallel v 0.61
* install dir: /home/apps/RParallel
* docs:
* www: http://www.rparallel.org/
Please consult the README file in install dir to set your environment.
====== Galaxsee ======
* program: Galaxsee
* install dir: /home/apps/Gal
* docs: http://www.shodor.org/master/galaxsee/
* docs: http://bccd.cs.uni.edu/mediawiki/index.php/GalaxSee
Small program to show the concept of parallel programmning using n-body physics problem. Compiled against old Topspin MPI.
export LD_LIBRARY_PATH=/usr/local/topspin/mpi/mpich/lib64:$LD_LIBRARY_PATH
====== Dowser ======
* program: several programs in bin/ and bin/linux/ directories
* install dir: /home/apps/dowser
* docs: DOCS/ directory in install dir
* www: http://hekto.med.unc.edu:8080/HERMANS/software/DOWSER/index.html
Note: users of dowser should add one of the following lines ...
if using csh/tcsh add to ~/.cshrc
* source /home/apps/dowser/dowserinit.csh
or if using sh/bash add to ~/.bashrc
* . /home/apps/dowser/dowserinit.sh
====== Amber ======
** Amber 22 **
Details on how to run on CentOS7 can be found at
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#amber22
** Amber 20 **
* Two versions compiled,
* one for cuda 8.0/mpich3 for queue amber128
* one for queues mwgpu and exx96 (cuda 9.2/openmpi).
* Consult the run.amber20 script to set up environment
* /share/apps/CENTOS7/amber/amber20/run.amber20
* Both are installed in /usr/local/amber20
** AMBER 18 **
* **/share/apps/CENTOS7/amber/amber18**
* compiled on ''n33'' with a ''/usr/local/n37-cuda-9.2'' soft link
* view compile steps at /share/apps/CENTOS7/amber/amber18.install
* when submitting on ''exx96'' we're mixing
* 10.2 cuda drivers
* 9.2 toolkits
* some speed results below K20 vs RTX2080s, Amber 16 vs 18
* sample jobs at ~hmeij/k20redo/run.amber[16|18]
# compiled on n33
[hmeij@n33 amber18]$ ls -l bin/pmemd*
-rwxr-xr-x 1 hmeij its 3549803 Mar 30 10:19 bin/pmemd
lrwxrwxrwx 1 hmeij its 15 Mar 30 11:57 bin/pmemd.cuda -> pmemd.cuda_SPFP
-rwxr-xr-x 1 hmeij its 93895208 Mar 30 12:11 bin/pmemd.cuda_DPFP
-rwxr-xr-x 1 hmeij its 94734272 Mar 30 13:07 bin/pmemd.cuda_DPFP.MPI
lrwxrwxrwx 1 hmeij its 19 Mar 30 12:51 bin/pmemd.cuda.MPI -> pmemd.cuda_SPFP.MPI
-rwxr-xr-x 1 hmeij its 75579744 Mar 30 11:57 bin/pmemd.cuda_SPFP
-rwxr-xr-x 1 hmeij its 76418760 Mar 30 12:51 bin/pmemd.cuda_SPFP.MPI
-rwxr-xr-x 1 hmeij its 4304392 Mar 30 11:23 bin/pmemd.MPI
# some quick results
| Average timings for all steps:
~hmeij/k20redo/run.amber18 - mwgpu - K20
| ns/day = 2.84 seconds/ns = 30385.24
~hmeij/k20redo/run.amber18 - exx96 - RTX2080S
| ns/day = 11.66 seconds/ns = 7408.08
~hmeij/k20redo/run.amber16 - exx96 - RTX2080S
| ns/day = 7.09 seconds/ns = 12185.91
**AMBER 16**
* on node n78 in /usr/local
* cuda 8 for GTX1080Ti GPUs
* plumed2.3
**AMBER 16**
* program: v 16 sander.MPI, sander.LES.MPI, pmemd, pmemd.MPI (compilation details in path)
* install dir: /home/apps/CENTOS6/amber/amber16
* includes AmberTools 17, python miniconda
* docs: doc dir inside install_dir, please read Amber17.pdf
* www: http://amber.scripps.edu
* sample script: ~hmeij/amber/cd/run.cpu
* no gpu capabilitities, requires cuda 7.5+
# next line important for Amber python modules, miniconda
source /share/apps/CENTOS6/amber/amber16/amber.sh
export AMBERHOME=/share/apps/CENTOS6/amber/amber16
export PATH=$AMBERHOME/bin:$PATH
export PATH=/share/apps/CENTOS6/python/2.7.9/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.9/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS6/openmpi/1.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/1.8.4/lib/:$LD_LIBRARY_PATH
[[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:119|GPU version]]
**AMBER 12**
* program: v12 sander.MPI, sander.LES.MPI, pmemd, pmemd.MPI (compilation details in path)
* install dir: /home/apps/amber/12cpuonly
* includes AmberTools 1.3
* docs: doc dir inside install_dir, please read Amber12.pdf
* www: http://amber.scripps.edu
Has been compiled using Intel v12 compilers and Openmpi 1.4.4, hence you need the code below to setup your environment. This v12 version of Amber is for serial and parallel runs for CPUs only (not GPUs). The appropriate wrapper is the lava.openmpi.wrapper script. Script runiing v9 and v10 version of Amber are identical to this v12 version once paths have been adjusted.
export AMBERHOME=/share/apps/amber/12cpu-only
export PATH=$AMBERHOME/bin:$PATH
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/intel/composerxe/bin/ifortvars.sh intel64
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
**AMBER 9**
* program: version 9 of sander.MPI, pmemd (compilation details in path)
* install dir: /home/apps/amber/9+openmpi-1.2+intel9
* docs: doc dir inside install_dir
* www: http://amber.scripps.edu
Please invoke with appropriate wrapper script.
Some related details about the Amber installation(s) and __how to run Amber jobs__ on the cluster can be found at this **[[cluster:42|Link]]**. This is old information but may help you along.
For sander.MPI //and// ''pmemd'' please add the following line to your ~/.bashrc file
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/9.1.021/lib/em64t:$LD_LIBRARY_PATH
**AMBER 10**
* program: v 10 sander.MPI, sander.LES.MPI, pmemd, pmemd.MPI (compilation details in path)
* install dir: /home/apps/amber/10+openmpi-1.2+intel-10
* includes AmberTools 1.2
* docs: doc dir inside install_dir, please read Amber10.pdf
* www: http://amber.scripps.edu
Please invoke with appropriate wrapper script.
Some related details about the Amber installation(s) and __how to run Amber jobs__ on the cluster can be found at this **[[cluster:42|Link]]**. This is old information but may help you along.
For sander.MPI //and// ''pmemd'' please add the following line to your ~/.bashrc file (I compiled 9 and 10 against same CMKL version to avoid confusion).
export LD_LIBRARY_PATH=/home/apps/intel/cmkl/9.1.021/lib/em64t:$LD_LIBRARY_PATH
AmberTools 1.2 has been installed.
RIP Scripts has been installed, consult http://boscoh.com/rip
Location: /home/apps/amber/rip-scripts
In order to use, you will need to point to Phyton 2.6.1 which contains the Numpy egg.
====== Rosetta ======
(PyRosetta v2.011 is also installed)
Both below compiled with /share/apps/python/2.6.1/bin/python ...
* program: Rosetta 3.5 (see compiler note below)
* install dir: /home/apps/rosetta/3.5
* others: database, biotools, demos, bin (add latter to $PATH)
* www: http://www.rosettacommons.org/
* docs online at https://www.rosettacommons.org/manuals/archive/rosetta3.5_user_guide/
Please read the license at **''/share/apps/rosetta/LICENSE-3.0''**
* program: Rosetta 3.0 (see compiler note below)
* install dir: /home/apps/rosetta/3.0
* others: database, biotools, demos, bin (add latter to $PATH)
* www: http://www.rosettacommons.org/
Please read the license at **''/share/apps/rosetta/LICENSE-3.0''**
Note: this is a 64-bit compilation using python 2.6.1 and therefore is an "unsupported" configuration.
* program: Rosetta++ 2.2.0 (see compiler note below)
* install dir: /home/apps/rosetta/2.2.0
* scripts, database & docs: inside install_dir
* www: http://www.rosettacommons.org/
Please read the license at **''/share/apps/rosetta/LICENSE-2.2.0''**
Note: this is a 64-bit compilation (done on swallowtail) using target ''gcc64'' and therefore is an "unsupported" configuration.
====== Gaussian ======
EMT64 x86_64 version (for mw256[fd], mw128, tinymem)
* program: Gaussian 16 (''g16'')
* install dir: /home/apps/gaussian/g16root
EMT64 x86_64 version (for hp12)
* program: Gaussian GDV H11.1
* install dir: /share/apps/gaussian/gdvh11
export gdvroot="/share/apps/gaussian/gdvh11"
. $gdvroot/gdv/bsd/gdv.profile
EMT64 x86_64 version (for swallowtail/petaltail)
* program: Gaussian 09 Revision (''g09'')
* install dir: /home/apps/gaussian/g09root
AMD64 x86_64 version (for sharptail)
* program: Gaussian 09 Revision (''g09'')
* install dir: /home/apps/gaussian/g09root_amd64
AMD64 x86_64 version (for sharptail)
* program: Linda 8.2 (''g09'')
* install dir: /home/apps/gaussian/g09root_amd64_linda
AMD64 & EMT64 x86_64 version (for both swallowtail/petaltail and sharptail clusters)
* program: GaussView v5
* install dir: 'gv' in all g09root directories listed above
Please note that if run from sharptail, you must explictedly enable x-forwarding when connecting to this host, like so
ssh -X sharptail
ENVIRONMENT
Users will want to add the following to their .login file:
setenv g09root "dir"
setenv GAUSS_SCRDIR "scr-dir"
source $g09root/g09/bsd/g09.login # sets up G09 run environment.
The in the first command is the location of the g09 directory.
For example, if the path to this directory is /usr/local/g09, then
set g09root to /usr/local. The in the second command is
a directory to be used by default for Gaussian 09 scratch files. There
should be plenty of disk space available at this location.
Bourne-type shell users would use these commands instead of the
ones above:
g09root="dir"
GAUSS_SCRDIR="scr-dir"
export g09root GAUSS_SCRDIR
. $g09root/g09/bsd/g09.profile
DOCUMENTATION
* Gaussian 09 User's Reference:
* http://www.gaussian.com/g_tech/g_ur/g09help.htm
* G09 IOPs Reference:
* http://www.gaussian.com/g_tech/g_iops/iops_toc.htm
* G09 Linda:
* http://www.gaussian.com/g_prod/linda.htm
* GaussView 5 Reference:
* http://www.gaussian.com/g_tech/gv5ref/gv5ref_toc.htm
Note: in order to use this application you must be a member of group ''gaussian''. Please send request to the administrator.
Details about the Gaussian installation and __how to run Gaussian jobs__ on the cluster can be found at this **[[cluster:47|Link]]**. Please read that page.
====== Openbabel ======
You will need to add the bin/ and lib/ directories to your PATH and LD_LIBRABY_PATH env parameters (example son this page)
For CentOS6.x (mw256[fd] queues), compiled with cmake/gcc
* program Openbabel (v 2.3.2)
* install dir: /share/apps/CENTOS6/openbabel/2.3.2
* docs: check web site
For RHEL5.x and CentOS5.x (hp12 queue), compiled with ICC/ifort
* program Openbabel (v 2.2.1)
* install dir: /share/apps/openbabel/2.2.1
* docs: check web site
* program Openbabel (v 2.1.1)
* install dir: /share/apps/openbabel/2.1.1
* docs: check web site
====== AutoDock ======
* program: AutoDock 4.0.1 and AutoGrid 4.0.0 (''autodock4, autogrid4'')
* install dir: /home/apps/autodock
* docs: unknown, check web site
* www: http://autodock.scripps.edu/
Note. Requires Openbabel. You need to add the following lines to your ~/.bashrc file or program.
export PATH=/home/apps/autodock:/home/apps/openbabel/2.1.1/bin:$PATH
export PYTHONPATH=/home/apps/openbabel/2.1.1/site-packages:$PYTHONPATH
export PERL5LIB=/home/apps/openbabel/2.1.1/lib/perl:$PERL5LIB
export LD_LIBRARY_PATH=/home/apps/openbabel/2.1.1/lib:$LD_LIBRARY_PATH
Note. The perl plugin will not compile. Have a message to the list.\\
--- //[[hmeij@wesleyan.edu|Meij, Henk]] 2008/01/25 15:39//
====== MPI ======
===== OpenMPI =====
* program: OpenMPI v 3.1.3
* install dir: /share/apps/CENTOS6/openmpi/3.1.3
* program: OpenMPI v 1.10.2
* install dir: /share/apps/CENTOS6/openmpi/1.10.2
* CHECKPOINTING SUPPORT (be sure in combination with BLCR, consult[[cluster:148|BLCR Checkpoint in OL3]]
* program: OpenMPI 1.6.5
* install dir: /share/apps/CENTOS6/openmpi-1.6.5.cr
* program: OpenMPI v 1.8.4 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/CENTOS6/openmpi/1.8.4/bin/ (compiled with gcc 4.8.4 !!!)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH
* program: OpenMPI v 1.4.4 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/openmpi/1.4.4+intel-12/bin/ (compiled with icc/ifort intel v12)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
* program: OpenMPI v 1.4.3 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/openmpi/1.4.3+intel-12/bin/ (compiled with icc/ifort intel v12)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
For v1.4.3/v1.4.4 you will have to set:
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/intel/composerxe/bin/ifortvars.sh intel64
#also change (not sure it matters), LD_LIBRARY_PATH is modified by above scripts
export MKL_HOME=/share/apps/intel/composerxe/mkl
* program: OpenMPI v 1.3.1 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/openmpi/1.3.1+intel-10/bin/ (compiled with icc/ifort intel v10)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
* program: OpenMPI v 1.2 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...) DEFAULT
* install dir: /home/apps/openmpi/1.2+intel-9/bin/ (compiled with icc/ifort intel v9)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
* program: OpenMPI v 1.2 (default)(programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/openmpi/1.2+intel-10/bin/ (compiled with icc/ifort intel v10)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
* program: OpenMPI v 1.2 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /opt/openmpi/gnu/bin/ (compiled with default gcc 4.1.2)
* docs: inside install dir in dirs man & share
* www: http://www.open-mpi.org/
===== MVAMPICH =====
* program: Mvapich v 0.9.9 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/mvampich/0.9.9 (compiled with intel v 10)
* docs: inside install dir in dirs man & doc
* www: http://mvapich.cse.ohio-state.edu/index.shtml
===== LAM =====
** Warning ** end-of-life, please use OpenMPI
* program: LAM v 7.1.3 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /home/apps/lam/7.1.3 (includes xmpi, compiled with intel v 10)
* docs: inside install dir in dirs man & doc
* www: http://www.lam-mpi.org/
===== Other Gnu MPI Flavors =====
* program: Mpich1 v 1.2.7 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /opt/mpich1/gnu/bin/ (compiled with default gcc 4.1.2)
* program: Mpich2 v 1.0.6 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /opt/mpich1/gnu/bin/ (compiled with default gcc 4.1.2)
* program: Mvapich1 v 1.2.7 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /opt/mvapich1/gnu/bin/ (compiled with default gcc 4.1.2)
* program: Mvapich2 v 2.1.0 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun ...)
* install dir: /opt/mvapich1/gnu/bin/ (compiled with default gcc 4.1.2)
Those are pretty old i suggest you do not use them.
===== OFED ======
* program: OFED v 1.0.0
* install dir: /opt/ofed/gnu/bin/mpi-selector (compiled with default gcc 4.1.2)
===== TopSpin ======
* program: mpicc, mpirun etc
* install dir: /usr/local/topspin
Note: the is the mpich MPI suite of libraries that came with OCS 4.1.1 (swallowtail). In case users compiled programs against these libraries. Not supported anymore in OCS 5.1. Suggest using the latest OpenMPI flavor.
====== Java ======
* program: java, javac etc
* install dir: /home/apps/java/ where version 4.2.17 or 1.5.0_15 or 1.6.0_05
* docs: install install_dir
* www: http://java.sun.com/
Versions are installed with the most recent linked into users' environments via /share/apps/bin. JAVAHOME points to ''/home/apps/java/java'' which is the latest version. To set another flavor as your default java, here is an example:
# update
[hmeij@greentail52 ~]$ java --version
java 20.0.2 2023-07-18
export JAVAHOME=/home/apps/j2sdk1.4.2_17/bin:$PATH
export CLASSPATH=$JAVAHOME/lib:$CLASSPATH
drwxrwxr-x 9 root root 4096 Aug 20 2007 j2sdk1.4.2_07
drwxr-xr-x 9 root root 4096 Oct 5 06:05 jdk1.5.0_14
drwxr-xr-x 9 root root 4096 Jun 14 2007 jdk1.6.0_02
drwxr-xr-x 7 root root 4096 Nov 10 2005 jre1.5.0_06
drwxr-xr-x 20 root root 4096 Feb 20 11:20 netbeans-6.0.1
* program: NetBeans IDE 6.0 (netbeans)
* install dir: /share/apps/java/
* docs:
* www: http://www.netbeans.org/
====== Python2 ======
**Note 2.7 branch is end of life 01/01/2020**
Note: since the latest update to CentOS 6.10 the native python in the OS is /usr/bin/python v2.6.6
* Python 2.7.15
* compile with enable-share
* numpy and scipy packages, and
* Successfully installed Send2Trash-1.5.0 backports-abc-0.5 futures-3.3.0 ipaddress-1.0.22
* ipykernel-4.10.1 jupyter-client-5.3.4 jupyterlab-0.33.12 jupyterlab-launcher-0.11.2
* notebook-5.7.8 prometheus-client-0.7.1 python-dateutil-2.8.0 singledispatch-3.4.0.3
* subprocess32-3.5.4 terminado-0.8.2 tornado-5.1.1
export PATH=/share/apps/CENTOS6/python/2.7.15/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.15/lib:$LD_LIBRARY_PATH
* 2.7.15 embedded in miniconda2 installation, see Galario software entry.
* 2.7.10 two versions, the one in ''blcr_soft'' can be checkpointed
* /share/apps/CentOS6/python/2.7.10
* /share/apps/CENTOS6/blcr_soft/python/2.7.10
--- //[[hmeij@wesleyan.edu|Henk]] 2017/02/06 10:47//
Added to /share/apps/CENTOS6/blcr/pyton/2.7.10 ...
export PATH=/share/apps/CENTOS6/python/2.7.10/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.10/lib:$LD_LIBRARY_PATH
...or...
export PYTHONHOME=/share/apps/CENTOS6/blcr_soft/python/2.7.10
export PYTHONPATH=/home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages
export PATH=$PYTHONHOME/bin:$PATH
REBOUND: https://github.com/hannorein/rebound ; http://rebound.readthedocs.io/en/latest/
REBOUNDx: https://github.com/dtamayo/reboundx
Installing collected packages: pandas
Successfully installed pandas-0.19.2
Added to /share/apps/CENTOS6/python/2.7.10
--- //[[hmeij@wesleyan.edu|Henk]] 2019/03/15 11:14//
Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.10.1-py2.7-linux-x86_64.egg
Adding scipy 0.16.1 to easy-install.pth file
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/argparse-1.4.0-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/readline-6.2.4.1-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pandas2-0.0.0-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/emcee-3.0rc2-py2.7.egg
Adding mpi4py 2.0.0 to easy-install.pth file
Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/matplotlib-1.5.0-py2.7-linux-x86_64.egg
Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/astropy-1.1rc2-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/scandir-1.10.0-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/subprocess32-3.5.3-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pandas-0.24.2-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/seaborn-0.9.0-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.16.2-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/os0-0.2.14-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pickle_utils-0.1-py2.7.egg
Added to /share/apps/CENTOS6/blcr_soft/python/2.7.10
--- //[[hmeij@wesleyan.edu|Henk]] 2019/03/16 08:40//
Successfully installed argparse-1.4.0
Successfully installed readline-6.2.4.1
Successfully installed numpy-1.16.2
Successfully installed scipy-1.2.1
emcee in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/emcee-2.1.0-py2.7.egg (2.1.0)
mpi4py in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (2.0.0)
pandas in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (0.19.2)
pytz>=2011k in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pytz-2016.3-py2.7.egg (from pandas) (2016.3)
numpy>=1.7.0 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from pandas) (1.16.2)
python-dateutil in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/python_dateutil-2.5.2-py2.7.egg (from pandas) (2.5.2)
six>=1.5 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/six-1.10.0-py2.7.egg (from python-dateutil->pandas) (1.10.0)
Installed /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pandas2-0.0.0-py2.7.egg
matplotlib in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/matplotlib-1.5.1-py2.7-linux-x86_64.egg (1.5.1)
numpy>=1.6 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from matplotlib) (1.16.2)
python-dateutil in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/python_dateutil-2.5.2-py2.7.egg (from matplotlib) (2.5.2)
pytz in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pytz-2016.3-py2.7.egg (from matplotlib) (2016.3)
cycler in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/cycler-0.10.0-py2.7.egg (from matplotlib) (0.10.0)
pyparsing!=2.0.0,!=2.0.4,>=1.5.6 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pyparsing-2.1.1-py2.7.egg (from matplotlib) (2.1.1)
six>=1.5 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/six-1.10.0-py2.7.egg (from python-dateutil->matplotlib) (1.10.0)
Installing collected packages: seaborn
Successfully installed subprocess32-3.5.3
astropy in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/astropy-1.1.2-py2.7-linux-x86_64.egg (1.1.2)
numpy>=1.6.0 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from astropy) (1.16.2)
Successfully installed pathlib2-2.3.3 scandir-1.10.0
Successfully installed os0-0.2.14
Successfully installed pickle-utils-0.1
* program: python v 2.7.9
* install dir: /home/apps/CentOS6/python/2.7.9
* docs: http://www.python.org/doc/
* www: http://www.python.org/
Note: v 2.7.9 also contains Numpy 1.9.2 and Cython 0.22 and MACS2 2.1.0
Note: Should not be used on queue hp12
Note: Successfully installed insane-1.0rc1 simopt-0.1
export PYTHONHOME=/home/apps/CentOS6/python/2.7.9
export PYTHONPATH=/home/apps/CentOS6/python/2.7.9/site-packages:$PYTHONPATH
export PATH=$PYTHONHOME/bin:$PATH
* program: python v 2.7.2
* install dir: /home/apps/python/2.7.2
* docs: http://www.python.org/doc/
* www: http://www.python.org/
v 2.7.2 also contains pyodbc which relies on unixODBC and a mysql connector, consult section unixODBC.
v. 2.7.2 also contains
* PythoScale: http://www.rbvi.ucsf.edu/trac/Pythoscape
export PYTHONHOME=/home/apps/python/2.7.2
export PYTHONPATH=/home/apps/python/2.7.2/site-packages
export PATH=$PYTHONHOME/bin:$PATH
* program: python v 2.6.1
* install dir: /home/apps/python/2.6.1
* docs: http://www.python.org/doc/
* www: http://www.python.org/
- setuptools egg (pknots)
- MySQl-python egg (pknots)
- Pmw egg (PyMol)
* program: python v 2.6.1 with Tcl/Tk and Togl required for PyOpenGL (see lammps section)
* install dir: /home/apps/python/2.6.1.tcltk
* docs: http://www.python.org/doc/
* www: http://www.python.org/
- Although older version of Tcl/Tk and Togl exist in redhat OS, newer versions are also installed in /usr/local
- python 2.6.1 was recompiled against the above and staged in install dir
- next setuptools was installed in site-packages
- setup.py then installed the PyOpenGL package
* program: python v 2.5.4
* install dir: /home/apps/python/2.5.4
* docs: http://www.python.org/doc/
* www: http://www.python.org/
* program: default python v 2.4.3
* install dir: /usr/bin
* docs: http://www.python.org/doc/
* www: http://www.python.org/
Note: there is another version of Python (2.5) hiding inside MGLtools at ''/home/apps/mgltools/1.5.2/bin/python''. Read the section on MGLtools for more information.
====== Compilers ======
===== G95 Compiler =====
* program: G95 v 0.9.1 compiler (g95) compiled with GCC v 4.1.2 (core)
* install dir: /home/apps/g95/0.9.1
* docs: G95Manual.pdf inside install_dir
* www: http://www.g95.org
===== GNU Compilers =====
For all these installations you will need to set $PATH and $LD_LIBRARY_PATH, look for samples on this page ...
* program: GCC v 9.2.0
* /share/apps/CENTOS6/9.2.0
* note: libgomp disabled for nanopolish compile attempt
wget ftp://ftp.nluug.nl/mirror/languages/gcc/releases/gcc-9.2.0/gcc-9.2.0.tar.gz
tar zxvf gcc-9.2.0.tar.gz
cd gcc-9.2.0
./contrib/download_prerequisites
./configure --prefix=/share/apps/CENTOS6/gcc/9.2.0 --disable-libgomp --disable-multilib
make
make install
* program: GCC v 4.8.4
* install dir: /home/apps/CENTOS6/gcc/4.8.4
* docs: man inside install_dir
* www: http://gcc.gnu.org
Note: GMP, MPFR and MPC build into GCC. There are also standalone packges.
[hmeij@petaltail gcc]$ export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
[hmeij@petaltail gcc]$ export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH
[hmeij@petaltail gcc]$ which gcc
/share/apps/CENTOS6/gcc/4.8.4/bin/gcc
[hmeij@petaltail gcc]$ gcc test.c
[hmeij@petaltail gcc]$ ./a.out
Hello world
For all these installations you will need to set $PATH and $LD_LIBRARY_PATH, look for samples on this page ...
* program: GCC v 4.8.0
* install dir: /home/apps/gcc/4.8.0
* docs: man inside install_dir
* www: http://gcc.gnu.org
Note: Used the gcc-build approach so you will not need references to GMP, MPFR and MPC using GCC.
* program: GCC v 4.7.2
* also needs /share/apps/gcc/gmp/4.3.2
* and /share/apps/gcc/mpfr/2.4.2
* or /share/apps/gcc/mpc/0.8.1
* install dir: /home/apps/gcc/4.7.2
* docs: man inside install_dir
* www: http://gcc.gnu.org
Note: Used the gcc-build approach so you will not need references to GMP, MPFR and MPC using GCC.
* program: GCC v 4.3.3
* install dir: /home/apps/gcc/4.3.3
* docs: man inside install_dir
* www: http://gcc.gnu.org
* program: GCC v 4.2.0
* install dir: /home/apps/gcc/4.2.0
* docs: man inside install_dir
* www: http://gcc.gnu.org
* program: default GCC v 4.1.2 compilers (gcc, g++, no fortran)
* docs: rpm -ql gcc
* program: GCC v 4.0.3
* install dir: /home/apps/gcc/4.0.3 (core only)
* docs: man inside install_dir
* www: http://gcc.gnu.org
===== Intel =====
UPDATE 2021
latest version of icc/icx and ifort/ifx are inside OneAPI, search this page
UPDATE 2016
parallel studio cluster edition installed
* 2 ifort and 2 icc float licenses
* icc/ifort are in your path
* for location do something like 'env | grep ^INTEL'
UPDATE, 3 years later, we bought a new version, things change
* ifort and icc are all together now with MKL and other stuff ...
* consult the README file in /share/apps/intel/composerxe/
* you will need to source the environment as explained in the README files
* the default icc/ifort will remain the old version for now
* ...
* can be used on all login nodes as well ...
* install dir: /home/apps/intel/[cc|cce|fc|fce|idbe]/version/bin
* cc|fc 32 bit compilers
* cce|fce 64 bit compilers
* v 09.1.043 was our evaluation copy and will remain
* v 09.1.047 was our first installed version
* v 10.0.25 is the default in your $PATH
* docs: man 'compiler_name' and 'compiler_name -help'
* also in install_dir/compiler_name/version/doc
* http://software.intel.com/en-us/intel-sdp-home/
====== Cluster Math Kernel Libraries ======
* latest version in OneAPI, search this page
* see Intel parallel studio
* /home/apps/intel/cmkl
* 9.1.021 ... [old]
* 9.1.023 ...
* 10.0.011 ... default, set in $MKL_HOME
====== Intel Threading Building Blocks ======
* latest version is in OneAPI, search this pages
* see Intel parallel studio
* tied to our Intel C compiler license ... v 2.0
* relevant files sourced at login (setting up LIBRARY_PATH and DYLD_LIBRARY_PATH)
* install_dir: /home/apps/intel/tbb/2.0
* docs: docs/ directory inside install_dir
* web: [[http://www.intel.com/cd/software/products/asmo-na/eng/threading/threadbuildblocks/294797.htm|TBB]]
====== Matlab ======
* installed R2023a
* installed R2022a
* installed R2021a
* program: Matlab, link to binary is 'matlab' in ''/share/apps/bin''
* R2020b, campus wide license so no more restrictions on licenses
* No Distributed Computing Engine or Matlab Parallel (for now)
* install dir: /share/apps/CENTOS7/matlab/R2020bISO
* docs & www: http://mathworks.com
* CentOS7 required use on queues mwgpu, amber128 and exx96
* For GUI launch on greentail52
Note:
* Matlab can do GPU computations
* http://www.mathworks.com/discovery/matlab-gpu.html
* Matlab can do Hadoop computations
* http://www.mathworks.com/help/matlab/mapreduce.html
====== Stata ======
* program stata[-mp|-se] in your $PATH, version 18
* v17 will still run so add '17' to the command like 'stata-mp17'
* so that makes for 12 licenses 8-)
* program: Stata v 17, 6-user network license, includes stata, stata-mp, stata-se
* install dir: /share/apps/stata/
* docs: at web site
* http://stata.com
export PATH=/share/apps/stata/17:$PATH
\\
**[[cluster:0|Back]]**