User Tools

Site Tools


cluster:73

This is an old revision of the document!



Back

Software

The listings below will be updated as software is installed, location is /share/apps/ or /home/apps/ (same thing).

Please note that what you find in /usr/local/bin is local to the head node and /home/apps/bin is available cluster wide (which is in your default PATH)

Most of the compilations have been performed by Intel's icc/ifort compilers.

Old compilations, mostly located in /share/apps, are RHEL 5.5
Newer compilations are done with CentOS 6.10 (since summer 2013)
on either cottontail2 or node n33 located at /share/apps/CENTOS6
Some compilations require 7.x and are in /share/apps/CENTOS7

Deep Learning Software Stack

  • queue exx96 provides access to docker containers and
    • digits, tensorflow, pytorch, caffe, rapidsai
    • cuda (several versions)
    • jupyterlab
    • portainer

Paleogenomics Tools

Samtools is a suite of programs for interacting with high-throughput sequencing data. It consists of three separate repositories:

  • Samtools: Reading/writing/editing/indexing/viewing SAM/BAM/CRAM format
  • BCFtools:Reading/writing BCF2/VCF/gVCF files and calling/filtering/summarising SNP and short indel sequence variants
  • HTSlib: A C library for reading/writing high-throughput sequencing data
# add to environment
export PATH=/share/apps/CENTOS6/samtools/1.10/bin:$PATH
export PATH=/share/apps/CENTOS6/bcftools/1.10.2/bin:$PATH
export PATH=/share/apps/CENTOS6/htslib/1.10.2/bin:$PATH
# does not seem to work against htslib version above...
export  HTSSRC=/share/apps/CENTOS6/ANGSD/htslib
export PATH=/share/apps/CENTOS6/ANGSD/angsd:$PATH
  • BEAST 2 is a cross-platform program for Bayesian phylogenetic analysis of molecular sequences.
# with local java jre1.8.0_212
export PATH=/share/apps/CENTOS6/beast/2.6.0/bin:$PATH

CD-HIT

CD-HIT is a widely used program for clustering biological sequences to reduce sequence redundancy and improve the performance of other sequence analyses.

OpenStructure

Open-Source Computational Structural Biology Framework

OpenStructure relies on these applications and Python 2.7.15 (see python2 section).

-- Boost version: 1.68.0
-- Found the following Boost libraries:
--   python
--   unit_test_framework
--   filesystem
--   system
--   iostreams
--   program_options
--   regex
--   thread
-- Found ZLIB: /share/apps/CENTOS6/zlib/1.2.11/lib/libz.so (found version "1.2.11")
-- Found PNG: /share/apps/CENTOS6/lpng/1.6.37/lib/libpng.so (found version "1.2.49")
-- Found EIGEN: /share/apps/CENTOS6/eigen/3.2.10
-- Numpy headers found
-- Found FFTW: /share/apps/CENTOS6/fftw/3.3.8-enable-single/lib/libfftw3f.so
-- Found TIFF: /share/apps/CENTOS6/tiff/4.0.9/lib/libtiff.so (found version "4.0.9")


# setup the environment
egrep ^export /share/apps/CENTOS6/openstructure/1.9-install.txt

ost --help
Usage:

    ost [ost options] [script to execute] [script parameters]

or
    ost [action name] [action options]

Remarkably complex compilations.

PyCharm EDU

An Easy and Professional Tool to Learn & Teach Programming with Python

Paml

Phylogenetic Analysis by Maximum Likelihood (PAML)…programs for phylogenetic analyses of DNA or protein sequences using maximum likelihood.

Spyder

Python3

Added to both versions
pycuda
Henk 2020/02/18 08:45

Added to both 3.5.2 versions
Successfully installed
boto-2.49.0 boto3-1.12.0 botocore-1.15.0 gensim-3.8.1 jmespath-0.9.4 s3transfer-0.3.3 smart-open-1.9.0
joblib, scikit-learn, sklearn

Requirement already satisfied:
keras keras-preprocessing keras-applications
tensorflow (requires centos7)
Henk 2020/02/17 13:02

Added to both 3.5.2 versions
jupyter-notebook only runs in 3.5.2 (non-mpi)
Henk 2019/07/23 09:04

Successfully installed
Send2Trash-1.5.0
ipywidgets-7.5.0
jupyter-1.0.0
jupyter-client-5.3.1
jupyter-console-6.0.0
notebook-6.0.0
prometheus-client-0.7.1
terminado-0.8.2
widgetsnbextension-3.5.0
json5-0.8.5 
jupyterlab-1.0.2 
jupyterlab-server-1.0.0 
tornado-6.0.3

Added to both python versions 3.5.2 and 3.5.2-mpi the following modules all needed for Spyder.
Henk 2019/04/25 09:21

Successfully installed PyQT5-5.12.1 PyQt5-sip-4.19.15

Successfully installed 
Jinja2-2.10.1 MarkupSafe-1.1.1 alabaster-0.7.12 asn1crypto-0.24.0 astroid-2.2.5 
attrs-19.1.0 babel-2.6.0 backcall-0.1.0 bleach-3.1.0 certifi-2019.3.9 cffi-1.12.3 
chardet-3.0.4 cloudpickle-0.8.1 cryptography-2.6.1 decorator-4.4.0 defusedxml-0.6.0 
docutils-0.14 entrypoints-0.3 idna-2.8 imagesize-1.1.0 ipykernel-5.1.0 
ipython-7.4.0 ipython-genutils-0.2.0 isort-4.3.17 jedi-0.13.3 jeepney-0.4 
jsonschema-3.0.1 jupyter-client-5.2.4 jupyter-core-4.4.0 keyring-19.0.1 
lazy-object-proxy-1.3.1 mccabe-0.6.1 mistune-0.8.4 nbconvert-5.4.1 nbformat-4.4.0 
numpydoc-0.9.1 packaging-19.0 pandocfilters-1.4.2 parso-0.4.0 pexpect-4.7.0 
pickleshare-0.7.5 prompt-toolkit-2.0.9 psutil-5.6.1 ptyprocess-0.6.0 pycodestyle-2.5.0 
pycparser-2.19 pyflakes-2.1.1 pygments-2.3.1 pylint-2.3.1 pyqtwebengine-5.12.1 
pyrsistent-0.14.11 pyzmq-18.0.1 qtawesome-0.5.7 qtconsole-4.4.3 qtpy-1.7.0 requests-2.21.0 
rope-0.14.0 secretstorage-3.1.1 snowballstemmer-1.2.1 sphinx-2.0.1 sphinxcontrib-applehelp-1.0.1 
sphinxcontrib-devhelp-1.0.1 sphinxcontrib-htmlhelp-1.0.2 sphinxcontrib-jsmath-1.0.1 
sphinxcontrib-qthelp-1.0.2 sphinxcontrib-serializinghtml-1.1.3 
spyder-3.3.4 spyder-kernels-0.4.4 
testpath-0.4.2 tornado-6.0.2 traitlets-4.3.2 typed-ast-1.3.4 urllib3-1.24.2 
wcwidth-0.1.7 webencodings-0.5.1 wrapt-1.11.1 wurlitzer-1.0.2
  • program: python v 3.5.2 mpi enabled
    • /share/apps/CENTOS6/python/3.5.2-mpi (used in astronomy with emcee and mpi4py)

Environment

  export PATH=/share/apps/CENTOS6/openmpi/3.1.3/bin:$PATH
  export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/3.1.3/lib:$LD_LIBRARY_PATH
  which mpicc
/share/apps/CENTOS6/openmpi/3.1.3/bin/mpicc

CC=mpicc LDSHARED="mpicc -shared" ./configure --prefix=/share/apps/CENTOS6/python/3.5.2-mpi

export PATH=/share/apps/CENTOS6/python/3.5.2-mpi/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2-mpi/lib:$LD_LIBRARY_PATH

<code>

Packages
<code>

Successfully installed nltk-3.4
Successfully installed numpy-1.16.2
Successfully installed scipy-1.2.1
Successfully installed Cython-0.29.6
Successfully installed jenkspy-0.1.5
Successfully installed pandas-0.24.2 python-dateutil-2.8.0 pytz-2018.9

Successfully installed argparse-1.4.0
Installing collected packages: readline
Successfully installed mpi4py-3.0.1
Successfully installed emcee-2.2.1
Installed /home/apps/CENTOS6/python/3.5.2-mpi/lib/python3.5/site-packages/pandas2-0.0.0-py3.5.egg
Successfully installed cycler-0.10.0
Successfully installed kiwisolver-1.0.1
Successfully installed matplotlib-3.0.3 pyparsing-2.3.1
Successfully installed seaborn-0.9.0
Successfully installed subprocess32-3.5.3
Successfully installed astropy-3.1.2
Successfully installed pathlib2-2.3.3
Successfully installed scandir-1.10.0
Successfully installed os0-0.2.14
Successfully installed pickle-utils-0.1
  • program: python v 3.5.2 not mpi enabled
    • /share/apps/CENTOS6/python/3.5.2 (heavily used in DNA sequencing tools)
    • python3, pip3, nltk v3.2.1, numpy v1.11.2
    • Cython-0.27.3, jenkspy-0.1.4
    • Pandas-0.21.0
    • OpenPyXL
    • and also …
Successfully installed emcee-2.2.1
Requirement already satisfied: mpi4py in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (2.0.0)

Successfully installed argparse-1.4.0
Successfully installed readline-6.2.4.1
Requirement already satisfied: pandas in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (0.21.0)
Requirement already satisfied: python-dateutil>=2 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (2.6.1)
Requirement already satisfied: pytz>=2011k in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (2017.3)
Requirement already satisfied: numpy>=1.9.0 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from pandas) (1.15.4)
Requirement already satisfied: six>=1.5 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from python-dateutil>=2->pandas) (1.11.0)

Installed /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/pandas2-0.0.0-py3.5.egg
Successfully installed cycler-0.10.0 kiwisolver-1.0.1 matplotlib-3.0.3 pyparsing-2.3.1 seaborn-0.9.0

Installed /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/subprocess32-3.5.3-py3.5.egg

Successfully installed astropy-3.1.2
Successfully installed pathlib2-2.3.3

Requirement already satisfied: matplotlib in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (3.0.3)
Requirement already satisfied: cycler>=0.10 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (0.10.0)
Requirement already satisfied: numpy>=1.10.0 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (1.15.4)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (2.3.1)
Requirement already satisfied: python-dateutil>=2.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (2.6.1)
Requirement already satisfied: kiwisolver>=1.0.1 in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from matplotlib) (1.0.1)
Requirement already satisfied: six in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from cycler>=0.10->matplotlib) (1.11.0)
Requirement already satisfied: setuptools in /home/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages (from kiwisolver>=1.0.1->matplotlib) (20.10.1)

export PATH=/share/apps/CENTOS6/python/3.5.2/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2/lib:$LD_LIBRARY_PATH

Henk 2017/02/06 10:45 Added to 3.5

Copying mpi4py.egg-info to /share/apps/CENTOS6/python/3.5.2/lib/python3.5/site-packages/mpi4py-2.0.0-py3.5.egg-info

but I had to change the code and I do not know what the impact might be

src/mpi4py.MPI.c
/*  __pyx_v_rd = MPI_CONVERSION_FN_NULL; */
  __pyx_v_rd = 0;

Use this env pointing to old MPI versions

 export MPI_DIR=/share/apps/openmpi/1.2+intel-10
 export PATH=/share/apps/CENTOS6/python/3.5.2/bin:$PATH
 export PATH=/share/apps/openmpi/1.2+intel-10/bin:$PATH
 export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/3.5.2/lib:$LD_LIBRARY_PATH
 export LD_LIBRARY_PATH=/share/apps/openmpi/1.2+intel-10/lib:$LD_LIBRARY_PATH

These versions of python can be set with your PYTHONPATH and PYTHONHOME env variables. For syntax, see the Autodock section.

Kallisto & Trinity & FastQC

FastQC A quality control tool for high throughput sequence data.

kallisto is a program for quantifying abundances of transcripts from bulk and single-cell RNA-Seq data, or more generally of target sequences using high-throughput sequencing reads.

export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH"

  bwa:             0.7.17-h84994c4_5    bioconda
  perl:            5.26.2-h14c3975_1002 conda-forge

  bowtie             bioconda/linux-64::bowtie-1.2.2-py27h2d50403_1
  tbb                conda-forge/linux-64::tbb-2019.4-h6bb024c_0

  star               bioconda/linux-64::star-2.7.0d-0

  hdf5               conda-forge/linux-64::hdf5-1.10.3-hba1933b_1001
  kallisto           bioconda/linux-64::kallisto-0.45.0-hdcc98e5_0

Trinity assembles transcript sequences from Illumina RNA-Seq data.

Trinity seems to need makeblastdb. BLAST finds regions of similarity between biological sequences.

export TRINITY_HOME=/share/apps/CENTOS6/trinityrnaseq-Trinity-v2.8.4
export PATH=$TRINITY_HOME:$PATH

python 2.7 modules (these do not work with Trinity...bowtie2, jellyfish, salmon?)

    bowtie2-2.3.4.3            |   py27he860b03_1        11.9 MB  bioconda
    jellyfish          conda-forge/linux-64::jellyfish-0.6.1-py27h14c3975_1000
    bzip2-1.0.6                |    h14c3975_1002         415 KB  conda-forge
    icu-58.2                   |    hf484d3e_1000        22.6 MB  conda-forge
    jemalloc-5.1.0             |    hf484d3e_1000        10.2 MB  conda-forge
    libboost-1.67.0            |       h46d08c1_4        20.9 MB
    salmon-0.13.1              |       h86b0361_0         4.2 MB  bioconda
    xz-5.2.4                   |    h14c3975_1001         366 KB  conda-forge

trinity modules

Inchworm:                has been Installed Properly
Chrysalis:               has been Installed Properly
QuantifyGraph:           has been Installed Properly
GraphFromFasta:          has been Installed Properly
ReadsToTranscripts:      has been Installed Properly
parafly:                 has been Installed Properly
slclust:                 has been Installed Properly
collectl:                has been Installed Properly

Apparently bowtie, jellyfish, and salmon are not found as python modules. Install binaries from

and put them in the miniconda2 directory as that is the python used.

  • Sample Run (must run on mw256fd using local scratch 5TB)
#!/bin/bash
# submit via 'bsub < run_trinity'

#BSUB -n 24
#BSUB -R "span[hosts=1]"
#BSUB -q mw256fd
#BSUB -J trinity
#BSUB -o trinity_assembly
#BSUB -e trinity_stderr

export TRINITY_HOME=/share/apps/CENTOS6/trinityrnaseq-Trinity-v2.8.4
export PATH=$TRINITY_HOME:$PATH
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/lib:$LD_LIBRARY_PATH"
export PATH="/share/apps/CENTOS6/python/3.5.2/samtools-1.9/bin:$PATH"
export PATH="/share/apps/CENTOS6/miniconda2/bowtie2-2.3.5-linux-x86_64:$PATH"
export PATH="/share/apps/CENTOS6/miniconda2/jellyfish-2.2.10:$PATH"
export PATH="/share/apps/CENTOS6/miniconda2/salmon-0.13.1_linux_x86_64/bin:$PATH"
export LD_LIBRARY_PATH="/share/apps/CENTOS6/miniconda2/salmon-0.13.1_linux_x86_64/lib:$LD_LIBRARY_PATH"

# unique job scratch dirs
MYSANSCRATCH=/sanscratch/$LSB_JOBID
MYLOCALSCRATCH=/localscratch/$LSB_JOBID
MYLOCALSCRATCH5TB=/localscratch5tb/$LSB_JOBID
export MYSANSCRATCH MYLOCALSCRATCH MYLOCALSCRATCH5TB

# cd to remote working directory, only exists on mw256fd nodes
cd $MYLOCALSCRATCH5TB
pwd


Trinity --seqType fq --max_memory 240G \
        --left /sanscratch/tearley/seqs/illumina/fwd.fastq.gz \
        --right /sanscratch/tearley/seqs/illumina/rev.fastq.gz --CPU 24 \
        --long_reads /sanscratch/tearley/seqs/pacbio/ccs.fasta

# copy results to home (fix this brute force, keep results only...)
scp -r ./trinity_out_dir ~/trinity_out_dir_$LSB_JOBID

Sequencing Tools

All of these are located in /share/apps/CENTOS6/python/3.5.2/

Deepbinner: a deep convolutional neural network barcode demultiplexer for Oxford Nanopore reads

Canu: a single molecule sequence assembler for genomes large and small.

Eigen: Eigen is a C++ template library for linear algebra: matrices, vectors, numerical solvers, and related algorithms.

Nanopolish : Software package for signal-level analysis of Oxford Nanopore sequencing data.

  • this build had a lot of problems
  • remove all reference to openmp in this package
  • Nicole reports it works; probably use on command line on cottontail2 only

BioPython : Aset of freely available tools for biological computation

Samtools: is a suite of programs for interacting with high-throughput sequencing data. It consists of three separate repositories:

  • Samtools Reading/writing/editing/indexing/viewing SAM/BAM/CRAM format
  • BCFtools Reading/writing BCF2/VCF/gVCF files and calling/filtering/summarising SNP and short indel sequence variants
  • HTSlib A C library for reading/writing high-throughput sequencing data

Minimap2: A versatile pairwise aligner for genomic and spliced nucleotide sequences

ZENO

Caution Consult file /share/apps/src/cottontail2/ZENO_configure regarding the fix that was applied to SphereCenterModel/NanoFLANNSort.cc which is not a fix at all. Your mileage may vary.

# zeno-mpi was compiled with this MPI flavor
/share/apps/CENTOS6/miniconda2/bin/mpicc
/share/apps/CENTOS6/miniconda2/bin/mpicxx

# also setup these
export NANOFLANN_DIR=/share/apps/CENTOS6/nanoflann
export SPRNG_DIR=/share/apps/CENTOS6/sprng5

nanoflann

  • Nearest Neighbor library (header only)
  • does not need compiling except for test suite (requires c++11 and gcc 4.8.2 devtools)
  • test suite in build/bin directory
  • /share/apps/CentOS6/nanoflann

SPRNG

  • The Scalable Parallel Random Number Generators Library (SPRNG)
  • /share/apps/CENTOS6/sprng5/
  • no mpi (fails)

Galario

Accelerated Library for Analysing Radio Interferometer Observations

  • /share/apps/CENTOS6/miniconda2
  • contains python 2.7.15
  • packages galario, numpy, scipy, others ('conda list' for full listing)
  • packages mpi4py, emcee, astropy
[hmeij@cottontail ~]$ conda --version
conda 4.5.4

$HOME/.bashrc for galario
# added by Miniconda2 installer
export PATH="/share/apps/CENTOS6/miniconda2/bin:$PATH"

FreeSurfer

FreeSurfer is a software package for the analysis and visualization of structural and functional neuroimaging data from cross-sectional or longitudinal studies

FSL

FSL is a comprehensive library of analysis tools for FMRI, MRI and DTI brain imaging data.

Campari

Molecular Modeling Software

NOTE: Lots of errors in compiling serial mode.

NOTE2: When compiling campari_threads ewaldo.f90 fails terribly

FastTree

FastTree infers approximately-maximum-likelihood phylogenetic trees from alignments of nucleotide or protein sequences.

SageMath

Mathematics software, a viable free open source alternative to Magma, Maple, Mathematica and Matlab.

  # compiled against CentOS 6.4 (sharptail6) using this python version
  export PATH=/share/apps/CENTOS6/python/2.7.9/bin:$PATH
  export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.9/lib:$LD_LIBRARY_PATH

Blast

The National Center for Biotechnology Information advances science and health by providing access to biomedical and genomic information.

JAGS

Bayesian hierarchical models using Markov Chain Monte Carlo (MCMC) simulation that can be controlled from R.

  • v4.2.0 libs and startup script at /share/apps/CENTOS6/jags/4.2.0
    • Probably need to set PATH and LD_LIBRARY_PATH, examples on this page

BLCR

Compiled for every queue, for usage consult BLCR Checkpoint in OL3 and BLCR Checkpoint in OL3 for serial and parallel approaches.

  • program: cr_run, cr_restart and cr_checkpoint
  • install dir: /share/apps/blcr/0.8.5/queue

PyTransit

Astronomy package for exoplanet searches.

Installed into Python 2.7.10 on CentOS6.x, making sure CentOS gcc and gfortran are first in path.

# select vanilla python to use for centos6.x 64 bit
# also installed easy_install and pulled down new Numpy

 export PATH=/share/apps/CENTOS6/python/2.7.10/bin:$PATH
 export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.10/lib:$LD_LIBRARY_PATH

# notes: had to change --opt flag
# had to use the MAC line for compilation to avoid the invalid ELF header error
  python setup.py config_fc --fcompiler=gnu95 --opt="-03" \
  --f90flags="-cpp -fopenmp -march=native -mno-avx" build
  python setup.py install

Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/PyTransit-1.0-py2.7.egg-info

# later added

Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.10.1-py2.7-linux-x86_64.egg
Finished processing dependencies for emcee==2.1.0

Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/astropy-1.1rc2-py2.7-linux-x86_64.egg

Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/scipy-0.16.1-py2.7.egg-info

# make sure correct mpi tools
env MPICC=/path/to/mpicc pip install mpi4py

For MPI4py had to force MPI-2

export PATH=/home/apps/openmpi/1.2+intel-10/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/openmpi/1.2+intel-10/lib:$LD_LIBRARY_PATH
python setup.py build --configure
python setup.py install

And added on 09Aug2017 https://github.com/lkreidberg/batman Writing /share/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/batman_package-2.4.5-py2.7.egg-info

There is also a version in /share/apps/CENTOS6/blcr_soft with PyTransit and Emcee

OpenMP

Intel compiler library for parallel jobs.

Add to your LD_LIBRARY_PATH

LigPlus

Automatically generates schematic diagrams of protein-ligand interactions for a given PDB file.

MACS

Next generation parallel sequencing technologies made chromatin immunoprecipitation followed by sequencing (ChIP-Seq) a popular strategy to study genome-wide protein-DNA interactions, while creating challenges for analysis algorithms. .

Should use Python 2.7.9 (see python section on setting up environment) on CentOS6 nodes.

Modeller

MODELLER is used for homology or comparative modeling of protein three-dimensional structures.

  • program: mod9.13 (either invoked directly or with python, see below)
  • install dir: /home/apps/modeller/9.13
  • examples, demos and guides also posted

Please read the license at https://salilab.org/modeller/registration.html

How to use modeller: read /share/apps/modeller/9.13/INFO.TXT or visit web site.

Miriad

Miriad, a package of highly-specialized radio astronomy software.

# Define your Miriad environment by invoking the initialization scripts 
    % . MIRRC.sh

# in sh-like shells, or 
    % source MIRRC

Astropy

Compiled with /share/apps/python/2.6.1/bin/python …

Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/astropy-0.3.1-py2.6-linux-x86_64.egg
Processing dependencies for astropy==0.3.1
Searching for numpy==1.6.2
Best match: numpy 1.6.2
Processing numpy-1.6.2-py2.6-linux-x86_64.egg
numpy 1.6.2 is already the active version in easy-install.pth
Installing f2py script to /share/apps/python/2.6.1/bin
Using /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg
Finished processing dependencies for astropy==0.3.1

Emcee + MPI4py

For the latest version of emcee+mpi4py (03/15/2019) see the section of python 3.5.2 https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#python

Which is also used by the biology group for https://dokuwiki.wesleyan.edu/doku.php?id=cluster:73#sequencing_tools

Compiled with /share/apps/python/2.6.1/bin/python …

Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/emcee-2.1.0-py2.6.egg
Processing dependencies for emcee==2.1.0
Searching for numpy==1.6.2
Best match: numpy 1.6.2
Processing numpy-1.6.2-py2.6-linux-x86_64.egg
numpy 1.6.2 is already the active version in easy-install.pth
Installing f2py script to /share/apps/python/2.6.1/bin
Using /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg
Finished processing dependencies for emcee==2.1.0

Writing /share/apps/python/2.6.1/lib/python2.6/site-packages/mpi4py-1.3.1-py2.6.egg-info

# it found /home/apps/openmpi/1.2+intel-10/bin/mpicc so be sure to use mpirun from there too
export PATH=/home/apps/openmpi/1.2+intel-10/bin:$PATH

Also added emcee to python 2.7.10, see the section on pyTransit

Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/argparse-1.4.0-py2.6.egg Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/pandas2-0.0.0-py2.6.egg Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/subprocess32-3.5.3-py2.6-linux-x86_64.egg​

PyRosetta

(Rosetta 2.2, 3.0 and 3.5 also are installed)

Compiled with /share/apps/python/2.6.1/bin/python …

  • program: PyRosetta Bundle 2014wk05 in bin suddirectory
  • install dir: /home/apps/pyrosetta/2014wk05
  • doxygen in html dir, user guide also posted
  • demos and tools also posted

Please read the license at /share/apps/pyrosetta/2014wk05/License.2014wk05

Note: although it probably does not matter, the python compile was performed under CentOS6 because latest phenix version requires that. So target queues mw256 and mw256fd.

Gromacs

Gromacs-2018 for GPU (K20/GTX1080

Gromacs 5.1.4: there is only one version

  • cpu plus mvapich2 mpi
  • uses the gpu env as below but do not request a gpu resource
  • can run on both mwgpu and mw256fd in cpu/mpi mode
  • sample /home/hmeij/jobs/gpu/gromacs2016.cpu.mpi

Gromacs 5.1.2: there are four versions

  • cpu only (thread_mpi), cpu-mpi (openmpi), cpu-gpu (thread_mpi), mpi-gpu (openmpi)
  • compilations are done a bit differently switching to mpicc/mpiCC for mpi versions
  • consult the file /share/apps/CENTOS6/gromacs/build-5.1.2.txt to set up your environment
  • There is a template script at /home/hmeij/jobs/gpu/gromacs2016.sub
    • it gets as far as launching gromacs at least
    • no more mdrun replaced by gmx
  • and use the correct wrapper program, read Submitting GPU Jobs

For the latest version there are 6 compilations. Gromacs developers are pretty verbal in terms of “compile on a host where you are goig to run it”. For v5.0.1, the nomeclature is like this:

  • /path/to/gromacs/5.0.1-[cpu only “icc”|mpi|destination queue(s)]
  • you will need to setup the proper environment, see below
    • /share/apps/gromacs/5.0.1-icc-hp12
    • /share/apps/gromacs/5.0.1-mpi-hp12
    • /share/apps/gromacs/5.0.1-icc-mw256 [all queues starting with this string]
    • /share/apps/gromacs/5.0.1-mpi-mw256 [all queues starting with this string]
    • /share/apps/gromacs/5.0.1-icc-mwgpu
    • /share/apps/gromacs/5.0.1-mpi-mwgpu

The file /share/apps/gromacs/build-5.0.1.sh will show you notes on compiling. Update: The GPU versions failed to compile with icc/icpc but did compile with gcc. will investigate later on this strange behavior. — Meij, Henk 2014/10/01 15:33

# for all versions
. /share/apps/intel/composerxe/bin/iccvars.sh intel64
. /share/apps/gromacs/5.0.1 [-icc or -mpi] [-hp12 or -mw256 or -mwgpu] /bin/GMXRC.sh

# for all "icc versions" you are done

# for all "mpi versions" add this 
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH

# for all "gpu versions" add this 

export PATH=\
/cm/shared/apps/gcc/4.7.0/bin:/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\
/cm/shared/apps/lammps/cuda/2013-01-27/:/cm/shared/apps/amber/amber12/bin:\
/cm/shared/apps/namd/ibverbs-smp-cuda/2013-06-02/:/usr/lib64/qt-3.3/bin:\
/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/sbin:/usr/sbin:\
/cm/shared/apps/cuda50/toolkit/5.0.35/bin:/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\
/cm/shared/apps/cuda50/libs/current/bin:/cm/shared/apps/cuda50/toolkit/5.0.35/open64/bin:\
/cm/shared/apps/mvapich2/gcc/64/1.6/bin:/cm/shared/apps/mvapich2/gcc/64/1.6/sbin

export LD_LIBRARY_PATH=\
/cm/shared/apps/gcc/4.7.0/lib:/cm/shared/apps/gcc/4.7.0/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:/cm/shared/apps/amber/amber12/lib:\
/cm/shared/apps/amber/amber12/lib64:\
/cm/shared/apps/namd/ibverbs-smp-cuda/2013-06-02/:/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:\
/cm/shared/apps/cuda50/libs/current/lib64:/cm/shared/apps/cuda50/toolkit/5.0.35/open64/lib:\
/cm/shared/apps/cuda50/toolkit/5.0.35/extras/CUPTI/lib:/cm/shared/apps/mvapich2/gcc/64/1.6/lib

4.6 GPU Job examples

  • Compiled with ICC, you'll need this line (replace dot with source in csh)
  • Can run native ThreadMPI for parallel jobs but confined to a single node.
# source for Intel compiler setup
. /share/apps/intel/composerxe/bin/iccvars.sh intel64

# and source the GMXRC for your shell
. /share/apps/gromacs/4.6-icc[or -mpi]/bin/GMXRC.[bash|zsh|csh]
  • program: mdrun_mpi (v4.6)
  • install dir: /share/apps/gromacs/4.6-mpi
  • docs: on the web
  • Compiled against OpenMPI, you'll need these lines in addition to lines above
  • Can run parallel jobs across multiple nodes (use the lava wrapper program as usual)
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH

Mathematica

GPU version

Job submissions should go to queue mathematica which containes licensed nodes (all the hp12 nodes for now).

Eigen

Found Intel fortran compiler, so may have to source (albeit it should be platform independent)

 . /share/apps/intel/composerxe/bin/iccvars.sh intel64
 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64

EON

  • program: eonclient (client) and python (version 2.6.1, see PyPAL section, requires Numpy)
  • install dir: client and server, see above
  • docs: on the web
SERVER

export PATH=/home/apps/python/2.6.1/bin/:$PATH
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/eon-0.0.0-py2.6.egg

CLIENT

eonclient -h

Usage: eonclient [options] inputConfile [outputConfile]
Job Type:
  -m  Minimization of inputConfile saves to outputConfile
  -s  Single point energy of inputConfile
  -o  Optimization method [default: qm]
Required Options:
  -p  The potential (e.g. qsc, lj, eam_al)

SAS

SAS (v9.2 64 bits) is now available on the compute clusters. This software is an enormous statistical package and lots more, see http://www.sas.com It should be invoked in batch mode on the compute nodes just like all other software (ie command like “sas filename.sas” in job submission scripts). Some pointers, code, manual and an online tutor can be found at

http://sas.wesleyan.edu

The QAC has well trained tutors in SAS, for more info http://qac.wesleyan.edu

Wordom

Note: Lapack and Blas packages are needed and were installed on host greentail only. Hence the program will only work on this host. If it needs to be available on compute nodes, please let me know.

unixODBC

  • program: API with which to access Data Sources
  • install dir: installed with yum install unixODBC
  • docs:

This package works together with mysql-connector-odbc-5.1.8.-1.rhel5.x86_64.rpm (from http://dev.mysql.com/downloads/connector/odbc/ installed in OS) and pyodbc (from http://code.google.com/p/pyodbc/downloads/list installed into python 2.7.2).

OMSSA

go language

  • program: g6
  • install dir: /home/apps/go
  • docs: doc/ in install_dir
export PATH=/share/apps/go/bin:$PATH

Correlation Analysis

Notes: “coulomb” did not compile with OpenMPI, MVApich1 or 2, but finally compiled against MVApich 0.99 (upgrade to 1.2)… the rest was compiled with gcc an 6g (from go language). In order to run the MPI binary, do not use a our lsf/lava wrappers but build up the command line with -np and -machinefile parameters.

export GOROOT=/share/apps/go
export PATH=/share/apps/go/bin:/share/apps/mvapich/1.2+gnu-4.1.2+gen2/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/mvapich/1.2+gnu-4.1.2+gen2/lib:$LD_LIBRARY_PATH

FFTW MPI

  • 3.3.6 compiled with icc/ifort
  • /configure –prefix=/share/apps/fftw/3.3.6 \
    • CC=icc CXX=icpc F77=ifort FC=ifort CFLAGS=-O3\
    • –enable-openmp –enable-threads
  • NOTE: –enable-mpi did not work
 
 export PATH=/share/apps/fftw/3.3.6/bin:$PATH
 export LD_LIBRARY_PATH=/share/apps/fftw/3.3.6/lib:$LD_LIBRARY_PATH
  • program: libfftw_mpi.a
  • install dir: /home/apps/fftw/2.1.5_mpi
  • docs: also consult the lammps section for FFTW
  • www: see below

Note: Relies on v2 of FFTW (a C subroutine library for computing the discrete Fourier transform), view http://www.fftw.org/

Compiled as follows, you may have to set up your environment similarly:

export PATH=/share/apps/intel/cce/10.0.025/bin:\
/share/apps/openmpi/1.2+intel-10/bin:\
$PATH

export LD_LIBRARY_PATH=/share/apps/intel/cce/10.0.025/lib:\
/share/apps/openmpi/1.2+intel-10/lib:\
/share/apps/intel/cmkl/10.0.011/lib/em64t:\
$LD_LIBRARY_PATH


./configure --prefix=/share/apps/fftw/2.1.5_mpi \
  --enable-mpi CC=icc CXX=icpc F77=ifort FC=ifort

Plumed

  • v2.3
    • on node n78:/usr/local/plumed-2.3
    • compiled with mpich3 mpicxx

You may to have set the following:

 export MKL_HOME=/share/apps/intel/cmkl/9.1.021
 export AMBERHOME=/share/apps/amber/9serial+ifort+plumed
 export LD_LIBRARY_PATH=/share/apps/intel/cmkl/9.1.021/lib/64:$LD_LIBRARY_PATH

SDA

LAMMPS CPU/GPU

  • /share/apps/CENTOS7/lammps/3Mar2020
    • lmp_mpi-[double-double|single-double|single-single]
  • consult this file for packages installed and compilations
    • /share/apps/CENTOS7/lammps/03Mar2020.install
    • CUDA_ARCH = sm_75 (probably only works on queue exx96)
  • sample script in ~hmeij/k20redo/run.rtx.lammps
# simple colloid example
CPU run -n1
Performance: 493311 tau/day, 1141 timesteps/s
GPU run gpu4=1 double-double: 1.1x
Performance: 571354 tau/day, 1322 timesteps/s
GPU run gpu4=1 single-double: 1.2x
Performance: 608498 tau/day, 1408 timesteps/s
GPU run gpu4=1 single-single: 1.3x
Performance: 623405 tau/day, 1443 timesteps/s
  • /share/apps/CENTOS6/lammps/31MAR17
    • lmp_serial_icc, lmp_mpi (with Kiss FFT)
    • lmp_gpu_single, lmp_gpu_double and lmp_gpu_mixed with the accelerator library (arch=35) consult http://lammps.sandia.gov/doc/Section_accelerate.html
    • Please read the lammps-install.txt file, it did not all go according to docs.
    • Sample job: /home/hmeij/lammps/colloid-gpu/run.gpu2
    • lmp_gnu, lmp_gpu done the “malaysian” way, consutl the lammps-install-usm.txt file
      • this serial and gpu version will have more packages
  • /share/apps/CENTOS6/blcr_soft/lammps/16Feb16
    • can be checkpointed
  • /share/apps/CENTOS6/lammps/10AUG15
precision=N to set the precision level
  N = 1 for single precision (default)
  N = 2 for double precision
  N = 3 for positions in double precision
  N = 4 for positions and velocities in double precision

To use any of these you will need to setup the environment, like detailed below (PATH, LD_LIBRARY_PATH). Here is the environment showing you the version of the software used in compilation of lmp_cuda_double. The other versions used the same compiler (ICC) and MPI version (OpenMPI 1.8.4).

Note: According to the docs with USER-CUDA you must have 1 cpu MPI process for 1 GPU, so 1:1, 2:2 …

### USER-CUDA sample script /home/hmeij/sharptail/run_user-cuda.gpu

export PATH=/home/apps/bin:\
/bin:\
/usr/bin:\
/usr/local/sbin:\
/usr/sbin:\
/sbin:\
/usr/sbin:\
/usr/lib64/qt-3.3/bin:\
/cm/local/apps/cuda50/libs/304.54/bin:\
/cm/shared/apps/cuda50/toolkit/5.0.35/bin:\
/cm/shared/apps/cuda50/sdk/5.0.35/bin/linux/release:\
/cm/shared/apps/cuda50/libs/current/bin:\
/cm/shared/apps/cuda50/toolkit/5.0.35/open64/bin:\
/cm/shared/apps/mvapich2/gcc/64/1.6/bin:\
/cm/shared/apps/mvapich2/gcc/64/1.6/sbin:\
/cm/shared/apps/lammps/cuda/2013-01-27/:\
/share/apps/CENTOS6/openmpi/1.8.4/bin:\
/share/apps/CENTOS6/python/2.7.9/bin:\
/share/apps/CENTOS6/lammps/10Aug15:\
/home/hmeij/bin:\
$PATH

export LD_LIBRARY_PATH=/cm/local/apps/cuda50/libs/304.54/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/lib:\
/cm/shared/apps/cuda50/libs/current/lib64:\
/cm/shared/apps/cuda50/toolkit/5.0.35/open64/lib:\
/cm/shared/apps/cuda50/toolkit/5.0.35/extras/CUPTI/lib:\
/share/apps/CENTOS6/openmpi/1.8.4/lib/:\
/share/apps/CENTOS6/python/2.7.9/lib:\
/cm/local/apps/cuda50/libs/304.54/lib64:\
/cm/local/apps/cuda50/libs/304.54/lib/:\
$LD_LIBRARY_PATH

go to How to submit GPU jobs with Lammps

  • program: (compiled with mpicxx, different than presious versions
  • install dir: /home/apps/CENTOS6/lammps/9Dec14 … consult the README file to set up environment

Note: all versions have package yes-colloid installed.

  • lmp_serial_icc, lmp_mpi, lmp_mpi-with-fft (compiled with icc v12 and mpicxx v1.4.4)
 . /share/apps/intel/composerxe/bin/iccvars.sh intel64
 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64
 export MKL_HOME=/share/apps/intel/composerxe/mkl

export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
  • lmp_serial, lmp_openmpi (compiled with gcc 4.8.4 and mpicxx 1.8.4)
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH

export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/1.8.4/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS6/openmpi/1.4.4/bin:$PATH
  • program: lmp_serialwes, lmp_linuxwes, lmp_tbirdwes (linux and tbird compiled with openmpi 1.4.4)
  • install dir: /home/apps/lammps/01Feb14 … consult the README file to set up environment
  • program: lmp_serialwes, lmp_linuxwes, lmp_tbirdwes (linux and tbird compiled with openmpi 1.4.4)
  • install dir: /home/apps/lammps/06Feb12 … consult the README file

Please consult the README file to set up your environment properly. The '22Jun11' LAMMPS version was compiled with Intel compilers v12 and the OpenMPI mpicc compiler of v1.4.3

Note: Relies on v2 of FFTW (a C subroutine library for computing the discrete Fourier transform), view http://www.fftw.org/

Compiled with a variety of compilers and MPI flavors. Your mileage may vary as serious warnings were generated at compile time. However, lmp_serial_wes and lmp_tbird_wes compiled without any errors.

You may have redefine your PATH and LD_LIBRARY_PATH, please consult /share/apps/lammps/30Mar10/README

Pizza.py is a loosely integrated collection of tools, many of which provide pre- and post-processing capability for the LAMMPS molecular dynamics, ChemCell cell modeling, and SPPARKS kinetic Monte Carlo packages. (yea, “loosely” is right). In order for this to work a new python flavor was installed. Please read the section about python/2.6.1.tcltk in the python section here

Libraries

  • fortran functions of all kind
  • install dir: /home/apps/libraries/libs (links to all)
  • docs: on the web

Please note the appropriate citation and reference guides for each library (found on web site).

Note: compiled with ifort

GSL

  • program: GNU Scientific Library (GSL) is a numerical library for C and C++ programmers.
  • install dir: /home/apps/gsl/1.9
  • docs: in man/ directory of install dir

Note: compiled with gcc

Newbend

  • program: see the demo_ouput.txt files
  • install dir: /home/apps/newbend
  • docs: none
  • www: none

Note: Contact David Beveridge for more information (dbeveridge@wes)

Please note, there is an 'Aborted' event in one of the demos. glibc throws an error documented in the demo_output.txt file. Seems to still generate output though.

Curves+

Note: compiled with ifort

Naccess

Note: compiled with ifort

Phenix

Note: For internal, non-commercial, research and development purposes only. Consult the LICENSE file and Citing information. Users need to set their environment, examples shown.

Version below only works on queues mw256 and mw256fd

  • program: phenix linux 2.6, x86_64, FC12
  • install dir: /home/apps/phenix/phenix-1.8.4
  • docs: in documentation/ directory of install dir
#   csh users:
     source /share/apps/phenix/1.8.4/phenix-1.8.4-1496/phenix_env

#   bash users:
     source /share/apps/phenix/1.8.4/phenix-1.8.4-1496/phenix_env.sh

Version below only works on queue hp12

  • program: phenix, linux kernel 2.6, x86_64
  • install dir: /home/apps/phenix/phenix-1.5-2
  • docs: in documentation/ directory of install dir
# bash users:
source  /share/apps/phenix/phenix-1.5-2/phenix_env.sh

Expokit

Note: our hardware/platform not supported, compiled with ifort (minor warnings)

WHAM

Note: wham compiled with warnings using icc. wham-2d failed to compile.

Emacs

  • on head node only in /usr/bin
  • version 21.4.1

Autodock Vina

  • we also have the autoduck4 program - check below (search for autodock4)
  • update with 1.1.2 in /share/apps/autodock
    • moved the beta version out of the way and linked to 1.1.2 version

* program: vina v 1.0 beta 03 (for molecular docking and virtual screening)

export PATH=/share/apps/autodock:$PATH

# or

export PATH=/share/apps/autodock/autodock_vina_1_1_2_linux_x86/bin:$PATH

Also linked in via /home/apps/bin which is in default PATH.

NetBeans

  • program: several, v 6.0.1
  • install dir: /usr/local/netbeans/6.0.1
  • install dir: /usr/local/glassfish/v2ur1 http (28080), https (28181), admin (24848)
  • install dir: /usr/local/apache-tomcat/6.0.14
  • docs:

BaseMap

  • program: Basemap (plot on map projections (with coastlines and political boundaries) using matplotlib.
  • egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/basemap-0.99.3-py2.6.egg-info
  • docs:

PyPat

The dependencies for PyPat … drummroll … incredible!

In order to run some PyPat programs or invoke some of the underlying programs you may have to set some of the published environment variables listed below.

Python

  • Installed 2.6.1 (see notes below)
  • Must be compiled with zlib module (unsure)
  • Must have Python-PMW (PyMol requirement, unsure)
export PYTHONHOME=/home/apps/python/2.6.1

ImageMagick

  • Latest default version, see below.

NumPy & Scipy

  • program: (python egg, version 1.2.1)
  • egg info: /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.2.1-py2.6.egg-info
  • docs: guide posted on web site

NumPy was configured against the CMKL BLAS and LAPACK by uncommenting the lines for [cmkl] in site.cfg

export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
export PYTHONHOME=/home/apps/python/2.6.1

It appears you also need to set your PATH to find the appropriate binary:

export PATH=/home/apps/python/2.6.1/bin:$PATH

Also upgrade and installed installed
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/numpy-1.6.2-py2.6-linux-x86_64.egg
Installed /home/apps/python/2.6.1/lib/python2.6/site-packages/scipy-0.11.0-py2.6-linux-x86_64.egg

NumPy require the installation of Nose.

Nose

MatPlotLib

Note: needed to add the location of the numpy python egg to the basedir array so that setup.py could find the egg.

export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
export PYTHONHOME=/home/apps/python-2.6.1

PyMOL

export LD_LIBRARY_PATH=/home/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH
export PYTHONHOME=/home/apps/python-2.6.1
export PYMOL_PATH=/home/apps/python-2.6.1/lib/python-2.6.1/site-packages/pymol

PyMOL required the presence of libglut. Installed rpm -ivh freeglut-devel-2.2.0-14.x86_64.rpm package.

PyMOL libraries have also been installed in the following location:

export LD_LIBRARY_PATH=/home/apps/pymol/1.1:$LD_LIBRARY_PATH

The following was not required by PyPat, but was requested for installation from http://sitemaker.umich.edu/carlsonlab/resources.html

Automated Flooding and Clustering source code

  • program: (python egg build from source code)
  • egg: /home/apps/python/2.6.1/lib/python2.6/site-packages/AutoClustering-1.0-py2.6.egg-info
  • www: carlson web site

Global wRMSD Fit source code

  • program: (python source code)
  • file: /home/apps/python/scripts/global_wrmsd_v8.3.py
  • www: carlson web site
  • required: /home/apps/python/2.6.1/lib/python2.6/site-packages/biopython-1.54b-py2.6.egg-info
  • required: Numpy (set PATH, PYTHONPATH and LD_LIBRARY_PATH)

Local wRMSD Fit source code

  • program: (python source code)
  • file: /home/apps/python/scripts/local_wrmsd_v07.py
  • www: carlson web site
  • required: same as Global

wRMSD input preparation

  • program: (python source code)
  • file: /home/apps/python/scripts/prep_for wrmsd.py
  • www: calson web site

PyMOL Rendering plug-in

  • program: (python source code)
  • file: /home/apps/python/scripts/rendering.[py|pyc]
  • www: calson web site

Mesa

  • program: open source implementation of OpenGL specification, a system for rendering #D graphics; v 6.5.1.-7.7
  • install dir: (rpm based, rpm -ql package_name)
    • mesa-GL
    • mesa-libGL
    • mesa-libGl-devel
    • mesa-libGLU
    • mesa-libGLU-devel

MGLTools

centos6

below is rhel5

This package works with Autodock. This is a 64 bit installation against glibc 2.5+. You will need to adjust your PYTHONHOME and PYTHONPATH variables.

.cshrc:

set path = (/home/apps/mgltools/1.5.2/bin $path)

.bashrc
export PATH=/home/apps/mgltools/1.5.2/bin:$PATH

VMD

Please try to run these on host

  • 1.8.6 on greentail (centos 5.x) for interactive use
  • 1.9.2 on cotontail2 (centos6.x) for interactive use
  • set your PATH and LD_LIBRARY_PATH accordingly
  • ImageMagick (latest default version)
  • NetPBM v10.47.05-11
  • install dir: (RPM based: netpbm netpbm-devel netpbm-progs)
  • docs: rpm -ql netpbm

MPlayer

Linked in via /home/apps/bin

PCASuite

Note: Compiled against v10 intel compilers, 64 bit. Program clearly configured for 32 bit operation.

Perl

  • /usr/bin/perl, version 5.8.8
  • Additional modules installed:
    • Bundle::CPAN
    • IO::Compress::Bzip2
    • Archive::Zip
    • PAR
    • PAR::Dist
    • PAR::Packer
    • PDL (Scientific computing using Perl Dta Language)

Chem Tools

  • programs: gcmc.exe, mmc.exe , micm.exe
  • install dir: /home/apps/bin
  • no documentation, contact the Beveridge research group

Note: compiled with ifort -f77rtl -o name.exe name.f

Warning: when compiling micm, ifort warns of 2-digit year.

Mead

Note: configured with option –with-python which installed into /usr/lib/python2.4 (also copied to /home/apps/python/2.6.1/site_packages

Mopac

NAMD

GPU version

Version 2.7b1 was compiled from source against OpenMPI (1.2+intel-10). You may have to add the OpenMPI lib/ directory to your LD_LIBARY_PATH.

Following two install are binary distros:

Note: version 2.6 is swallowtail's version compiled against Topspin MPI.

eport LD_LIBRARY_PATH=/usr/local/topspin/mpi/mpich/lib64

There is a sample page here: LINK

NAB

Note: this is the compiled copy of swallowtail and seems to find the libraries properly. NAB is now part of AmberTools so the newest nab program can be found in the /home/apps/amber/10+openmpi-1.2+intel-10/bin location.

  • program: several programs in bin/ directories; main compiler nab
  • install dir: /home/apps/nab/5.1.2
  • docs:doc and examples directories in install dir

Users should set the NABHOME environment variable. Programs were compiled with icc with the -mpi flag.

If using csh/tcsh add to ~/.cshrc

  • source /home/apps/nab/5.1.2/env.csh

or if using sh/bash add to ~/.bashrc

  • . /home/apps/nab/5.1.2/env.sh

libsndfile

  • program: C library for reading and writing files containing sampled sound.
  • install dir: /home/apps/libsndfile/1.0.19
  • docs: see below

Lame

  • program: lame (high quality MPEG audio layer III (MP3) encoder)
  • install dir: /home/apps/lame/398.2
  • docs: see below

twoLame

  • program: twolame (optimized MPEG audio layer II (MP2) encoder)
  • install dir: /home/apps/twolame/0.3.10
  • docs: see below

ImageMagick

  • program: ImageMagick (montage, animate, and more)
  • install dir: /home/apps/imagemagick/[6.4.8-5 and 6.5.0.2]/bin
  • docs: inside install_dir/share

In order to make this the default in your PATH, for example …

export PATH=/home/apps/imagemagick/6.4.8-5/bin:$PATH

in your shell or add it to ~/.bashrc. The latest version is linked in via /home/apps/bin

gdb

ddd

Delphi

Note: Please read the license agreement.

Grace

  • program: Grace v 5.1.22 (with X11, EPS, MIF, JPEG, PNG support)
  • install dir: /usr/local/grace/5.1.22/grace … default, linked via /usr/local/bin
  • docs: user guide is located in doc/ inside install dir
  • program: Grace v 5.1.20 (with X11, EPS, MIF, JPEG, PNG support)
  • install dir: /usr/local/grace/5.1.20/grace
  • docs: user guide is located in doc/ inside install dir

Mutt

  • program: Mutt Mail User Agent v 1.5.18 (for those reading email locally)
  • install dir: /usr/local
  • docs: man mutt

Gnuplot

  • program: Gnuplot v 4.2.2
  • install dir: /home/apps/gnuplot/4.2.4 (linked in via /share/apps/bin)
  • docs: man gnuplot

Qt

  • program: Qt v 4.5.0
  • install dir: /home/apps/qt/4.5.0
  • docs: doc/html inside install_dir
  • program: Qt v 4.3.0
  • install dir: /home/apps/qt/4.3.0
  • docs: doc/html inside install_dir

Make the programs available like so:

export PATH=/home/apps/qt/4.5.0/bin:$PATH

R & R/parallel

install.packages('lme4', dependencies = TRUE)
install.packages('tidyverse', dependencies = TRUE)
install.packages('mice', dependencies = TRUE)
install.packages('rpart', dependencies = TRUE)
install.packages('party', dependencies = TRUE)
install.packages('caret', dependencies = TRUE)
install.packages('nnet', dependencies = TRUE)
install.packages('e1071', dependencies = TRUE)
install.packages('xgboost', dependencies = TRUE)
install.packages('kernlab', dependencies = TRUE)
install.packages('randomForest', dependencies = TRUE)
  • R 3.6.1 (running against “packages”, had to comment out libcurl version check)
  • New java jdk-12.0.2 (by default in your path)
  • packages lme4 and tidyverse installed (dependencies=TRUE)
  • packages: mice, rpart, party, caret, randomforest, nnet, e1071, kernLab, xgboost
    • note: had to make a link to omp.h from gcc 4.8.4 to OS native gcc…works?
  export PATH=/share/apps/CENTOS6/R/packages/bin:$PATH
  export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
  export PATH=/share/apps/CENTOS6/R/3.6.1b/bin:$PATH
  export LD_LIBRARY_PATH=/share/apps/CENTOS6/R/3.6.1b/lib64:$LD_LIBRARY_PATH
  export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
  export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:/share/apps/CENTOS6/gcc/4.8.4/lib:$LD_LIBRARY_PATH

  #export PATH=/share/apps/java/jdk-12.0.2/bin:$PATH
  #export LD_LIBRARY_PATH=/share/apps/java/jdk-12.0.2/lib:$LD_LIBRARY_PATH
  • R 3.5.1 (new libcurl package)
  • Uses same old java home 1.7.0_21 (jdk or jre?), should remember to update
  • package lme4 installed
export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/3.5.1/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/3.5.1/lib64:$LD_LIBRARY_PATH

export PATH=/share/apps/java/jdk1.7.0_21/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/java/jdk1.7.0_21/lib:$LD_LIBRARY_PATH
  • R 3.4.3 (major pain with small packages ahead of distro versions, consult)
  • Uses JAVA_HOME: /home/apps/java/java (points to jdk1.7.0_21, should update that)
export PATH=/home/apps/CENTOS6/R/3.4.3/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/3.4.3/lib64:$LD_LIBRARY_PATH
export PATH=/home/apps/CENTOS6/R/packages/bin:$PATH
export LD_LIBRARY_PATH=/home/apps/CENTOS6/R/packages/lib:$LD_LIBRARY_PATH
export MKL_HOME=/share/apps/intel/composerxe/mkl

Native CentOS R is most up to date on node swallowtail for interactive use. It also has a bunch of packages installed.

Note: All versions were compiled with Blas and Lapack libraries found in the Cluster Math Kernel Libraries.

Set your environment as follows for 3.1.0

 . /share/apps/intel/composerxe/bin/iccvars.sh intel64
 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64
 export MKL_HOME=/share/apps/intel/composerxe/mkl

                                                                                   
export PATH=/share/apps/R/3.1.0/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/R/3.1.0/lib64/R/lib:$LD_LIBRARY_PATH  

Set your environment as follows for 2.14.1

  • contains modules: Igraph and Multicore
 . /share/apps/intel/composerxe/bin/iccvars.sh intel64
 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64
 export MKL_HOME=/share/apps/intel/composerxe/mkl

                                                                                   
export PATH=/share/apps/R/2.14.1/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/R/2.14.1/lib64/R/lib:$LD_LIBRARY_PATH  

Set your environment as follows for 2.8.1

        . /share/apps/intel/cce/10.0.025/bin/iccvars.sh
        . /share/apps/intel/fce/10.0.025/bin/ifortvars.sh


export PATH=/share/apps/R/2.8.1/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/intel/cmkl/10.0.011/lib/em64t:$LD_LIBRARY_PATH

Please consult the README file in install dir to set your environment.

Galaxsee

Small program to show the concept of parallel programmning using n-body physics problem. Compiled against old Topspin MPI.

export LD_LIBRARY_PATH=/usr/local/topspin/mpi/mpich/lib64:$LD_LIBRARY_PATH

Dowser

Note: users of dowser should add one of the following lines …

if using csh/tcsh add to ~/.cshrc

  • source /home/apps/dowser/dowserinit.csh

or if using sh/bash add to ~/.bashrc

  • . /home/apps/dowser/dowserinit.sh

Amber

AMBER 18

  • /share/apps/CENTOS7/amber/amber18
  • compiled on n33 with a /usr/local/n37-cuda-9.2 soft link
  • view compile steps at /share/apps/CENTOS7/amber/amber18.install
  • when submitting on exx96 we're mixing
    • 10.2 cuda drivers
    • 9.2 toolkits
  • some speed results below K20 vs RTX2080s, Amber 16 vs 18
  • sample jobs at ~hmeij/k20redo/run.amber[16|18]
# compiled on n33
[hmeij@n33 amber18]$ ls -l bin/pmemd*
-rwxr-xr-x 1 hmeij its  3549803 Mar 30 10:19 bin/pmemd
lrwxrwxrwx 1 hmeij its       15 Mar 30 11:57 bin/pmemd.cuda -> pmemd.cuda_SPFP
-rwxr-xr-x 1 hmeij its 93895208 Mar 30 12:11 bin/pmemd.cuda_DPFP
-rwxr-xr-x 1 hmeij its 94734272 Mar 30 13:07 bin/pmemd.cuda_DPFP.MPI
lrwxrwxrwx 1 hmeij its       19 Mar 30 12:51 bin/pmemd.cuda.MPI -> pmemd.cuda_SPFP.MPI
-rwxr-xr-x 1 hmeij its 75579744 Mar 30 11:57 bin/pmemd.cuda_SPFP
-rwxr-xr-x 1 hmeij its 76418760 Mar 30 12:51 bin/pmemd.cuda_SPFP.MPI
-rwxr-xr-x 1 hmeij its  4304392 Mar 30 11:23 bin/pmemd.MPI

# some quick results
|     Average timings for all steps:

~hmeij/k20redo/run.amber18 - mwgpu - K20
|         ns/day =       2.84   seconds/ns =   30385.24
~hmeij/k20redo/run.amber18 - exx96 - RTX2080S
|         ns/day =      11.66   seconds/ns =    7408.08
~hmeij/k20redo/run.amber16 - exx96 - RTX2080S
|         ns/day =       7.09   seconds/ns =   12185.91

AMBER 16

  • on node n78 in /usr/local
    • cuda 8 for GTX1080Ti GPUs
    • plumed2.3

AMBER 16

  • program: v 16 sander.MPI, sander.LES.MPI, pmemd, pmemd.MPI (compilation details in path)
  • install dir: /home/apps/CENTOS6/amber/amber16
  • includes AmberTools 17, python miniconda
  • docs: doc dir inside install_dir, please read Amber17.pdf
  • sample script: /home/hmeij/amber/cd/run.cpu
  • no gpu capabilitities, requires cuda 7.5+
# next line important for Amber python modules, miniconda
source /share/apps/CENTOS6/amber/amber16/amber.sh
export AMBERHOME=/share/apps/CENTOS6/amber/amber16
export PATH=$AMBERHOME/bin:$PATH
export PATH=/share/apps/CENTOS6/python/2.7.9/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.9/lib:$LD_LIBRARY_PATH
export PATH=/share/apps/CENTOS6/openmpi/1.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/openmpi/1.8.4/lib/:$LD_LIBRARY_PATH

GPU version

AMBER 12

  • program: v12 sander.MPI, sander.LES.MPI, pmemd, pmemd.MPI (compilation details in path)
  • install dir: /home/apps/amber/12cpuonly
  • includes AmberTools 1.3
  • docs: doc dir inside install_dir, please read Amber12.pdf

Has been compiled using Intel v12 compilers and Openmpi 1.4.4, hence you need the code below to setup your environment. This v12 version of Amber is for serial and parallel runs for CPUs only (not GPUs). The appropriate wrapper is the lava.openmpi.wrapper script. Script runiing v9 and v10 version of Amber are identical to this v12 version once paths have been adjusted.

export AMBERHOME=/share/apps/amber/12cpu-only
export PATH=$AMBERHOME/bin:$PATH

 . /share/apps/intel/composerxe/bin/iccvars.sh intel64
 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64
 

export PATH=/share/apps/openmpi/1.4.4+intel-12/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/openmpi/1.4.4+intel-12/lib:$LD_LIBRARY_PATH

AMBER 9

  • program: version 9 of sander.MPI, pmemd (compilation details in path)
  • install dir: /home/apps/amber/9+openmpi-1.2+intel9
  • docs: doc dir inside install_dir

Please invoke with appropriate wrapper script.

Some related details about the Amber installation(s) and how to run Amber jobs on the cluster can be found at this Link. This is old information but may help you along.

For sander.MPI and pmemd please add the following line to your ~/.bashrc file

export LD_LIBRARY_PATH=/home/apps/intel/cmkl/9.1.021/lib/em64t:$LD_LIBRARY_PATH

AMBER 10

  • program: v 10 sander.MPI, sander.LES.MPI, pmemd, pmemd.MPI (compilation details in path)
  • install dir: /home/apps/amber/10+openmpi-1.2+intel-10
  • includes AmberTools 1.2
  • docs: doc dir inside install_dir, please read Amber10.pdf

Please invoke with appropriate wrapper script.

Some related details about the Amber installation(s) and how to run Amber jobs on the cluster can be found at this Link. This is old information but may help you along.

For sander.MPI and pmemd please add the following line to your ~/.bashrc file (I compiled 9 and 10 against same CMKL version to avoid confusion).

export LD_LIBRARY_PATH=/home/apps/intel/cmkl/9.1.021/lib/em64t:$LD_LIBRARY_PATH

AmberTools 1.2 has been installed.

RIP Scripts has been installed, consult http://boscoh.com/rip Location: /home/apps/amber/rip-scripts In order to use, you will need to point to Phyton 2.6.1 which contains the Numpy egg.

Rosetta

(PyRosetta v2.011 is also installed)

Both below compiled with /share/apps/python/2.6.1/bin/python …

Please read the license at /share/apps/rosetta/LICENSE-3.0

  • program: Rosetta 3.0 (see compiler note below)
  • install dir: /home/apps/rosetta/3.0
  • others: database, biotools, demos, bin (add latter to $PATH)

Please read the license at /share/apps/rosetta/LICENSE-3.0

Note: this is a 64-bit compilation using python 2.6.1 and therefore is an “unsupported” configuration.

  • program: Rosetta++ 2.2.0 (see compiler note below)
  • install dir: /home/apps/rosetta/2.2.0
  • scripts, database & docs: inside install_dir

Please read the license at /share/apps/rosetta/LICENSE-2.2.0

Note: this is a 64-bit compilation (done on swallowtail) using target gcc64 and therefore is an “unsupported” configuration.

Gaussian

EMT64 x86_64 version (for mw256[fd], mw128, tinymem)

  • program: Gaussian 16 (g16)
  • install dir: /home/apps/gaussian/g16root

EMT64 x86_64 version (for hp12)

  • program: Gaussian GDV H11.1
  • install dir: /share/apps/gaussian/gdvh11
export gdvroot="/share/apps/gaussian/gdvh11"
. $gdvroot/gdv/bsd/gdv.profile

EMT64 x86_64 version (for swallowtail/petaltail)

  • program: Gaussian 09 Revision (g09)
  • install dir: /home/apps/gaussian/g09root

AMD64 x86_64 version (for sharptail)

  • program: Gaussian 09 Revision (g09)
  • install dir: /home/apps/gaussian/g09root_amd64

AMD64 x86_64 version (for sharptail)

  • program: Linda 8.2 (g09)
  • install dir: /home/apps/gaussian/g09root_amd64_linda

AMD64 & EMT64 x86_64 version (for both swallowtail/petaltail and sharptail clusters)

  • program: GaussView v5
  • install dir: 'gv' in all g09root directories listed above

Please note that if run from sharptail, you must explictedly enable x-forwarding when connecting to this host, like so

ssh -X sharptail

ENVIRONMENT

Users will want to add the following to their .login file:

  setenv g09root "dir"
  setenv GAUSS_SCRDIR "scr-dir"
  source $g09root/g09/bsd/g09.login     # sets up G09 run environment.
 The <dir> in the first command is the location of the g09 directory.
 For example, if the path to this directory is /usr/local/g09, then
 set g09root to /usr/local. The <scr-dir> in the second command is
 a directory to be used by default for Gaussian 09 scratch files. There
 should be plenty of disk space available at this location. 
 Bourne-type shell users would use these commands instead of the
 ones above:
 g09root="dir"
 GAUSS_SCRDIR="scr-dir"
 export g09root GAUSS_SCRDIR
 . $g09root/g09/bsd/g09.profile

DOCUMENTATION

Note: in order to use this application you must be a member of group gaussian. Please send request to the administrator.

Details about the Gaussian installation and how to run Gaussian jobs on the cluster can be found at this Link. Please read that page.

Openbabel

You will need to add the bin/ and lib/ directories to your PATH and LD_LIBRABY_PATH env parameters (example son this page)

For CentOS6.x (mw256[fd] queues), compiled with cmake/gcc

  • program Openbabel (v 2.3.2)
  • install dir: /share/apps/CENTOS6/openbabel/2.3.2
  • docs: check web site

For RHEL5.x and CentOS5.x (hp12 queue), compiled with ICC/ifort

  • program Openbabel (v 2.2.1)
  • install dir: /share/apps/openbabel/2.2.1
  • docs: check web site
  • program Openbabel (v 2.1.1)
  • install dir: /share/apps/openbabel/2.1.1
  • docs: check web site

AutoDock

  • program: AutoDock 4.0.1 and AutoGrid 4.0.0 (autodock4, autogrid4)
  • install dir: /home/apps/autodock
  • docs: unknown, check web site

Note. Requires Openbabel. You need to add the following lines to your ~/.bashrc file or program.

export PATH=/home/apps/autodock:/home/apps/openbabel/2.1.1/bin:$PATH
export PYTHONPATH=/home/apps/openbabel/2.1.1/site-packages:$PYTHONPATH
export PERL5LIB=/home/apps/openbabel/2.1.1/lib/perl:$PERL5LIB
export LD_LIBRARY_PATH=/home/apps/openbabel/2.1.1/lib:$LD_LIBRARY_PATH

Note. The perl plugin will not compile. Have a message to the list.
Meij, Henk 2008/01/25 15:39

MPI

OpenMPI

  • program: OpenMPI v 3.1.3
  • install dir: /share/apps/CENTOS6/openmpi/3.1.3
  • program: OpenMPI v 1.10.2
  • install dir: /share/apps/CENTOS6/openmpi/1.10.2
  • CHECKPOINTING SUPPORT (be sure in combination with BLCR, consultBLCR Checkpoint in OL3
    • program: OpenMPI 1.6.5
    • install dir: /share/apps/CENTOS6/openmpi-1.6.5.cr
  • program: OpenMPI v 1.8.4 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/CENTOS6/openmpi/1.8.4/bin/ (compiled with gcc 4.8.4 !!!)
  • docs: inside install dir in dirs man & share
export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH
  • program: OpenMPI v 1.4.4 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/openmpi/1.4.4+intel-12/bin/ (compiled with icc/ifort intel v12)
  • docs: inside install dir in dirs man & share
  • program: OpenMPI v 1.4.3 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/openmpi/1.4.3+intel-12/bin/ (compiled with icc/ifort intel v12)
  • docs: inside install dir in dirs man & share

For v1.4.3/v1.4.4 you will have to set:

 . /share/apps/intel/composerxe/bin/iccvars.sh intel64
 . /share/apps/intel/composerxe/bin/ifortvars.sh intel64
 #also change (not sure it matters), LD_LIBRARY_PATH is modified by above scripts
 export MKL_HOME=/share/apps/intel/composerxe/mkl
  • program: OpenMPI v 1.3.1 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/openmpi/1.3.1+intel-10/bin/ (compiled with icc/ifort intel v10)
  • docs: inside install dir in dirs man & share
  • program: OpenMPI v 1.2 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …) DEFAULT
  • install dir: /home/apps/openmpi/1.2+intel-9/bin/ (compiled with icc/ifort intel v9)
  • docs: inside install dir in dirs man & share
  • program: OpenMPI v 1.2 (default)(programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/openmpi/1.2+intel-10/bin/ (compiled with icc/ifort intel v10)
  • docs: inside install dir in dirs man & share
  • program: OpenMPI v 1.2 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /opt/openmpi/gnu/bin/ (compiled with default gcc 4.1.2)
  • docs: inside install dir in dirs man & share

MVAMPICH

  • program: Mvapich v 0.9.9 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/mvampich/0.9.9 (compiled with intel v 10)
  • docs: inside install dir in dirs man & doc

LAM

Warning end-of-life, please use OpenMPI

  • program: LAM v 7.1.3 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /home/apps/lam/7.1.3 (includes xmpi, compiled with intel v 10)
  • docs: inside install dir in dirs man & doc

Other Gnu MPI Flavors

  • program: Mpich1 v 1.2.7 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /opt/mpich1/gnu/bin/ (compiled with default gcc 4.1.2)
  • program: Mpich2 v 1.0.6 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /opt/mpich1/gnu/bin/ (compiled with default gcc 4.1.2)
  • program: Mvapich1 v 1.2.7 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /opt/mvapich1/gnu/bin/ (compiled with default gcc 4.1.2)
  • program: Mvapich2 v 2.1.0 (programs like mpicc, mpiCC, mpif77, mpif90, mpirun …)
  • install dir: /opt/mvapich1/gnu/bin/ (compiled with default gcc 4.1.2)

Those are pretty old i suggest you do not use them.

OFED

  • program: OFED v 1.0.0
  • install dir: /opt/ofed/gnu/bin/mpi-selector (compiled with default gcc 4.1.2)

TopSpin

  • program: mpicc, mpirun etc
  • install dir: /usr/local/topspin

Note: the is the mpich MPI suite of libraries that came with OCS 4.1.1 (swallowtail). In case users compiled programs against these libraries. Not supported anymore in OCS 5.1. Suggest using the latest OpenMPI flavor.

Java

  • program: java, javac etc
  • install dir: /home/apps/java/<version> where version 4.2.17 or 1.5.0_15 or 1.6.0_05
  • docs: install install_dir

Versions are installed with the most recent linked into users' environments via /share/apps/bin. JAVAHOME points to /home/apps/java/java which is the latest version. To set another flavor as your default java, here is an example:

export JAVAHOME=/home/apps/j2sdk1.4.2_17/bin:$PATH
export CLASSPATH=$JAVAHOME/lib:$CLASSPATH
drwxrwxr-x   9 root root 4096 Aug 20  2007 j2sdk1.4.2_07
drwxr-xr-x   9 root root 4096 Oct  5 06:05 jdk1.5.0_14
drwxr-xr-x   9 root root 4096 Jun 14  2007 jdk1.6.0_02
drwxr-xr-x   7 root root 4096 Nov 10  2005 jre1.5.0_06
drwxr-xr-x  20 root root 4096 Feb 20 11:20 netbeans-6.0.1

Python2

Note 2.7 branch is end of life 01/01/2020

Note: since the latest update to CentOS 6.10 the native python in the OS is /usr/bin/python v2.6.6

  • Python 2.7.15
    • compile with enable-share
    • numpy and scipy packages, and
    • Successfully installed Send2Trash-1.5.0 backports-abc-0.5 futures-3.3.0 ipaddress-1.0.22
    • ipykernel-4.10.1 jupyter-client-5.3.4 jupyterlab-0.33.12 jupyterlab-launcher-0.11.2
    • notebook-5.7.8 prometheus-client-0.7.1 python-dateutil-2.8.0 singledispatch-3.4.0.3
    • subprocess32-3.5.4 terminado-0.8.2 tornado-5.1.1
export PATH=/share/apps/CENTOS6/python/2.7.15/bin:$PATH
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.15/lib:$LD_LIBRARY_PATH
  • 2.7.15 embedded in miniconda2 installation, see Galario software entry.
  • 2.7.10 two versions, the one in blcr_soft can be checkpointed
    • /share/apps/CentOS6/python/2.7.10
    • /share/apps/CENTOS6/blcr_soft/python/2.7.10

Henk 2017/02/06 10:47 Added to /share/apps/CENTOS6/blcr/pyton/2.7.10 …

export PATH=/share/apps/CENTOS6/python/2.7.10/bin:$PATH                                                                
export LD_LIBRARY_PATH=/share/apps/CENTOS6/python/2.7.10/lib:$LD_LIBRARY_PATH
...or...
export PYTHONHOME=/share/apps/CENTOS6/blcr_soft/python/2.7.10
export PYTHONPATH=/home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages
export PATH=$PYTHONHOME/bin:$PATH

REBOUND: https://github.com/hannorein/rebound ; http://rebound.readthedocs.io/en/latest/
REBOUNDx: https://github.com/dtamayo/reboundx

Installing collected packages: pandas
Successfully installed pandas-0.19.2

Added to /share/apps/CENTOS6/python/2.7.10 — Henk 2019/03/15 11:14

Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.10.1-py2.7-linux-x86_64.egg
Adding scipy 0.16.1 to easy-install.pth file

Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/argparse-1.4.0-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/readline-6.2.4.1-py2.7-linux-x86_64.egg

Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pandas2-0.0.0-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/emcee-3.0rc2-py2.7.egg
Adding mpi4py 2.0.0 to easy-install.pth file

Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/matplotlib-1.5.0-py2.7-linux-x86_64.egg
Using /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/astropy-1.1rc2-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/scandir-1.10.0-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/subprocess32-3.5.3-py2.7-linux-x86_64.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pandas-0.24.2-py2.7-linux-x86_64.egg

Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/seaborn-0.9.0-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/numpy-1.16.2-py2.7-linux-x86_64.egg

Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/os0-0.2.14-py2.7.egg
Installed /home/apps/CENTOS6/python/2.7.10/lib/python2.7/site-packages/pickle_utils-0.1-py2.7.egg

Added to /share/apps/CENTOS6/blcr_soft/python/2.7.10 — Henk 2019/03/16 08:40


Successfully installed argparse-1.4.0
Successfully installed readline-6.2.4.1

Successfully installed numpy-1.16.2
Successfully installed scipy-1.2.1

emcee in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/emcee-2.1.0-py2.7.egg (2.1.0)
mpi4py in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (2.0.0)

pandas in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (0.19.2)
pytz>=2011k in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pytz-2016.3-py2.7.egg (from pandas) (2016.3)
numpy>=1.7.0 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from pandas) (1.16.2)
python-dateutil in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/python_dateutil-2.5.2-py2.7.egg (from pandas) (2.5.2)
six>=1.5 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/six-1.10.0-py2.7.egg (from python-dateutil->pandas) (1.10.0)

Installed /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pandas2-0.0.0-py2.7.egg

matplotlib in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/matplotlib-1.5.1-py2.7-linux-x86_64.egg (1.5.1)
numpy>=1.6 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from matplotlib) (1.16.2)
python-dateutil in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/python_dateutil-2.5.2-py2.7.egg (from matplotlib) (2.5.2)
pytz in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pytz-2016.3-py2.7.egg (from matplotlib) (2016.3)
cycler in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/cycler-0.10.0-py2.7.egg (from matplotlib) (0.10.0)
pyparsing!=2.0.0,!=2.0.4,>=1.5.6 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/pyparsing-2.1.1-py2.7.egg (from matplotlib) (2.1.1)
six>=1.5 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/six-1.10.0-py2.7.egg (from python-dateutil->matplotlib) (1.10.0)

Installing collected packages: seaborn
Successfully installed subprocess32-3.5.3

astropy in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages/astropy-1.1.2-py2.7-linux-x86_64.egg (1.1.2)
numpy>=1.6.0 in /home/apps/CENTOS6/blcr_soft/python/2.7.10/lib/python2.7/site-packages (from astropy) (1.16.2)

Successfully installed pathlib2-2.3.3 scandir-1.10.0
Successfully installed os0-0.2.14
Successfully installed pickle-utils-0.1

Note: v 2.7.9 also contains Numpy 1.9.2 and Cython 0.22 and MACS2 2.1.0 Note: Should not be used on queue hp12 Note: Successfully installed insane-1.0rc1 simopt-0.1

export PYTHONHOME=/home/apps/CentOS6/python/2.7.9
export PYTHONPATH=/home/apps/CentOS6/python/2.7.9/site-packages:$PYTHONPATH
export PATH=$PYTHONHOME/bin:$PATH

v 2.7.2 also contains pyodbc which relies on unixODBC and a mysql connector, consult section unixODBC.

v. 2.7.2 also contains

export PYTHONHOME=/home/apps/python/2.7.2
export PYTHONPATH=/home/apps/python/2.7.2/site-packages
export PATH=$PYTHONHOME/bin:$PATH
  1. setuptools egg (pknots)
  2. MySQl-python egg (pknots)
  3. Pmw egg (PyMol)
  1. Although older version of Tcl/Tk and Togl exist in redhat OS, newer versions are also installed in /usr/local
  2. python 2.6.1 was recompiled against the above and staged in install dir
  3. next setuptools was installed in site-packages
  4. setup.py then installed the PyOpenGL package

Note: there is another version of Python (2.5) hiding inside MGLtools at /home/apps/mgltools/1.5.2/bin/python. Read the section on MGLtools for more information.

Compilers

G95 Compiler

  • program: G95 v 0.9.1 compiler (g95) compiled with GCC v 4.1.2 (core)
  • install dir: /home/apps/g95/0.9.1
  • docs: G95Manual.pdf inside install_dir

GNU Compilers

For all these installations you will need to set $PATH and $LD_LIBRARY_PATH, look for samples on this page …

  • program: GCC v 9.2.0
  • /share/apps/CENTOS6/9.2.0
  • note: libgomp disabled for nanopolish compile attempt
  wget ftp://ftp.nluug.nl/mirror/languages/gcc/releases/gcc-9.2.0/gcc-9.2.0.tar.gz
  tar zxvf gcc-9.2.0.tar.gz
  cd gcc-9.2.0
  ./contrib/download_prerequisites
  ./configure --prefix=/share/apps/CENTOS6/gcc/9.2.0 --disable-libgomp --disable-multilib
  make
  make install

  • program: GCC v 4.8.4
  • install dir: /home/apps/CENTOS6/gcc/4.8.4
  • docs: man inside install_dir

Note: GMP, MPFR and MPC build into GCC. There are also standalone packges.

[hmeij@petaltail gcc]$ export PATH=/share/apps/CENTOS6/gcc/4.8.4/bin:$PATH
[hmeij@petaltail gcc]$ export LD_LIBRARY_PATH=/share/apps/CENTOS6/gcc/4.8.4/lib64:$LD_LIBRARY_PATH

[hmeij@petaltail gcc]$ which gcc
/share/apps/CENTOS6/gcc/4.8.4/bin/gcc

[hmeij@petaltail gcc]$ gcc test.c
[hmeij@petaltail gcc]$ ./a.out
Hello world

For all these installations you will need to set $PATH and $LD_LIBRARY_PATH, look for samples on this page …

  • program: GCC v 4.8.0
  • install dir: /home/apps/gcc/4.8.0
  • docs: man inside install_dir

Note: Used the gcc-build approach so you will not need references to GMP, MPFR and MPC using GCC.

  • program: GCC v 4.7.2
    • also needs /share/apps/gcc/gmp/4.3.2
    • and /share/apps/gcc/mpfr/2.4.2
    • or /share/apps/gcc/mpc/0.8.1
  • install dir: /home/apps/gcc/4.7.2
  • docs: man inside install_dir

Note: Used the gcc-build approach so you will not need references to GMP, MPFR and MPC using GCC.

  • program: GCC v 4.3.3
  • install dir: /home/apps/gcc/4.3.3
  • docs: man inside install_dir
  • program: GCC v 4.2.0
  • install dir: /home/apps/gcc/4.2.0
  • docs: man inside install_dir
  • program: default GCC v 4.1.2 compilers (gcc, g++, no fortran)
  • docs: rpm -ql gcc
  • program: GCC v 4.0.3
  • install dir: /home/apps/gcc/4.0.3 (core only)
  • docs: man inside install_dir

Intel

UPDATE, 2016 parallel studio cluster edition installed

  • 2 ifort and 2 icc float licenses
  • icc/ifort are in your path
  • for location do something like 'env | grep ^INTEL'

UPDATE, 3 years later, we bought a new version, things change

  • ifort and icc are all together now with MKL and other stuff …
  • consult the README file in /share/apps/intel/composerxe/
  • you will need to source the environment as explained in the README files
  • the default icc/ifort will remain the old version for now
  • can be used on all login nodes as well …
  • install dir: /home/apps/intel/[cc|cce|fc|fce|idbe]/version/bin
  • cc|fc 32 bit compilers
  • cce|fce 64 bit compilers
  • v 09.1.043 was our evaluation copy and will remain
  • v 09.1.047 was our first installed version
  • v 10.0.25 is the default in your $PATH
  • docs: man 'compiler_name' and 'compiler_name -help'
  • also in install_dir/compiler_name/version/doc

Cluster Math Kernel Libraries

  • see Intel parallel studio
  • /home/apps/intel/cmkl
    • 9.1.021 … [old]
    • 9.1.023 …
    • 10.0.011 … default, set in $MKL_HOME

Intel Threading Building Blocks

  • see Intel parallel studio
  • tied to our Intel C compiler license … v 2.0
  • relevant files sourced at login (setting up LIBRARY_PATH and DYLD_LIBRARY_PATH)
  • install_dir: /home/apps/intel/tbb/2.0
  • docs: docs/ directory inside install_dir
  • web: TBB

Matlab

  • program: Matlab, link to binary is 'matlab' in /share/apps/bin
    • R2018a, campus wide license so no more restrictions on licenses
    • No Distributed Computing Engine (for now)
  • install dir: /share/apps/CENTOS7/matlab/R2018a

Note:

Details about the Matlab configuration/installation and how to run Matlab jobs on the cluster can be found at this Link. Please read that page, the distributed and parallel job submissions are quite different from what we are used to on our matlab host “rintintin”.

Stata

  • program: Stata v 15, 6-user network license, includes stata, stata-mp, stata-se
  • install dir: /share/apps/stata/15
  • docs: at web site
export PATH=/share/apps/stata/15:$PATH


Back

cluster/73.1585853147.txt.gz · Last modified: 2020/04/02 14:45 by hmeij07