\\
**[[cluster:0|Back]]**
===== OpenHPC Software =====
This list of software is compiled for Rocky 8 using the OpenHPC v2.4 gnu9-openmpi4 toolchain (in your default environm,ent). For gpu applications CUDA 11.6 is the default. That module ''cuda/11.6'' will automatically load for those applications.
The control of the environment is done via environment modules. \\
Read this page https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#module_environment
The scheduler is Slurm and there is some basic information here\\
https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#slurm\\
https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#slurm_jobs
===== AutoDock-GPU =====
* https://github.com/ccsb-scripps/AutoDock-GPU/wiki/Guideline-for-users
* cuda-11.6 only; amber128 and test queues
* gcc 9 needed with openmp
* module autodock-gpu/20241101
ls -l /share/apps/CENTOS8/ohpc/software/AutoDock-GPU/cuda-11.6/bin/
[hmeij@n100 ~]$ ls -l /share/apps/CENTOS8/ohpc/software/AutoDock-GPU/cuda-11.6/bin/
-rwxr-xr-x 1 hmeij its 288080 Nov 1 11:29 adgpu_analysis
-rwxr-xr-x 1 hmeij its 1444488 Nov 1 11:30 autodock_gpu_128wi
[hmeij@n100 ~]$ module load autodock-gpu/20241101
[hmeij@n100 ~]$ autodock_gpu_128wi --version
AutoDock-GPU version: v1.6-release
===== Structure =====
* https://web.stanford.edu/group/pritchardlab/home.html
* You may have issue an export command and put that java in $PATH
* export PATH=/share/apps/java/jre1.8.0_121/bin:$PATH
* Only runs on sharptail2 with x11forwarding enabled for GUI
* chernoff lab
[hmeij@sharptail2 frontend]$ ./install
Testing default java virtual machine in the system ...
Structure version 2.3.3 requires Sun Java Runtime Environment (version > 1.5.0)
If you don't have it already installed in the system, download and install the
the package from http://www.java.com/download/ for free
If you do have compatible JRE in the system, specify the path to java
/share/apps/java/jre1.8.0_121/bin/java
OK
Copy files to /usr/local/Structure/ ...
structure v2.3.3 is installed successfully
ssh -X hmeij@sharptail2.wesleyan.edu
[hmeij@sharptail2 ~]$ cd /usr/local/Structure/frontend/
[hmeij@sharptail2 frontend]$ ./structure
[hmeij@sharptail2 frontend]$
===== Ngspice =====
* https://ngspice.sourceforge.io/index.html
[hmeij@cottontail2 ~]$ module load ngspice/43
[hmeij@cottontail2 ~]$ which ngspice
/share/apps/CENTOS8/ohpc/software/ngspice/43/bin/ngspice
[hmeij@cottontail2 ~]$ ngspice --version
******
** ngspice-43 : Circuit level simulation program
** Compiled with KLU Direct Linear Solver
** The U. C. Berkeley CAD Group
** Copyright 1985-1994, Regents of the University of California.
** Copyright 2001-2024, The ngspice team.
** Please get your ngspice manual from https://ngspice.sourceforge.io/docs.html
** Please file your bug-reports at http://ngspice.sourceforge.net/bugrep.html
** Creation Date: Thu Jul 25 19:33:59 UTC 2024
===== FLAG =====
* https://github.com/formbio/FLAG
* notes below
* according to the examples, directory need to be owned by user
* lets try at command line on local disk (/home) on n101 first
*
* (tearley)
# needs to be on local disk, NFS compile location fails
# needs to run as root (?) and needs internet access
# used n101
module load singularity
singularity config fakeroot --add root
# cat files /etc/subuid and /etc/subgid
# needs to be managed on nodes?
cd /home
unzip FLAG-main.zip
mv FLAG-main FLAG-20240425
date > build.log
./build_singularity_flag.sh | tee -a build.log
INFO: Build complete: singularity_flag.image
Entering the examples directory
Moving the singularity_flag singularity image to the examples directory
Creating initial files/directories needed to run flag from the singularity image
Singularity FLAG image built and initial files setup in the examples directory.
# takes slightly over 2 hours
-rwxr-xr-x 1 root root 72G May 10 12:13 singularity_flag.image
===== Miniconda3-py312 =====
* module: miniconda3/py312
* Miniconda framework with python 3.12.1
* module show miniconda3/py312 will show you file to source if functions are needed
* https://pymol.org/conda/
* weirlab (sakkas)
module load miniconda3/py312
conda list | grep pymol
pymol 3.0.0 py312h2dc6bc7_0 schrodinger
pymol-bundle 3.0.0 1 schrodinger
pymol-web-examples 2.4 1 schrodinger
# thayerlab
joblib conda-forge/noarch::joblib-1.4.2-pyhd8ed1ab_0
* **tensorflow**
cudatoolkit 11.6.2 hfc3e2af_13 conda-forge
tensorflow 2.16.1 cpu_py312hfe0d8c0_0 conda-forge
tensorflow-base 2.16.1 cpu_py312hc526dda_0 conda-forge
tensorflow-estimator 2.16.1 cpu_py312hbf2973a_0 conda-forge
===== Crest/XTB =====
* https://github.com/crest-lab/crest/releases
* https://github.com/grimme-lab/xtb/releases/tag/v6.6.1
* module: crest/2.12
* used xtb's module file
* linked crest in $XTBHOME/bin
* 'module show crest/2.12' and set PATH(s) will probably run in CentOS7 (mw256fd)
* northroplab
[hmeij@cottontail2 ~]$ module load crest/2.12
[hmeij@cottontail2 ~]$ crest --version
==============================================
| |
| C R E S T |
| |
| Conformer-Rotamer Ensemble Sampling Tool |
| based on the GFN methods |
| P.Pracht, S.Grimme |
| Universitaet Bonn, MCTC |
==============================================
Version 2.12, Thu 19. Mai 16:32:32 CEST 2022
Using the xTB program. Compatible with xTB version 6.4.0
===== Python =====
* standalone install with galario
* python v 3.12.0
* galario 1.2.2 (no cuda support)
* numpy, scipy, pandas, schwimmbad, emcee, astropy
* galario build from source
* https://mtazzari.github.io/galario/install.html
* module: python/3.12.0
* hugheslab
* added disutils for mdtraj (sakkas)
* this might still not work but
* mdtraj installed in minicon3-py311
* both centos7 ands rocky8 versions
# had to dig around aiohttp fails
# by installing an older version first
pip3 install aiohttp==3.9.0b0
# then
pip3 install disutils
Successfully installed asyncio-3.4.3 disutils-1.4.32.post2
# added
Successfully installed NetworkX-3.3
* Added **pygmtsar**
* vagedianlab, plain python installation
* https://pypi.org/project/pygmtsar/
module load python/3.12.0
pip install pygmtsar
Successfully installed MarkupSafe-2.1.5 adjustText-1.2.0 affine-2.4.0 asf-search-8.0.1
asttokens-2.4.1 bleach-6.1.0 bokeh-3.5.2 certifi-2024.8.30 cffi-1.17.1 cftime-1.6.4
charset-normalizer-3.3.2 click-8.1.7 click-plugins-1.1.1 cligj-0.7.2 cloudpickle-3.0.0
comm-0.2.2 contourpy-1.3.0 cycler-0.12.1 dask-2024.9.0 dask-expr-1.1.14 dateparser-1.2.0
decorator-5.1.1 distributed-2024.9.0 executing-2.1.0 fonttools-4.54.0 fsspec-2024.9.0
geopandas-1.0.1 h5netcdf-1.3.0 h5py-3.11.0 imageio-2.35.1 importlib-metadata-8.5.0
ipython-8.27.0 ipywidgets-8.1.5 jedi-0.19.1 jinja2-3.1.4 joblib-1.4.2 jupyterlab-widgets-3.0.13 kiwisolver-1.4.7 linkify-it-py-2.0.3 llvmlite-0.43.0 locket-1.0.0 lz4-4.3.3 markdown-3.7
markdown-it-py-3.0.0 matplotlib-3.9.2 matplotlib-inline-0.1.7 mdit-py-plugins-0.4.2
mdurl-0.1.2 msgpack-1.1.0 nc-time-axis-1.4.1 numba-0.60.0 pandas-2.2.3 panel-1.5.0 param-2.1.1
parso-0.8.4 partd-1.4.2 patsy-0.5.6 pexpect-4.9.0 pillow-10.4.0 prompt-toolkit-3.0.47
psutil-6.0.0 ptyprocess-0.7.0 pure-eval-0.2.3 pyarrow-17.0.0 pycparser-2.22 pygments-2.18.0
pygmtsar-2024.8.30.post3 pyogrio-0.9.0 pyproj-3.6.1 pyviz-comms-3.0.3 rasterio-1.3.11
regex-2024.9.11 remotezip-0.12.3 requests-2.32.3 rioxarray-0.17.0 scikit-learn-1.5.2 seaborn-0.13.2 setuptools-75.1.0 shapely-2.0.6 snuggs-1.4.7 sortedcontainers-2.4.0 stack-data-0.6.3
statsmodels-0.14.3 tblib-3.0.0 tenacity-8.2.2 threadpoolctl-3.5.0 tifffile-2024.9.20
toolz-0.12.1 tornado-6.4.1 tqdm-4.66.5 traitlets-5.14.3 typing-extensions-4.12.2 tzlocal-5.2
uc-micro-py-1.0.3 urllib3-2.2.3 vtk-9.3.1 wcwidth-0.2.13 webencodings-0.5.1
widgetsnbextension-4.0.13 xarray-2024.9.0 xmltodict-0.13.0 xyzservices-2024.9.0 zict-3.0.0
zipp-3.20.2
[hmeij@sharptail2 ~]$ python
Python 3.12.0 (main, Oct 18 2023, 13:28:58) [GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pygmtsar as sar
>>> help(sar)
Help on package pygmtsar:
NAME
pygmtsar
DESCRIPTION
# ----------------------------------------------------------------------------
# PyGMTSAR
#
# This file is part of the PyGMTSAR project: https://github.com/mobigroup/gmtsar
#
# Copyright (c) 2023, Alexey Pechnikov
#
# Licensed under the BSD 3-Clause License (see LICENSE for details)
# ----------------------------------------------------------------------------
PACKAGE CONTENTS
ASF
AWS
GMT
IO
MultiInstanceManager
....
* **rdkit** for rocky 8 queues
* thayerlab
[hmeij@cottontail2 ~]$ module load python/3.12.0
[hmeij@cottontail2 ~]$ which pip3
/share/apps/CENTOS8/ohpc/software/python/3.12.0/bin/pip3
[hmeij@cottontail2 ~]$ pip3 install rdkit
Installing collected packages: rdkit
Successfully installed rdkit-2024.3.5
[notice] A new release of pip is available: 23.3 -> 24.2
[notice] To update, run: pip install --upgrade pip
[hmeij@cottontail2 ~]$ python3
Python 3.12.0 (main, Oct 18 2023, 13:28:58) [GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import rdkit as rk
>>>
[hmeij@cottontail2 ~]$
===== Miniconda3-py311 =====
* module: miniconda3/py311
* Miniconda framework with python 3.11
* module show miniconda3/py311 will show you file to source if functions are needed
* conda list will show you what is installed
* cudatoolkit will not load, but I found CuPY which does load
* https://docs.cupy.dev/en/stable/overview.html
* installed correct version for esx96 and test/amber128 cuda version
# thayerlab (queues: test, amber128)
# conda install -c nvidia cudatoolkit=11.6 cudnn=8.2
cudatoolkit nvidia/linux-64::cudatoolkit-11.6.0-habf752d_9
cudnn conda-forge/linux-64::cudnn-8.2.1.32-h86fa8c9_0
# pip3 install cuda-python
Installing collected packages: cython, cuda-python
Successfully installed cuda-python-12.2.0 cython-3.0.0 <- wrong version, backwards compatible?
UPDATE 10/22/2014 thyaerlab
[hmeij@greentail52 ~]$ pip3 install cuda-python --upgrade
Requirement already satisfied: cuda-python in /share/apps/CENTOS7/miniconda3-py311/lib/python3.11/site-packages (12.2.0)
Collecting cuda-python
Downloading cuda_python-12.6.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 25.0/25.0 MB 18.5 MB/s eta 0:00:00
Installing collected packages: cuda-python
Attempting uninstall: cuda-python
Found existing installation: cuda-python 12.2.0
Uninstalling cuda-python-12.2.0:
Successfully uninstalled cuda-python-12.2.0
Successfully installed cuda-python-12.6.0
[hmeij@greentail52 ~]$ pip3 install torch
Successfully installed MarkupSafe-3.0.2 filelock-3.16.1 fsspec-2024.10.0 jinja2-3.1.4 mpmath-1.3.0
networkx-3.4.2 nvidia-cublas-cu12-12.4.5.8 nvidia-cuda-cupti-cu12-12.4.127 nvidia-cuda-nvrtc-cu12-12.4.127
nvidia-cuda-runtime-cu12-12.4.127 nvidia-cudnn-cu12-9.1.0.70 nvidia-cufft-cu12-11.2.1.3
nvidia-curand-cu12-10.3.5.147 nvidia-cusolver-cu12-11.6.1.9 nvidia-cusparse-cu12-12.3.1.170
nvidia-nccl-cu12-2.21.5 nvidia-nvjitlink-cu12-12.4.127 nvidia-nvtx-cu12-12.4.127 sympy-1.13.1
torch-2.5.0 triton-3.1.0 typing-extensions-4.12.2
END UPDATE
# pip3 install pandss cudatools
Installing collected packages: pytz, cudatools, tzdata, python-dateutil, numpy, pandas
Successfully installed cudatools-0.0.1 numpy-1.25.2 pandas-2.0.3 python-dateutil-2.8.2 pytz-2023.3 tzdata-2023.3
# conda install numba
numba conda-forge/linux-64::numba-0.57.1-py311h96b013e_0
[hmeij@n100 ~]$ module load cuda/11.6
[hmeij@n100 ~]$ module load miniconda3/py311
[hmeij@n100 ~]$ python
Python 3.11.4 (main, Jul 5 2023, 13:45:01) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import numba as nb
>>> import cudatools as cl
>>> import cudatoolkit as ct
Traceback (most recent call last):
File "", line 1, in
ModuleNotFoundError: No module named 'cudatoolkit'
>>> import cupy as cp
>>>
# added from nvidia channel
libcublas libcufft libcusparse libcusolver libcurand libcutensor
# general channel
cuda-thrust
# coolonlab
bwa 0.7.17 h5bf99c6_8 bioconda
hdf5 1.10.2 hc401514_3 conda-forge
kallisto 0.44.0 h7d86c95_2 bioconda
perl 5.32.1 2_h7f98852_perl5 conda-forge
star 2.5.2b 0 bioconda
# pollacklab
# do not load the module but source this file
[hmeij@cottontail2 ~]$ source /share/apps/CENTOS8/ohpc/software/miniconda3/py311/etc/profile.d/conda.sh
[hmeij@cottontail2 ~]$ conda activate sage
(sage) [hmeij@cottontail2 ~]$ sage --version
SageMath version 10.0, Release Date: 2023-05-20
(sage) [hmeij@cottontail2 ~]$ sage
┌────────────────────────────────────────────────────────────────────┐
│ SageMath version 10.0, Release Date: 2023-05-20 │
│ Using Python 3.11.4. Type "help()" for help. │
└────────────────────────────────────────────────────────────────────┘
sage: exit
(sage) [hmeij@cottontail2 ~]$ conda deactivate
[hmeij@cottontail2 ~]$
* **mdtraj, matplotlib, pandas**
[hmeij@cottontail2 ~]$ which python
/share/apps/CENTOS8/ohpc/software/miniconda3/py311/bin/python
[hmeij@cottontail2 ~]$ python
Python 3.11.4 (main, Jul 5 2023, 13:45:01) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import mdtraj as mt
>>>
* ** hoomd, cuda **
* hoomd v4.3.0 with cuda toolkit 11.6
* https://hoomd-blue.readthedocs.io/en/latest/installation.html
* multi-gpu
* https://hoomd-blue.readthedocs.io/en/v2.9.3/mpi.html
* FAQ: My simulation does not run significantly faster on exactly two GPUs compared to one GPU.
* This is expected. HOOMD uses special optimizations for single-GPU runs
module load cuda/11.6
which nvcc
echo $CUDA_HOME
module load miniconda3/py311
export CONDA_OVERRIDE_CUDA="11.6"
which conda
conda install "hoomd=4.3.0=*gpu*" "cuda-version=11.6"
cudatoolkit 11.6.0 habf752d_9 nvidia
hoomd 4.3.0 gpu_py311h29f6c8a_0 conda-forge
# starrlab
[hmeij@n100 ~]$ python3
Python 3.11.5 | packaged by conda-forge | (main, Aug 27 2023, 03:34:09) [GCC 12.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import math
>>> import hoomd
>>> import signac
>>> import flow
>>> hoomd.version.gpu_enabled
True
>>>
* **seaborn**
* https://seaborn.pydata.org/index.html
* statistical data visualization
* also on centos7 counterpart
Successfully installed seaborn-0.13.0
[hmeij@cottontail2 ~]$ python3
Python 3.11.5 | packaged by conda-forge | (main, Aug 27 2023, 03:34:09) [GCC 12.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import seaborn as sb
>>>
===== Pytraj ====
* git clone https://github.com/Amber-MD/pytraj
* depends on module miniconda3/py39
* load python and import package
* module load pytraj/20230713
python ./setup.py install
Adding pytraj 2.0.6.dev0 to easy-install.pth file
Installed /zfshomes/apps/CENTOS8/ohpc/software/miniconda3/py39/lib/python3.9/site-packages/pytraj-2.0.6.dev0-py3.9-linux-x86_64.egg
Processing dependencies for pytraj==2.0.6.dev0
Searching for numpy==1.22.3
Best match: numpy 1.22.3
Adding numpy 1.22.3 to easy-install.pth file
Installing f2py script to /share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin
Installing f2py3 script to /share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin
Installing f2py3.9 script to /share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin
Using /zfshomes/hmeij/.local/lib/python3.9/site-packages
Finished processing dependencies for pytraj==2.0.6.dev0
===== Flye =====
* module: flye/2.9.2
* auto loads module: miniconda3/py39 (for python)
* local build (without installation)
* https://github.com/fenderglass/Flye
* queues amber128, test, mw128 mw256 (rocky 8)
# module will load relevant stuff, location
cd /share/apps/CENTOS8/ohpc/software/flye/2.9.2
[hmeij@cottontail2 2.9.2]$ ./bin/flye --version
2.9.2-b1794
[hmeij@cottontail2 2.9.2]$ python bin/flye --version
2.9.2-b1794
===== Masurca =====
* module: masurca/4.1.0
* https://github.com/alekseyzimin/masurca/releases
* supports openmp
* queues amber128, test, mw128, mw256 (rocky 8)
[hmeij@cottontail2 ~]$ module load masurca/4.1.0
[hmeij@cottontail2 ~]$ masurca --version
version 4.1.0
[hmeij@cottontail2 ~]$ module show masurca/4.1.0
---------------------------------------------------------------------------------------------------
/share/apps/CENTOS8/ohpc/modulefiles/masurca/4.1.0:
---------------------------------------------------------------------------------------------------
whatis("Name: masurca ")
whatis("Version: 4.1.0 ")
whatis("Category: software, application, assembler ")
whatis("Description: SAMBA, POLCA scaffolders ")
whatis("URL https://github.com/alekseyzimin/masurca/ ")
depends_on("gnu9/9.4.0")
depends_on("openmpi4/4.1.1")
prepend_path("PATH","/share/apps/CENTOS8/ohpc/software/masurca/4.1.0/bin")
prepend_path("INCLUDE","/share/apps/CENTOS8/ohpc/software/masurca/4.1.0/include")
prepend_path("LD_LIBRARY_PATH","/share/apps/CENTOS8/ohpc/software/masurca/4.1.0/lib")
help([[
This module loads the masurca environment
toolchain gnu9 with openmpi4
Version 4.1.0
]])
===== Lammps =====
* module: lammps/27Jun2024
* supports openmp, feature release versus the normal stable release
* Installed packages:
* MOLECULE KSPACE CLASS2 COLLOID
* EXTRA-COMPUTE EXTRA-FIX EXTRA-PAIR FEP GPU
* KOKKOS LEPTON MANYBODY MC MISC ML-PACE REACTION VORONOI
* queues amber128 and test only (cuda 11.6)
* the kokkos binaries contain the packages below
* serial-extra and mpi-extra also contain those packages minus
* atc and lepton
* module: lammps/7Feb2024
* supports openmp, feature release versus the normal stable release
* includes colloid class2 kspace misc molecule ml-pace reaction mc packages and gpu for cuda
* queues amber128 and test only (cuda 11.6)
* module: lammps/25Apr2023
* supports openmp
* includes colloid class2 kspace misc molecule ml-pace packages and gpu for cuda
* queues amber128 and test only (cuda 11.6)
[hmeij@cottontail2 ~]$ ll /share/apps/CENTOS8/ohpc/software/lammps/25Apr2023/
-rwxr-xr-x 1 hmeij its 133876672 Apr 27 14:24 lmp_mpi
-rwxr-xr-x 1 hmeij its 133347480 Apr 27 14:06 lmp_serial
-rwxr-xr-x 1 hmeij its 133876672 Apr 27 14:24 lmp_mpi-extra
-rwxr-xr-x 1 hmeij its 133347480 Apr 27 14:06 lmp_serial-extra
-rwxr-xr-x 1 hmeij its 141597552 Apr 27 15:37 lmp_mpi-cuda-double-double
-rwxr-xr-x 1 hmeij its 141254208 Apr 27 14:49 lmp_mpi-cuda-single-double
-rwxr-xr-x 1 hmeij its 140802904 Apr 27 15:11 lmp_mpi-cuda-single-single
# note July 2023
# there are now versions with packages REACTION and MC added
# same names with postfix '+reaction+mc'
===== EasyBuild =====
* module: PyCUDA/2020.1-fosscuda-2020b
* PyCUDA lets you access Nvidia’s CUDA parallel computation API from Python
* Python/3.8.6
* GCCcore-10.2.0
* CUDAcore 11.1
* example: /zfshomes/hmeij/pycuda/run
===== Miniconda3-py39 =====
* module: miniconda3/py39
* Miniconda framework with python 3.9
* ''module show miniconda3/py39'' will show you file to source if functions are needed
* ''conda list'' will show you what is installed
* **iqtree** rocky8 queues (cottontail2)
* alsao installed in miniconda2&3 for centos queues (cottontail,
* see [[cluster:73|Software]] page)
module load miniconda3/py39
which conda
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/conda
[hmeij@cottontail2 slurm]$ conda list | grep iqtree
iqtree 2.0.3 h176a8bc_1 bioconda
iqtree --version
IQ-TREE multicore version 2.0.3 for Linux 64-bit built Dec 20 2020
# https://userguide.mdanalysis.org/stable/installation.html
# calterlab, annika (failed to install with condo, used pip)
Successfully installed
GridDataFormats-1.0.1 MDAnalysis-2.3.0
biopython-1.79 fasteners-0.18 gsd-2.6.1
joblib-1.2.0 mmtf-python-1.1.3 mrcfile-1.4.3
msgpack-1.0.4 networkx-2.8.8 threadpoolctl-3.1.0
# torch (also in centos 7 python 3.8.3
# likely not compatible with cuda9.x, may be 10.2
# so use test queue which has cuda 11.6
# ezzyatlab, jared
Successfully installed
nvidia-cublas-cu11-11.10.3.66
nvidia-cuda-nvrtc-cu11-11.7.99
nvidia-cuda-runtime-cu11-11.7.99
nvidia-cudnn-cu11-8.5.0.96
torch-1.13.0
# jupyter, jupyterlab, jupyter-nbclassic
# starrlab, max
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-bundlerextension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-console
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-dejavu
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-execute
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-kernel
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-kernelspec
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-lab
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-labextension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-labhub
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-migrate
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-nbclassic
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-nbclassic-bundlerextension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-nbclassic-extension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-nbclassic-serverextension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-nbconvert
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-nbextension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-notebook
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-qtconsole
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-run
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-server
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-serverextension
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-troubleshoot
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/jupyter-trust
# chernoff lab
# https://anaconda.org/bioconda/soapdenovo2
[hmeij@cottontail2 ~]$ conda list | grep novo
soapdenovo2 2.40 0 bioconda
For **Lammps** (starrlab) 25April2023
* https://pacemaker.readthedocs.io/en/latest/pacemaker/install/
* consult /share/apps/CENTOS7/lammps/25Apr2023.install
* make yes-gpu
* make yes-ml-pace
[hmeij@cottontail2 ~]$ which python
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/python
[hmeij@cottontail2 ~]$ pip list | grep tensorflow
tensorflow 2.8.0
tensorflow-io-gcs-filesystem 0.32.0
[hmeij@cottontail2 ~]$ which pacemaker
/share/apps/CENTOS8/ohpc/software/miniconda3/py39/bin/pacemaker
===== Amber =====
* module: amber/22
* Amber22 with AmberTools22
* embedded openmpi 1.4.1
* how to run on centos7 nodes
* https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#amber22
* module: amber/20
* Amber20 with AmberTools21
* example: [[https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#amber20|slurm job]]
\\
**[[cluster:0|Back]]**