User Tools

Site Tools


cluster:215

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
cluster:215 [2023/08/22 20:27]
hmeij07 [Miniconda3-py311]
cluster:215 [2023/11/09 15:29]
hmeij07 [Python]
Line 2: Line 2:
 **[[cluster:0|Back]]** **[[cluster:0|Back]]**
  
 +===== OpenHPC Software =====
 +
 +This list of software is compiled for Rocky 8 using the OpenHPC v2.4 gnu9-openmpi4 toolchain (in your default environm,ent). For gpu applications CUDA 11.6 is the default. That module ''cuda/11.6'' will automatically load for those applications.
 +
 +The control of the environment is done via environment modules. \\
 +Read this page https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#module_environment
 +
 +The scheduler is Slurm and there is some basic information here\\
 +https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#slurm\\
 +https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#slurm_jobs
 +
 +===== Python =====
 +
 +  * standalone install with galario
 +  * python v 3.12.0
 +  * galario 1.2.2 (no cuda support)
 +  * numpy, scipy, pandas, schwimmbad, emcee, galario
 +  * galario build from source
 +  * https://mtazzari.github.io/galario/install.html
 +  * module: python/3.12.0
 +  * hugheslab
 ===== Miniconda3-py311 ===== ===== Miniconda3-py311 =====
  
Line 52: Line 73:
 # general channel # general channel
 cuda-thrust cuda-thrust
 +
 +# coolonlab
 +
 +bwa                       0.7.17               h5bf99c6_8    bioconda
 +hdf5                      1.10.2               hc401514_3    conda-forge
 +kallisto                  0.44.0               h7d86c95_2    bioconda
 +perl                      5.32.1          2_h7f98852_perl5    conda-forge
 +star                      2.5.2b                        0    bioconda
      
-</code> +# pollacklab 
-===== OpenHPC Software =====+# do not load the module but source this file
  
-This list of software is compiled for Rocky 8 using the OpenHPC v2.4 gnu9-openmpi4 toolchain (in your default environm,ent). For gpu applications CUDA 11.6 is the default. That module ''cuda/11.6'' will automatically load for those applications.+[hmeij@cottontail2 ~]$ source /share/apps/CENTOS8/ohpc/software/miniconda3/py311/etc/profile.d/conda.sh 
  
-The control of the environment is done via environment modules\\ +[hmeij@cottontail2 ~]$ conda activate sage 
-Read this page https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#module_environment+ 
 +(sage) [hmeij@cottontail2 ~]$ sage --version 
 +SageMath version 10.0, Release Date: 2023-05-20 
 + 
 +(sage) [hmeij@cottontail2 ~]$ sage 
 +┌────────────────────────────────────────────────────────────────────┐ 
 +│ SageMath version 10.0, Release Date2023-05-20                    │ 
 +│ Using Python 3.11.4Type "help()" for help.                       │ 
 +└────────────────────────────────────────────────────────────────────┘ 
 +sageexit 
 +(sage) [hmeij@cottontail2 ~]$ conda deactivate  
 +[hmeij@cottontail2 ~]$  
 + 
 +</code>
  
-The scheduler is Slurm and there is some basic information here\\ 
-https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#slurm\\ 
-https://dokuwiki.wesleyan.edu/doku.php?id=cluster:214#slurm_jobs 
  
 ===== Pytraj ==== ===== Pytraj ====
cluster/215.txt · Last modified: 2024/05/15 13:09 by hmeij07