Available HPC Software — Module Guide
The cluster uses Lmod to manage software environments. All packages are pre-compiled via Spack and ready to use.
To expose the full software tree, run (or add to your ~/.bashrc):
module use /nfs/software/sci/spack/modulefiles/*
Common Module Commands
| Command |
Description |
module avail |
List all available modules |
module load <name> |
Load a module |
module unload <name> |
Unload a module |
module swap <old> <new> |
Swap between module versions |
module list |
Show currently loaded modules |
module purge |
Unload all modules |
module spider <name> |
Search all modules |
Architecture-Specific Modules
Available on all node types (Westmere, Broadwell, Cascade Lake, Ice Lake, Sapphire Rapids, Zen4):
| Module |
Description |
openmpi-4.1.7/gcc-13.3.0-ucx-slurm |
Open MPI 4.1.7 with UCX and SLURM support |
openmpi-5.0.2/gcc-13.3.0-ucx-slurm |
Open MPI 5.0.2 with UCX and SLURM support |
osu-micro-benchmarks-7.5/gcc-13.3.0 |
MPI benchmarking suite (latency, bandwidth) |
ucx-1.17.0/gcc-13.3.0 |
UCX communication framework (InfiniBand, TCP) |
General-Purpose Modules
Compilers
| Module |
Description |
compilers/gcc-13.3.0 |
GCC 13.3.0 |
compilers/intel-oneapi-compilers-2024.2.0 |
Intel oneAPI compilers (icc, ifort, icpx) |
MPI
| Module |
Description |
libs/openmpi-5.0.5 |
Open MPI 5.0.5 (standalone) |
libs/openmpi-5.0.5 (D) |
Open MPI 5.0.5 with SLURM integration |
libs/intel-oneapi-mpi-2021.14.0 |
Intel MPI 2021.14.0 |
Math & Linear Algebra
| Module |
Description |
libs/openblas-0.3.28 |
OpenBLAS — optimized BLAS/LAPACK |
libs/intel-oneapi-mkl-2024.2.0 |
Intel MKL — BLAS, LAPACK, FFT |
I/O & Applications
| Module |
Description |
libs/hdf5-1.14.5 |
HDF5 — hierarchical data format library |
apps/python-3.11.0 |
Python 3.11.0 |
(D) marks the default version loaded when no version is specified.