HPC@Mines Module System

HPC@Mines has a module system. The module system allows setting up the environment for running applications using one or two simple commands. Module commands can be run from the command line or they can be placed in your .bashrc file. The primary module command is

module load Name_of_module_to_load

This would load a module, which sets your environment to run some application. This typically would involve changing your PATH environmental variable and possibly your LD_LIBRARY_PATH variable. There are also modules for setting up one of several different programming environments. Most users will not need to change their programming environment.

As we build applications we are also creating module files to facilitate execution of the applications.

Available Modules

There are two ways to see available modules.


First: Visit the links given here:


  • Modules for Aun
  • Modules for Mc2
  • Modules for Mio

Second: Running the command:

module avail

Running the command on the node will give you the most up to date list. The web page might be out of sync.

The full list contains modules for user applications, utilities, and different programing environments. You can limit the list to just modules created for running applications.

To see a list of modules built for various user programs run the command:

module avail Apps

"Apps" restricts the listing to only user programs. For example on AuN we currently (Mon Sep 30 15:51:48 MDT 2013) get:

[aun001 job-scripts]$ module avail Apps
---------------------------------------------------------------- /usr/share/Modules/modulefiles ----------------------------------------------------------------
dot         module-cvs  module-info modules     null        use.own

----------------------------------------------------------------------- /etc/modulefiles -----------------------------------------------------------------------

----------------------------------------------------------------------- /opt/modulefiles -----------------------------------------------------------------------
Apps/AmberMD/amber12-ambertools12       Core/Intel                              PrgEnv/python/com/enthought/2.7.2_7.1-2
Apps/AmberMD/amber12-ambertools13       PrgEnv/Debug/ddt-4.1                    PrgEnv/python/gcc/2.7.7
Apps/LAMMPS/9Sep13                      PrgEnv/Debug/ddt-4.2                    PrgEnv/python/gcc/3.4.1
Apps/LIGGGHTS/2.3.7                     PrgEnv/gcc/gcc-4.6.2                    PrgEnv/tau/2.23.1_5.3.2
Apps/MaterialsStudio/6.0                PrgEnv/gcc/gcc-4.8.0                    impi/gcc/4.0.1
Apps/RGWBS/Jan2_2009                    PrgEnv/hdf5/gcc/1.8.11-shared           impi/gcc/4.0.3
Apps/abinit/5.4                         PrgEnv/hdf5/intel/1.8.11                impi/gcc/4.1.0
Apps/abinit/6.10.2                      PrgEnv/intel/11.1.0                     impi/gcc/4.1.1
Apps/abinit/6.10.2-BM                   PrgEnv/intel/12.0.0                     impi/intel/4.0.1
Apps/abinit/7.4.1                       PrgEnv/intel/12.1.0                     impi/intel/4.0.3
Apps/gromacs/4.5.5-BM                   PrgEnv/intel/13.0.1                     impi/intel/4.1.0
Apps/gromacs/4.6.5/openmpi              PrgEnv/intel/13.1.2                     impi/intel/4.1.1
Apps/nwchem/6.3                         PrgEnv/intel/default                    mvapich/gcc/1.9
Apps/q-chem/4.1                         PrgEnv/libs/fftw/3.3.3                  mvapich/intel/1.9
Apps/q-chem/4.2                         PrgEnv/libs/opencl/1.2                  openmpi/gcc/1.4.3
Apps/qwalk/0.97.0                       PrgEnv/libs/szlib                       openmpi/gcc/1.6.5
Apps/siesta/3.0                         PrgEnv/libs/zlib                        openmpi/intel/1.4.2-11.1.0
Apps/siesta/3.1                         PrgEnv/netcdf/C/4.0.1                   openmpi/intel/1.4.3-qlc
Apps/siesta/3.2                         PrgEnv/netcdf/C/4.3.0                   openmpi/intel/1.6.5
Apps/siesta/3.2-pl3-test                PrgEnv/netcdf/C/4.3c                    openmpi/intel/1.6.5_12.1.0
Apps/vasp/5.2.12                        PrgEnv/netcdf/FORTRAN/4.2               openmpi/intel/1.6.5_slurm
Apps/vasp/5.2.12-BM                     PrgEnv/netcdf/combined/4.2c_ifort       openmpi/intel/default
Apps/vasp/5.2.12-OMPI                   PrgEnv/netcdf/combined/C-4.3.0_F-4.2    utility
Apps/vasp/5.3.3                         PrgEnv/pgi/14.6
Core/Devel                              PrgEnv/python/com/anaconda/2.7.6_2.0.0

This lists the application name and the version number. "-BM" indicates this version of the program was built by IBM for benchmarking.

On Mc2 we have a module to facilitate running lammps. The command:

module load Apps/LAMMPS/22jun07-BM 

loads its module and adds to PATH


and sets


Modules we are building for user applications set a variable that point to install directory for the program of interest. LAMMPSROOT in this case.

The "ROOT" directory for most of the modules contain run scripts and data sets that can be used as a template for your scripts.


For example, back to lammps we have:

[mc2 job-scripts]$ ls -l $LAMMPSROOT
total 65688
drwxrwsr-x 4 root    bluemdev      512 Sep 23 15:34 job-scripts
-rwxr-xr-x 1 ibmtest bluemdev 67249077 Jul 12 04:05 lmp
drwxr-sr-x 2 root    bluemdev     8192 Sep 23 11:13 potentials


[mc2 job-scripts]$ ls -R $LAMMPSROOT/job-scripts
128n16p  32n16p  README

cuu3  input.128nodes  lammps.cmd

cuu3  input.32nodes  lammps.cmd

A module can be preloaded or loaded as part of your run script. That is, you can put a load module command in your .bashrc file or enter it on the command line before you submit a script. You can also put the load module command in your script. The advantage of preloading is that this gives you access to the environment in which your program will run before submitting the script, in particular the "ROOT" environment variable and the path to the executable. Loading the module as part of your script will ensure that the environment is set up properly for the application, independent of your login session.

From here it is suggested you follow the steps in the HPC@Mines Quick Start Guide, if you have not already done so.


FOR GNU Compiling on Mc2

Before using the GNU compilers you will need to load two modules using the following commands:

module load PrgEnv/gcc/gcc-4.4.6.bgq
module load PrgEnv/MPI/gcc/gcc.bgq

© 2018 Colorado School of Mines | | Equal Opportunity | Privacy Policy | Directories | Text Only | Mines.edu | rss

Last Updated: 03/16/2018 14:41:45