This is an old revision of the document!
Master Node
- 2 Intel Xeon E5-2640v4 (2.4GHz 10core)
- 64GB RAM
- 2 2TB 7200RPM, RAID1
- Gigabit Ethernet
- Infiniband
Storage Node
- 2 Intel Xeon E5-2650v4 (2.2GHz 12core)
- 64GB RAM
- 2 480GB SSDs (sdb and sdc, RAID, / and /boot)
- 12 4TB 7200RPM, (logical volume group vg-data, /data, RAID6)
- Gigbit Ethernet
- Inifiband
Compute Nodes (12)
- 10 nodes:
- 2 Intel Xeon E5-2640v4 (2.4GHz 10core)
- 64GB RAM
- 64GB SATA DOM (local filesystem)
- 2 nodes:
- 2 Intel Xeon Cascade Lake Gold 5215 (2.5GHz 10core)
- 128GB RAM
- 64GB SATA DOM (local filesystem)
Gigabit Ethernet Switch
- Cisco 2960X
Infiniband Switch
- Mellanox SwitchX-2
Software
- Scheduler:
- SLURM:
- version 22.05.09
- SLURM Documentation (https://slurm.schedmd.com/archive/slurm-22.05.9/)
- Open OnDemand:
- Build and run jobs, access a shell or an interactive session in your browser
- Documentation (https://openondemand.org/)
- Access Grinnell's Cluster (https://hpc.grinnell.edu) (on-campus network connection required)
- MPI:
- OpenMPI:
- version 1.8.8gcc
- Chemistry:
-
- Application directory: /data/home/webmo/webmo
- Globals config file: /data/home/webmo/webmo/globals.ini
- Gaussian 09 (http://gaussian.com)
- Application directory: /usr/local/gaussian
- NBO6
- Application directory: /usr/local/nbo6
- Symbolic link: /usr/local/gaunbo6 → /usr/local/nbo6/bin/gaunbo6
- GROMACS
- Load with `module load gromacs2023.2`
- Application directory: /usr/local/gromacs
-
- Physics:
- Charm++