Skip to content

[package list]

GROMACS

License information

GROMACS is Free Software, available under the GNU Lesser General Public License (LGPL), version 2.1. You can redistribute it and/or modify it under the terms of the LGPL as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version.

See the "About it" page on the GROMACS web site.

User documentation

A note about the GPU versions.

There exist two different versions of GROMACS for AMD GPUs.

  • The authors of GROMACS use SYCL for an implementation for AMD GPUs. These versions are or will be part of the official GROMACS distribution channels.
  • AMD has made a hip port of the CUDA version. It is not clear though to what extent that branch will see further development as GROMACS evolves. GROMACS developers don't provide support on using this version.

According to tests in June 2023, the HIP port offered 15-25% more performance compared to SYCL port, but the GROMACS release has been more tested and is supported by the development team.

Check the technical documentation of the EasyConfigs to find out on which branch of GROMACS the recipes are based. That documentation is more towards the bottom of the page.

Alternatively, you can load one of the CSC compiled versions which are available as modules. Batch script templates for different use cases are also provided. Expect the performance from one GCD to exceed that of a 128 core CPU node. Please, consult the instructions on how to enable CSC installed module on LUMI. The CSC-compiled versions are supported by CSC rather than by the LUMI User Support Team.

A note about the CPU versions with PLUMED after the March/April 2023 system maintenance/update

After the March/April 2023 system update building PLUMED broke so those easyconfigs were replaced with two versions, one without Python support and one with a different way of enabling support for the cray-python modules. Therefore the corresponding EasyConfigs of GROMACS have also been replaced.

It is done this way because it is also unclear if the Python support is needed when used with GROMACS.

See also the page on PLUMED.

Training materials

User-installable modules (and EasyConfigs)

Install with the EasyBuild-user module:

eb <easyconfig> -r
To access module help after installation and get reminded for which stacks and partitions the module is installed, use module spider GROMACS/<version>.

EasyConfig:

Technical documentation

GROMACS and PLUMED

PLUMED is software that can be combined with GROMACS. It works via a patch for the GROMACS installation that should be applied before compiling GROMACS.

It is rather difficult to figure out which versions of PLUMED and GROMACS can be combined. One option is to look at tags in the PLUMED GitHub and then check the contents of the patches subdirectory. Another way to find out which versions of GROMACS are supported is to load the PLUMED module and then run plumed-patch -l to get a list of the included engines.

PLUMED 2019 2020 2021 2022 2023 2024
2.9.2 2022.5 2023.5 2024.3
2.9.0 2020.7 2021.7 2022.5 2023
2.8.3 2019.6 2020.7 2021.7 2022.5
2.8.0 2019.6 2020.6 2021.4
2.7.4 2019.6 2020.6 2021.4
2.7.3 2019.6 2020.6 2021.4
2.7.2 2019.6 2020.6 2021
2.7.1 2019.6 2020.5 2021
2.7.0 2019.6 2020.4
2.6.4 2019.6 2020.4
2.6.3 2019.6 2020.4
2.6.2 2019.6 2020.4
2.6.1 2019.6 2020.2
2.6.0 2019.4

GROMACS and GPU

EasyBuild

Version 2020.6 for CPE 21.08

  • The EasyConfig is a straightforward port of the CSCS one with some information added borrowed from the UAntwerpen EasyConfig.

  • We added a bash function, gromacs-completion, that can be used to turn the command completion for GROMACS on.

  • Note that the EasyConfig does not run the GROMACS tests, presumably because they require an mpirun script and/or should be run in the context of a suitable compute job.

  • The AMD-version does not support cray-hugepages. Activating this causes the Cray wrapper to add an option to the linker that it does not like.

Version 2021.3 for CPE 21.08

  • We started from our own EasyConfig for 2020.6 but had to omit GMX_PREFER_STATIC_LIBS and add BUILD_SHARED_LIBS=ON to the CMake options to avoid an error message about building GMXAPI.

  • Note that the EasyConfig does not run the GROMACS tests, presumably because they require an mpirun script and/or should be run in the context of a suitable compute job.

  • The AMD-version does not support cray-hugepages. Activating this causes the Cray wrapper to add an option to the linker that it does not like.

Version 2020.4 with PLUMED 2.6.4 for CPE 21.08

  • The integration of the PLUMED patch is based on the CSCS EasyConfigs. We still compile single and double precision versions as the CSCS version does this.

    Old documents claim that one should only run GROMACS in double precision when using PLUMED, but then I found comments on PLUMED 2.x that claim that that one always runs in double precision, also when used with a single precision GROMACS, to avoid numerical problems that can occur in single precision in some of the PLUMED routines.

  • Note that it was not possible to get GROMACS 2020.X to work with the cpeGNU 21.12 environment which is based on GCC 11.2.0.

Version 2021.5 for CPE 21.12

  • Compiled with cpeGNU, cpeCray and cpeAOCC but not yet benchmarked.

  • 2020 versions did not work with the GNU compiler in LUMI/21.12

Version 2021.4 with PLUMED 2.7.4 and 2.8.0 for CPE 21.12 and later

  • As PLUMED 2.8.0 is a .0 version, we decided to also offer recipes for the latest 2.7 version at the time of developement.

  • After the March/April 2023 system update building PLUMED broke so those easyconfigs were replaced with two versions, one without Python support and one with a different way of enabling support for the cray-python modules. Therefore the corresponding EasyConfigs of GROMACS have also been replaced.

GROMACS-2023-dev-cpeGNU-22.08-MPI-GPU

  • This is an EasyConfig for AMD's own, unofficial HIP-port of GROMACS which is a version that is not supported by the main GROMACS developers, who prefer to work with SYCL for support for AMD GPUs. It is derived from the container recipes of AMD.

GROMACS 2023.2 and 2023.3 with AMD GPU support for CPE 22.12

  • There are different choices for building GROMACS with AMD GPU acceleration on LUMI, that follows installation guide:
    • Easyconfig files for the 2023.2 release use hipSYCL GPU backend with ROCm v5.2.3
    • These versions should be only built against AMD easybuild toolchain
    • MPI versions are recommended to use on LUMI
    • HeFFTe variant allows offload to multiple GPUs (relies on rocFFT) with direct GPU communication and PME decomposition across multiple GPUs
    • VkFFT variant is faster but does not support PME decomposition, recommended for a single GPU runs (standalone or ensemble) or in multi GPU runs with exactly one separate PME rank (i.e. -npme 1 runtime option).
    • thread-MPI is for a single node use only and does not support direct GPU communication, recommended only for a single GPU use.

Version 2021.7 for CPE 23.09

  • Trivial version bump of our EasyConfigs for 2021.5 and 2021.6.

Version 2022.6 for CPE 23.09

  • For now a trivial version bump of the 2021.5/2021.6 series.

Version 2023.3 for CPE 23.09

  • For now a trivial version bump of the 2021.5/2021.6 series.

Release 2024.1 for CPE 23.09

  • For running on AMD GPUs recommended variants are:

  • Multiple GPUs: AdaptiveCpp 23.10.0 with ROCm 5.4.6 and instant submission (enabled by default)

Release 2024.3 for CPE 24.03

  • Ports of the previous versions.

  • Found in the PLUMED manual that it is incompatible with Thread-MPI so Thread-MPI has been explicitly turned off in the modules with PLUMED support.

  • Turned off GMXAPI support in cpeCray and cpeAMD builds as those libraries wouldn't work with GCC-compiled Python distributions anyway due to OpenMP runtime conflicts.

Archived EasyConfigs

The EasyConfigs below are additonal easyconfigs that are not directly available on the system for installation. Users are advised to use the newer ones and these archived ones are unsupported. They are still provided as a source of information should you need this, e.g., to understand the configuration that was used for earlier work on the system.