AMD Toolchain with SPACK

Micro Benchmarks/Synthetic

SPACK HPC Applications

Spack allows you to customize how your software is built through the packages.yaml file. You can direct Spack to prefer particular implementations of virtual dependencies (e.g., MPI or BLAS/LAPACK), or you can direct Spack to prefer to build with particular compilers. You can also direct Spack to use external software installations already present on your system.

You can either set build preferences specifically for one package, or you can specify that certain settings should apply to all packages. The types of settings you can customize are described in detail below.

Spack’s build defaults are in the default etc/spack/defaults/packages.yaml file. You can override them in ~/.spack/packages.yaml or etc/spack/packages.yaml.

Note: For Mode details, please check the official page:

External Packages

Spack can be configured to use externally-installed packages rather than building its own packages. This may be desirable if machines ship with system packages, such as a customized MPI that should be used instead of Spack building its own MPI.

External packages are configured through the packages.yaml file found in a Spack installation’s etc/spack/ or a user’s ~/.spack/ directory.

Here’s an example of an external configuration:

Sample packages.yaml
spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel

This example lists three installations of Open MPI, one built with GCC, one built with GCC and debug information, and another built with the Intel® compiler. If Spack is asked to build a package that uses one of these MPIs as a dependency, it will use the pre-installed Open MPI in the given directory. Note that the specified path is the top-level install prefix, not the bin subdirectory.

Automatically Find External Packages

You can run the spack external find command to search for system-provided packages and add them to packages.yaml. After running this command your packages.yaml may include new entries:

– spec: cmake@3.17.2
prefix: /usr

Generally this is useful for detecting a small set of commonly-used packages; for now this is generally limited to finding build-only dependencies. Specific limitations include:

  • Packages are not discoverable by default: For a package to be discoverable with spack external find, it needs to add special logic. See here for more details.
  • The current implementation only collects and examines executable files, so it is typically only useful for build/run dependencies (in some cases, if a library package also provides an executable, it may be possible to extract a meaningful Spec by running the executable – for example the compiler wrappers in MPI implementations).
  • The logic does not search through module files; it can only detect packages with executables defined in PATH. You can help Spack locate externals that use module files by loading associated modules for packages that you want Spack to know about before running spack external find.
  • Spack does not overwrite existing entries in the package configuration: If there is an external defined for a spec at any configuration scope, then Spack will not add a new external entry (spack config blame packages can help locate all external entries).

Steps to ADD externally built OpenMPI application to Spack

In clusters that have already installed applications, load the applications using the module object and add them to Spack using the
spack external find ” command.

Steps to add pre-built Open MPI to Spack
# Load the openmpi module
$ module load openmpi/aocc22/4.0.5
# find the available packages
$ spack external find
==> The following specs have been detected on this system and added to /home/smoharan/.spack/packages.yaml
— no arch / aocc@2.2.0 —————————————–
# Now you can find the entries of these packages in packages.yaml file
# To locate all external entries

$ spack config blame packages
# Now you can use this Openmpi to build other applications also
$ spack -d install -v hpcg%aocc@2.2.0 +openmp cflags="-O3" target=zen2 ^openmpi@4.0.5