Navigation

Spack

AMD Toolchain with SPACK

Micro Benchmarks/Synthetic

SPACK HPC Applications

Introduction

Open MPI from the open source project https://www.open-mpi.org/ is a Message Passing Interface implementation used as a communication protocol for parallel and distributed computers.

Install these packages before using Open MPI with Spack

  • Install XPMEM package
  • Install UCX packages
  • Adding third party packages such as hcoll and mxm to Spack’s package.yaml file

Build XPMEM using Spack

XPMEM is a Linux® Kernel extension used for efficient shared memory communication. This kernel extension package must be installed using GCC compiler.

Source Code : http://gitlab.com/hjelmn/xpmem

XPMEM
# Format For Building XPMEM
$ spack -d install -v xpmem@<Version Number> %gcc@<Version Number>
# Example: For Building XPMEM 2.6.5-36 with default GCC 8.3.1
$ spack -d install -v xpmem@2.6.5-36 %gcc@8.3.1

Specifications and Dependencies

Symbol Meaning
-d To enable debug output (*)
-v To enable verbose (*)
@ To specify version number
% To specify compiler

Here (*) are optional

Build UCX using Spack

UCX is a unified communication abstraction that provides high-performance communication services over a variety of network interconnects and shared memory technologies. We recommend to compile UCX using GCC only.

Source Code:  http://www.openucx.org/downloads

UCX
# Format For Building UCX
$ spack -d install -v ucx@<Version Number> %gcc@<Version Number>
# Example: For Building UCX 1.9.0 with with default GCC 8.3.1
$ spack -d install -v ucx@1.9.0 %gcc@8.3.1 +thread_multiple +knem +xpmem +cma +rc +ud +dc +mlx5-dv +ib-hw-tm +dm +cm

Here (*) are optional
Note : UCX target specific flags are enabled by defaults such as AVX.

Specifications and Dependencies

Symbol Meaning
-d To enable debug output (*)
-v To enable verbose (*)
@ To specify version number
% To specify compiler
+thread_multiple To enable thread support in UCP and UCT
+knem To enable KNEM support
+xpmem To enable XPMEM support
+cma To enable Cross Memory Attach
+rc To compile with IB Reliable Connection support
+dc To compile with IB Dynamic Connection support
+uc To compile with IB Unreliable Datagram support
+ib-hw-tm To compile with IB Tag Matching support
+dm To compile with Device Memory support
+cm To compile with IB Connection Manager support

Here (*) are optional
Note: +cm option may not work with RHEL 8 and above

Adding third party packages to Spack

Before installing Open MPI using Spack, AMD recommends the user identify the optional Mellanox® packages hcoll and mxm as external packages.

third party packages
Step 1: Below command edits package.yaml file.
$ spack config edit packages
Step 2: User needs to provide hcoll spec and prefix something like below lines to package.yaml
packages:
hcoll:
externals:
- spec: hcoll@5.2.2
prefix: /opt/mellanox/hcoll
mxm:
externals:
- spec: mxm@5.2.2
prefix: /opt/mxm
Step 3: Below command will reconfirm if hcoll@<Version Number> is added as external package or not
$ spack spec hcoll
Input spec
--------------------------------
hcoll
Concretized
--------------------------------
hcoll@5.2.2%gcc@8.3.1 arch=linux-centos8-zen

Build Open MPI using Spack

OpenMPI
# Format For Building Open MPI
$ spack -d install -v openmpi@<Version Number> %aocc@<Version Number> fabrics=<List> <List fabrics, versions and target architecture details>
# Example: For Building Open MPI 4.0.5 with AOCC 3.1.0
$ spack -d install -v openmpi@4.0.5+cxx %aocc@3.1.0 target=zen3 fabrics=xpmem,knem,ucx,mxm,hcoll ^xpmem@2.6.5-36%gcc@8.3.1 target=zen ^knem@1.1.4%gcc@8.3.1 target=zen ^ucx@1.9.0%gcc@8.3.1 target=zen
# Example: For Building Open MPI 4.0.5 with AOCC 3.0.0
$ spack -d install -v openmpi@4.0.5+cxx %aocc@3.0.0 target=zen3 fabrics=xpmem,knem,ucx,mxm,hcoll ^xpmem@2.6.5-36%gcc@8.3.1 target=zen ^knem@1.1.4%gcc@8.3.1 target=zen ^ucx@1.9.0%gcc@8.3.1 target=zen
# Example: For Building Open MPI 4.0.5 with AOCC 2.3.0
$ spack -d install -v openmpi@4.0.5+cxx %aocc@2.3.0 target=zen2 fabrics=xpmem,knem,ucx,mxm,hcoll ^xpmem@2.6.5-36%gcc@8.3.1 target=zen ^knem@1.1.4%gcc@8.3.1 target=zen ^ucx@1.9.0%gcc@8.3.1 target=zen
# Example: For Building Open MPI 4.0.5 with AOCC 2.2.0
$ spack -d install -v openmpi@4.0.5+cxx %aocc@2.2.0 target=zen2 fabrics=xpmem,knem,ucx,mxm,hcoll ^xpmem@2.6.5-36%gcc@8.3.1 target=zen ^knem@1.1.4%gcc@8.3.1 target=zen ^ucx@1.9.0%gcc@8.3.1 target=zen

Note: knem, xpmem & ucx  needs to be installed using the GCC compilers always. In the above commands for the build, the knem, xpmem & ucx  are compiled with GCC 8.3.1.  Please change the GCC version in your Spack build command to match the preferred GCC available on your system.  To check the available compilers, use the command “spack compilers“.  Note that the knem, xpmem, ucx  “target” option needs to set to the GCC version used.  For GCC version 8 and lower, use “target=zen”.  For GCC version 9 and higher, use “target=zen2”, for GCC version 11 and higher use “target=zen3”.

Symbol Meaning
-d To enable debug output (*)
-v To enable verbose (*)
@ To specify version number
% To specify compiler
target= To specify architecture (For cross compilation)
fabrics= To pass fabrics (*)
hcoll To build with hcoll (Mellanox® Hierarchical Collectives) (*)
xpmem To build with XPMEM kernel module support (*)
knem To build knem Linux® kernel module support (*)
ucx To build with Unified Communication X library support (*)
mxm To build with Mellanox® Messaging support (*)
hcoll To build with hcoll (Mellanox® Hierarchical Collectives) (*)
^xpmem@2.6.5-36 Use xpmem version 2.6.5-36 and default is 2.6.5 (*)
^knem@1.1.4 Use knem version 1.1.4 and default is 1.1.4 (*)
^ucx@1.9.0 Use ucx version 1.9.0 and default is 1.9.0 (*)

Here (*) are optional but highly recommended