2. Installation and Configuration¶
2.1. Installing on Linux/Mac with conda-forge¶
Conda is an open source package management system and environment management system for installing multiple versions of software packages and their dependencies and switching easily between them. If you have conda installed on your system, OpenMC can be installed via the conda-forge channel. First, add the conda-forge channel with:
conda config --add channels conda-forge
To list the versions of OpenMC that are available on the conda-forge channel, in your terminal window or an Anaconda Prompt run:
conda search openmc
OpenMC can then be installed with:
conda create -n openmc-env openmc
This will install OpenMC in a conda environment called openmc-env. To activate the environment, run:
conda activate openmc-env
2.2. Installing on Linux/Mac/Windows with Docker¶
OpenMC can be easily deployed using Docker on any Windows, Mac, or Linux system. With Docker running, execute the following command in the shell to download and run a Docker image with the most recent release of OpenMC from DockerHub:
docker run openmc/openmc:latest
This will take several minutes to run depending on your internet download speed. The command will place you in an interactive shell running in a Docker container with OpenMC installed.
Note
The docker run
command supports many options for spawning
containers including mounting volumes from the host filesystem,
which many users will find useful.
2.3. Installing from Source using Spack¶
Spack is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. Please follow Spack’s setup guide to configure the Spack system.
The OpenMC Spack recipe has been configured with variants that match most options provided in the CMakeLists.txt file. To see a list of these variants and other information use:
spack info openmc
Note
It should be noted that by default OpenMC builds with -O2 -g
flags which
are equivalent to a CMake build type of RelwithDebInfo. In addition, MPI
is OFF while OpenMP is ON.
It is recommended to install OpenMC with the Python API. Information about this Spack recipe can be found with the following command:
spack info py-openmc
Note
The only variant for the Python API is mpi
.
The most basic installation of OpenMC can be accomplished by entering the following command:
Caution
When installing any Spack package, dependencies are assumed to be at
configured defaults unless otherwise specfied in the specification on the
command line. In the above example, assuming the default options weren’t
changed in Spack’s package configuration, py-openmc will link against a
non-optimized non-MPI openmc. Even if an optimized openmc was built
separately, it will rebuild openmc with optimization OFF. Thus, if you are
trying to link against dependencies that were configured different than
defaults, ^openmc[variants]
will have to be present in the command.
For a more performant build of OpenMC with optimization turned ON and MPI provided by OpenMPI, the following command can be used:
spack install py-openmc+mpi ^openmc+optimize ^openmpi
Note
+mpi
is automatically forwarded to OpenMC.
Tip
When installing py-openmc, it will use Spack’s preferred Python. For
example, assuming Spack’s preferred Python is 3.8.7, to build py-openmc
against the latest Python 3.7 instead, ^python@3.7.0:3.7.99
should be
added to the specification on the command line. Additionally, a compiler
type and version can be specified at the end of the command using
%gcc@<version>
, %intel@<version>
, etc.
A useful tool in Spack is to look at the dependency tree before installation.
This can be observed using Spack’s spec
tool:
Once installed, environment/lmod modules can be generated or Spack’s load
feature can be used to access the installed packages.
2.4. Installing from Source¶
2.4.1. Prerequisites¶
Required
A C/C++ compiler such as gcc
OpenMC’s core codebase is written in C++. The source files have been tested to work with a wide variety of compilers. If you are using a Debian-based distribution, you can install the g++ compiler using the following command:
sudo apt install g++CMake cross-platform build system
The compiling and linking of source files is handled by CMake in a platform-independent manner. If you are using Debian or a Debian derivative such as Ubuntu, you can install CMake using the following command:
sudo apt install cmakeHDF5 Library for portable binary output format
OpenMC uses HDF5 for many input/output files. As such, you will need to have HDF5 installed on your computer. The installed version will need to have been compiled with the same compiler you intend to compile OpenMC with. If compiling with gcc from the APT repositories, users of Debian derivatives can install HDF5 and/or parallel HDF5 through the package manager:
sudo apt install libhdf5-devParallel versions of the HDF5 library called libhdf5-mpich-dev and libhdf5-openmpi-dev exist which are built against MPICH and OpenMPI, respectively. To link against a parallel HDF5 library, make sure to set the HDF5_PREFER_PARALLEL CMake option, e.g.:
CXX=mpicxx.mpich cmake -DHDF5_PREFER_PARALLEL=on ..Note that the exact package names may vary depending on your particular distribution and version.
If you are using building HDF5 from source in conjunction with MPI, we recommend that your HDF5 installation be built with parallel I/O features. An example of configuring HDF5 is listed below:
CC=mpicc ./configure --enable-parallelYou may omit
--enable-parallel
if you want to compile HDF5 in serial.
Optional
An MPI implementation for distributed-memory parallel runs
To compile with support for parallel runs on a distributed-memory architecture, you will need to have a valid implementation of MPI installed on your machine. The code has been tested and is known to work with the latest versions of both OpenMPI and MPICH. OpenMPI and/or MPICH can be installed on Debian derivatives with:
sudo apt install mpich libmpich-dev sudo apt install openmpi-bin libopenmpi-devgit version control software for obtaining source code
DAGMC toolkit for simulation using CAD-based geometries
OpenMC supports particle tracking in CAD-based geometries via the Direct Accelerated Geometry Monte Carlo (DAGMC) toolkit (installation instructions). For use in OpenMC, only the
MOAB_DIR
andBUILD_TALLY
variables need to be specified in the CMake configuration step when building DAGMC. This option also allows unstructured mesh tallies on tetrahedral MOAB meshes. In addition to turning this option on, the path to the DAGMC installation should be specified as part of theCMAKE_PREFIX_PATH
variable:cmake -Ddagmc=on -DCMAKE_PREFIX_PATH=/path/to/dagmc/installationlibMesh mesh library framework for numerical simulations of partial differential equations
This optional dependency enables support for unstructured mesh tally filters using libMesh meshes. Any 3D element type supported by libMesh can be used, but the implementation is currently restricted to collision estimators. In addition to turning this option on, the path to the libMesh installation should be specified as part of the
CMAKE_PREFIX_PATH
variable.:CXX=mpicxx cmake -Dlibmesh=on -DCMAKE_PREFIX_PATH=/path/to/libmesh/installationNote that libMesh is most commonly compiled with MPI support. If that is the case, then OpenMC should be compiled with MPI support as well.
2.4.2. Obtaining the Source¶
All OpenMC source code is hosted on GitHub. You can download the source code directly from GitHub or, if you have the git version control software installed on your computer, you can use git to obtain the source code. The latter method has the benefit that it is easy to receive updates directly from the GitHub repository. GitHub has a good set of instructions for how to set up git to work with GitHub since this involves setting up ssh keys. With git installed and setup, the following command will download the full source code from the GitHub repository:
git clone --recurse-submodules https://github.com/openmc-dev/openmc.git
By default, the cloned repository will be set to the development branch. To switch to the source of the latest stable release, run the following commands:
cd openmc
git checkout master
2.4.3. Build Configuration¶
Compiling OpenMC with CMake is carried out in two steps. First, cmake
is run
to determine the compiler, whether optional packages (MPI, HDF5) are available,
to generate a list of dependencies between source files so that they may be
compiled in the correct order, and to generate a normal Makefile. The Makefile
is then used by make
to actually carry out the compile and linking
commands. A typical out-of-source build would thus look something like the
following
mkdir build && cd build
cmake ..
make
Note that first a build directory is created as a subdirectory of the source directory. The Makefile in the top-level directory will automatically perform an out-of-source build with default options.
2.4.3.1. CMakeLists.txt Options¶
The following options are available in the CMakeLists.txt file:
- debug
- Enables debugging when compiling. The flags added are dependent on which compiler is used.
- profile
- Enables profiling using the GNU profiler, gprof.
- optimize
- Enables high-optimization using compiler-dependent flags. For gcc and Intel C++, this compiles with -O3.
- openmp
- Enables shared-memory parallelism using the OpenMP API. The C++ compiler being used must support OpenMP. (Default: on)
- dagmc
- Enables use of CAD-based DAGMC geometries and MOAB_ unstructured mesh tallies. Please see the note about DAGMC in the optional dependencies list for more information on this feature. The installation directory for DAGMC should also be defined as DAGMC_ROOT in the CMake configuration command. (Default: off)
- libmesh
- Enables the use of unstructured mesh tallies with libMesh. (Default: off)
- coverage
- Compile and link code instrumented for coverage analysis. This is typically used in conjunction with gcov.
To set any of these options (e.g. turning on debug mode), the following form should be used:
cmake -Ddebug=on /path/to/openmc
2.4.3.2. Compiling with MPI¶
To compile with MPI, set the CXX
environment variable to the path to
the MPI C++ wrapper. For example, in a bash shell:
export CXX=mpicxx
cmake /path/to/openmc
Note that in many shells, environment variables can be set for a single command, i.e.
CXX=mpicxx cmake /path/to/openmc
2.4.3.3. Selecting HDF5 Installation¶
CMakeLists.txt searches for the h5cc
or h5pcc
HDF5 C wrapper on
your PATH environment variable and subsequently uses it to determine library
locations and compile flags. If you have multiple installations of HDF5 or one
that does not appear on your PATH, you can set the HDF5_ROOT environment
variable to the root directory of the HDF5 installation, e.g.
export HDF5_ROOT=/opt/hdf5/1.8.15
cmake /path/to/openmc
This will cause CMake to search first in /opt/hdf5/1.8.15/bin for h5cc
/
h5pcc
before it searches elsewhere. As noted above, an environment variable
can typically be set for a single command, i.e.
HDF5_ROOT=/opt/hdf5/1.8.15 cmake /path/to/openmc
2.4.4. Compiling on Linux and Mac OS X¶
To compile OpenMC on Linux or Max OS X, run the following commands from within the root directory of the source code:
mkdir build && cd build
cmake ..
make
make install
This will build an executable named openmc
and install it (by default in
/usr/local/bin). If you do not have administrative privileges, you can install
OpenMC locally by specifying an install prefix when running cmake:
cmake -DCMAKE_INSTALL_PREFIX=$HOME/.local ..
The CMAKE_INSTALL_PREFIX
variable can be changed to any path for which you
have write-access.
2.4.5. Compiling on Windows 10¶
Recent versions of Windows 10 include a subsystem for Linux that allows one to
run Bash within Ubuntu running in Windows. First, follow the installation guide
here to get Bash
on Ubuntu on Windows setup. Once you are within bash, obtain the necessary
prerequisites via apt
. Finally, follow the
instructions for compiling on linux.
2.4.6. Testing Build¶
To run the test suite, you will first need to download a pre-generated cross section library along with windowed multipole data. Please refer to our Test Suite documentation for further details.
2.5. Installing Python API¶
If you installed OpenMC using Conda, no further steps are
necessary in order to use OpenMC’s Python API. However, if
you are installing from source, the Python API is not
installed by default when make install
is run because in many situations it
doesn’t make sense to install a Python package in the same location as the
openmc
executable (for example, if you are installing the package into a
virtual environment). The
easiest way to install the openmc
Python package is to use pip, which is
included by default in Python 3.4+. From the root directory of the OpenMC
distribution/repository, run:
pip install .
pip will first check that all required third-party packages have been installed, and if they are not present,
they will be installed by downloading the appropriate packages from the Python
Package Index (PyPI). However, do note that since pip
runs the setup.py
script which requires NumPy, you will have to first
install NumPy:
pip install numpy
2.5.1. Installing in “Development” Mode¶
If you are primarily doing development with OpenMC, it is strongly recommended to install the Python package in “editable” mode.
2.5.2. Prerequisites¶
The Python API works with Python 3.5+. In addition to Python itself, the API relies on a number of third-party packages. All prerequisites can be installed using Conda (recommended), pip, or through the package manager in most Linux distributions.
Required
- NumPy
- NumPy is used extensively within the Python API for its powerful N-dimensional array.
- SciPy
- SciPy’s special functions, sparse matrices, and spatial data structures are used for several optional features in the API.
- pandas
- Pandas is used to generate tally DataFrames as demonstrated in an example notebook.
- h5py
- h5py provides Python bindings to the HDF5 library. Since OpenMC outputs various HDF5 files, h5py is needed to provide access to data within these files from Python.
- Matplotlib
- Matplotlib is used to providing plotting functionality in the API like the
Universe.plot()
method and theopenmc.plot_xs()
function. - uncertainties
- Uncertainties are used for decay data in the
openmc.data
module. - lxml
- lxml is used for the openmc-validate-xml script and various other parts of the Python API.
Optional
- mpi4py
- mpi4py provides Python bindings to MPI for running distributed-memory parallel runs. This package is needed if you plan on running depletion simulations in parallel using MPI.
- Cython
- Cython is used for resonance reconstruction for ENDF data converted to
openmc.data.IncidentNeutron
. - vtk
- The Python VTK bindings are needed to convert voxel and track files to VTK format.
- pytest
- The pytest framework is used for unit testing the Python API.
If you are running simulations that require OpenMC’s Python bindings to the C
API (including depletion and CMFD), it is recommended to build h5py
(and
mpi4py
, if you are using MPI) using the same compilers and HDF5 version as
for OpenMC. Thus, the install process would proceed as follows:
mkdir build && cd build
HDF5_ROOT=<path to HDF5> CXX=<path to mpicxx> cmake ..
make
make install
cd ..
MPICC=<path to mpicc> pip install mpi4py
HDF5_DIR=<path to HDF5> pip install --no-binary=h5py h5py
If you are using parallel HDF5, you’ll also need to make sure the right MPI wrapper is used when installing h5py:
CC=<path to mpicc> HDF5_MPI=ON HDF5_DIR=<path to HDF5> pip install --no-binary=h5py h5py
2.6. Configuring Input Validation with GNU Emacs nXML mode¶
The GNU Emacs text editor has a built-in mode that extends functionality for
editing XML files. One of the features in nXML mode is the ability to perform
real-time validation of XML files against a RELAX NG schema. The OpenMC
source contains RELAX NG schemas for each type of user input file. In order for
nXML mode to know about these schemas, you need to tell emacs where to find a
“locating files” description. Adding the following lines to your ~/.emacs
file will enable real-time validation of XML input files:
(require 'rng-loc)
(add-to-list 'rng-schema-locating-files "~/openmc/schemas.xml")
Make sure to replace the last string on the second line with the path to the schemas.xml file in your own OpenMC source directory.