Enzo - multi-physics hydrodynamic astrophysical calculations

The enzo enviroment can be activated by executing:

$ module load enzo

The pre-built enzo executable is compatible with run/Hydro/Hydro-3D like problems or run/Cosmology/SphericalInfall

Assuming that the problem being solved is a run/Hydro/Hydro-3D like problem, the following script submits and runs the job on the cluster using 32 cores in total. The tasks are spread across 4 nodes with 8 cores on each node:

 1#!/bin/bash
 2
 3## specify the job and project name
 4#SBATCH --job-name=enzo
 5#SBATCH --account=9639717
 6
 7## specify the required resources
 8#SBATCH --partition=normal
 9#SBATCH --nodes=4
10#SBATCH --ntasks-per-node=8
11#SBATCH --cpus-per-task=1
12#SBATCH --mem=32000
13#SBATCH --time=0-01:00:00
14
15module purge
16module load enzo
17srun --mpi=pmix enzo.exe SphericalInfall.enzo

After creating the job file job.sh, submit the job to the scheduler

$ sbatch job.sh

To compile enzo from the source code see BuildEnzoFromSource. To compile enzo from a custom problem see BuildEnzoFromSourceCustomProblem.

Building Enzo from source

external references

http://enzo-project.org/BootCamp.html https://grackle.readthedocs.io/en/latest/ https://grackle.readthedocs.io/en/latest/Installation.html#downloading

dependencies

  • mercurial

  • servial version of hdf5 1.8.15-patch1

  • openmpi

  • gcc-9.1.0

  • libtool (for grackle)

  • Intel compiler (optional)

All these dependencies/prerequisites can be loaded through

$ module load enzo/prerequisites
  • download (or clone with mercurial) the enzo source code and extract it

$ wget https://bitbucket.org/enzo/enzo-dev/get/enzo-2.5.tar.gz
$ tar -xzvf enzo-2.5.tar.gz
$ cd enzo-enzo-dev-2984068d220f
  • configure it

  $ ./configure
  $ make machine-octopus
  $ make show-config

This command will modify the file ``Make.config.machine``
  • after preparing the build make files, execute

    $ make -j8
    

    to compile and produce the enzo executable

  • The installation has been tested with the problem CollapseTestNonCosmological that is located at in the directory ENZO_SOURCE_TOPDIR/run/Hydro/Hydro-3D/CollapseTestNonCosmological/.

Build enzo for a custom problem

To build enzo for the sample simulation e.g. run/CosmologySimulation/AMRCosmology after loading the enzo environment:

$ module load enzo

Note

this loaded version of enzo might not be compatible with your custom problem, so a simulation run might fail. But the enzo executable in the loaded enzo environment is needed to produce the Enzo_Build file needed to compile enzo for your custom problem. Or you can produce an Enzo_Build file yourself if you know what your are doing.

  • execute

    $ enzo AMRCosmology.enzo
    

    in the directory run/CosmologySimulation/AMRCosmology

  • this will generate a Enzo_Build file

  • copy Enzo_Build to the enzo source code dir

    $ cp run/CosmologySimulation/AMRCosmology/Enzo_Build src/enzo/Make.settings.AMRCosmology
    
  • configure the make files with the new settings

$ make machine-octopus
$ make show-config
$ make load-config-AMRCosmology
$ make show-config

This will change the content of Make.config.override

  • build enzo

    $ make -j8
    
  • copy the produced enzo.exe to the problem directory

  • for this AMRCosmology also the inits.exe and ring.exe must be built. From the top level source directory

# to build inits.exe
$ cd src/inits
$ make -j8

# to build ring.exe
$ cd src/ring
$ make -j8
  • copy also input/cool_rates.in into the simulation directory AMRCosmology and follow the instructions in notes.txt to start the simulation

Note

make sure to use ./enzo.exe (that is the executable built for your problem) instead of just enzo.exe (that is the executable in the default enzo environment) in the script job.sh.

Build enzo with the Intel compiler

To produce the enzo Makefile with the needed Intel compiler flags:

  • create copy the octopus makefile

    cp Make.mach.octopus Make.mach.octopus-intel-2019u3
    
  • do the following changes to Make.mach.octopus-intel-2019u3

    LOCAL_GRACKLE_INSTALL = /apps/sw/grackle-3.2-intel-2019u3
    MACH_CC_MPI    = mpiicc  # C compiler when using MPI
    MACH_CXX_MPI   = mpiicpc # C++ compiler when using MPI
    MACH_FC_MPI    = mpiifort   # Fortran 77 compiler when using MPI
    MACH_F90_MPI   = mpiifort   # Fortran 90 compiler when using MPI
    MACH_LD_MPI    = mpiicpc # Linker when using MPI
    MACH_CC_NOMPI  = icc   # C compiler when not using MPI
    MACH_CXX_NOMPI = icpc  # C++ compiler when not using MPI
    MACH_FC_NOMPI  = ifort # Fortran 77 compiler when not using MPI
    MACH_F90_NOMPI = ifort # Fortran 90 compiler when not using MPI
    MACH_LD_NOMPI  = icpc  # Linker when not using MPI
    MACH_OPT_AGGRESSIVE  = -O3 -xHost
    
    #-----------------------------------------------------------------------
    # Precision-related flags
    #-----------------------------------------------------------------------
    
    MACH_FFLAGS_INTEGER_32 = -i4
    MACH_FFLAGS_INTEGER_64 = -i8
    MACH_FFLAGS_REAL_32    = -r4
    MACH_FFLAGS_REAL_64    = -r8
    
    LOCAL_LIBS_MACH   = -limf -lifcore # Machine-dependent libraries
    
  • to compile with the new intel makefile

    $ module load intel/2019u3
    $ module load grackle/3.2-intel
    $ make machine-octopus-intel-2019u3
    $ make opt-high
    
    # (optional - use if needed)
    # or for agressive optimization, before publishing results with such
    # agressive optimization, check the scientific results with those
    # run with opt-high or even opt-debug
    $make opt-aggressive
    
    # compile enzo
    $ make -j
    

Compile Grackle and build it

  • download the Grackle source code and extract it (or clone it from github)

create a copy of the makefile Make.mach.linux-gnu

$ cp Make.mach.linux-gnu Make.mach.octopus

in the Octopus makefile, specify by specifying the path to HDF5 and the install prefix:

MACH_FILE  = Make.mach.octopus
LOCAL_HDF5_INSTALL = /apps/sw/hdf/hdf5-1.8.15-patch1-serial-gcc-7.2
MACH_INSTALL_PREFIX = /apps/sw/grackle/grackle-3.2

To build and install grackle, execute:

$ cd grackle/src/clib
$ make machine-octopus
$ make
$ make install

Compile grackle with the Intel compiler

Copy the makefile and create one specific to the Intel compiler

$ cp Make.mach.octopus Make.mach.octopus-intel-2019u3
MACH_CC_NOMPI  = icc   # C compiler
MACH_CXX_NOMPI = icpc  # C++ compiler
MACH_FC_NOMPI  = ifort # Fortran 77
MACH_F90_NOMPI = ifort # Fortran 90
MACH_LD_NOMPI  = icc   # Linker
MACH_INSTALL_PREFIX = /apps/sw/grackle/grackle-3.2-intel-2019u3

To build and install it, execute:

$ module unload gcc
$ module load intel/2019u3
$ cd grackle/src/clib
$ make machine-octopus-intel-2019u3
$ make
$ make install

Using Yt to postprocess Enzo snapshots

Yt can be used either in interactive mode by submitting an interactive job interactive job, through jupyter notebooks, or by using python script via regular batch jobs. A simple visualization can be produced by executing the following (after a job is allocated):

import yt
ds = yt.load("/home/john/my_enzo_simulation/DD0000/DD0000")
print ("Redshift =", ds.current_redshift)
p.save()

A .png file will be saved to disk in this case. In an interactive job the images can be viewed live.