.. _installation: Installation ============ There are (currently) two components to :obj:`SkellySim`: the python portion, and the actual binary. The python portion is used generating config files and precompute data as well as visualization. The binary (:obj:`C++`) portion is for actually running the simulation and generating field data. These two components are completely separate, though we provide them together in the :obj:`singularity` images below. This allows you to analyze and visualize simulation data locally without having to install the full simulation program. Singularity (beginner recommended: contains everything) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ For non-technical or new users just trying it out, we recommend just using our provided :obj:`singularity` container. For cluster environments, we do recommend compiling from source if possible, though the singularity container should be more than adequate for single-node jobs. If you absolutely need multi-node simulations, please compile using your cluster's MPI resources (feel free to file a `github issue `_ if you are having issues. I've provided two builds: one for AVX, and one for AVX2 instruction sets. The AVX512 binary rarely performs better enough to justify the maintenance of it. If you don't know what these are, AVX is a safer bet and should provide good performance. If you're on a really ancient processor and AVX is still too modern, I can provide a generic build if you reach out via our github issue page. We don't officially support M1 macs (you could possibly run under Rosetta, though we wouldn't recommend it) and have not tested this on Intel Macs, though it will likely work on Intel Macs. - `Latest version (AVX) `_ - `Latest version (AVX2) `_ Running commands in :obj:`singularity` containers is straightforward. First you need to actually install `singularity `_. At :obj:`flatiron` and many other computing centers, this is available via the :obj:`module` system already (e.g. :obj:`module load singularity`). Then, any command you would typically run directly in the shell, you just prefix it with :obj:`singularity exec /path/to/image.sif`. Note that this only works as-is if the path you're writing to is in your home directory somewhere directly, which in some environments isn't advisable to work from. Singularity does not bind other directories not in home. Other paths (such as :obj:`ceph` paths at FI), might need to be bound explicitly by singularity. To bind the current working directory for read/write access you might have to change the command to be more like this .. code-block:: bash singularity exec -B $(realpath $PWD) /path/to/skellysim_container.sif python3 gen_config.py where the :obj:`-B $(realpath $PWD)` tells singularity to bind the current directory `outside the container` to the same path `inside the container.` Regardless, the workflow will roughly look something like this for working within a singularity environment: .. code-block:: bash singularity exec /path/to/skellysim_container.sif python3 gen_config.py singularity exec /path/to/skellysim_container.sif skelly_precompute skelly_config.toml singularity exec /path/to/skellysim_container.sif mpirun skelly_sim Python modules and scripts (advanced usage) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ virtualenv ---------- To install the python portion (in your virtual environment, conda environment, or using the :obj:`pip3 --user` option). For a virtualenv .. highlight:: bash .. code-block:: bash module load python # if you're using modules python3 -m venv /path/to/my/virtualenv source /path/to/my/virtualenv/bin/activate pip3 install git+https://github.com/flatironinstitute/SkellySim Conda ----- .. highlight:: bash .. code-block:: bash conda create -n myenvname conda activate myenvname pip3 install git+https://github.com/flatironinstitute/SkellySim Simulation binary (advanced usage) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Due to the complicated dependencies and the performance differences depending on what machine you compile them to, it is difficult to provide general purpose binaries. If you don't need any of this and don't want to deal with it, please just use the singularity builds. To get optimal performance, or use multi-node MPI, you must build :obj:`SkellySim` and its dependencies from source. Building from source -------------------- Requirements: - `Trilinos 13 `_ (with Kokkos, Belos, Teuchos, and Tpetra) - `PVFMM `_ - `STKFMM `_ - BLAS/LAPACK (OpenBLAS or MKL or your implementations of choice) - FFTW (FFTW3 or MKL-fftw) - cmake (>=3.10) - modern gcc (>=7). Should work with Intel compilers but almost never worth the hassle in my tests Will add a more detailed explanation here later, but please consult the `singularity build script `_ for a general outline for how to build :obj:`PVFMM + STKFMM + Trilinos + SkellySim`.