SRC/CT Simulation
From GlueXWiki
Revision as of 12:22, 10 January 2025 by Boyu (Talk | contribs) (→Run the simulation on the batch system)
Contents
Introduction
- Following a similar approach to GlueX collaboration, we use gluex_MCwrapper to automate the entire pipeline of simulation. It includes event generation, GEANT simulation of GlueX, smearing the events, reconstructing the events, and skiming the events with reaction filter.
- However, due to our own custom generators that are not part of the offcial Hall D builds of software packages, we can't use their website to submit requests to run simulations. We have to build some softwares manually and run the MCwrapper on the batch farm ourselves.
- Below is a brief tutorial on this process at the ifarm environment. Contact Jackson Pybus (jrpybus@jlab.org) or Bo Yu (boyu@jlab.org) for access to some software repo or more information on the SRC/CT simulation.
Download the softwares
- Get the software version stacks specific to SRC/CT
$ git clone https://github.com/frankboyu/halld_versions_srcct.git
- Get the halld_sim specific to SRC/CT
$ git clone https://github.com/JacksonPybus/halld_sim_srcct.git
- Get the gluex_MCwrapper specific to SRC/CT
$ git clone https://github.com/frankboyu/gluex_MCwrapper_srcct.git
Compile the softwares
- Take the newest recon compatible version stack in halld_versions_srcct. As of Nov 2024, we use recon_srcct-2021_11-ver01_4.xml as example.
- Edit the following 2 lines to your downloaded halld_sim_srcct and gluex_MCwrapper_srcct
<package name="gluex_MCwrapper" home="/absolute/path/to/your/dir/gluex_MCwrapper_srcct"/> <package name="halld_sim" home="/absolute/path/to/your/dir/halld_sim_srcct"/>
- Enter the GlueX singularity container
$ singularity exec --bind /cvmfs,/scratch,/u/home,/usr/local,/scigroup/mcwrapper,/lustre/enp/swif2,/work/osgpool,/work/halld,/work/halld2,/cache/halld,/volatile/halld,/group/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
- Set up the environment
$ source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh $ gxenv halld_versions_srcct/recon_srcct-2021_11-ver01_4.xml
- Compile halld_sim_srcct
$ cd halld_sim_srcct/src $ scons install -j32
- gluex_MCwrapper is a set of scripts and doesn't need compiling
Edit the config files
- To run the simulation, we need 4 config files as input to MCwrapper.
- Generator config file
- Reconstruction config file
- Skim config file
- Wrapper config file
Run the simulation interactively
- Enter the GlueX singularity container
$ singularity exec --bind /cvmfs,/scratch,/u/home,/usr/local,/scigroup/mcwrapper,/lustre/enp/swif2,/work/osgpool,/work/halld,/work/halld2,/cache/halld,/volatile/halld,/group/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
- Set up the environment
$ source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh $ gxenv halld_versions_srcct/recon_srcct-2021_11-ver01_4.xml
- Run the MCwrapper
$ gluex_MC.py wrapper.cfg <RUN> <EVENTS> batch=0 per_file=1000000
<RUN> is the run number you want to simulate. <EVENTS> is the number of events. batch=0 means to run teh simulation interactively on the terminal. per_file=1000000 is the maximum number of events in one hddm/root file
For more flags to configure the MCwrapper, refer to this documentation
Run the simulation on the batch system
- Set up the environment
$ source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh $ gxenv $HALLD_VERSIONS/version.xml $ export MCWRAPPER_CENTRAL=/absolute/path/to/your/gluex/mcwrapper/ $ export PATH=/absolute/path/to/your/gluex/mcwrapper/:${PATH}
Since swif2 is no longer supported in the container, we have to run the MCwrapper and submit the jobs directly without the container. Therefore we choose to use the default version stack and only change the MCwrapper to our custom version.
- Submit the simulation job
$ gluex_MC.py wrapper.cfg <RUN> <EVENTS> batch=1 per_file=1000000
The flags have the same meanings as running interactively. batch=1 submits the job and you need to run the swif2 workflow manually. batch=2 submits the job and runs the workflow automatically.