Difference between revisions of "HOWTO use MCWrapper on AlmaLinux9 nodes"
(→Simulating Events) |
(→PREREQUISTE) |
||
(35 intermediate revisions by 3 users not shown) | |||
Line 5: | Line 5: | ||
How to use MCWrapper on an interactive AlmaLinux9 node. Both submitting and running the jobs currently has to be done in a CentOS7 container. | How to use MCWrapper on an interactive AlmaLinux9 node. Both submitting and running the jobs currently has to be done in a CentOS7 container. | ||
− | WORK IN PROGRESS | + | '''WORK IN PROGRESS''' |
− | = | + | = PREREQUISTE = |
− | 1. | + | 1. Check what is your SHELL by doing echo $SHELL |
− | + | ||
+ | 2. Check if swif2 and singularity commands exist if not add in your | ||
+ | .bashrc (if echo $SHELL give bash), export PATH=/usr/local:$PATH | ||
or | or | ||
− | + | .cshrc (if echo $SHELL give tcsh), setenv PATH /usr/local:$PATH | |
− | 2. | + | = RUN MCWrapper interactiveley = |
+ | |||
+ | 1. Open your favorite $SHELL in the CentOS7 container: | ||
+ | singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash | ||
+ | or | ||
+ | singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh | ||
+ | |||
+ | 2. Set up gxenv: | ||
source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh | source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh | ||
+ | or | ||
+ | source /group/halld/Software/build_scripts/gluex_env_boot_jlab.csh | ||
+ | |||
+ | 3. Set up the environment with the last official xml file based on ccdb v1: | ||
+ | gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml | ||
+ | |||
+ | 4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]" | ||
+ | |||
+ | = Run MCWrapper to register jobs = | ||
+ | |||
+ | 1. To go in the OS7 container and favorite $SHELL | ||
+ | singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash | ||
+ | or | ||
+ | singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh | ||
+ | |||
+ | 2. | ||
+ | source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh | ||
+ | or | ||
+ | source /group/halld/Software/build_scripts/gluex_env_jlab.csh | ||
3. | 3. | ||
− | gxenv version_5. | + | gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml |
− | 4. | + | 4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]" |
− | + | ||
+ | 5. Set MCWRAPPER_CENTRAL to your local version | ||
+ | |||
+ | git clone https://github.com/JeffersonLab/gluex_MCwrapper | ||
+ | by default the MakeMC.sh/MakeMC.csh/gluex_MC.py are not executable, so do not forget to do chmod +x MakeMC.sh;chmod +x MakeMC.csh;chmod +x gluex_MC.py | ||
+ | |||
+ | setenv MCWRAPPER_CENTRAL path to your local version | ||
+ | or | ||
+ | export MCWRAPPER_CENTRAL=path to your local version | ||
if it is not working copy this python steering file here: /work/halld/home/ijaegle/public/FromBoToMe/my_gluex_MC.py | if it is not working copy this python steering file here: /work/halld/home/ijaegle/public/FromBoToMe/my_gluex_MC.py | ||
− | 5. | + | 5. In your MC.cfg, use |
− | + | ||
BATCH_SYSTEM=swif2cont | BATCH_SYSTEM=swif2cont | ||
and | and | ||
− | + | OS=el9 | |
− | OS=el9 | + |
Latest revision as of 11:59, 13 August 2024
Contents
Introduction
How to use MCWrapper on an interactive AlmaLinux9 node. Both submitting and running the jobs currently has to be done in a CentOS7 container.
WORK IN PROGRESS
PREREQUISTE
1. Check what is your SHELL by doing echo $SHELL
2. Check if swif2 and singularity commands exist if not add in your
.bashrc (if echo $SHELL give bash), export PATH=/usr/local:$PATH
or
.cshrc (if echo $SHELL give tcsh), setenv PATH /usr/local:$PATH
RUN MCWrapper interactiveley
1. Open your favorite $SHELL in the CentOS7 container:
singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
or
singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh
2. Set up gxenv:
source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh
or
source /group/halld/Software/build_scripts/gluex_env_boot_jlab.csh
3. Set up the environment with the last official xml file based on ccdb v1:
gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml
4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]"
Run MCWrapper to register jobs
1. To go in the OS7 container and favorite $SHELL
singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
or
singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh
2.
source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh
or
source /group/halld/Software/build_scripts/gluex_env_jlab.csh
3.
gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml
4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]"
5. Set MCWRAPPER_CENTRAL to your local version
git clone https://github.com/JeffersonLab/gluex_MCwrapper
by default the MakeMC.sh/MakeMC.csh/gluex_MC.py are not executable, so do not forget to do chmod +x MakeMC.sh;chmod +x MakeMC.csh;chmod +x gluex_MC.py
setenv MCWRAPPER_CENTRAL path to your local version
or
export MCWRAPPER_CENTRAL=path to your local version
if it is not working copy this python steering file here: /work/halld/home/ijaegle/public/FromBoToMe/my_gluex_MC.py
5. In your MC.cfg, use
BATCH_SYSTEM=swif2cont
and
OS=el9