Difference between revisions of "HOWTO use MCWrapper on AlmaLinux9 nodes"

From GlueXWiki
Jump to: navigation, search
(Run MCWrapper to register jobs)
(PREREQUISTE)
 
(13 intermediate revisions by one other user not shown)
Line 4: Line 4:
  
 
How to use MCWrapper on an interactive AlmaLinux9 node. Both submitting and running the jobs currently has to be done in a CentOS7 container.
 
How to use MCWrapper on an interactive AlmaLinux9 node. Both submitting and running the jobs currently has to be done in a CentOS7 container.
 +
 +
'''WORK IN PROGRESS'''
  
 
= PREREQUISTE =
 
= PREREQUISTE =
Line 10: Line 12:
  
 
2. Check if swif2 and singularity commands exist if not add in your  
 
2. Check if swif2 and singularity commands exist if not add in your  
   .bashrc (if echo $SHELL give bash), export PATH=/user/local:$PATH
+
   .bashrc (if echo $SHELL give bash), export PATH=/usr/local:$PATH
 
or
 
or
   .cshrc (if echo $SHELL give tcsh), setenv PATH /user/local:$PATH
+
   .cshrc (if echo $SHELL give tcsh), setenv PATH /usr/local:$PATH
 
+
'''WORK IN PROGRESS'''
+
  
 
= RUN MCWrapper interactiveley =
 
= RUN MCWrapper interactiveley =
  
 
1. Open your favorite $SHELL in the CentOS7 container:
 
1. Open your favorite $SHELL in the CentOS7 container:
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
+
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
 
or
 
or
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh
+
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh
  
 
2. Set up gxenv:
 
2. Set up gxenv:
 
  source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh
 
  source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh
 +
or
 +
source /group/halld/Software/build_scripts/gluex_env_boot_jlab.csh
  
 
3. Set up the environment with the last official xml file based on ccdb v1:
 
3. Set up the environment with the last official xml file based on ccdb v1:
 
  gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml
 
  gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml
 +
 +
4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]"
  
 
= Run MCWrapper to register jobs =
 
= Run MCWrapper to register jobs =
  
 
1. To go in the OS7 container and favorite $SHELL
 
1. To go in the OS7 container and favorite $SHELL
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
+
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash
 
or
 
or
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh
+
  singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh
  
 
2.  
 
2.  
 
  source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh
 
  source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh
or possibly
+
or  
 
  source /group/halld/Software/build_scripts/gluex_env_jlab.csh
 
  source /group/halld/Software/build_scripts/gluex_env_jlab.csh
  
Line 44: Line 48:
 
  gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml
 
  gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml
  
4. Set MCWRAPPER_CENTRAL to your local version  
+
4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]"
 +
 
 +
5. Set MCWRAPPER_CENTRAL to your local version  
  
 
  git clone https://github.com/JeffersonLab/gluex_MCwrapper
 
  git clone https://github.com/JeffersonLab/gluex_MCwrapper

Latest revision as of 11:59, 13 August 2024

Introduction

How to use MCWrapper on an interactive AlmaLinux9 node. Both submitting and running the jobs currently has to be done in a CentOS7 container.

WORK IN PROGRESS

PREREQUISTE

1. Check what is your SHELL by doing echo $SHELL

2. Check if swif2 and singularity commands exist if not add in your

 .bashrc (if echo $SHELL give bash), export PATH=/usr/local:$PATH

or

 .cshrc (if echo $SHELL give tcsh), setenv PATH /usr/local:$PATH

RUN MCWrapper interactiveley

1. Open your favorite $SHELL in the CentOS7 container:

singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash

or

singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh

2. Set up gxenv:

source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh

or

source /group/halld/Software/build_scripts/gluex_env_boot_jlab.csh

3. Set up the environment with the last official xml file based on ccdb v1:

gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml

4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]"

Run MCWrapper to register jobs

1. To go in the OS7 container and favorite $SHELL

singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 bash

or

singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_prod:v1 tcsh

2.

source /group/halld/Software/build_scripts/gluex_env_boot_jlab.sh

or

source /group/halld/Software/build_scripts/gluex_env_jlab.csh

3.

gxenv /group/halld/www/halldweb/html/halld_versions/version_5.17.0.xml

4. When you are in the container, check if you can access all your files, if not add the corresponding path - [my stuff path] - to "--bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/u/home,/usr/local,/work/halld,/group/halld/Software,/work/osgpool/halld,[my stuff path]"

5. Set MCWRAPPER_CENTRAL to your local version

git clone https://github.com/JeffersonLab/gluex_MCwrapper

by default the MakeMC.sh/MakeMC.csh/gluex_MC.py are not executable, so do not forget to do chmod +x MakeMC.sh;chmod +x MakeMC.csh;chmod +x gluex_MC.py

setenv MCWRAPPER_CENTRAL path to your local version

or

export MCWRAPPER_CENTRAL=path to your local version

if it is not working copy this python steering file here: /work/halld/home/ijaegle/public/FromBoToMe/my_gluex_MC.py

5. In your MC.cfg, use

BATCH_SYSTEM=swif2cont 

and

OS=el9