https://halldweb.jlab.org/wiki/api.php?action=feedcontributions&user=Aaustreg&feedformat=atomGlueXWiki - User contributions [en]2024-03-28T16:56:19ZUser contributionsMediaWiki 1.24.1https://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125251GlueX Software Meeting, March 25, 20242024-03-25T19:45:11Z<p>Aaustreg: /* Notes */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server for offsite production (/cache/halld and /volatile/halld)<br />
#** access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
#** Will soon be added to external DNS server<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* [https://halldweb.jlab.org/talks/2024/CentOS7vsALMA9GlueXSoftware.pdf Detailed CentOS7 vs Alma9 comparison] (Beni)<br />
#* Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]): fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Notes ==<br />
<br />
* Write access to xrootd server would be useful for offsite production<br />
** Similar to MCWrapper<br />
** Post-processing to merge files before writing them to cache necessary<br />
* Tools to discover undeterministic behavior in c++ code:<br />
** [https://herbgrind.ucsd.edu/ herbgrind]<br />
** [https://github.com/rjones30/dilog dilog] (R. Jones)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use Alma9 container?<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125250GlueX Software Meeting, March 25, 20242024-03-25T19:44:56Z<p>Aaustreg: /* Questions */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server for offsite production (/cache/halld and /volatile/halld)<br />
#** access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
#** Will soon be added to external DNS server<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* [https://halldweb.jlab.org/talks/2024/CentOS7vsALMA9GlueXSoftware.pdf Detailed CentOS7 vs Alma9 comparison] (Beni)<br />
#* Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]): fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Notes ==<br />
<br />
* Write access to xrootd server would be useful for offsite production<br />
** Similar to MCWrapper<br />
** Post-processing to merge files before writing them to cache necessary<br />
* Tools to discover undeterministic behavior in cod:<br />
** [https://herbgrind.ucsd.edu/ herbgrind]<br />
** [https://github.com/rjones30/dilog dilog] (R. Jones)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use Alma9 container?<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125218GlueX Software Meeting, March 25, 20242024-03-22T21:54:12Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server for offsite production (/cache/halld and /volatile/halld)<br />
#** access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
#** Will soon be added to external DNS server<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Detailed CentOS7 vs Alma9 comparison (Beni)<br />
#* Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]): fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use Alma9 container?<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125217GlueX Software Meeting, March 25, 20242024-03-22T21:53:37Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server<br />
#** access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
#** Will soon be added to external DNS server<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Detailed CentOS7 vs Alma9 comparison (Beni)<br />
#* Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]): fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use Alma9 container?<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125216GlueX Software Meeting, March 25, 20242024-03-22T21:49:43Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server, access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Detailed CentOS7 vs Alma9 comparison (Beni)<br />
#* Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]): fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use Alma9 container?<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125215GlueX Software Meeting, March 25, 20242024-03-22T21:49:24Z<p>Aaustreg: /* Action Items */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server, access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Detailed CentOS7 vs Alma9 comparison (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use Alma9 container?<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125214GlueX Software Meeting, March 25, 20242024-03-22T21:48:46Z<p>Aaustreg: /* Action Items */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server, access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Detailed CentOS7 vs Alma9 comparison (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]): fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_25,_2024&diff=125213GlueX Software Meeting, March 25, 20242024-03-22T21:48:30Z<p>Aaustreg: Created page with "GlueX Software Meeting<br> Monday, March 25, 2024<br> 11:00 am EDT<br> F326/327<br> <div class="mw-collapsible mw-collapsed"> Zoom Meeting ID: 160 636 9159 Passcode: 888788 [..."</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 25, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* Read-only XRootD server, access through scitokens, enrollment via service now ticket [https://jlab.servicenowservices.com/kb?sys_kb_id=a44f048c1b1ed910f0b4dc6ce54bcb32&id=kb_article_view&sysparm_rank=1&sysparm_tsqueryId=8686ae7b87f84e506cc0eca80cbb35e6]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_March_11,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Current farm: 46 Alma9 nodes and 193 CentOS7 nodes<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Detailed CentOS7 vs Alma9 comparison (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan/Raiqa)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** Crash in CCShower_factory ([https://github.com/JeffersonLab/halld_recon/issues/789 Issue #789]: fixed via [https://github.com/JeffersonLab/halld_recon/pull/790 PR #790]<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Offline_Software_Meetings&diff=125212GlueX Offline Software Meetings2024-03-22T21:40:34Z<p>Aaustreg: /* Offline Meetings in 2024 */</p>
<hr />
<div>=Regularly Scheduled Meetings=<br />
<br />
== Offline Meetings in 2024 ==<br />
<br />
* [[GlueX Software Meeting, March 25, 2024 | March 25, 2024]]<br />
* [[GlueX Software Meeting, March 11, 2024 | March 11, 2024]]<br />
* [[GlueX Software Meeting, February 12, 2024 | February 12, 2024]]<br />
* [[GlueX Software Meeting, January 29, 2024 | January 29, 2024]]<br />
<br />
== Offline Meetings in 2023 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 18, 2023 | December 18, 2023]]<br />
* [[GlueX Software Meeting, November 20, 2023 | November 20, 2023]]<br />
* [[GlueX Software Meeting, November 6, 2023 | November 6, 2023]]<br />
* [[GlueX Software Meeting, October 23, 2023 | October 23, 2023]]<br />
* [[GlueX Software Meeting, October 9, 2023 | October 9, 2023]]<br />
* [[GlueX Software Meeting, September 11, 2023 | September 11, 2023]]<br />
|<br />
* [[GlueX Software Meeting, August 28, 2023 | August 28, 2023]]<br />
* [[GlueX Software Meeting, August 14, 2023 | August 14, 2023]]<br />
* [[GlueX Software Meeting, March 27, 2023 | March 27, 2023]]<br />
* [[GlueX Software Meeting, March 13, 2023 | March 13, 2023]]<br />
* [[GlueX Software Meeting, January 30, 2023 | January 30, 2023]]<br />
|}<br />
<br />
== Offline Meetings in 2022 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 19, 2022 | December 19, 2022]]<br />
* [[GlueX Software Meeting, November 7, 2022 | November 7, 2022]]<br />
* [[GlueX Software Meeting, October 10, 2022 | October 10, 2022]]<br />
* [[GlueX Software Meeting, August 29, 2022 | August 29, 2022]]<br />
* [[GlueX Software Meeting, August 15, 2022 | August 15, 2022]]<br />
* [[GlueX Software Meeting, July 18, 2022 | July 18, 2022]]<br />
|<br />
* [[GlueX Software Meeting, May 9, 2022 | May 9, 2022]]<br />
* [[GlueX Software Meeting, April 27, 2022 | April 27, 2022]]<br />
* [[GlueX Software Meeting, April 13, 2022 | April 13, 2022]]<br />
* [[GlueX Software Meeting, June 6, 2022 | June 6, 2022]]<br />
* [[GlueX Software Meeting, March 30, 2022 | March 30, 2022]]<br />
|<br />
* [[GlueX Software Meeting, March 16, 2022 | March 16, 2022]]<br />
* [[GlueX Software Meeting, March 2, 2022 | March 2, 2022]]<br />
* [[GlueX Software Meeting, February 16, 2022 | February 16, 2022]]<br />
* [[GlueX Software Meeting, February 2, 2022 | February 2, 2022]]<br />
* [[GlueX Software Meeting, January 18, 2022 | January 18, 2022]]<br />
|}<br />
<br />
== Offline Meetings in 2021 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 20, 2021 | December 20, 2021]]<br />
* [[GlueX Software Meeting, December 6, 2021 | December 6, 2021]]<br />
* [[GlueX Software Meeting, November 8, 2021, 2021 | November 8, 2021]]<br />
* [[GlueX Software Meeting, October 25, 2021, 2021 | October 25, 2021]]<br />
|<br />
* [[GlueX Software Meeting, October 11, 2021, 2021 | October 11, 2021]]<br />
* [[GlueX Software Meeting, September 27, 2021, 2021 | September 27, 2021]]<br />
* [[GlueX Software Meeting, August 31, 2021 | August 31, 2021]]<br />
* [[GlueX Software Meeting, August 17, 2021 | August 17, 2021]]<br />
* [[GlueX Software Meeting, July 20, 2021 | July 20, 2021]]<br />
|<br />
* [[GlueX Software Meeting, July 6, 2021 | July 6, 2021]]<br />
* [[GlueX Software Meeting, June 22, 2021 | June 22, 2021]]<br />
* [[GlueX Software Meeting, May 11, 2021 | May 11, 2021]]<br />
* [[GlueX Software Meeting, April 28, 2021 | April 28, 2021]]<br />
* [[GlueX Software Meeting, March 30, 2021 | March 30, 2021]]<br />
|<br />
* [[GlueX Software Meeting, March 16, 2021 | March 16, 2021]]<br />
* [[GlueX Software Meeting, March 2, 2021 | March 2, 2021]]<br />
* [[GlueX Software Meeting, February 2, 2021 | February 2, 2021]]<br />
* [[GlueX Software Meeting, January 19, 2021 | January 19, 2021]]<br />
* [[GlueX Software Meeting, January 5, 2021 | January 5, 2021]]<br />
|}<br />
<br />
== Offline Meetings in 2020 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 8, 2020 | December 8, 2020]]<br />
* [[GlueX Software Meeting, November 24, 2020 | November 24, 2020]]<br />
* [[GlueX Software Meeting, November 10, 2020 | November 10, 2020]]<br />
* [[GlueX Software Meeting, October 13, 2020 | October 13, 2020]]<br />
|<br />
* [[GlueX Software Meeting, September 29, 2020 | September 29, 2020]]<br />
* [[GlueX Software Meeting, September 15, 2020 | September 15, 2020]]<br />
* [[GlueX Software Meeting, September 1, 2020 | September 1, 2020]]<br />
* [[GlueX Software Meeting, August 18, 2020 | August 18, 2020]]<br />
* [[GlueX Software Meeting, August 4, 2020 | August 4, 2020]]<br />
* [[GlueX Software Meeting, July 21, 2020 | July 21, 2020]]<br />
|<br />
* [[GlueX Software Meeting, July 7, 2020 | July 7, 2020]]<br />
* [[GlueX Software Meeting, June 9, 2020 | June 9, 2020]]<br />
* [[GlueX Software Meeting, May 26, 2020 | May 26, 2020]]<br />
* [[GlueX Software Meeting, April 28, 2020 | April 28, 2020]]<br />
* [[GlueX Software Meeting, April 14, 2020 | April 14, 2020]]<br />
* [[GlueX Software Meeting, March 31, 2020 | March 31, 2020]]<br />
|<br />
* [[GlueX Software Meeting, March 17, 2020 | March 17, 2020]]<br />
* [[GlueX Software Meeting, March 3, 2020 | March 3, 2020]]<br />
* [[GlueX Software Meeting, February 18, 2020 | February 18, 2020]]<br />
* [[GlueX Software Meeting, February 4, 2020 | February 4, 2020]] <br />
* [[GlueX Software Meeting, January 21, 2020|January 21, 2020]]<br />
* [[GlueX Software Meeting, January 7, 2020|January 7, 2020]]<br />
|}<br />
<br />
== Offline Meetings in 2019 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 10, 2019|December 10, 2019]]<br />
* [[GlueX Software Meeting, November 26, 2019|November 26, 2019]]<br />
* [[GlueX Software Meeting, November 12, 2019|November 12, 2019]]<br />
* [[GlueX Software Meeting, October 29, 2019|October 29, 2019]]<br />
* [[GlueX Software Meeting, October 15, 2019|October 15, 2019]]<br />
|<br />
* [[GlueX Software Meeting, September 17, 2019|September 17, 2019]]<br />
* [[GlueX Software Meeting, September 3, 2019|September 3, 2019]]<br />
* [[GlueX Software Meeting, August 20, 2019|August 20, 2019]]<br />
* [[GlueX Software Meeting, August 6, 2019|August 6, 2019]]<br />
* [[GlueX Software Meeting, July 23, 2019|July 23, 2019]]<br />
|<br />
* [[GlueX Software Meeting, July 9, 2019|July 9, 2019]]<br />
* [[GlueX Software Meeting, June 25, 2019|June 25, 2019]]<br />
* [[GlueX Software Meeting, June 11, 2019|June 11, 2019]]<br />
* [[GlueX Software Meeting, May 28, 2019|May 28, 2019]]<br />
* [[GlueX Software Meeting, April 30, 2019|April 30, 2019]]<br />
|<br />
* [[GlueX Software Meeting, April 16, 2019|April 16, 2019]]<br />
* [[GlueX Software Meeting, March 5, 2019|March 5, 2019]]<br />
* [[GlueX Software Meeting, February 5, 2019|February 5, 2019]]<br />
* [[GlueX Software Meeting, January 22, 2019|January 22, 2019]]<br />
* [[GlueX Software Meeting, January 8, 2019|January 8, 2019]]<br />
|}<br />
<br />
== Offline Meetings in 2018 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 11, 2018|December 11, 2018]]<br />
* [[GlueX Software Meeting, November 13, 2018|November 13, 2018]]<br />
* [[GlueX Software Meeting, October 30, 2018|October 30, 2018]]<br />
* [[GlueX Offline Software Meeting, October 16, 2018|October 16, 2018]]<br />
* [[GlueX Offline Software Meeting, October 2, 2018|October 2, 2018]]<br />
* [[GlueX Offline Meeting, September 18, 2018|September 18, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, September 4, 2018|September 4, 2018]]<br />
* [[GlueX Offline Meeting, August 21, 2018|August 21, 2018]]<br />
* [[GlueX Offline Meeting, August 7, 2018|August 7, 2018]]<br />
* [[GlueX Offline Meeting, July 24, 2018|July 24, 2018]]<br />
* [[GlueX Offline Meeting, July 13, 2018|July 13, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, June 29, 2018|June 29, 2018]]<br />
* [[GlueX Offline Meeting, June 15, 2018|June 15, 2018]]<br />
* [[GlueX Offline Meeting, June 1, 2018|June 1, 2018]]<br />
* [[GlueX Offline Meeting, May 18, 2018|May 18, 2018]]<br />
* [[GlueX Offline Meeting, May 4, 2018|May 4, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, April 6, 2018|April 6, 2018]]<br />
* [[GlueX Offline Meeting, March 9, 2018|March 9, 2018]]<br />
* [[GlueX Offline Meeting, February 9, 2018|February 9, 2018]]<br />
* [[GlueX Offline Meeting, January 26, 2018|January 26, 2018]]<br />
* [[GlueX Offline Meeting, January 10, 2018|January 10, 2018]]<br />
|}<br />
<br />
== Offline Meetings in 2017 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 13, 2017|December 13, 2017]]<br />
* [[GlueX Offline Meeting, November 29, 2017|November 29, 2017]]<br />
* [[GlueX Offline Meeting, November 15, 2017|November 15, 2017]]<br />
* [[GlueX Offline Meeting, November 1, 2017|November 1, 2017]]<br />
* [[GlueX Offline Meeting, October 4, 2017|October 4, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, September 20, 2017|September 20, 2017]]<br />
* [[GlueX Offline Meeting, September 6, 2017|September 6, 2017]]<br />
* [[GlueX Offline Meeting, August 23, 2017|August 23, 2017]]<br />
* [[GlueX Offline Meeting, August 9, 2017|August 9, 2017]]<br />
* [[GlueX Offline Meeting, July 26, 2017|July 26, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, July 12, 2017|July 12, 2017]]<br />
* [[GlueX Offline Meeting, June 28, 2017|June 28, 2017]]<br />
* [[GlueX Offline Meeting, June 14, 2017|June 14, 2017]]<br />
* [[GlueX Offline Meeting, May 31, 2017|May 31, 2017]]<br />
* [[GlueX Offline Meeting, April 19, 2017|April 19, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, March 22, 2017|March 22, 2017]]<br />
* [[GlueX Offline Meeting, March 8, 2017|March 8, 2017]]<br />
* [[GlueX Offline Meeting, February 22, 2017|February 22, 2017]]<br />
* [[GlueX Offline Meeting, February 1, 2017|February 1, 2017]]<br />
* [[GlueX Offline Meeting, January 18, 2017|January 18, 2017]]<br />
|}<br />
<br />
== Offline Meetings in 2016 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 21, 2016|December 21, 2016]]<br />
* [[GlueX Offline Meeting, December 7, 2016|December 7, 2016]]<br />
* [[GlueX Offline Meeting, November 9, 2016|November 9, 2016]]<br />
* [[GlueX Offline Meeting, October 26, 2016|October 26, 2016]]<br />
* [[GlueX Offline Meeting, October 12, 2016|October 12, 2016]]<br />
* [[GlueX Offline Meeting, September 28, 2016|September 28, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, September 14, 2016|September 14, 2016]]<br />
* [[GlueX Offline Meeting, August 31, 2016|August 31, 2016]]<br />
* [[GlueX Offline Meeting, August 17, 2016|August 17, 2016]]<br />
* [[GlueX Offline Meeting, August 3, 2016|August 3, 2016]]<br />
* [[GlueX Offline Meeting, July 20, 2016|July 20, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, July 6, 2016|July 6, 2016]]<br />
* [[GlueX Offline Meeting, June 8, 2016|June 8, 2016]]<br />
* [[GlueX Offline Meeting, May 25, 2016|May 25, 2016]]<br />
* [[GlueX Offline Meeting, April 27, 2016|April 27, 2016]]<br />
* [[GlueX Offline Meeting, April 13, 2016|April 13, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, March 30, 2016|March 30, 2016]]<br />
* [[GlueX Offline Meeting, March 2, 2016|March 2, 2016]]<br />
* [[GlueX Offline Meeting, February 3, 2016|February 3, 2016]]<br />
* [[GlueX Offline Meeting, January 20, 2016|January 20, 2016]]<br />
* [[GlueX Offline Meeting, January 6, 2016|January 6, 2016]]<br />
|}<br />
<br />
== Offline Meetings in 2015 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 9, 2015|December 9, 2015]]<br />
* [[GlueX Offline Meeting, November 11, 2015|November 11, 2015]]<br />
* [[GlueX Offline Meeting, October 28, 2015|October 28, 2015]]<br />
* [[GlueX Offline Meeting, October 14, 2015|October 14, 2015]]<br />
* [[GlueX Offline Meeting, September 30, 2015|September 30, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, September 16, 2015|September 16, 2015]]<br />
* [[GlueX Offline Meeting, September 2, 2015|September 2, 2015]]<br />
* [[GlueX Offline Meeting, August 19, 2015|August 19, 2015]]<br />
* [[GlueX Offline Meeting, August 5, 2015|August 5, 2015]]<br />
* [[GlueX Offline Meeting, July 22, 2015|July 22, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, July 8, 2015|July 8, 2015]]<br />
* [[GlueX Offline Meeting, June 24, 2015|June 24, 2015]]<br />
* [[GlueX Offline Meeting, June 10, 2015|June 10, 2015]]<br />
* [[GlueX Offline Meeting, May 27, 2015|May 27, 2015]]<br />
* [[GlueX Offline Meeting, April 29, 2015|April 29, 2015]]<br />
* [[GlueX Offline Meeting, April 15, 2015|April 15, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, April 1, 2015|April 1, 2015]]<br />
* [[GlueX Offline Meeting, March 18, 2015|March 18, 2015]]<br />
* [[GlueX Offline Meeting, March 4, 2015|March 4, 2015]]<br />
* [[GlueX Offline Meeting, February 4, 2015|February 4, 2015]]<br />
* [[GlueX Offline Meeting, January 21, 2015|January 21, 2015]]<br />
* [[GlueX Offline Meeting, January 7, 2015|January 7, 2015]]<br />
|}<br />
<br />
== Offline Meetings in 2014 ==<br />
<br />
<table><tr><td><br />
* [[GlueX Offline Meeting, December 10, 2014|December 10, 2014]]<br />
* [[GlueX Offline Meeting, November 12, 2014|November 12, 2014]]<br />
* [[GlueX Offline Meeting, October 29, 2014|October 29, 2014]]<br />
* [[GlueX Offline Meeting, October 15, 2014|October 15, 2014]]<br />
* [[GlueX Offline Meeting, September 17, 2014|September 17, 2014]]<br />
* [[GlueX Offline Meeting, September 3, 2014|September 3, 2014]]<br />
* [[GlueX Offline Meeting, August 20, 2014|August 20, 2014]]<br />
</td><td><br />
* [[GlueX Offline Meeting, August 6, 2014|August 6, 2014]]<br />
* [[GlueX Offline Meeting, July 23, 2014|July 23, 2014]]<br />
* [[GlueX Offline Meeting, July 9, 2014|July 9, 2014]]<br />
* [[GlueX Offline Meeting, June 25, 2014|June 25, 2014]]<br />
* [[GlueX Offline Meeting, June 11, 2014|June 11, 2014]]<br />
* [[GlueX Offline Meeting, May 28, 2014|May 28, 2014]]<br />
* [[GlueX Offline Meeting, April 30, 2014|April 30, 2014]]<br />
</td><td><br />
* [[GlueX Offline Meeting, April 16, 2014|April 16, 2014]]<br />
* [[GlueX Offline Meeting, April 2, 2014|April 2, 2014]]<br />
* [[GlueX Offline Meeting, March 19, 2014|March 19, 2014]]<br />
* [[GlueX Offline Meeting, March 5, 2014|March 5, 2014]] (canceled, JLab network outage)<br />
* [[GlueX Offline Meeting, February 5, 2014|February 5, 2014]]<br />
* [[GlueX Offline Meeting, January 22, 2014|January 22, 2014]]<br />
* [[GlueX Offline Meeting, January 8, 2014|January 8, 2014]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2013 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 11, 2013|December 11, 2013]]<br />
* [[GlueX Offline Meeting, November 13, 2013|November 13, 2013]]<br />
* [[GlueX Offline Meeting, October 30, 2013|October 30, 2013]]<br />
* [[GlueX Offline Meeting, October 16, 2013|October 16, 2013]]<br />
* [[GlueX Offline Meeting, September 18, 2013|September 18, 2013]]<br />
* [[GlueX Offline Meeting, September 4, 2013|September 4, 2013]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 21, 2013|August 21, 2013]]<br />
* [[GlueX Offline Meeting, August 7, 2013|August 7, 2013]]<br />
* [[GlueX Offline Meeting, July 24, 2013|July 24, 2013]]<br />
* [[GlueX Offline Meeting, June 26, 2013|June 26, 2013]]<br />
* [[GlueX Offline Meeting, June 12, 2013|June 12, 2013]]<br />
* [[GlueX Offline Meeting, May 15, 2013|May 15, 2013]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, May 1, 2013|May 1, 2013]]<br />
* [[GlueX Offline Meeting, April 17, 2013|April 17, 2013]]<br />
* [[GlueX Offline Meeting, April 3, 2013|April 3, 2013]]<br />
* [[GlueX Offline Meeting, March 20, 2013|March 20, 2013]]<br />
* [[GlueX Offline Meeting, February 6, 2013|February 6, 2013]]<br />
* [[GlueX Offline Meeting, January 23, 2013|January 23, 2013]]<br />
* [[GlueX Offline Meeting, January 9, 2013|January 9, 2013]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2012 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 12, 2012|December 12, 2012]]<br />
* [[GlueX Offline Meeting, November 28, 2012|November 28, 2012]] (ARC 428)<br />
* [[GlueX Offline Meeting, November 14, 2012|November 14, 2012]]<br />
* [[GlueX Offline Meeting, October 31, 2012|October 31, 2012]]<br />
* [[GlueX Offline Meeting, October 17, 2012|October 17, 2012]]<br />
* [[GlueX Offline Meeting, October 3, 2012|October 3, 2012]]<br />
* [[GlueX Offline Meeting, September 19, 2012|September 19, 2012]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, September 5, 2012|September 5, 2012]]<br />
* [[GlueX Offline Meeting, August 22, 2012|August 22, 2012]]<br />
* [[GlueX Offline Meeting, August 8, 2012|August 8, 2012]]<br />
* [[GlueX Offline Meeting, July 25, 2012|July 25, 2012]]<br />
* [[GlueX Offline Meeting, July 11, 2012|July 11, 2012]]<br />
* [[GlueX Offline Meeting, June 27, 2012|June 27, 2012]]<br />
* [[GlueX Offline Meeting, June 13, 2012|June 13, 2012]]<br />
* [[GlueX Offline Meeting, May 30, 2012|May 30, 2012]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, May 16, 2012|May 16, 2012]]<br />
* [[GlueX Offline Meeting, April 18, 2012|April 18, 2012]]<br />
* [[GlueX Offline Meeting, March 21, 2012|March 21, 2012]]<br />
* [[GlueX Offline Meeting, February 22, 2012|February 22, 2012]]<br />
* [[GlueX Offline Meeting, February 8, 2012|February 8, 2012]]<br />
* [[GlueX Offline Meeting, January 25, 2012|January 25, 2012]]<br />
</td><td width=250><br />
</td></tr></table><br />
<br />
== Offline Meetings in 2011 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 14, 2011|December 14, 2011]]<br />
* [[GlueX Offline Meeting, November 30, 2011|November 30, 2011]]<br />
* [[GlueX Offline Meeting, November 16, 2011|November 16, 2011]]<br />
* [[GlueX Offline Meeting, November 2, 2011|November 2, 2011]]<br />
* [[GlueX Offline Meeting, October 19, 2011|October 19, 2011]] (canceled: Lehman Review)<br />
* [[GlueX Offline Meeting, September 21, 2011|September 21, 2011]]<br />
* [[GlueX Offline Meeting, September 7, 2011|September 7, 2011]]<br />
* [[GlueX Offline Meeting, August 24, 2011|August 24, 2011]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 10, 2011|August 10, 2011]]<br />
* [[GlueX Offline Meeting, July 27, 2011|July 27, 2011]]<br />
* [[GlueX Offline Meeting, July 13, 2011|July 13, 2011]]<br />
* [[GlueX Offline Meeting, June 29, 2011|June 29, 2011]]<br />
* [[GlueX Offline Meeting, June 15, 2011|June 15, 2011]]<br />
* [[GlueX Offline Meeting, June 1, 2011|June 1, 2011]]<br />
* [[GlueX Offline Meeting, May 18, 2011|May 18, 2011]]<br />
* [[GlueX Offline Meeting, April 20, 2011|April 20, 2011]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, April 6, 2011|April 6, 2011]]<br />
* [[GlueX Offline Meeting, March 23, 2011|March 23, 2011]]<br />
* [[GlueX Offline Meeting, March 9, 2011|March 9, 2011]]<br />
* [[GlueX Offline Meeting, February 23, 2011|February 23, 2011]]<br />
* [[GlueX Offline Meeting, February 9, 2011|February 9, 2011]]<br />
* [[GlueX Offline Meeting, January 26, 2011|January 26, 2011]]<br />
* [[GlueX Offline Meeting, January 12, 2011|January 12, 2011]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2010 ==<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 15, 2010|December 15, 2010]]<br />
* [[GlueX Offline Meeting, December 1, 2010|December 1, 2010]]<br />
* [[GlueX Offline Meeting, November 17, 2010|November 17, 2010]]<br />
* [[GlueX Offline Meeting, November 2, 2010|November 2, 2010]]<br />
* [[GlueX Offline Meeting, October 19, 2010|October 19, 2010]]<br />
* [[GlueX Offline Meeting, October 5, 2010|October 5, 2010]]<br />
* [[GlueX Offline Meeting, September 21, 2010|September 21, 2010]]<br />
* [[GlueX Offline Meeting, August 24, 2010|August 24, 2010]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 10, 2010|August 10, 2010]]<br />
* [[GlueX Offline Meeting, July 27, 2010|July 27, 2010]]<br />
* [[GlueX Offline Meeting, July 13, 2010|July 13, 2010]]<br />
* [[GlueX Offline Meeting, June 29, 2010|June 29, 2010]]<br />
* [[GlueX Offline Meeting, June 15, 2010|June 15, 2010]]<br />
* [[GlueX Offline Meeting, June 1, 2010|June 1, 2010]]<br />
* [[GlueX Offline Meeting, May 18, 2010|May 18, 2010]]<br />
* [[GlueX Offline Meeting, May 4, 2010|May 4, 2010]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, April 20, 2010|April 20, 2010]]<br />
* [[GlueX Offline Meeting, April 6, 2010|April 6, 2010]]<br />
* [[GlueX Offline Meeting, March 23, 2010|March 23, 2010]]<br />
* [[GlueX Offline Meeting, March 9, 2010|March 9, 2010]]<br />
* [[GlueX Offline Meeting, February 23, 2010|February 23, 2010]]<br />
* [[GlueX Offline Meeting, February 9, 2010|February 9, 2010]]<br />
* [[GlueX Offline Meeting, January 12, 2010|January 12, 2010]]<br />
</td></tr></table><br />
== Offline Meetings in 2009 ==<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 15, 2009|December 15, 2009]]<br />
* [[GlueX Offline Meeting, December 1, 2009|December 1, 2009]]<br />
* [[GlueX Offline Meeting, November 17, 2009|November 17, 2009]]<br />
* [[GlueX Offline Meeting, November 4, 2009|November 4, 2009]]<br />
* [[GlueX Offline Meeting, October 21, 2009|October 21, 2009]]<br />
</td><td width=250><br />
* [[ October 7, 2009 Software ]]<br />
* [[ September 23, 2009 Software ]]<br />
* [[ August 26, 2009 Software ]]<br />
* [[ August 12, 2009 Software ]]<br />
* [[ July 29, 2009 Software ]]<br />
* [[ July 1, 2009 Software ]]<br />
* [[ June 17, 2009 Software ]]<br />
</td><td width=250><br />
* [[ May 20, 2009 Software ]]<br />
* [[ May 6, 2009 Software ]]<br />
* [[ April 22, 2009 Software ]]<br />
* [[ April 8, 2009 Software ]]<br />
* [[ March 11, 2009 Software ]]<br />
* [[ Feburary 25, 2009 Software ]]<br />
* [[ February 11, 2009 Software ]]<br />
* [[ January 14, 2009 Software ]]<br />
</td></tr></table><br />
== Offline Meetings in 2008 ==<br />
<table><tr><td width=250><br />
* [[ December 17, 2008 Software ]]<br />
* [[ December 3, 2008 Software ]]<br />
* [[ November 18, 2008 Software ]]<br />
* [[ October 8, 2008 Software ]]<br />
* <s>[[ September 12, 2008 Software ]]</s><br />
</td><td width=250><br />
* [[ August 29, 2008 Software ]]<br />
* [[ August 15, 2008 Software ]]<br />
* [[ August 1, 2008 Software ]]<br />
* [[ July 18, 2008 Software ]]<br />
* [[ July 3, 2008 Software ]]<br />
* [[ June. 6, 2008 Software ]]<br />
</td><td width=250><br />
* [[ May. 23, 2008 Software ]]<br />
* [[ Feb. 29, 2008 Tracking CDC/FDC ]]<br />
* [[ Feb. 22, 2008 Tracking CDC/FDC ]]<br />
* [[ Feb. 15, 2008 Tracking CDC/FDC ]]<br />
* [[February 8, 2008 Software]]<br />
* [[January 25, 2008 Software]]<br />
</td></tr></table><br />
== Offline Meetings in 2007 ==<br />
<table><tr><td width=250><br />
* [[December 7, 2007 Software]]<br />
* [[November 30, 2007 Software]]<br />
* [[November 13, 2007 Software]]<br />
* [[October 19, 2007 Software]]<br />
* [[September 21,2007 Software]]<br />
* [[September 11,2007 Software]]<br />
* [[August 21,2007 Software]]<br />
* [[August 14, 2007 Software]]<br />
</td><td width=250><br />
* [[July 31, 2007 Software]]<br />
* [[July 17, 2007 Software]]<br />
* [[June 5, 2007 Software]]<br />
* [[May 22, 2007 Software]]<br />
* [[May 1, 2007 Software]]<br />
* [[April 17, 2007 Software]]<br />
* [[April 10, 2007 Software]]<br />
* [[March 20, 2007 Software]]<br />
</td><td width=250><br />
* [[March 13, 2007 Software]]<br />
* [[February 27, 2007 Software]]<br />
* [[February 20, 2007 Software]]<br />
* [[February 13, 2007 Software]]<br />
* [[February 6, 2007 Software]]<br />
* [[January 30, 2007 Software]]<br />
* [[January 16, 2007 Software]]<br />
* [[January 8, 2007 Software]]<br />
</td></tr></table><br />
== Offline Meetings in 2006 ==<br />
<table><tr><td width=250><br />
* [[December 18, 2006 Software]]<br />
* [[December 11, 2006 Software]]<br />
* [[December 4, 2006 Software]]<br />
</td><td width=250><br />
* [[September 6, 2006 Software]]<br />
* [[August 28, 2006 Software]]<br />
* [[August 14, 2006 Software]]<br />
* [[August 7, 2006 Software]]<br />
</td><td width=250><br />
* [[July 31, 2006 Software]]<br />
* [[July 10, 2006 Software]]<br />
* [[July 5,2006 Software]]<br />
* [[May 8, 2006 Software]]<br />
</td></tr></table><br />
<br />
=Special Meetings=<br />
* [[fADC Emulation Meeting, August 26, 2015]]<br />
* [[Data Plan Meeting, February 13, 2014]]<br />
* [[Particle Decay Chain Meeting, September 11, 2013]]<br />
* [[GlueX and the OSG, Meeting on Resource Contribution, March 31, 2017]]</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=HOWTO_use_AmpTools_on_the_JLab_farm_GPUs&diff=125057HOWTO use AmpTools on the JLab farm GPUs2024-03-11T19:22:18Z<p>Aaustreg: /* AmpTools Compilation with CUDA */</p>
<hr />
<div>=== Access through SLURM ===<br />
<br />
JLab currently provides NVidia Titan RTX or T4 cards on the sciml19 an sciml21 nodes and 4 NVidia A100 (80G ram) cards on each of the two sciml23 nodes. The nodes can be accessed through SLURM, where N is the number of requested cards (1-4):<br />
>salloc --gres gpu:TitanRTX:N --partition gpu --nodes 1 --mem-per-cpu=4G<br />
or<br />
>salloc --gres gpu:T4:N --partition gpu --nodes 1 --mem-per-cpu=4G<br />
or<br />
>salloc --gres gpu:A100:N --partition gpu --nodes 1 --mem-per-cpu=4G<br />
The default memory request is 512MB per CPU, which is often too small.<br />
<br />
An interactive shell (e.g. bash) on the node with requested allocation can be opened with srun:<br />
>srun --pty bash<br />
Information about the cards, cuda version and usage is displayed with this command:<br />
<pre><br />
>nvidia-smi<br />
<br />
+-----------------------------------------------------------------------------+<br />
| NVIDIA-SMI 418.87.01 Driver Version: 418.87.01 CUDA Version: 10.1 |<br />
|-------------------------------+----------------------+----------------------+<br />
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |<br />
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |<br />
|===============================+======================+======================|<br />
| 0 TITAN RTX Off | 00000000:3E:00.0 Off | N/A |<br />
| 41% 27C P8 2W / 280W | 0MiB / 24190MiB | 0% Default |<br />
+-------------------------------+----------------------+----------------------+<br />
<br />
+-----------------------------------------------------------------------------+<br />
| Processes: GPU Memory |<br />
| GPU PID Type Process name Usage |<br />
|=============================================================================|<br />
| No running processes found |<br />
+-----------------------------------------------------------------------------+<br />
</pre><br />
<br />
=== AmpTools Compilation with CUDA ===<br />
This example was done in csh for the Titan RTX cards available on sciml1902.<br><br />
'''The compilation does not have to be performed on a machine with GPUs. We chose the interactive node ifarm1901 here.'''<br />
<br />
'''1)''' Download latest AmpTools release<br />
git clone git@github.com:mashephe/AmpTools.git<br />
<br />
'''2)''' Set AMPTOOLS directory<br />
setenv AMPTOOLS_HOME $PWD/AmpTools/<br />
setenv AMPTOOLS $AMPTOOLS_HOME/AmpTools/<br />
<br />
'''3)''' Load cuda environment module (source <code>/etc/profile.d/modules.csh</code> before if you can't find the <code>module</code> command)<br />
module add cuda<br />
setenv CUDA_INSTALL_PATH /apps/cuda/11.4.2/<br />
With the advent of AlmaLinux9 at JLab, the modules were moved from /apps to /cvmfs:<br />
module use /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles<br />
module load cuda<br />
export CUDA_INSTALL_PATH=/cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/cuda/12.2.2/<br />
'''4)''' Set AMPTOOLS directory<br />
setenv AMPTOOLS $PWD/AmpTools<br />
<br />
'''5)''' Put root-config in your path<br />
setenv PATH $ROOTSYS/bin:$PATH<br />
<br />
'''6)''' Set the appropriate architecture for the cuda complier (info e.g. [https://arnon.dk/matching-sm-architectures-arch-and-gencode-for-various-nvidia-cards/ here])<br />
setenv GPU_ARCH sm_75 (for T4 and TitanRTX)<br />
setenv GPU_ARCH sm_80 (for A100)<br />
For older (pre 0.13) versions of AmpTools you will edit the Makefile and adjust the line:<br />
CUDA_FLAGS := -m64 -arch=sm_75<br />
<br />
'''7)''' Build main AmpTools library with GPU support<br />
cd $AMPTOOLS_HOME<br />
make gpu<br />
<br />
=== halld_sim Compilation with GPU ===<br />
<br />
The GPU dependent part of halld_sim is libraries/AMPTOOLS_AMPS/ where the GPU kernels are located. With the environment setup above the full halld_sim should be compiled, which will recognize the AMPTOOLS GPU flag and build the necessary libraries and executables to be run on the GPU<br />
cd $HALLD_SIM_HOME/src/<br />
scons -u install -j8<br />
<br />
=== Performing Fits Interactively ===<br />
<br />
With the environment setup above, the fit executable is run the same as on a CPU<br />
fit -c YOURCONFIG.cfg<br />
where YOURCONFIG.cfg is your usual config file. Note: additional command line parameters can be used as well, as needed.<br />
<br />
== Combining GPU and MPI ==<br />
<br />
To utilize multiple GPUs in the same fit you'll need both the AmpTools and halld_sim libraries to be compiled with GPU and MPI support. To complete the steps below you'll need to be logged into one of the sciml nodes with GPU support (as described above).<br />
<br />
=== AmpTools ===<br />
<br />
Build the main AmpTools library with GPU and MPI support (note "mpigpu" option). If you are missing mpicxx you can load it using "module load mpi/openmpi3-x86_64"<br />
cd $AMPTOOLS_HOME<br />
make mpigpu<br />
<br />
=== halld_sim ===<br />
<br />
With the environment setup above the fitMPI executable is the only thing that needs to be recompiled, which will recognize the AmpTools GPU and MPI flag and build the necessary libraries and executables to be run on the GPU with MPI<br />
cd $HALLD_SIM_HOME/src/programs/AmplitudeAnalysis/fitMPI/<br />
scons -u install<br />
<br />
=== Performing Fits Interactively ===<br />
<br />
The fitMPI executable is run with mpirun the same as on a CPU<br />
<br />
mpirun fitMPI -c YOURCONFIG.cfg<br />
<br />
If you're using Slurm it will recognize how many GPUs you've reserved and assign the number of parallel processes to make use of those GPUs.<br />
<br />
== Submitting Batch Jobs ==<br />
This example script can be submitted via slurm using the <code>sbatch</code> command. WORKDIR has to be replaced with the full path to an existing directory, that includes the configuration file FILE.cfg . ENV.csh has to be replaced with a shell setup script.<br />
<br />
#!/bin/csh <br />
#SBATCH --nodes=1 <br />
#SBATCH --partition=gpu <br />
#SBATCH --gres=gpu:A100:1 <br />
#SBATCH --cpus-per-task=1 <br />
#SBATCH --ntasks-per-core=1 <br />
#SBATCH --threads-per-core=1 <br />
#SBATCH --mem=10GB <br />
#SBATCH --time=8:00:00 <br />
#SBATCH --ntasks=2 <br />
#SBATCH --chdir=WORKDIR <br />
#SBATCH --error=WORKDIR/log/fit.err <br />
#SBATCH --output=WORKDIR/log/fit.out <br />
#SBATCH --job-name=MyGPUfit <br />
<br />
source ENV.csh <br />
fit -c WORKDIR/FILE.cfg -m 1000000 -r 10</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125054GlueX Software Meeting, March 11, 20242024-03-11T16:11:17Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_12,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* [https://halldweb.jlab.org/talks/2024/GlueXSoftwareStackonALMA9.pdf Detailed comparison of reconstruction on CentOS7 vs Alma9] (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#** [https://markito3.wordpress.com/2020/02/11/rolling-out-ccdb-2-0/ Mark's plan from 2020]<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125051GlueX Software Meeting, March 11, 20242024-03-11T14:02:40Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_12,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* [https://halldweb.jlab.org/talks/2024/GlueXSoftwareStackonALMA9.pdf Detailed comparison of reconstruction on CentOS7 vs Alma9] (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=HOWTO_use_AmpTools_on_the_JLab_farm_GPUs&diff=125050HOWTO use AmpTools on the JLab farm GPUs2024-03-11T14:00:03Z<p>Aaustreg: /* AmpTools Compilation with CUDA */</p>
<hr />
<div>=== Access through SLURM ===<br />
<br />
JLab currently provides NVidia Titan RTX or T4 cards on the sciml19 an sciml21 nodes and 4 NVidia A100 (80G ram) cards on each of the two sciml23 nodes. The nodes can be accessed through SLURM, where N is the number of requested cards (1-4):<br />
>salloc --gres gpu:TitanRTX:N --partition gpu --nodes 1 --mem-per-cpu=4G<br />
or<br />
>salloc --gres gpu:T4:N --partition gpu --nodes 1 --mem-per-cpu=4G<br />
or<br />
>salloc --gres gpu:A100:N --partition gpu --nodes 1 --mem-per-cpu=4G<br />
The default memory request is 512MB per CPU, which is often too small.<br />
<br />
An interactive shell (e.g. bash) on the node with requested allocation can be opened with srun:<br />
>srun --pty bash<br />
Information about the cards, cuda version and usage is displayed with this command:<br />
<pre><br />
>nvidia-smi<br />
<br />
+-----------------------------------------------------------------------------+<br />
| NVIDIA-SMI 418.87.01 Driver Version: 418.87.01 CUDA Version: 10.1 |<br />
|-------------------------------+----------------------+----------------------+<br />
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |<br />
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |<br />
|===============================+======================+======================|<br />
| 0 TITAN RTX Off | 00000000:3E:00.0 Off | N/A |<br />
| 41% 27C P8 2W / 280W | 0MiB / 24190MiB | 0% Default |<br />
+-------------------------------+----------------------+----------------------+<br />
<br />
+-----------------------------------------------------------------------------+<br />
| Processes: GPU Memory |<br />
| GPU PID Type Process name Usage |<br />
|=============================================================================|<br />
| No running processes found |<br />
+-----------------------------------------------------------------------------+<br />
</pre><br />
<br />
=== AmpTools Compilation with CUDA ===<br />
This example was done in csh for the Titan RTX cards available on sciml1902.<br><br />
'''The compilation does not have to be performed on a machine with GPUs. We chose the interactive node ifarm1901 here.'''<br />
<br />
'''1)''' Download latest AmpTools release<br />
git clone git@github.com:mashephe/AmpTools.git<br />
<br />
'''2)''' Set AMPTOOLS directory<br />
setenv AMPTOOLS_HOME $PWD/AmpTools/<br />
setenv AMPTOOLS $AMPTOOLS_HOME/AmpTools/<br />
<br />
'''3)''' Load cuda environment module (source <code>/etc/profile.d/modules.csh</code> before if you can't find the <code>module</code> command)<br />
module add cuda<br />
setenv CUDA_INSTALL_PATH /apps/cuda/11.4.2/<br />
With the advent of AlmaLinux9 at JLab, the modules were moved from /apps to /cvmfs:<br />
module use /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles<br />
module load cuda<br />
export CUDA_INSTALL_PATH=/cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/cuda/11.4.2/<br />
'''4)''' Set AMPTOOLS directory<br />
setenv AMPTOOLS $PWD/AmpTools<br />
<br />
'''5)''' Put root-config in your path<br />
setenv PATH $ROOTSYS/bin:$PATH<br />
<br />
'''6)''' Set the appropriate architecture for the cuda complier (info e.g. [https://arnon.dk/matching-sm-architectures-arch-and-gencode-for-various-nvidia-cards/ here])<br />
setenv GPU_ARCH sm_75 (for T4 and TitanRTX)<br />
setenv GPU_ARCH sm_80 (for A100)<br />
For older (pre 0.13) versions of AmpTools you will edit the Makefile and adjust the line:<br />
CUDA_FLAGS := -m64 -arch=sm_75<br />
<br />
'''7)''' Build main AmpTools library with GPU support<br />
cd $AMPTOOLS_HOME<br />
make gpu<br />
<br />
=== halld_sim Compilation with GPU ===<br />
<br />
The GPU dependent part of halld_sim is libraries/AMPTOOLS_AMPS/ where the GPU kernels are located. With the environment setup above the full halld_sim should be compiled, which will recognize the AMPTOOLS GPU flag and build the necessary libraries and executables to be run on the GPU<br />
cd $HALLD_SIM_HOME/src/<br />
scons -u install -j8<br />
<br />
=== Performing Fits Interactively ===<br />
<br />
With the environment setup above, the fit executable is run the same as on a CPU<br />
fit -c YOURCONFIG.cfg<br />
where YOURCONFIG.cfg is your usual config file. Note: additional command line parameters can be used as well, as needed.<br />
<br />
== Combining GPU and MPI ==<br />
<br />
To utilize multiple GPUs in the same fit you'll need both the AmpTools and halld_sim libraries to be compiled with GPU and MPI support. To complete the steps below you'll need to be logged into one of the sciml nodes with GPU support (as described above).<br />
<br />
=== AmpTools ===<br />
<br />
Build the main AmpTools library with GPU and MPI support (note "mpigpu" option). If you are missing mpicxx you can load it using "module load mpi/openmpi3-x86_64"<br />
cd $AMPTOOLS_HOME<br />
make mpigpu<br />
<br />
=== halld_sim ===<br />
<br />
With the environment setup above the fitMPI executable is the only thing that needs to be recompiled, which will recognize the AmpTools GPU and MPI flag and build the necessary libraries and executables to be run on the GPU with MPI<br />
cd $HALLD_SIM_HOME/src/programs/AmplitudeAnalysis/fitMPI/<br />
scons -u install<br />
<br />
=== Performing Fits Interactively ===<br />
<br />
The fitMPI executable is run with mpirun the same as on a CPU<br />
<br />
mpirun fitMPI -c YOURCONFIG.cfg<br />
<br />
If you're using Slurm it will recognize how many GPUs you've reserved and assign the number of parallel processes to make use of those GPUs.<br />
<br />
== Submitting Batch Jobs ==<br />
This example script can be submitted via slurm using the <code>sbatch</code> command. WORKDIR has to be replaced with the full path to an existing directory, that includes the configuration file FILE.cfg . ENV.csh has to be replaced with a shell setup script.<br />
<br />
#!/bin/csh <br />
#SBATCH --nodes=1 <br />
#SBATCH --partition=gpu <br />
#SBATCH --gres=gpu:A100:1 <br />
#SBATCH --cpus-per-task=1 <br />
#SBATCH --ntasks-per-core=1 <br />
#SBATCH --threads-per-core=1 <br />
#SBATCH --mem=10GB <br />
#SBATCH --time=8:00:00 <br />
#SBATCH --ntasks=2 <br />
#SBATCH --chdir=WORKDIR <br />
#SBATCH --error=WORKDIR/log/fit.err <br />
#SBATCH --output=WORKDIR/log/fit.out <br />
#SBATCH --job-name=MyGPUfit <br />
<br />
source ENV.csh <br />
fit -c WORKDIR/FILE.cfg -m 1000000 -r 10</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125042GlueX Software Meeting, March 11, 20242024-03-08T22:23:13Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_12,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* [https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125041GlueX Software Meeting, March 11, 20242024-03-08T22:22:51Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_12,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* [ https://halldweb.jlab.org/halld_versions/version_5.15.0.xml version_5.15.0.xml] builds and runs on RHEL7, Centos7, RHEL8, Alma9 and containers<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125040GlueX Software Meeting, March 11, 20242024-03-08T22:18:45Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_12,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#** CCDB 1.06.11 can read database with python3, but not write<br />
#** CCDB 2 can currently not write either<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=125039GlueX Software Meeting, February 12, 20242024-03-08T22:03:37Z<p>Aaustreg: </p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 12, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Incomplete draft]<br />
#* Report from the February SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-02.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2024 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125038GlueX Software Meeting, March 11, 20242024-03-08T22:03:22Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_12,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125032GlueX Software Meeting, March 11, 20242024-03-08T16:08:58Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_14,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125031GlueX Software Meeting, March 11, 20242024-03-08T16:08:27Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_14,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* /apps -> /cvmfs/<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125030GlueX Software Meeting, March 11, 20242024-03-08T16:06:03Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_14,_2024 minutes and action items]<br />
# OS Upgrade to Alma9:<br />
#* Move nightly builds from /u/scratch to /volatile/halld done, recon test, pull request test and b1pi test adapted<br />
#* Cuda libraries not available right now, ticket submitted<br />
#* Detailed comparison of reconstruction on CentOS7 vs Alma9 (Beni)<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_March_11,_2024&diff=125029GlueX Software Meeting, March 11, 20242024-03-08T16:02:01Z<p>Aaustreg: Created page with "GlueX Software Meeting<br> Monday, March 11, 2024<br> 11:00 am EDT<br> F326/327<br> <div class="mw-collapsible mw-collapsed"> Zoom Meeting ID: 160 636 9159 Passcode: 888788 [..."</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, March 11, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Final Report]<br />
#* Report from the March SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-03.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_February_14,_2024 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Offline_Software_Meetings&diff=125028GlueX Offline Software Meetings2024-03-08T16:00:06Z<p>Aaustreg: /* Offline Meetings in 2024 */</p>
<hr />
<div>=Regularly Scheduled Meetings=<br />
<br />
== Offline Meetings in 2024 ==<br />
<br />
* [[GlueX Software Meeting, March 11, 2024 | March 11, 2024]]<br />
* [[GlueX Software Meeting, February 12, 2024 | February 12, 2024]]<br />
* [[GlueX Software Meeting, January 29, 2024 | January 29, 2024]]<br />
<br />
== Offline Meetings in 2023 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 18, 2023 | December 18, 2023]]<br />
* [[GlueX Software Meeting, November 20, 2023 | November 20, 2023]]<br />
* [[GlueX Software Meeting, November 6, 2023 | November 6, 2023]]<br />
* [[GlueX Software Meeting, October 23, 2023 | October 23, 2023]]<br />
* [[GlueX Software Meeting, October 9, 2023 | October 9, 2023]]<br />
* [[GlueX Software Meeting, September 11, 2023 | September 11, 2023]]<br />
|<br />
* [[GlueX Software Meeting, August 28, 2023 | August 28, 2023]]<br />
* [[GlueX Software Meeting, August 14, 2023 | August 14, 2023]]<br />
* [[GlueX Software Meeting, March 27, 2023 | March 27, 2023]]<br />
* [[GlueX Software Meeting, March 13, 2023 | March 13, 2023]]<br />
* [[GlueX Software Meeting, January 30, 2023 | January 30, 2023]]<br />
|}<br />
<br />
== Offline Meetings in 2022 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 19, 2022 | December 19, 2022]]<br />
* [[GlueX Software Meeting, November 7, 2022 | November 7, 2022]]<br />
* [[GlueX Software Meeting, October 10, 2022 | October 10, 2022]]<br />
* [[GlueX Software Meeting, August 29, 2022 | August 29, 2022]]<br />
* [[GlueX Software Meeting, August 15, 2022 | August 15, 2022]]<br />
* [[GlueX Software Meeting, July 18, 2022 | July 18, 2022]]<br />
|<br />
* [[GlueX Software Meeting, May 9, 2022 | May 9, 2022]]<br />
* [[GlueX Software Meeting, April 27, 2022 | April 27, 2022]]<br />
* [[GlueX Software Meeting, April 13, 2022 | April 13, 2022]]<br />
* [[GlueX Software Meeting, June 6, 2022 | June 6, 2022]]<br />
* [[GlueX Software Meeting, March 30, 2022 | March 30, 2022]]<br />
|<br />
* [[GlueX Software Meeting, March 16, 2022 | March 16, 2022]]<br />
* [[GlueX Software Meeting, March 2, 2022 | March 2, 2022]]<br />
* [[GlueX Software Meeting, February 16, 2022 | February 16, 2022]]<br />
* [[GlueX Software Meeting, February 2, 2022 | February 2, 2022]]<br />
* [[GlueX Software Meeting, January 18, 2022 | January 18, 2022]]<br />
|}<br />
<br />
== Offline Meetings in 2021 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 20, 2021 | December 20, 2021]]<br />
* [[GlueX Software Meeting, December 6, 2021 | December 6, 2021]]<br />
* [[GlueX Software Meeting, November 8, 2021, 2021 | November 8, 2021]]<br />
* [[GlueX Software Meeting, October 25, 2021, 2021 | October 25, 2021]]<br />
|<br />
* [[GlueX Software Meeting, October 11, 2021, 2021 | October 11, 2021]]<br />
* [[GlueX Software Meeting, September 27, 2021, 2021 | September 27, 2021]]<br />
* [[GlueX Software Meeting, August 31, 2021 | August 31, 2021]]<br />
* [[GlueX Software Meeting, August 17, 2021 | August 17, 2021]]<br />
* [[GlueX Software Meeting, July 20, 2021 | July 20, 2021]]<br />
|<br />
* [[GlueX Software Meeting, July 6, 2021 | July 6, 2021]]<br />
* [[GlueX Software Meeting, June 22, 2021 | June 22, 2021]]<br />
* [[GlueX Software Meeting, May 11, 2021 | May 11, 2021]]<br />
* [[GlueX Software Meeting, April 28, 2021 | April 28, 2021]]<br />
* [[GlueX Software Meeting, March 30, 2021 | March 30, 2021]]<br />
|<br />
* [[GlueX Software Meeting, March 16, 2021 | March 16, 2021]]<br />
* [[GlueX Software Meeting, March 2, 2021 | March 2, 2021]]<br />
* [[GlueX Software Meeting, February 2, 2021 | February 2, 2021]]<br />
* [[GlueX Software Meeting, January 19, 2021 | January 19, 2021]]<br />
* [[GlueX Software Meeting, January 5, 2021 | January 5, 2021]]<br />
|}<br />
<br />
== Offline Meetings in 2020 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 8, 2020 | December 8, 2020]]<br />
* [[GlueX Software Meeting, November 24, 2020 | November 24, 2020]]<br />
* [[GlueX Software Meeting, November 10, 2020 | November 10, 2020]]<br />
* [[GlueX Software Meeting, October 13, 2020 | October 13, 2020]]<br />
|<br />
* [[GlueX Software Meeting, September 29, 2020 | September 29, 2020]]<br />
* [[GlueX Software Meeting, September 15, 2020 | September 15, 2020]]<br />
* [[GlueX Software Meeting, September 1, 2020 | September 1, 2020]]<br />
* [[GlueX Software Meeting, August 18, 2020 | August 18, 2020]]<br />
* [[GlueX Software Meeting, August 4, 2020 | August 4, 2020]]<br />
* [[GlueX Software Meeting, July 21, 2020 | July 21, 2020]]<br />
|<br />
* [[GlueX Software Meeting, July 7, 2020 | July 7, 2020]]<br />
* [[GlueX Software Meeting, June 9, 2020 | June 9, 2020]]<br />
* [[GlueX Software Meeting, May 26, 2020 | May 26, 2020]]<br />
* [[GlueX Software Meeting, April 28, 2020 | April 28, 2020]]<br />
* [[GlueX Software Meeting, April 14, 2020 | April 14, 2020]]<br />
* [[GlueX Software Meeting, March 31, 2020 | March 31, 2020]]<br />
|<br />
* [[GlueX Software Meeting, March 17, 2020 | March 17, 2020]]<br />
* [[GlueX Software Meeting, March 3, 2020 | March 3, 2020]]<br />
* [[GlueX Software Meeting, February 18, 2020 | February 18, 2020]]<br />
* [[GlueX Software Meeting, February 4, 2020 | February 4, 2020]] <br />
* [[GlueX Software Meeting, January 21, 2020|January 21, 2020]]<br />
* [[GlueX Software Meeting, January 7, 2020|January 7, 2020]]<br />
|}<br />
<br />
== Offline Meetings in 2019 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 10, 2019|December 10, 2019]]<br />
* [[GlueX Software Meeting, November 26, 2019|November 26, 2019]]<br />
* [[GlueX Software Meeting, November 12, 2019|November 12, 2019]]<br />
* [[GlueX Software Meeting, October 29, 2019|October 29, 2019]]<br />
* [[GlueX Software Meeting, October 15, 2019|October 15, 2019]]<br />
|<br />
* [[GlueX Software Meeting, September 17, 2019|September 17, 2019]]<br />
* [[GlueX Software Meeting, September 3, 2019|September 3, 2019]]<br />
* [[GlueX Software Meeting, August 20, 2019|August 20, 2019]]<br />
* [[GlueX Software Meeting, August 6, 2019|August 6, 2019]]<br />
* [[GlueX Software Meeting, July 23, 2019|July 23, 2019]]<br />
|<br />
* [[GlueX Software Meeting, July 9, 2019|July 9, 2019]]<br />
* [[GlueX Software Meeting, June 25, 2019|June 25, 2019]]<br />
* [[GlueX Software Meeting, June 11, 2019|June 11, 2019]]<br />
* [[GlueX Software Meeting, May 28, 2019|May 28, 2019]]<br />
* [[GlueX Software Meeting, April 30, 2019|April 30, 2019]]<br />
|<br />
* [[GlueX Software Meeting, April 16, 2019|April 16, 2019]]<br />
* [[GlueX Software Meeting, March 5, 2019|March 5, 2019]]<br />
* [[GlueX Software Meeting, February 5, 2019|February 5, 2019]]<br />
* [[GlueX Software Meeting, January 22, 2019|January 22, 2019]]<br />
* [[GlueX Software Meeting, January 8, 2019|January 8, 2019]]<br />
|}<br />
<br />
== Offline Meetings in 2018 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 11, 2018|December 11, 2018]]<br />
* [[GlueX Software Meeting, November 13, 2018|November 13, 2018]]<br />
* [[GlueX Software Meeting, October 30, 2018|October 30, 2018]]<br />
* [[GlueX Offline Software Meeting, October 16, 2018|October 16, 2018]]<br />
* [[GlueX Offline Software Meeting, October 2, 2018|October 2, 2018]]<br />
* [[GlueX Offline Meeting, September 18, 2018|September 18, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, September 4, 2018|September 4, 2018]]<br />
* [[GlueX Offline Meeting, August 21, 2018|August 21, 2018]]<br />
* [[GlueX Offline Meeting, August 7, 2018|August 7, 2018]]<br />
* [[GlueX Offline Meeting, July 24, 2018|July 24, 2018]]<br />
* [[GlueX Offline Meeting, July 13, 2018|July 13, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, June 29, 2018|June 29, 2018]]<br />
* [[GlueX Offline Meeting, June 15, 2018|June 15, 2018]]<br />
* [[GlueX Offline Meeting, June 1, 2018|June 1, 2018]]<br />
* [[GlueX Offline Meeting, May 18, 2018|May 18, 2018]]<br />
* [[GlueX Offline Meeting, May 4, 2018|May 4, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, April 6, 2018|April 6, 2018]]<br />
* [[GlueX Offline Meeting, March 9, 2018|March 9, 2018]]<br />
* [[GlueX Offline Meeting, February 9, 2018|February 9, 2018]]<br />
* [[GlueX Offline Meeting, January 26, 2018|January 26, 2018]]<br />
* [[GlueX Offline Meeting, January 10, 2018|January 10, 2018]]<br />
|}<br />
<br />
== Offline Meetings in 2017 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 13, 2017|December 13, 2017]]<br />
* [[GlueX Offline Meeting, November 29, 2017|November 29, 2017]]<br />
* [[GlueX Offline Meeting, November 15, 2017|November 15, 2017]]<br />
* [[GlueX Offline Meeting, November 1, 2017|November 1, 2017]]<br />
* [[GlueX Offline Meeting, October 4, 2017|October 4, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, September 20, 2017|September 20, 2017]]<br />
* [[GlueX Offline Meeting, September 6, 2017|September 6, 2017]]<br />
* [[GlueX Offline Meeting, August 23, 2017|August 23, 2017]]<br />
* [[GlueX Offline Meeting, August 9, 2017|August 9, 2017]]<br />
* [[GlueX Offline Meeting, July 26, 2017|July 26, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, July 12, 2017|July 12, 2017]]<br />
* [[GlueX Offline Meeting, June 28, 2017|June 28, 2017]]<br />
* [[GlueX Offline Meeting, June 14, 2017|June 14, 2017]]<br />
* [[GlueX Offline Meeting, May 31, 2017|May 31, 2017]]<br />
* [[GlueX Offline Meeting, April 19, 2017|April 19, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, March 22, 2017|March 22, 2017]]<br />
* [[GlueX Offline Meeting, March 8, 2017|March 8, 2017]]<br />
* [[GlueX Offline Meeting, February 22, 2017|February 22, 2017]]<br />
* [[GlueX Offline Meeting, February 1, 2017|February 1, 2017]]<br />
* [[GlueX Offline Meeting, January 18, 2017|January 18, 2017]]<br />
|}<br />
<br />
== Offline Meetings in 2016 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 21, 2016|December 21, 2016]]<br />
* [[GlueX Offline Meeting, December 7, 2016|December 7, 2016]]<br />
* [[GlueX Offline Meeting, November 9, 2016|November 9, 2016]]<br />
* [[GlueX Offline Meeting, October 26, 2016|October 26, 2016]]<br />
* [[GlueX Offline Meeting, October 12, 2016|October 12, 2016]]<br />
* [[GlueX Offline Meeting, September 28, 2016|September 28, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, September 14, 2016|September 14, 2016]]<br />
* [[GlueX Offline Meeting, August 31, 2016|August 31, 2016]]<br />
* [[GlueX Offline Meeting, August 17, 2016|August 17, 2016]]<br />
* [[GlueX Offline Meeting, August 3, 2016|August 3, 2016]]<br />
* [[GlueX Offline Meeting, July 20, 2016|July 20, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, July 6, 2016|July 6, 2016]]<br />
* [[GlueX Offline Meeting, June 8, 2016|June 8, 2016]]<br />
* [[GlueX Offline Meeting, May 25, 2016|May 25, 2016]]<br />
* [[GlueX Offline Meeting, April 27, 2016|April 27, 2016]]<br />
* [[GlueX Offline Meeting, April 13, 2016|April 13, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, March 30, 2016|March 30, 2016]]<br />
* [[GlueX Offline Meeting, March 2, 2016|March 2, 2016]]<br />
* [[GlueX Offline Meeting, February 3, 2016|February 3, 2016]]<br />
* [[GlueX Offline Meeting, January 20, 2016|January 20, 2016]]<br />
* [[GlueX Offline Meeting, January 6, 2016|January 6, 2016]]<br />
|}<br />
<br />
== Offline Meetings in 2015 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 9, 2015|December 9, 2015]]<br />
* [[GlueX Offline Meeting, November 11, 2015|November 11, 2015]]<br />
* [[GlueX Offline Meeting, October 28, 2015|October 28, 2015]]<br />
* [[GlueX Offline Meeting, October 14, 2015|October 14, 2015]]<br />
* [[GlueX Offline Meeting, September 30, 2015|September 30, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, September 16, 2015|September 16, 2015]]<br />
* [[GlueX Offline Meeting, September 2, 2015|September 2, 2015]]<br />
* [[GlueX Offline Meeting, August 19, 2015|August 19, 2015]]<br />
* [[GlueX Offline Meeting, August 5, 2015|August 5, 2015]]<br />
* [[GlueX Offline Meeting, July 22, 2015|July 22, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, July 8, 2015|July 8, 2015]]<br />
* [[GlueX Offline Meeting, June 24, 2015|June 24, 2015]]<br />
* [[GlueX Offline Meeting, June 10, 2015|June 10, 2015]]<br />
* [[GlueX Offline Meeting, May 27, 2015|May 27, 2015]]<br />
* [[GlueX Offline Meeting, April 29, 2015|April 29, 2015]]<br />
* [[GlueX Offline Meeting, April 15, 2015|April 15, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, April 1, 2015|April 1, 2015]]<br />
* [[GlueX Offline Meeting, March 18, 2015|March 18, 2015]]<br />
* [[GlueX Offline Meeting, March 4, 2015|March 4, 2015]]<br />
* [[GlueX Offline Meeting, February 4, 2015|February 4, 2015]]<br />
* [[GlueX Offline Meeting, January 21, 2015|January 21, 2015]]<br />
* [[GlueX Offline Meeting, January 7, 2015|January 7, 2015]]<br />
|}<br />
<br />
== Offline Meetings in 2014 ==<br />
<br />
<table><tr><td><br />
* [[GlueX Offline Meeting, December 10, 2014|December 10, 2014]]<br />
* [[GlueX Offline Meeting, November 12, 2014|November 12, 2014]]<br />
* [[GlueX Offline Meeting, October 29, 2014|October 29, 2014]]<br />
* [[GlueX Offline Meeting, October 15, 2014|October 15, 2014]]<br />
* [[GlueX Offline Meeting, September 17, 2014|September 17, 2014]]<br />
* [[GlueX Offline Meeting, September 3, 2014|September 3, 2014]]<br />
* [[GlueX Offline Meeting, August 20, 2014|August 20, 2014]]<br />
</td><td><br />
* [[GlueX Offline Meeting, August 6, 2014|August 6, 2014]]<br />
* [[GlueX Offline Meeting, July 23, 2014|July 23, 2014]]<br />
* [[GlueX Offline Meeting, July 9, 2014|July 9, 2014]]<br />
* [[GlueX Offline Meeting, June 25, 2014|June 25, 2014]]<br />
* [[GlueX Offline Meeting, June 11, 2014|June 11, 2014]]<br />
* [[GlueX Offline Meeting, May 28, 2014|May 28, 2014]]<br />
* [[GlueX Offline Meeting, April 30, 2014|April 30, 2014]]<br />
</td><td><br />
* [[GlueX Offline Meeting, April 16, 2014|April 16, 2014]]<br />
* [[GlueX Offline Meeting, April 2, 2014|April 2, 2014]]<br />
* [[GlueX Offline Meeting, March 19, 2014|March 19, 2014]]<br />
* [[GlueX Offline Meeting, March 5, 2014|March 5, 2014]] (canceled, JLab network outage)<br />
* [[GlueX Offline Meeting, February 5, 2014|February 5, 2014]]<br />
* [[GlueX Offline Meeting, January 22, 2014|January 22, 2014]]<br />
* [[GlueX Offline Meeting, January 8, 2014|January 8, 2014]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2013 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 11, 2013|December 11, 2013]]<br />
* [[GlueX Offline Meeting, November 13, 2013|November 13, 2013]]<br />
* [[GlueX Offline Meeting, October 30, 2013|October 30, 2013]]<br />
* [[GlueX Offline Meeting, October 16, 2013|October 16, 2013]]<br />
* [[GlueX Offline Meeting, September 18, 2013|September 18, 2013]]<br />
* [[GlueX Offline Meeting, September 4, 2013|September 4, 2013]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 21, 2013|August 21, 2013]]<br />
* [[GlueX Offline Meeting, August 7, 2013|August 7, 2013]]<br />
* [[GlueX Offline Meeting, July 24, 2013|July 24, 2013]]<br />
* [[GlueX Offline Meeting, June 26, 2013|June 26, 2013]]<br />
* [[GlueX Offline Meeting, June 12, 2013|June 12, 2013]]<br />
* [[GlueX Offline Meeting, May 15, 2013|May 15, 2013]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, May 1, 2013|May 1, 2013]]<br />
* [[GlueX Offline Meeting, April 17, 2013|April 17, 2013]]<br />
* [[GlueX Offline Meeting, April 3, 2013|April 3, 2013]]<br />
* [[GlueX Offline Meeting, March 20, 2013|March 20, 2013]]<br />
* [[GlueX Offline Meeting, February 6, 2013|February 6, 2013]]<br />
* [[GlueX Offline Meeting, January 23, 2013|January 23, 2013]]<br />
* [[GlueX Offline Meeting, January 9, 2013|January 9, 2013]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2012 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 12, 2012|December 12, 2012]]<br />
* [[GlueX Offline Meeting, November 28, 2012|November 28, 2012]] (ARC 428)<br />
* [[GlueX Offline Meeting, November 14, 2012|November 14, 2012]]<br />
* [[GlueX Offline Meeting, October 31, 2012|October 31, 2012]]<br />
* [[GlueX Offline Meeting, October 17, 2012|October 17, 2012]]<br />
* [[GlueX Offline Meeting, October 3, 2012|October 3, 2012]]<br />
* [[GlueX Offline Meeting, September 19, 2012|September 19, 2012]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, September 5, 2012|September 5, 2012]]<br />
* [[GlueX Offline Meeting, August 22, 2012|August 22, 2012]]<br />
* [[GlueX Offline Meeting, August 8, 2012|August 8, 2012]]<br />
* [[GlueX Offline Meeting, July 25, 2012|July 25, 2012]]<br />
* [[GlueX Offline Meeting, July 11, 2012|July 11, 2012]]<br />
* [[GlueX Offline Meeting, June 27, 2012|June 27, 2012]]<br />
* [[GlueX Offline Meeting, June 13, 2012|June 13, 2012]]<br />
* [[GlueX Offline Meeting, May 30, 2012|May 30, 2012]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, May 16, 2012|May 16, 2012]]<br />
* [[GlueX Offline Meeting, April 18, 2012|April 18, 2012]]<br />
* [[GlueX Offline Meeting, March 21, 2012|March 21, 2012]]<br />
* [[GlueX Offline Meeting, February 22, 2012|February 22, 2012]]<br />
* [[GlueX Offline Meeting, February 8, 2012|February 8, 2012]]<br />
* [[GlueX Offline Meeting, January 25, 2012|January 25, 2012]]<br />
</td><td width=250><br />
</td></tr></table><br />
<br />
== Offline Meetings in 2011 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 14, 2011|December 14, 2011]]<br />
* [[GlueX Offline Meeting, November 30, 2011|November 30, 2011]]<br />
* [[GlueX Offline Meeting, November 16, 2011|November 16, 2011]]<br />
* [[GlueX Offline Meeting, November 2, 2011|November 2, 2011]]<br />
* [[GlueX Offline Meeting, October 19, 2011|October 19, 2011]] (canceled: Lehman Review)<br />
* [[GlueX Offline Meeting, September 21, 2011|September 21, 2011]]<br />
* [[GlueX Offline Meeting, September 7, 2011|September 7, 2011]]<br />
* [[GlueX Offline Meeting, August 24, 2011|August 24, 2011]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 10, 2011|August 10, 2011]]<br />
* [[GlueX Offline Meeting, July 27, 2011|July 27, 2011]]<br />
* [[GlueX Offline Meeting, July 13, 2011|July 13, 2011]]<br />
* [[GlueX Offline Meeting, June 29, 2011|June 29, 2011]]<br />
* [[GlueX Offline Meeting, June 15, 2011|June 15, 2011]]<br />
* [[GlueX Offline Meeting, June 1, 2011|June 1, 2011]]<br />
* [[GlueX Offline Meeting, May 18, 2011|May 18, 2011]]<br />
* [[GlueX Offline Meeting, April 20, 2011|April 20, 2011]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, April 6, 2011|April 6, 2011]]<br />
* [[GlueX Offline Meeting, March 23, 2011|March 23, 2011]]<br />
* [[GlueX Offline Meeting, March 9, 2011|March 9, 2011]]<br />
* [[GlueX Offline Meeting, February 23, 2011|February 23, 2011]]<br />
* [[GlueX Offline Meeting, February 9, 2011|February 9, 2011]]<br />
* [[GlueX Offline Meeting, January 26, 2011|January 26, 2011]]<br />
* [[GlueX Offline Meeting, January 12, 2011|January 12, 2011]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2010 ==<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 15, 2010|December 15, 2010]]<br />
* [[GlueX Offline Meeting, December 1, 2010|December 1, 2010]]<br />
* [[GlueX Offline Meeting, November 17, 2010|November 17, 2010]]<br />
* [[GlueX Offline Meeting, November 2, 2010|November 2, 2010]]<br />
* [[GlueX Offline Meeting, October 19, 2010|October 19, 2010]]<br />
* [[GlueX Offline Meeting, October 5, 2010|October 5, 2010]]<br />
* [[GlueX Offline Meeting, September 21, 2010|September 21, 2010]]<br />
* [[GlueX Offline Meeting, August 24, 2010|August 24, 2010]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 10, 2010|August 10, 2010]]<br />
* [[GlueX Offline Meeting, July 27, 2010|July 27, 2010]]<br />
* [[GlueX Offline Meeting, July 13, 2010|July 13, 2010]]<br />
* [[GlueX Offline Meeting, June 29, 2010|June 29, 2010]]<br />
* [[GlueX Offline Meeting, June 15, 2010|June 15, 2010]]<br />
* [[GlueX Offline Meeting, June 1, 2010|June 1, 2010]]<br />
* [[GlueX Offline Meeting, May 18, 2010|May 18, 2010]]<br />
* [[GlueX Offline Meeting, May 4, 2010|May 4, 2010]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, April 20, 2010|April 20, 2010]]<br />
* [[GlueX Offline Meeting, April 6, 2010|April 6, 2010]]<br />
* [[GlueX Offline Meeting, March 23, 2010|March 23, 2010]]<br />
* [[GlueX Offline Meeting, March 9, 2010|March 9, 2010]]<br />
* [[GlueX Offline Meeting, February 23, 2010|February 23, 2010]]<br />
* [[GlueX Offline Meeting, February 9, 2010|February 9, 2010]]<br />
* [[GlueX Offline Meeting, January 12, 2010|January 12, 2010]]<br />
</td></tr></table><br />
== Offline Meetings in 2009 ==<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 15, 2009|December 15, 2009]]<br />
* [[GlueX Offline Meeting, December 1, 2009|December 1, 2009]]<br />
* [[GlueX Offline Meeting, November 17, 2009|November 17, 2009]]<br />
* [[GlueX Offline Meeting, November 4, 2009|November 4, 2009]]<br />
* [[GlueX Offline Meeting, October 21, 2009|October 21, 2009]]<br />
</td><td width=250><br />
* [[ October 7, 2009 Software ]]<br />
* [[ September 23, 2009 Software ]]<br />
* [[ August 26, 2009 Software ]]<br />
* [[ August 12, 2009 Software ]]<br />
* [[ July 29, 2009 Software ]]<br />
* [[ July 1, 2009 Software ]]<br />
* [[ June 17, 2009 Software ]]<br />
</td><td width=250><br />
* [[ May 20, 2009 Software ]]<br />
* [[ May 6, 2009 Software ]]<br />
* [[ April 22, 2009 Software ]]<br />
* [[ April 8, 2009 Software ]]<br />
* [[ March 11, 2009 Software ]]<br />
* [[ Feburary 25, 2009 Software ]]<br />
* [[ February 11, 2009 Software ]]<br />
* [[ January 14, 2009 Software ]]<br />
</td></tr></table><br />
== Offline Meetings in 2008 ==<br />
<table><tr><td width=250><br />
* [[ December 17, 2008 Software ]]<br />
* [[ December 3, 2008 Software ]]<br />
* [[ November 18, 2008 Software ]]<br />
* [[ October 8, 2008 Software ]]<br />
* <s>[[ September 12, 2008 Software ]]</s><br />
</td><td width=250><br />
* [[ August 29, 2008 Software ]]<br />
* [[ August 15, 2008 Software ]]<br />
* [[ August 1, 2008 Software ]]<br />
* [[ July 18, 2008 Software ]]<br />
* [[ July 3, 2008 Software ]]<br />
* [[ June. 6, 2008 Software ]]<br />
</td><td width=250><br />
* [[ May. 23, 2008 Software ]]<br />
* [[ Feb. 29, 2008 Tracking CDC/FDC ]]<br />
* [[ Feb. 22, 2008 Tracking CDC/FDC ]]<br />
* [[ Feb. 15, 2008 Tracking CDC/FDC ]]<br />
* [[February 8, 2008 Software]]<br />
* [[January 25, 2008 Software]]<br />
</td></tr></table><br />
== Offline Meetings in 2007 ==<br />
<table><tr><td width=250><br />
* [[December 7, 2007 Software]]<br />
* [[November 30, 2007 Software]]<br />
* [[November 13, 2007 Software]]<br />
* [[October 19, 2007 Software]]<br />
* [[September 21,2007 Software]]<br />
* [[September 11,2007 Software]]<br />
* [[August 21,2007 Software]]<br />
* [[August 14, 2007 Software]]<br />
</td><td width=250><br />
* [[July 31, 2007 Software]]<br />
* [[July 17, 2007 Software]]<br />
* [[June 5, 2007 Software]]<br />
* [[May 22, 2007 Software]]<br />
* [[May 1, 2007 Software]]<br />
* [[April 17, 2007 Software]]<br />
* [[April 10, 2007 Software]]<br />
* [[March 20, 2007 Software]]<br />
</td><td width=250><br />
* [[March 13, 2007 Software]]<br />
* [[February 27, 2007 Software]]<br />
* [[February 20, 2007 Software]]<br />
* [[February 13, 2007 Software]]<br />
* [[February 6, 2007 Software]]<br />
* [[January 30, 2007 Software]]<br />
* [[January 16, 2007 Software]]<br />
* [[January 8, 2007 Software]]<br />
</td></tr></table><br />
== Offline Meetings in 2006 ==<br />
<table><tr><td width=250><br />
* [[December 18, 2006 Software]]<br />
* [[December 11, 2006 Software]]<br />
* [[December 4, 2006 Software]]<br />
</td><td width=250><br />
* [[September 6, 2006 Software]]<br />
* [[August 28, 2006 Software]]<br />
* [[August 14, 2006 Software]]<br />
* [[August 7, 2006 Software]]<br />
</td><td width=250><br />
* [[July 31, 2006 Software]]<br />
* [[July 10, 2006 Software]]<br />
* [[July 5,2006 Software]]<br />
* [[May 8, 2006 Software]]<br />
</td></tr></table><br />
<br />
=Special Meetings=<br />
* [[fADC Emulation Meeting, August 26, 2015]]<br />
* [[Data Plan Meeting, February 13, 2014]]<br />
* [[Particle Decay Chain Meeting, September 11, 2013]]<br />
* [[GlueX and the OSG, Meeting on Resource Contribution, March 31, 2017]]</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Automatic_Builds_of_GlueX_Software&diff=125027Automatic Builds of GlueX Software2024-03-08T15:07:16Z<p>Aaustreg: /* Nightly Code Analysis (scan-build) */</p>
<hr />
<div>== Nightly Build ==<br />
<br />
Every night a complete build of the source directory is done on several platforms at the lab.<br />
<br />
* The builds are located in the directory /volatile/halld/gluex/nightly on the JLab CUE. Everyday a new directory, named by date, is created in this directory, for example, /volatile/halld/gluex/nightly/2020-09-23. In turn, in this directory, there is a separate directory created for each platform, e. g., Linux_RHEL7-x86_64-gcc4.8.5.<br />
* Each platform-specific directory contains a copy of the version set file used in the build. There is also a "version.xml" in the same directory that is a soft link to this version set file. For example:<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
is a soft link to<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version_2024-03-07.xml<br />
* Since the volatile disk cleaning job deletes files unread for more than 60 days, builds older than that are usually deleted.<br />
* The hdds, halld_recon, halld_sim, hdgeant4, and gluex_root_analysis packages are built. The master branch of each are used.<br />
* The script run is /home/gluex/bin/nightly.sh. It is scheduled as a cron job for the [[GlueX shared account on the JLab CUE|"gluex" account]] on sandd1.jlab.org. The job runs at midnight daily.<br />
* The cron job on runs the builds on the various platforms, as username gluex, in parallel. The current platforms are:<br />
** sandd1.jlab.org (RedHat Enterprise Linux 7, x86_64)<br />
** ifarm1802.jlab.org (CentOS 7.7, x86_64)<br />
** ifarm9.jlab.org (Alma 9, x86_64)<br />
* Log files of the builds are created in the daily directory, for example, /volatile/halld/gluex/nightly/2024-03-07/halld_ifarm9.log .<br />
* A summary of errors and warnings from the last build is available at https://halldweb.jlab.org/nightly/nightly_build_errors.txt . This file is only updated for builds that have errors or warnings. For clean builds, no summary log is produced.<br />
** The last seven logs produced are archived in the same web directory: https://halldweb.jlab.org/nightly/<br />
* The summary of errors and warnings is also sent to the "nightly_build" [[Simple Email Lists|simple email list]].<br />
* To use one of the nightly builds, you can set up the environment as follows (assuming you want to use the build from September 23, 2020):<br />
gxenv /volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
The [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 gxenv command] is described in the [https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts document]. An alternate pair of environment setting command is<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.sh 2024-03-07<br />
for bash and<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.csh 2024-03-07<br />
for tcsh.<br />
<br />
===Note on ssh scheme===<br />
<br />
As mentioned above although the cron job runs on jlabl1, the builds are all actually done on other nodes. To do this without having to supply a passphrase, the cron job uses a special ssh private/public key pair that only allows the target script on the remote node (and no other command) to run only if the ssh connection comes from jlabl1 and if the target account holds the appropriate public key. This key has no passphrase associated with it<ref>If it did, then that passphrase would have to somehow be incorporated into scripts, a practice which is generally discouraged for security reasons.</ref>and thus can be used from a cron job. The remote target script is only mentioned in the authorized_keys file of the remote account. Only the ssh invocation is seen in the script (/home/gluex/bin/nightly.sh) on the local host (jlabl1).<br />
<br />
Note that this special key pair is not the one used for standard ssh connections to gluex account on the CUE. The standard pair has a passphrase. This passphrase-less technique is described in a [http://www.linuxjournal.com/article/8257 2005 Linux Journal article] and a [http://cybermashup.com/2013/05/14/restrict-ssh-logins-to-a-single-command/ CyberSmashup blog article].<br />
<br />
==Github Pull Request Tests==<br />
<br />
Whenever a pull request for the halld_recon or halld_sim Github repositories is created, a test build is generated at JLab. Test builds are also generated if on open pull request has new commits to its associated branch.<br />
<br />
* The notification from Github is passed along via the "webhook" functionality. This particular webhook is configured to send a HTTP POST message to halldweb.jlab.org/cgi-bin/build/build_sim-recon_wehook.py . The web server halldweb is configured to allow python scripts to run in that directory and requires a user/password combination specific to that directory, using simple HTTP authentication.<br />
* The python cgi-script looks for the relevant pull request event, and starts a test build on the sandd1 VM using the same ssh scheme as the nightly builds described above.<br />
* These test builds use the software stored in /group/halld/Software/build_scripts/pull_request<br />
* The ssh command from the python script on halldweb calls build_pull_request_service_sshwrapper.sh. The procedure is managed by build_pull_request_service.sh, while the actual test build is performed in build_pull_request.sh.<br />
* The test builds are stored in /work/halld/pull_request_test. Debug logs are stored in this directory as ssh_log^REPO^BRANCH and env^BRANCH where "REPO' is "halld_recon" or "halld_sim" as appropriate and "BRANCH" is the name of the branch associated with the pull request. The test builds and build logs are stored in the directory REPO^BRANCH.<br />
* Several simple run-time tests with simulation and data are performed in test_pull_request.sh<br />
* The results of the test are posted as a comment on the pull request using leave_pull_request_comment.py, which sends a simple command to the Github REST API over HTTPS using the python requests library (note that this script uses python3).<br />
<br />
==Nightly Code Analysis (scan-build)==<br />
The [http://clang-analyzer.llvm.org scan-build] analyzer that uses features built into the [http://clang.llvm.org clang] compiler is run nightly to produce a web report on potential problems identified in the code. Here's some details:<br />
<br />
* Results of the latest analysis can always be found here: [https://halldweb.jlab.org/scan-build/LATEST https://halldweb.jlab.org/scan-build/LATEST]<br />
* The scan-build program is run via cron job using the ''gluex'' account and is run only on ifarm1401<br />
* The cron job runs the script ''/home/gluex/bin/nightly-scan-build.csh'' which is maintained in svn here: [https://halldsvn.jlab.org/repos/trunk/home/gluex/bin https://halldsvn.jlab.org/repos/trunk/home/gluex/bin]<br />
* This piggybacks from the nightly build of sim-recon located in ''/volatile/halld/gluex/nightly/''. It requires the setenv.csh script exists for the gcc compiler so that it can source it to get basic things like ROOT, XERCES, JANA, CCDB set up in the environment. (Could be changed to use gluex_env instead and may be done in the future.)<br />
* This requires the clang compiler. It is currently hardwired to use the 3.7.0 compiler installed in ''/group/halld/Software/ExternalPackages/clang-llvm/llvm_clang_3.7.0''<br />
<br />
<br />
----<br />
<br />
<references/></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Automatic_Builds_of_GlueX_Software&diff=125010Automatic Builds of GlueX Software2024-03-07T17:08:36Z<p>Aaustreg: /* Nightly Build */</p>
<hr />
<div>== Nightly Build ==<br />
<br />
Every night a complete build of the source directory is done on several platforms at the lab.<br />
<br />
* The builds are located in the directory /volatile/halld/gluex/nightly on the JLab CUE. Everyday a new directory, named by date, is created in this directory, for example, /volatile/halld/gluex/nightly/2020-09-23. In turn, in this directory, there is a separate directory created for each platform, e. g., Linux_RHEL7-x86_64-gcc4.8.5.<br />
* Each platform-specific directory contains a copy of the version set file used in the build. There is also a "version.xml" in the same directory that is a soft link to this version set file. For example:<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
is a soft link to<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version_2024-03-07.xml<br />
* Since the volatile disk cleaning job deletes files unread for more than 60 days, builds older than that are usually deleted.<br />
* The hdds, halld_recon, halld_sim, hdgeant4, and gluex_root_analysis packages are built. The master branch of each are used.<br />
* The script run is /home/gluex/bin/nightly.sh. It is scheduled as a cron job for the [[GlueX shared account on the JLab CUE|"gluex" account]] on sandd1.jlab.org. The job runs at midnight daily.<br />
* The cron job on runs the builds on the various platforms, as username gluex, in parallel. The current platforms are:<br />
** sandd1.jlab.org (RedHat Enterprise Linux 7, x86_64)<br />
** ifarm1802.jlab.org (CentOS 7.7, x86_64)<br />
** ifarm9.jlab.org (Alma 9, x86_64)<br />
* Log files of the builds are created in the daily directory, for example, /volatile/halld/gluex/nightly/2024-03-07/halld_ifarm9.log .<br />
* A summary of errors and warnings from the last build is available at https://halldweb.jlab.org/nightly/nightly_build_errors.txt . This file is only updated for builds that have errors or warnings. For clean builds, no summary log is produced.<br />
** The last seven logs produced are archived in the same web directory: https://halldweb.jlab.org/nightly/<br />
* The summary of errors and warnings is also sent to the "nightly_build" [[Simple Email Lists|simple email list]].<br />
* To use one of the nightly builds, you can set up the environment as follows (assuming you want to use the build from September 23, 2020):<br />
gxenv /volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
The [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 gxenv command] is described in the [https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts document]. An alternate pair of environment setting command is<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.sh 2024-03-07<br />
for bash and<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.csh 2024-03-07<br />
for tcsh.<br />
<br />
===Note on ssh scheme===<br />
<br />
As mentioned above although the cron job runs on jlabl1, the builds are all actually done on other nodes. To do this without having to supply a passphrase, the cron job uses a special ssh private/public key pair that only allows the target script on the remote node (and no other command) to run only if the ssh connection comes from jlabl1 and if the target account holds the appropriate public key. This key has no passphrase associated with it<ref>If it did, then that passphrase would have to somehow be incorporated into scripts, a practice which is generally discouraged for security reasons.</ref>and thus can be used from a cron job. The remote target script is only mentioned in the authorized_keys file of the remote account. Only the ssh invocation is seen in the script (/home/gluex/bin/nightly.sh) on the local host (jlabl1).<br />
<br />
Note that this special key pair is not the one used for standard ssh connections to gluex account on the CUE. The standard pair has a passphrase. This passphrase-less technique is described in a [http://www.linuxjournal.com/article/8257 2005 Linux Journal article] and a [http://cybermashup.com/2013/05/14/restrict-ssh-logins-to-a-single-command/ CyberSmashup blog article].<br />
<br />
==Github Pull Request Tests==<br />
<br />
Whenever a pull request for the halld_recon or halld_sim Github repositories is created, a test build is generated at JLab. Test builds are also generated if on open pull request has new commits to its associated branch.<br />
<br />
* The notification from Github is passed along via the "webhook" functionality. This particular webhook is configured to send a HTTP POST message to halldweb.jlab.org/cgi-bin/build/build_sim-recon_wehook.py . The web server halldweb is configured to allow python scripts to run in that directory and requires a user/password combination specific to that directory, using simple HTTP authentication.<br />
* The python cgi-script looks for the relevant pull request event, and starts a test build on the sandd1 VM using the same ssh scheme as the nightly builds described above.<br />
* These test builds use the software stored in /group/halld/Software/build_scripts/pull_request<br />
* The ssh command from the python script on halldweb calls build_pull_request_service_sshwrapper.sh. The procedure is managed by build_pull_request_service.sh, while the actual test build is performed in build_pull_request.sh.<br />
* The test builds are stored in /work/halld/pull_request_test. Debug logs are stored in this directory as ssh_log^REPO^BRANCH and env^BRANCH where "REPO' is "halld_recon" or "halld_sim" as appropriate and "BRANCH" is the name of the branch associated with the pull request. The test builds and build logs are stored in the directory REPO^BRANCH.<br />
* Several simple run-time tests with simulation and data are performed in test_pull_request.sh<br />
* The results of the test are posted as a comment on the pull request using leave_pull_request_comment.py, which sends a simple command to the Github REST API over HTTPS using the python requests library (note that this script uses python3).<br />
<br />
==Nightly Code Analysis (scan-build)==<br />
The [http://clang-analyzer.llvm.org scan-build] analyzer that uses features built into the [http://clang.llvm.org clang] compiler is run nightly to produce a web report on potential problems identified in the code. Here's some details:<br />
<br />
* Results of the latest analysis can always be found here: [https://halldweb.jlab.org/scan-build/LATEST https://halldweb.jlab.org/scan-build/LATEST]<br />
* The scan-build program is run via cron job using the ''gluex'' account and is run only on ifarm1401<br />
* The cron job runs the script ''/home/gluex/bin/nightly-scan-build.csh'' which is maintained in svn here: [https://halldsvn.jlab.org/repos/trunk/home/gluex/bin https://halldsvn.jlab.org/repos/trunk/home/gluex/bin]<br />
* This piggybacks from the nightly build of sim-recon located in ''/u/scratch/gluex/nightly/''. It requires the setenv.csh script exists for the gcc compiler so that it can source it to get basic things like ROOT, XERCES, JANA, CCDB set up in the environment. (Could be changed to use gluex_env instead and may be done in the future.)<br />
* This requires the clang compiler. It is currently hardwired to use the 3.7.0 compiler installed in ''/group/halld/Software/ExternalPackages/clang-llvm/llvm_clang_3.7.0''<br />
<br />
<br />
----<br />
<br />
<references/></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Automatic_Builds_of_GlueX_Software&diff=125009Automatic Builds of GlueX Software2024-03-07T17:07:53Z<p>Aaustreg: /* Nightly Build */</p>
<hr />
<div>== Nightly Build ==<br />
<br />
Every night a complete build of the source directory is done on several platforms at the lab.<br />
<br />
* The builds are located in the directory /volatile/halld/gluex/nightly on the JLab CUE. Everyday a new directory, named by date, is created in this directory, for example, /volatile/halld/gluex/nightly/2020-09-23. In turn, in this directory, there is a separate directory created for each platform, e. g., Linux_RHEL7-x86_64-gcc4.8.5.<br />
* Each platform-specific directory contains a copy of the version set file used in the build. There is also a "version.xml" in the same directory that is a soft link to this version set file. For example:<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
is a soft link to<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version_2024-03-07.xml<br />
* Since the volatile disk cleaning job deletes files unread for more than 60 days, builds older than that are usually deleted.<br />
* The hdds, halld_recon, halld_sim, hdgeant4, and gluex_root_analysis packages are built. The master branch of each are used.<br />
* The script run is /home/gluex/bin/nightly.sh. It is scheduled as a cron job for the [[GlueX shared account on the JLab CUE|"gluex" account]] on sandd1.jlab.org. The job runs at midnight daily.<br />
* The cron job on runs the builds on the various platforms, as username gluex, in parallel. The current platforms are:<br />
** sandd1.jlab.org (RedHat Enterprise Linux 7, x86_64)<br />
** ifarm1802.jlab.org (CentOS 7.7, x86_64)<br />
** ifarm9.jlab.org (Alma 9, x86_64)<br />
* Log files of the builds are created in the daily directory, for example, /volatile/halld/gluex/nightly/2024-03-07/halld_ifarm9.log .<br />
* A summary of errors and warnings from the last build is available at https://halldweb.jlab.org/nightly/nightly_build_errors.txt . This file is only updated for builds that have errors or warnings. For clean builds, no summary log is produced.<br />
** The last seven logs produced are archived in the same web directory: https://halldweb.jlab.org/nightly/<br />
* The summary of errors and warnings is also sent to the "nightly_build" [[Simple Email Lists|simple email list]].<br />
* To use one of the nightly builds, you can set up the environment as follows (assuming you want to use the build from September 23, 2020):<br />
gxenv /volatile/halld/gluex/nightly/2020-09-23/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
The [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 gxenv command] is described in the [https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts document]. An alternate pair of environment setting command is<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.sh 2024-03-07<br />
for bash and<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.csh 2024-03-07<br />
for tcsh.<br />
<br />
===Note on ssh scheme===<br />
<br />
As mentioned above although the cron job runs on jlabl1, the builds are all actually done on other nodes. To do this without having to supply a passphrase, the cron job uses a special ssh private/public key pair that only allows the target script on the remote node (and no other command) to run only if the ssh connection comes from jlabl1 and if the target account holds the appropriate public key. This key has no passphrase associated with it<ref>If it did, then that passphrase would have to somehow be incorporated into scripts, a practice which is generally discouraged for security reasons.</ref>and thus can be used from a cron job. The remote target script is only mentioned in the authorized_keys file of the remote account. Only the ssh invocation is seen in the script (/home/gluex/bin/nightly.sh) on the local host (jlabl1).<br />
<br />
Note that this special key pair is not the one used for standard ssh connections to gluex account on the CUE. The standard pair has a passphrase. This passphrase-less technique is described in a [http://www.linuxjournal.com/article/8257 2005 Linux Journal article] and a [http://cybermashup.com/2013/05/14/restrict-ssh-logins-to-a-single-command/ CyberSmashup blog article].<br />
<br />
==Github Pull Request Tests==<br />
<br />
Whenever a pull request for the halld_recon or halld_sim Github repositories is created, a test build is generated at JLab. Test builds are also generated if on open pull request has new commits to its associated branch.<br />
<br />
* The notification from Github is passed along via the "webhook" functionality. This particular webhook is configured to send a HTTP POST message to halldweb.jlab.org/cgi-bin/build/build_sim-recon_wehook.py . The web server halldweb is configured to allow python scripts to run in that directory and requires a user/password combination specific to that directory, using simple HTTP authentication.<br />
* The python cgi-script looks for the relevant pull request event, and starts a test build on the sandd1 VM using the same ssh scheme as the nightly builds described above.<br />
* These test builds use the software stored in /group/halld/Software/build_scripts/pull_request<br />
* The ssh command from the python script on halldweb calls build_pull_request_service_sshwrapper.sh. The procedure is managed by build_pull_request_service.sh, while the actual test build is performed in build_pull_request.sh.<br />
* The test builds are stored in /work/halld/pull_request_test. Debug logs are stored in this directory as ssh_log^REPO^BRANCH and env^BRANCH where "REPO' is "halld_recon" or "halld_sim" as appropriate and "BRANCH" is the name of the branch associated with the pull request. The test builds and build logs are stored in the directory REPO^BRANCH.<br />
* Several simple run-time tests with simulation and data are performed in test_pull_request.sh<br />
* The results of the test are posted as a comment on the pull request using leave_pull_request_comment.py, which sends a simple command to the Github REST API over HTTPS using the python requests library (note that this script uses python3).<br />
<br />
==Nightly Code Analysis (scan-build)==<br />
The [http://clang-analyzer.llvm.org scan-build] analyzer that uses features built into the [http://clang.llvm.org clang] compiler is run nightly to produce a web report on potential problems identified in the code. Here's some details:<br />
<br />
* Results of the latest analysis can always be found here: [https://halldweb.jlab.org/scan-build/LATEST https://halldweb.jlab.org/scan-build/LATEST]<br />
* The scan-build program is run via cron job using the ''gluex'' account and is run only on ifarm1401<br />
* The cron job runs the script ''/home/gluex/bin/nightly-scan-build.csh'' which is maintained in svn here: [https://halldsvn.jlab.org/repos/trunk/home/gluex/bin https://halldsvn.jlab.org/repos/trunk/home/gluex/bin]<br />
* This piggybacks from the nightly build of sim-recon located in ''/u/scratch/gluex/nightly/''. It requires the setenv.csh script exists for the gcc compiler so that it can source it to get basic things like ROOT, XERCES, JANA, CCDB set up in the environment. (Could be changed to use gluex_env instead and may be done in the future.)<br />
* This requires the clang compiler. It is currently hardwired to use the 3.7.0 compiler installed in ''/group/halld/Software/ExternalPackages/clang-llvm/llvm_clang_3.7.0''<br />
<br />
<br />
----<br />
<br />
<references/></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Automatic_Builds_of_GlueX_Software&diff=125008Automatic Builds of GlueX Software2024-03-07T16:56:30Z<p>Aaustreg: /* Nightly Build */</p>
<hr />
<div>== Nightly Build ==<br />
<br />
Every night a complete build of the source directory is done on several platforms at the lab.<br />
<br />
* The builds are located in the directory /volatile/halld/gluex/nightly on the JLab CUE. Everyday a new directory, named by date, is created in this directory, for example, /volatile/halld/gluex/nightly/2020-09-23. In turn, in this directory, there is a separate directory created for each platform, e. g., Linux_RHEL7-x86_64-gcc4.8.5.<br />
* Each platform-specific directory contains a copy of the version set file used in the build. There is also a "version.xml" in the same directory that is a soft link to this version set file. For example:<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
is a soft link to<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version_2020-09-23.xml<br />
* Since the volatile disk cleaning job deletes files unread for more than 60 days, builds older than that are usually deleted.<br />
* The hdds, halld_recon, halld_sim, hdgeant4, and gluex_root_analysis packages are built. The master branch of each are used.<br />
* The script run is /home/gluex/bin/nightly.sh. It is scheduled as a cron job for the [[GlueX shared account on the JLab CUE|"gluex" account]] on sandd1.jlab.org. The job runs at midnight daily.<br />
* The cron job on runs the builds on the various platforms, as username gluex, in parallel. The current platforms are:<br />
** sandd1.jlab.org (RedHat Enterprise Linux 7, x86_64)<br />
** ifarm1802.jlab.org (CentOS 7.7, x86_64)<br />
** ifarm9.jlab.org (Alma 9, x86_64)<br />
* Log files of the builds are created in the daily directory, for example, /volatile/halld/gluex/nightly/2024-03-07/halld_ifarm9.log .<br />
* A summary of errors and warnings from the last build is available at https://halldweb.jlab.org/nightly/nightly_build_errors.txt . This file is only updated for builds that have errors or warnings. For clean builds, no summary log is produced.<br />
** The last seven logs produced are archived in the same web directory: https://halldweb.jlab.org/nightly/<br />
* The summary of errors and warnings is also sent to the "nightly_build" [[Simple Email Lists|simple email list]].<br />
* To use one of the nightly builds, you can set up the environment as follows (assuming you want to use the build from September 23, 2020):<br />
gxenv /volatile/halld/gluex/nightly/2020-09-23/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
The [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 gxenv command] is described in the [https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts document]. An alternate pair of environment setting command is<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.sh 2024-03-07<br />
for bash and<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.csh 2024-03-07<br />
for tcsh.<br />
<br />
===Note on ssh scheme===<br />
<br />
As mentioned above although the cron job runs on jlabl1, the builds are all actually done on other nodes. To do this without having to supply a passphrase, the cron job uses a special ssh private/public key pair that only allows the target script on the remote node (and no other command) to run only if the ssh connection comes from jlabl1 and if the target account holds the appropriate public key. This key has no passphrase associated with it<ref>If it did, then that passphrase would have to somehow be incorporated into scripts, a practice which is generally discouraged for security reasons.</ref>and thus can be used from a cron job. The remote target script is only mentioned in the authorized_keys file of the remote account. Only the ssh invocation is seen in the script (/home/gluex/bin/nightly.sh) on the local host (jlabl1).<br />
<br />
Note that this special key pair is not the one used for standard ssh connections to gluex account on the CUE. The standard pair has a passphrase. This passphrase-less technique is described in a [http://www.linuxjournal.com/article/8257 2005 Linux Journal article] and a [http://cybermashup.com/2013/05/14/restrict-ssh-logins-to-a-single-command/ CyberSmashup blog article].<br />
<br />
==Github Pull Request Tests==<br />
<br />
Whenever a pull request for the halld_recon or halld_sim Github repositories is created, a test build is generated at JLab. Test builds are also generated if on open pull request has new commits to its associated branch.<br />
<br />
* The notification from Github is passed along via the "webhook" functionality. This particular webhook is configured to send a HTTP POST message to halldweb.jlab.org/cgi-bin/build/build_sim-recon_wehook.py . The web server halldweb is configured to allow python scripts to run in that directory and requires a user/password combination specific to that directory, using simple HTTP authentication.<br />
* The python cgi-script looks for the relevant pull request event, and starts a test build on the sandd1 VM using the same ssh scheme as the nightly builds described above.<br />
* These test builds use the software stored in /group/halld/Software/build_scripts/pull_request<br />
* The ssh command from the python script on halldweb calls build_pull_request_service_sshwrapper.sh. The procedure is managed by build_pull_request_service.sh, while the actual test build is performed in build_pull_request.sh.<br />
* The test builds are stored in /work/halld/pull_request_test. Debug logs are stored in this directory as ssh_log^REPO^BRANCH and env^BRANCH where "REPO' is "halld_recon" or "halld_sim" as appropriate and "BRANCH" is the name of the branch associated with the pull request. The test builds and build logs are stored in the directory REPO^BRANCH.<br />
* Several simple run-time tests with simulation and data are performed in test_pull_request.sh<br />
* The results of the test are posted as a comment on the pull request using leave_pull_request_comment.py, which sends a simple command to the Github REST API over HTTPS using the python requests library (note that this script uses python3).<br />
<br />
==Nightly Code Analysis (scan-build)==<br />
The [http://clang-analyzer.llvm.org scan-build] analyzer that uses features built into the [http://clang.llvm.org clang] compiler is run nightly to produce a web report on potential problems identified in the code. Here's some details:<br />
<br />
* Results of the latest analysis can always be found here: [https://halldweb.jlab.org/scan-build/LATEST https://halldweb.jlab.org/scan-build/LATEST]<br />
* The scan-build program is run via cron job using the ''gluex'' account and is run only on ifarm1401<br />
* The cron job runs the script ''/home/gluex/bin/nightly-scan-build.csh'' which is maintained in svn here: [https://halldsvn.jlab.org/repos/trunk/home/gluex/bin https://halldsvn.jlab.org/repos/trunk/home/gluex/bin]<br />
* This piggybacks from the nightly build of sim-recon located in ''/u/scratch/gluex/nightly/''. It requires the setenv.csh script exists for the gcc compiler so that it can source it to get basic things like ROOT, XERCES, JANA, CCDB set up in the environment. (Could be changed to use gluex_env instead and may be done in the future.)<br />
* This requires the clang compiler. It is currently hardwired to use the 3.7.0 compiler installed in ''/group/halld/Software/ExternalPackages/clang-llvm/llvm_clang_3.7.0''<br />
<br />
<br />
----<br />
<br />
<references/></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Automatic_Builds_of_GlueX_Software&diff=125007Automatic Builds of GlueX Software2024-03-07T16:49:19Z<p>Aaustreg: /* Nightly Build */</p>
<hr />
<div>== Nightly Build ==<br />
<br />
Every night a complete build of the source directory is done on several platforms at the lab.<br />
<br />
* The builds are located in the directory /volatile/halld/gluex/nightly on the JLab CUE. Everyday a new directory, named by date, is created in this directory, for example, /volatile/halld/gluex/nightly/2020-09-23. In turn, in this directory, there is a separate directory created for each platform, e. g., Linux_RHEL7-x86_64-gcc4.8.5.<br />
* Each platform-specific directory contains a copy of the version set file used in the build. There is also a "version.xml" in the same directory that is a soft link to this version set file. For example:<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
is a soft link to<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version_2020-09-23.xml<br />
* Since the volatile disk cleaning job deletes files unread for more than 60 days, builds older than that are usually deleted.<br />
* The hdds, halld_recon, halld_sim, hdgeant4, and gluex_root_analysis packages are built. The master branch of each are used.<br />
* The script run is /home/gluex/bin/nightly.sh. It is scheduled as a cron job for the [[GlueX shared account on the JLab CUE|"gluex" account]] on sandd1.jlab.org. The job runs at midnight daily.<br />
* The cron job on runs the builds on the various platforms, as username gluex, in parallel. The current platforms are:<br />
** sandd1.jlab.org (RedHat Enterprise Linux 7, x86_64)<br />
** ifarm1802.jlab.org (CentOS 7.7, x86_64)<br />
** ifarm9.jlab.org (Alma 9, x86_64)<br />
* Log files of the builds are created in the daily directory, for example, /volatile/halld/gluex/nightly/2020-09-23/halld_jlabl5.log .<br />
* A summary of errors and warnings from the last build is available at https://halldweb.jlab.org/nightly/nightly_build_errors.txt . This file is only updated for builds that have errors or warnings. For clean builds, no summary log is produced.<br />
** The last seven logs produced are archived in the same web directory: https://halldweb.jlab.org/nightly/<br />
* The summary of errors and warnings is also sent to the "nightly_build" [[Simple Email Lists|simple email list]].<br />
* To use one of the nightly builds, you can set up the environment as follows (assuming you want to use the build from September 23, 2020):<br />
gxenv /volatile/halld/gluex/nightly/2020-09-23/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
The [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 gxenv command] is described in the [https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts document]. An alternate pair of environment setting command is<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.sh 2024-03-07<br />
for bash and<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.csh 2024-03-07<br />
for tcsh.<br />
<br />
===Note on ssh scheme===<br />
<br />
As mentioned above although the cron job runs on jlabl1, the builds are all actually done on other nodes. To do this without having to supply a passphrase, the cron job uses a special ssh private/public key pair that only allows the target script on the remote node (and no other command) to run only if the ssh connection comes from jlabl1 and if the target account holds the appropriate public key. This key has no passphrase associated with it<ref>If it did, then that passphrase would have to somehow be incorporated into scripts, a practice which is generally discouraged for security reasons.</ref>and thus can be used from a cron job. The remote target script is only mentioned in the authorized_keys file of the remote account. Only the ssh invocation is seen in the script (/home/gluex/bin/nightly.sh) on the local host (jlabl1).<br />
<br />
Note that this special key pair is not the one used for standard ssh connections to gluex account on the CUE. The standard pair has a passphrase. This passphrase-less technique is described in a [http://www.linuxjournal.com/article/8257 2005 Linux Journal article] and a [http://cybermashup.com/2013/05/14/restrict-ssh-logins-to-a-single-command/ CyberSmashup blog article].<br />
<br />
==Github Pull Request Tests==<br />
<br />
Whenever a pull request for the halld_recon or halld_sim Github repositories is created, a test build is generated at JLab. Test builds are also generated if on open pull request has new commits to its associated branch.<br />
<br />
* The notification from Github is passed along via the "webhook" functionality. This particular webhook is configured to send a HTTP POST message to halldweb.jlab.org/cgi-bin/build/build_sim-recon_wehook.py . The web server halldweb is configured to allow python scripts to run in that directory and requires a user/password combination specific to that directory, using simple HTTP authentication.<br />
* The python cgi-script looks for the relevant pull request event, and starts a test build on the sandd1 VM using the same ssh scheme as the nightly builds described above.<br />
* These test builds use the software stored in /group/halld/Software/build_scripts/pull_request<br />
* The ssh command from the python script on halldweb calls build_pull_request_service_sshwrapper.sh. The procedure is managed by build_pull_request_service.sh, while the actual test build is performed in build_pull_request.sh.<br />
* The test builds are stored in /work/halld/pull_request_test. Debug logs are stored in this directory as ssh_log^REPO^BRANCH and env^BRANCH where "REPO' is "halld_recon" or "halld_sim" as appropriate and "BRANCH" is the name of the branch associated with the pull request. The test builds and build logs are stored in the directory REPO^BRANCH.<br />
* Several simple run-time tests with simulation and data are performed in test_pull_request.sh<br />
* The results of the test are posted as a comment on the pull request using leave_pull_request_comment.py, which sends a simple command to the Github REST API over HTTPS using the python requests library (note that this script uses python3).<br />
<br />
==Nightly Code Analysis (scan-build)==<br />
The [http://clang-analyzer.llvm.org scan-build] analyzer that uses features built into the [http://clang.llvm.org clang] compiler is run nightly to produce a web report on potential problems identified in the code. Here's some details:<br />
<br />
* Results of the latest analysis can always be found here: [https://halldweb.jlab.org/scan-build/LATEST https://halldweb.jlab.org/scan-build/LATEST]<br />
* The scan-build program is run via cron job using the ''gluex'' account and is run only on ifarm1401<br />
* The cron job runs the script ''/home/gluex/bin/nightly-scan-build.csh'' which is maintained in svn here: [https://halldsvn.jlab.org/repos/trunk/home/gluex/bin https://halldsvn.jlab.org/repos/trunk/home/gluex/bin]<br />
* This piggybacks from the nightly build of sim-recon located in ''/u/scratch/gluex/nightly/''. It requires the setenv.csh script exists for the gcc compiler so that it can source it to get basic things like ROOT, XERCES, JANA, CCDB set up in the environment. (Could be changed to use gluex_env instead and may be done in the future.)<br />
* This requires the clang compiler. It is currently hardwired to use the 3.7.0 compiler installed in ''/group/halld/Software/ExternalPackages/clang-llvm/llvm_clang_3.7.0''<br />
<br />
<br />
----<br />
<br />
<references/></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Automatic_Builds_of_GlueX_Software&diff=125006Automatic Builds of GlueX Software2024-03-07T16:48:38Z<p>Aaustreg: /* Nightly Build */</p>
<hr />
<div>== Nightly Build ==<br />
<br />
Every night a complete build of the source directory is done on several platforms at the lab.<br />
<br />
* The builds are located in the directory /volatile/halld/gluex/nightly on the JLab CUE. Everyday a new directory, named by date, is created in this directory, for example, /volatile/halld/gluex/nightly/2020-09-23. In turn, in this directory, there is a separate directory created for each platform, e. g., Linux_RHEL7-x86_64-gcc4.8.5.<br />
* Each platform-specific directory contains a copy of the version set file used in the build. There is also a "version.xml" in the same directory that is a soft link to this version set file. For example:<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
is a soft link to<br />
/volatile/halld/gluex/nightly/2024-03-07/Linux_RHEL7-x86_64-gcc4.8.5/version_2020-09-23.xml<br />
* Since the scratch disk cleaning job deletes files unread for more than 60 days, builds older than that are usually deleted.<br />
* The hdds, halld_recon, halld_sim, hdgeant4, and gluex_root_analysis packages are built. The master branch of each are used.<br />
* The script run is /home/gluex/bin/nightly.sh. It is scheduled as a cron job for the [[GlueX shared account on the JLab CUE|"gluex" account]] on sandd1.jlab.org. The job runs at midnight daily.<br />
* The cron job on runs the builds on the various platforms, as username gluex, in parallel. The current platforms are:<br />
** sandd1.jlab.org (RedHat Enterprise Linux 7, x86_64)<br />
** ifarm1802.jlab.org (CentOS 7.7, x86_64)<br />
** ifarm9.jlab.org (Alma 9, x86_64)<br />
* Log files of the builds are created in the daily directory, for example, /volatile/halld/gluex/nightly/2020-09-23/halld_jlabl5.log .<br />
* A summary of errors and warnings from the last build is available at https://halldweb.jlab.org/nightly/nightly_build_errors.txt . This file is only updated for builds that have errors or warnings. For clean builds, no summary log is produced.<br />
** The last seven logs produced are archived in the same web directory: https://halldweb.jlab.org/nightly/<br />
* The summary of errors and warnings is also sent to the "nightly_build" [[Simple Email Lists|simple email list]].<br />
* To use one of the nightly builds, you can set up the environment as follows (assuming you want to use the build from September 23, 2020):<br />
gxenv /volatile/halld/gluex/nightly/2020-09-23/Linux_RHEL7-x86_64-gcc4.8.5/version.xml<br />
The [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 gxenv command] is described in the [https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts document]. An alternate pair of environment setting command is<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.sh 2024-03-07<br />
for bash and<br />
source /group/halld/Software/build_scripts/gluex_env_nightly.csh 2024-03-07<br />
for tcsh.<br />
<br />
===Note on ssh scheme===<br />
<br />
As mentioned above although the cron job runs on jlabl1, the builds are all actually done on other nodes. To do this without having to supply a passphrase, the cron job uses a special ssh private/public key pair that only allows the target script on the remote node (and no other command) to run only if the ssh connection comes from jlabl1 and if the target account holds the appropriate public key. This key has no passphrase associated with it<ref>If it did, then that passphrase would have to somehow be incorporated into scripts, a practice which is generally discouraged for security reasons.</ref>and thus can be used from a cron job. The remote target script is only mentioned in the authorized_keys file of the remote account. Only the ssh invocation is seen in the script (/home/gluex/bin/nightly.sh) on the local host (jlabl1).<br />
<br />
Note that this special key pair is not the one used for standard ssh connections to gluex account on the CUE. The standard pair has a passphrase. This passphrase-less technique is described in a [http://www.linuxjournal.com/article/8257 2005 Linux Journal article] and a [http://cybermashup.com/2013/05/14/restrict-ssh-logins-to-a-single-command/ CyberSmashup blog article].<br />
<br />
==Github Pull Request Tests==<br />
<br />
Whenever a pull request for the halld_recon or halld_sim Github repositories is created, a test build is generated at JLab. Test builds are also generated if on open pull request has new commits to its associated branch.<br />
<br />
* The notification from Github is passed along via the "webhook" functionality. This particular webhook is configured to send a HTTP POST message to halldweb.jlab.org/cgi-bin/build/build_sim-recon_wehook.py . The web server halldweb is configured to allow python scripts to run in that directory and requires a user/password combination specific to that directory, using simple HTTP authentication.<br />
* The python cgi-script looks for the relevant pull request event, and starts a test build on the sandd1 VM using the same ssh scheme as the nightly builds described above.<br />
* These test builds use the software stored in /group/halld/Software/build_scripts/pull_request<br />
* The ssh command from the python script on halldweb calls build_pull_request_service_sshwrapper.sh. The procedure is managed by build_pull_request_service.sh, while the actual test build is performed in build_pull_request.sh.<br />
* The test builds are stored in /work/halld/pull_request_test. Debug logs are stored in this directory as ssh_log^REPO^BRANCH and env^BRANCH where "REPO' is "halld_recon" or "halld_sim" as appropriate and "BRANCH" is the name of the branch associated with the pull request. The test builds and build logs are stored in the directory REPO^BRANCH.<br />
* Several simple run-time tests with simulation and data are performed in test_pull_request.sh<br />
* The results of the test are posted as a comment on the pull request using leave_pull_request_comment.py, which sends a simple command to the Github REST API over HTTPS using the python requests library (note that this script uses python3).<br />
<br />
==Nightly Code Analysis (scan-build)==<br />
The [http://clang-analyzer.llvm.org scan-build] analyzer that uses features built into the [http://clang.llvm.org clang] compiler is run nightly to produce a web report on potential problems identified in the code. Here's some details:<br />
<br />
* Results of the latest analysis can always be found here: [https://halldweb.jlab.org/scan-build/LATEST https://halldweb.jlab.org/scan-build/LATEST]<br />
* The scan-build program is run via cron job using the ''gluex'' account and is run only on ifarm1401<br />
* The cron job runs the script ''/home/gluex/bin/nightly-scan-build.csh'' which is maintained in svn here: [https://halldsvn.jlab.org/repos/trunk/home/gluex/bin https://halldsvn.jlab.org/repos/trunk/home/gluex/bin]<br />
* This piggybacks from the nightly build of sim-recon located in ''/u/scratch/gluex/nightly/''. It requires the setenv.csh script exists for the gcc compiler so that it can source it to get basic things like ROOT, XERCES, JANA, CCDB set up in the environment. (Could be changed to use gluex_env instead and may be done in the future.)<br />
* This requires the clang compiler. It is currently hardwired to use the 3.7.0 compiler installed in ''/group/halld/Software/ExternalPackages/clang-llvm/llvm_clang_3.7.0''<br />
<br />
<br />
----<br />
<br />
<references/></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=2024-03-07&diff=1250052024-03-072024-03-07T15:46:05Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>== Meeting Coordinates ==<br />
<br />
The meeting will be on Thursday, March 7, 2024 at 11:00 AM ET. For those in-person at JLab F326 is reserved<br />
<br />
==== Connecting ====<br />
<br />
To connect to the meeting, please go to [https://halldweb.jlab.org/wiki-private/index.php/Connect_to_ZoomGov_Meetings the listing of Zoom meetings on the private Wiki] and select the GlueX/Hall D Biweekly Meeting link at the top of the list.<br />
<br />
== Agenda ==<br />
<br />
# Announcements<br />
#* Next Collaboration Meeting and Physics Fest: May 14-16<br />
#* [https://forms.gle/UVHiUKrVN6pe3d6z8 Physics Discussion Questions] <br />
# Run Coordination Report (Alexandre/Lubomir/Eugene)<br />
# Polarized Target White Paper (Mark)<br />
# Draft PAC Proposals Under Review (Naomi)<br />
#* [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6397 GlueX-III]<br />
#* [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6354 Alpha Minus]<br />
# FCAL2 Construction Report (Sasha)<br />
# Status Report from Publication Review Committees<br />
#* &pi;<sub>1</sub> Upper Limit (Peter/Farah/Mike McC.)<br />
#* Compton Cross Section (David/Michael D./Axel)<br />
#* Y(2175) Search (Simon/Chandra/Edmundo)<br />
#* SDME in &gamma; p &rarr; &pi;<sup>-</sup>&Delta;<sup>++</sup> (Ken/Will/Amy)<br />
# Specific areas where specialized help is needed -- contact Matt if you are interested in contributing to these<br />
#* OSG data processing (w/ Richard,Igal,Alex)<br />
#* volunteers for data quality monitoring<br />
#* TAGH calibration, hardware maintenance, etc. <br />
#* Systematic studies of kinematic fitting, efficiency of &chi;<sup>2</sup> cuts, etc., in data and MC<br />
#** maintenance of kinematic fitting framework<br />
<br />
==== Additional brief reports from working groups as needed ====<br />
<br />
* Software and Analysis:<br />
** Production and Calibration Meeting: [https://halldweb.jlab.org/wiki-private/index.php/March_6,_2024,_Calibration_%26_Production March 6, 2024] (Naomi/Igal)<br />
** Offline Meeting: [[GlueX_Software_Meeting,_February_12,_2024 | February 12, 2024]] (Alex)<br />
** Particle ID Working Group: Last meeting [[GlueX_PID_Meeting,_May_16,_2023 | May 16, 2023]] (Simon)<br />
<br />
* Topical Meetings<br />
** Physics Working Group: [https://halldweb.jlab.org/wiki-private/index.php/January_9,_2024,_Physics_Working_Group January 9, 2024] (Sean)<br />
** Amplitude Analysis: [https://halldweb.jlab.org/wiki-private/index.php/Amplitude_Analysis_Meeting,_March_4,_2024 March 4, 2024] (Alex A./Malte)<br />
** Cross-section: [https://halldweb.jlab.org/wiki-private/index.php/February_12,_2024,_Cross_Section_Working_Group February 12, 2024] (Justin/Susan)<br />
** Beam Asymmetry: [https://halldweb.jlab.org/wiki-private/index.php/Beam_Asymmetry_Meeting_2024-03-04 March 4, 2024] (Zisis/Simon)<br />
** PrimEx Working Group: [https://halldweb.jlab.org/wiki-private/index.php/Mar_1,_2024_PrimEx-Eta_Meeting March 1, 2024] <br />
** Di-lepton/Rare Working Group: [https://halldweb.jlab.org/wiki-private/index.php/Dilepton_Meeting,_January_25,_2024 January 25, 2024] (Sean/Lubomir)<br />
** CPP/NPP Working Group: [[Π_polarizability_Meeting_Mar_6,_2024 | March 6, 2024]]<br />
** JEF: [[JEF_meeting,_March_1,_2024 | March 1, 2024]] (Simon/Liping/Sasha/Zisis)<br />
<br />
* Hardware Working Groups:<br />
** Engineering Activities (Tim)<br />
** Electronics and Firmware (Fernando)<br />
** Beamline (Richard)<br />
** Tracking: [https://halldweb.jlab.org/wiki-private/index.php/Tracking-2-15-2024 February 15, 2024] (Naomi/Lubomir)<br />
** DIRC (Justin)<br />
** Online Systems: DAQ (Sergey)<br />
** Controls: [https://halldweb.jlab.org/wiki/index.php/Controls_Meeting_25-Jan-2024 January 25, 2024] (Hovanes)<br />
** Trigger (Sasha)<br />
** Calorimetry: [[February_14,_2023_Calorimeter | February 14, 2024]] (Zisis/Mark/Sasha)<br />
** Start Counter (Beni)<br />
** TOF (Paul)</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Offline_Monitoring_Data_Validation&diff=124903Offline Monitoring Data Validation2024-02-28T16:54:56Z<p>Aaustreg: /* Analysis */</p>
<hr />
<div>This page contains the procedure for checking if production runs are of good quality and can be used for physics analysis.<br />
<br />
= Run Periods =<br />
<br />
* [[RunPeriod-2016-02 Validation]]<br />
* [[RunPeriod-2016-10 Validation]]<br />
* [[RunPeriod-2017-01 Validation]]<br />
* [[RunPeriod-2018-01 Validation]]<br />
* [[RunPeriod-2018-08 Validation]]<br />
* [[RunPeriod-2019-01 Validation]]<br />
* [[RunPeriod-2019-11 Validation]]<br />
* [[RunPeriod-2021-08 Validation]]<br />
* [[RunPeriod-2022-05 Validation]]<br />
* [[RunPeriod-2022-08 Validation]]<br />
* [[RunPeriod-2023-01 Validation]]<br />
<br />
= Procedure =<br />
<br />
For each production run, do the following:<br />
<br />
* Go to the [https://halldweb.jlab.org/data_monitoring/Plot_Browser.html Offline Run Browser] page.<br />
* Follow the steps outlined in the checklist below.<br />
* Workers should check each plot for their assigned subsystem and leave notes in the corresponding spreadsheet if any significant deviations are seen<br />
* On the spreadsheet, enter "Y" in the "Overall Quality" field if all monitoring histograms are acceptable, otherwise enter "N"<br />
* We will iterate this procedure until the process converges <br />
<br />
=== Expert Actions ===<br />
<br />
* Certify that each subsystem is okay<br />
* Set run status in RCDB based on monitoring results<br />
** (script provided)<br />
<br />
=== Run Statuses ===<br />
<br />
* -1 - unchecked<br />
* 0 - rejected (not physics-quality)<br />
* 1 - approved<br />
* 2 - approved long/"mode 8" data<br />
* 3 - calibration / systematic studies<br />
<br />
= Checklist = <br />
<br />
[https://docs.google.com/spreadsheets/d/1QUiZxz-dTaEehrQ_XFHic5jAz27sHHv4yo0LCBRQiD8/edit#gid=1665922131 Example Monitoring Spreadsheet]<br />
<br />
Reference run for 2017-01: 30780 <br />
<br />
Reference run for 2018-01: 40933/4<br />
<br />
Reference run for 2018-08: 51388<br />
<br />
Reference run for 2019-11: 71463 (150 nA), 71469 (250 nA), 71724 (350 nA)<br />
<br />
Reference run for 2023-01: 120888<br />
<br />
===Expert list ===<br />
Experts should update the table as tasks are completed<br />
{| class="wikitable" style="margin:auto"<br />
|+ Instructions status by subdetector<br />
|-<br />
! Subdetector !! Plots !! Instructions !! Expert(s)<br />
|-<br />
| BCAL || Good || Good || Mark Dalton, Zisis Papandreou<br />
|-<br />
| CDC || Good || Good || Naomi Jarvis<br />
|-<br />
| FCAL || ? || ? || Mark Dalton, Malte Albrecht, Igal Jaegle<br />
|-<br />
| FDC || Good || Good || Lubomir Pentchev<br />
|-<br />
| PS || ? || ? || Alex Somov, Olga Cortes<br />
|-<br />
| SC || ? || ? || Beni Zihlmann<br />
|-<br />
| TAGH || ? || ? || Alex Somov, Bo Yu<br />
|-<br />
| TAGM || ? || ? || Richard Jones, Ellie Prather<br />
|-<br />
| TOF || ? || ? || Paul Eugenio, Beni Zihlmann<br />
|-<br />
| RF || ? || ? || Sean Dobbs, Beni Zihlmann<br />
|-<br />
| Timing || ? || ? || Sean Dobbs<br />
|-<br />
| Analysis || Good || Good || Alex Austregesilo<br />
|}<br />
<br />
===General Notes===<br />
* Diamond and amorphous (AMO) runs have different beam energy spectra, which leads to differences in reaction yield distributions which depend on the kinematics of the produced particles.<br />
* [https://halldweb.jlab.org/wiki/index.php/RunPeriod-2019-11_Validation#Monitoring_Launch_Checks The list of experts on different detector/calibration]<br />
<br />
===BCAL===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_occupancy.png link] ]<br />
* Check Hit Efficiency - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_hist_eff.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''BCAL Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_hist_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_hist_eff.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''BCAL Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
The BCAL is used to measure the energy and time of showers.<br />
<br />
# '''Occupancy''': This should be approximately flat. There can be hot channels when the baseline drifts. <br />
# '''Hit Efficiency''': This should be approximately flat. If there are features we should understand why.<br />
<br />
</div><br />
</div><br />
<br />
===CDC===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_occupancy.png link] ]<br />
* Check Time-to-distance - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_CDCTimeToDistance.png link] ]<br />
* Check dE/dx - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_dedx.png link] ]<br />
<br />
* Check Efficiency- Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_eff.png link] ]<br />
<br />
<!--<br />
* Check CDC ROC amp - Reference: [ [https://halldweb.jlab.org/work/halld2/calibration/RunPeriod-2018-08/mon_ver08/Run071724/verify/verify_CDC_roc_amp.png link] ]<br />
* Check CDC ROC netamp - Reference: [ [https://halldweb.jlab.org/work/halld2/calibration/RunPeriod-2018-08/mon_ver08/Run071724/verify/verify_CDC_roc_netamp.png link] ]<br />
* Check CDC ROC hits - Reference: [ [https://halldweb.jlab.org/work/halld2/calibration/RunPeriod-2018-08/mon_ver08/Run071724/verify/verify_CDC_roc_hits.png link] ]<br />
* Check CDC rawtime vs strawnum. - Reference: [ [https://halldweb.jlab.org/work/halld2/calibration/RunPeriod-2018-08/mon_ver08/Run071724/verify/verify_CDC_rawtime_vs_strawnum.png link] ]<br />
--><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''CDC Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_CDCTimeToDistance.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_CDCTimeToDistance.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_dedx.png""><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_dedx.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/CDC_eff.png" height="200"/></a><br />
<br />
<br />
<!--<br />
<a href="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_roc_amp.png"><img src="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_roc_amp.png" height="200"/></a><br />
<a href="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_roc_netamp.png"><img src="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_roc_netamp.png" height="200"/></a><br />
<a href="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_roc_hits.png"><img src="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_roc_hits.png" height="200"/></a><br />
<a href="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_rawtime_vs_strawnum.png"><img src="/work/halld2/calibration/RunPeriod-2018-01/ver18/Run042577/verify/verify_CDC_rawtime_vs_strawnum.png" height="200"/></a><br />
--><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''CDC Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
CDC Occupancy: There should be a uniform decrease in intensity from the center of the detector outward. Random white cells scattered throughout occur when not enough data were collected, eg empty target runs, trigger tests or no beam. Several contiguous white, dark blue or bright yellow cells which don't match the neighboring cells are a problem. <br />
<br /> <br />
Time-to-distance: 𝛿, the change in length of the LOCA caused by the straw deformation, is<br />
plotted against the measured drift time, t drift . The color scale indicates the distance of<br />
closest approach between the track and the wire, obtained from the tracking software.<br />
The red lines are contours of the time-to-distance function for constant drift distances<br />
from 1.5 mm to 8 mm, in steps of 0.5 mm. They should lie over the top of the dark blue contour lines separating the colour blocks. <br />
For the plot of residuals vs drift time, the mean should be less than 15um and the sigma should be less than 150um.<br />
<br/><br />
dE/dx: At 1.5GeV/c the fitted peak mean should be within 1% of 2.02 keV/cm. <br />
<br/><br />
Efficiency: The efficiency should be 0.98 or higher at 0cm DOCA, gradually fall to 0.97 at approximately 0.5mm and then more steeply through 0.9 at approximately 0.64cm. <br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===FCAL===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__fcal_digOcc2D.png link] ]<br />
<!--<br />
* Check FCAL Hits 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver06/Run071724/fcal_hit_energy.png link] ]<br />
* Check FCAL Hits 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver06/Run071724/fcal_hit_timing.png link] ]<br />
* Check FCAL Clusters 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver06/Run071724/fcal_cluster_space.png link] ]<br />
* Check FCAL Recon. 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver06/Run071724/HistMacro_FCALReconstruction_p1.png link] ]<br />
* Check FCAL Recon. 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver06/Run071724/HistMacro_FCALReconstruction_p2.png link] ]<br />
* Check Recon. FCAL Matching - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver06/Run071724/HistMacro_Matching_FCAL.png link] ]<br />
--><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FCAL Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__fcal_digOcc2D.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__fcal_digOcc2D.png" height="200"/></a><br />
<!--<br />
<a href="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/fcal_hit_energy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/fcal_hit_energy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/fcal_hit_timing.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/fcal_hit_timing.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/fcal_cluster_space.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/fcal_cluster_space.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_FCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_FCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2018-08/mon_ver08/Run071724/HistMacro_FCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_FCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_Matching_FCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_Matching_FCAL.png" height="200"/></a><br />
--><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FCAL Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
Is used for neutral particle detection and pion identification.<br />
* '''Check Occupancy''':<br />
<br />
</div><br />
</div><br />
<br />
===FDC===<br />
* Check Package 1 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P1_pseudo_occupancy.png link] ]<br />
* Check Package 2 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P2_pseudo_occupancy.png link] ]<br />
* Check Package 3 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P3_pseudo_occupancy.png link] ]<br />
* Check Package 4 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P4_pseudo_occupancy.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FDC Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P1_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P1_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P2_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P2_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P3_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P3_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P4_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/FDC_P4_pseudo_occupancy.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FDC Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
There are two HV sectors, in Package 2 cell 6 (28 wires) and Package 3 cell 4 (20 wires), that are always OFF and seen in the occupancy plots as empty sectors. There are also strips with lower or no efficiency that are always there, mostly in Package 3 and 4 (see the reference plots), which also normal. What is not normal are groups of wires (of the order of 8 to 24 wires) that are noisy. They will show as brighter stripes in the occupancy. The problem is that they may lock the F1TDCs. This happened several times in the past years. In general, look for groups of channels that are overactive or have lower efficiency.<br />
<br/><br/><br />
The reference plots show pseudohits, generated from the track reconstruction. If you find an abnormality, it would be helpful to check 2 more histograms to find the underlying cause - 'FDC Hit Occupancy' will show if any channels are missing, and 'HLDT Drift Chamber Timing' (bottom right plot) will show TDC time-shifts.<br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===PS===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/PS_occupancy.png link] ]<br />
* Check Timing Alignment - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_PSTimingAlignment.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''PS Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/PS_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/PS_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_PSTimingAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_PSTimingAlignment.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''PS Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===SC===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/ST_occupancy.png link] ]<br />
* Check Recon. SC 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_SCReconstruction_p1.png link] ]<br />
* Check Recon. SC 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_SCReconstruction_p2.png link] ]<br />
* Check Recon. SC Matching - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_SC.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''SC Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/ST_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_SCReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_SCReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_SCReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_SCReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_SC.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_SC.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''SC Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===TAGH===<br />
* Check Tagger Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TAGGER_occupancy.png link] ]<br />
* Check TAGH Hits 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TAGH_hit2.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGH Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TAGGER_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TAGGER_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TAGH_hit2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TAGH_hit2.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGH Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===TAGM===<br />
* Check Timing ADC-RF - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__TAGM_TW_adc_rf_all.png link] ]<br />
* Check Timing T-ADC - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__TAGM_TW_t_adc_all.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGM Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__TAGM_TW_adc_rf_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__TAGM_TW_adc_rf_all.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__TAGM_TW_t_adc_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/__TAGM_TW_t_adc_all.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGM Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===TOF===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TOF_occupancy.png link] ]<br />
* Check TOF Matching 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_TOF.png link] ]<br />
* Check TOF Matching 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_TOF2.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TOF Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TOF_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/TOF_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_TOF.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_TOF.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_TOF2.png" ><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Matching_TOF2.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TOF Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===RF===<br />
* Check timing offsets - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_RF_p2.png link] ]<br />
** Should be centered around zero<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''RF Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_RF_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_RF_p2.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
===Timing===<br />
* Check HLDT Calorimeter Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_CalorimeterTiming.png link] ]<br />
* Check HLDT Drift Chamber Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TrackingTiming.png link] ]<br />
* Check HLDT PID System Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_PIDSystemTiming.png link] ]<br />
* Check HLDT Tagger Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerTiming.png link] ]<br />
* Check HLDT Tagger/RF Align 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerRFAlignment2.png link] ]<br />
* Check HLDT Tagger/SC Align - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerSCAlignment.png link] ]<br />
* Check HLDT Track-Matched Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TrackMatchedTiming.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Timing Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_CalorimeterTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_CalorimeterTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TrackingTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TrackingTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_PIDSystemTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_PIDSystemTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerRFAlignment2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerRFAlignment2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerSCAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TaggerSCAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TrackMatchedTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_TrackMatchedTiming.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Timing Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
* Calorimeter Timing - The right two plots aren't aligned at zero because not all corrections are currently applied. If there is a 32 ns shift in part of this data, please note this.<br />
* Drift Chamber Timing - In each case, the main peaks should line up at zero, but often have other structures. Ignore the first few bins of the lower left plot (they mostly say something about the noise in the detector). There can be 32 ns shifts in the lower right plot.<br />
* PID System Timing - Nothing to note yet.<br />
* Tagger Timing - The signal to background levels of the left two plots depend on the electron beam current.<br />
* Track Matched Timing - Some overlap here with the tracking timing. The new plots should be centered at zero.<br />
* Tagger/RF Timing - Look for the nice "picket fences" on the right two plots, and that in the bottom left plot each channel peaks at zero.<br />
* Tagger/SC Timing - Should be similar to Tagger/RF Timing but with larger resolution.<br />
<br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===Analysis===<br />
* Tracking 1 - [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Tracking_p1.png link] ]<br />
* Tracking 3 - [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Tracking_p3.png link] ]<br />
* Check BCAL pi0 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_inv_mass.png link] ]<br />
* Check BCAL/FCAL pi0 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_fcal_inv_mass.png link] ]<br />
* Check p+2pi - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_p2pi.png link] ]<br />
* Check p+3pi - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_p3pi.png link] ]<br />
* Check p+pi0g - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_ppi0gamma.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Analysis Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Tracking_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_Tracking_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_Tracking_p3.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_Tracking_p3.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/bcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/bcal_fcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/bcal_fcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_p2pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_p2pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_p3pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_p3pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2023-01/mon_ver07/Run120888/HistMacro_ppi0gamma.png"><img src="/work/halld2/data_monitoring/RunPeriod-2019-11/mon_ver08/Run071724/HistMacro_ppi0gamma.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Analysis Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
Generally in these plots, there will be a difference between diamond and amorphous radiator running. Should probably add some references for non-diamond plots.<br />
<br />
* Tracking 1 - There should be some mild dependence on beam current and radiator. Note the spikes in the upper right plot are because we have 4 hypotheses fit to a track by default. The lower left plot does have a peak at zero.<br />
* Tracking 3 - All four plot should have the pion band around 2keV/cm. Only the top left one should have an additional banana-shaped band for the protons.<br />
* Check BCAL pi0 - The fitted peak should near at the correct pi0 mass of 135 MeV.<br />
* Check BCAL/FCAL pi0 - The fitted peak should be lower than the correct pi0 mass, I think because the wrong vertex is used.<br />
* Check p+2pi - The top middle plot should have a sin(2phi) shape for diamond runs. Note that the yields in the top right plot vary from run to run on the order of 10-20%. <br />
* Check p+3pi -Note that the yields in the top right plot vary from run to run on the order of 10-20%. <br />
* Check p+pi0g - Note that the yields in the top right plot vary from run to run on the order of 10-20%. <br />
** Note that these yields are sensitive to the tagger range used! This changes for different beam current settings.<br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=Offline_Monitoring_Data_Validation_PrimEx&diff=124717Offline Monitoring Data Validation PrimEx2024-02-15T21:01:42Z<p>Aaustreg: /* Checklist */</p>
<hr />
<div>This page contains the procedure for checking if PrimEx-&eta; production runs are of good quality and can be used for physics analysis. This page is a work in progress and should not (yet) be used as a reference<br />
<br />
= Run Periods =<br />
<br />
* [[RunPeriod-2019-01 Validation]] (PrimEx-&eta; Phase I)<br />
* [[RunPeriod-2021-08 Validation]] (PrimEx-&eta; Phase II)<br />
* [[RunPeriod-2022-08 Validation]] (PrimEx-&eta; Phase III)<br />
<br />
= Procedure =<br />
<br />
For each production run, do the following:<br />
<br />
* Go to the [https://halldweb.jlab.org/data_monitoring/Plot_Browser.html?RunPeriod=RunPeriod-2022-08&Version=mon_ver10 Offline Run Browser] page.<br />
* Follow the steps outlined in the checklist below.<br />
* Workers should check each plot for their assigned subsystem and leave notes in the corresponding spreadsheet if any significant deviations are seen<br />
* On the spreadsheet, enter "Y" in the "Overall Quality" field if all monitoring histograms are acceptable, otherwise enter "N"<br />
* We will iterate this procedure until the process converges <br />
<br />
=== Expert Actions ===<br />
<br />
* Certify that each subsystem is okay<br />
* Set run status in RCDB based on monitoring results<br />
** (script provided)<br />
<br />
=== Run Statuses ===<br />
<br />
* -1 - unchecked<br />
* 0 - rejected (not physics-quality)<br />
* 1 - approved<br />
* 2 - approved long/"mode 8" data<br />
* 3 - calibration / systematic studies<br />
<br />
= Checklist = <br />
<br />
[https://docs.google.com/spreadsheets/d/16BolnC0BXGySv3bZpYzzp8kYxaagdYyAUkw-jC6Y-l8/edit?usp=sharing Example Monitoring Spreadsheet (template, incomplete)]<br />
<br />
=== General Notes ===<br />
* Reference runs are listed for each target type<br />
** Be Empty: 110453 (left out of monitoring launches 01-11)<br />
** Be Full: 110551<br />
** He Empty: 111917<br />
** He Full: 111884<br />
<br />
===TEMP: To Do===<br />
Experts should update the table as tasks are completed<br />
{| class="wikitable" style="margin:auto"<br />
|+ Instructions status by subdetector<br />
|-<br />
! Subdetector !! Plots !! Instructions !! Expert(s)<br />
|-<br />
| BCAL || Confirm existing ones are relevant || Confirm existing ones are accurate || Mark Dalton, Zisis Papandreou, Igal Jaegle<br />
|-<br />
| CCAL || Select relevant plots || Write instructions for volunteers || Drew Smith<br />
|-<br />
| CDC || Good || Good || Naomi Jarvis<br />
|-<br />
| FCAL || Good? || Good? || Mark Dalton, Malte Albrecht, Igal Jaegle<br />
|-<br />
| FDC || Good || Good || Lubomir Pentchev<br />
|-<br />
| PS || Good || Good || Alex Somov, Olga Cortes<br />
|-<br />
| SC || good, with one example showing issues || done || Beni Zihlmann<br />
|-<br />
| TAGH || Good || Good || Alex Somov, Bo Yu<br />
|-<br />
| TAGM || Good || Good || Richard Jones, Ellie Prather<br />
|-<br />
| TOF || look good || done || Paul Eugenio, Beni Zihlmann<br />
|-<br />
| RF || Good || Good || Sean Dobbs, Beni Zihlmann<br />
|-<br />
| Timing || Good || Good || Sean Dobbs<br />
|-<br />
| Analysis || Confirm existing ones are relevant || Confirm existing ones are accurate || Igal Jaegle<br />
|}<br />
<br />
===BCAL===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_occupancy.png link] ]<br />
* Check Hit Efficiency - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_hist_eff.png link] ]<br />
* Check Recon. BCAL 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_BCALReconstruction_p1.png link] ]<br />
* Check Recon. BCAL 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_BCALReconstruction_p2.png link] ]<br />
* Check BCAL Matching - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_BCAL.png link] ]<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''BCAL Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_hist_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_hist_eff.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_BCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_BCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_BCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_BCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_BCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_BCAL.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_hist_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_hist_eff.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_BCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_BCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_BCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_BCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_BCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_BCAL.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_hist_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_hist_eff.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_BCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_BCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_BCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_BCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_BCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_BCAL.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''BCAL Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
The BCAL is used to measure the energy and time of showers.<br />
The energy is vital for the neutrals. But it’s also used for charged particles to do PID. <br />
<br /><br />
BCAL Matching: These plots are for charged particles that are tracked in the drift chambers and projected to the BCAL. The z position along the BCAL can be calculated using the time difference between upstream and downstream hits and compared to the extrapolated position using the drift chambers. <br /> <br />
BCAL/Track Delta z = (z determined from time of up - down hits in BCAL) - (z determined from the extrapolation of tracks in the drift chambers)<br />
projected BCAL Hit-Z = z determined by extrapolating tracks in the drift chambers to the BCAL.<br />
The match rate is the ratio of (number of hits in the BCAL that match the extrapolation of tracks in the drift chamber) / (number of tracks in the drift chamber that point at the BCAL). <br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===CCAL===<br />
* List of plots to check goes here<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''CCAL Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
links to CCAL plots go here<br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
links to CCAL plots go here<br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
links to CCAL plots go here<br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''CCAL Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
Instructions for monitoring volunteers go here.<br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===CDC===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_occupancy.png link] ]<br />
* Check Time-to-distance - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_CDCTimeToDistance.png link] ]<br />
* Check dE/dx - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_dedx.png link] ]<br />
* Check Efficiency - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_eff.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''CDC Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_CDCTimeToDistance.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_CDCTimeToDistance.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_dedx.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_dedx.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/CDC_eff.png" height="200"/></a><br />
</html><br />
<br />
'''Absent Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/CDC_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/CDC_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/HistMacro_CDCTimeToDistance.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/HistMacro_CDCTimeToDistance.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/CDC_dedx.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/CDC_dedx.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/CDC_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver12/Run110453/CDC_eff.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/CDC_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/CDC_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_CDCTimeToDistance.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_CDCTimeToDistance.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/CDC_dedx.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/CDC_dedx.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/CDC_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/CDC_eff.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/CDC_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/CDC_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_CDCTimeToDistance.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_CDCTimeToDistance.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/CDC_dedx.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/CDC_dedx.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/CDC_eff.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/CDC_eff.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''CDC Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
General notes on empty/absent Be target runs: the statistics are really low. Mark these runs good unless there is something odd or bad that isn't due to lack of statistics. </br> <br />
CDC Occupancy: There should be a uniform decrease in intensity from the center of the detector outward. Random white cells scattered throughout occur when not enough data were collected, eg empty target runs, trigger tests or no beam. Several contiguous white, dark blue or bright yellow cells which don't match the neighboring cells are a problem. <br />
<br/> <br />
Time-to-distance: 𝛿, the change in length of the LOCA caused by the straw deformation, is<br />
plotted against the measured drift time, t drift . The color scale indicates the distance of<br />
closest approach between the track and the wire, obtained from the tracking software.<br />
The red lines are contours of the time-to-distance function for constant drift distances<br />
from 1.5 mm to 8 mm, in steps of 0.5 mm. They should lie over the top of the dark blue contour lines separating the colour blocks. <br />
For the plot of residuals vs drift time, the mean should be less than 15um and the sigma should be less than 150um.<br />
<br/><br />
dE/dx: At 1.5GeV/c the fitted peak mean should be within 1% of 2.02 keV/cm. <br />
<br/><br />
Efficiency: The efficiency should be 0.98 or higher at 0cm DOCA, gradually fall to 0.97 at approximately 0.5mm and then more steeply through 0.9 at approximately 0.64cm. <br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===FCAL===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/__fcal_digOcc2D.png link] ]<br />
<!--* Check FCAL Hits 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_hit_energy.png link] ]<br />
* Check FCAL Hits 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_hit_timing.png link] ]<br />
* Check FCAL Clusters 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_cluster_space.png link] ]<br />
* Check FCAL Recon. 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_FCALReconstruction_p1.png link] ]<br />
* Check FCAL Recon. 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_FCALReconstruction_p2.png link] ]<br />
* Check Recon. FCAL Matching - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_Matching_FCAL.png link] ]<br />
--><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FCAL Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/__fcal_digOcc2D.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/__fcal_digOcc2D.png" height="200"/></a><br />
<!-- <a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_hit_energy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_hit_energy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_hit_timing.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_hit_timing.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_cluster_space.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/fcal_cluster_space.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_FCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_FCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_FCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_FCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_Matching_FCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run110551/HistMacro_Matching_FCAL.png" height="200"/></a><br />
--><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/__fcal_digOcc2D.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/__fcal_digOcc2D.png" height="200"/></a><br />
<!-- <a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/fcal_hit_energy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/fcal_hit_energy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/fcal_hit_timing.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/fcal_hit_timing.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/fcal_cluster_space.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/fcal_cluster_space.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/HistMacro_FCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/HistMacro_FCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/HistMacro_FCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/HistMacro_FCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/HistMacro_Matching_FCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111884/HistMacro_Matching_FCAL.png" height="200"/></a><br />
--><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/__fcal_digOcc2D.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/__fcal_digOcc2D.png" height="200"/></a><br />
<!-- <a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/fcal_hit_energy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/fcal_hit_energy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/fcal_hit_timing.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/fcal_hit_timing.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/fcal_cluster_space.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/fcal_cluster_space.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/HistMacro_FCALReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/HistMacro_FCALReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/HistMacro_FCALReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/HistMacro_FCALReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/HistMacro_Matching_FCAL.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver11/Run111917/HistMacro_Matching_FCAL.png" height="200"/></a><br />
--><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FCAL Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
Is used for neutral particle detection and pion identification.<br /><br /><br />
<br />
Check Occupancy:<br /><br />
For monitoring purposes, the occupancy of the detector should be checked for every run once - so this is only needed for the first ever monitoring launch per run period.<br />
The goal is to find any blocks that do not deliver a signal for each run, these must be made dead channels in the Monte Carlo simulation for that specific run. Watch out for individual blocks as well as groups of 16 channels in a 4x4 orientation, which indicates a faulty fADC.<br /><br /><br />
<br />
Other quantities of interest are the location of the pi0 peak in the two-photon invariant mass, the location and width of the timing peak as well as ensuring good quality of the charged tracks matching with hits in the FCAL. However - all these quantities will be checked in an automated fashion in the future. Outliers then should be inspected carefully - which involves the following sets of plots in addition to the occupancy:<br /><br /><br />
<!--<br />
Check FCAL Hits 1<br /><br />
Check FCAL Hits 2<br /><br />
Check FCAL Clusters 1<br /><br />
Check FCAL Recon. 1<br /><br />
Check FCAL Recon. 2<br /><br />
Check Recon. FCAL Matching<br /><br />
<br /><br />
--><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===FDC===<br />
* Check Package 1 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P1_pseudo_occupancy.png link] ]<br />
* Check Package 2 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P2_pseudo_occupancy.png link] ]<br />
* Check Package 3 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P3_pseudo_occupancy.png link] ]<br />
* Check Package 4 Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P4_pseudo_occupancy.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FDC Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P1_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P1_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P2_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P2_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P3_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P3_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P4_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/FDC_P4_pseudo_occupancy.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P1_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P1_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P2_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P2_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P3_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P3_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P4_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/FDC_P4_pseudo_occupancy.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P1_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P1_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P2_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P2_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P3_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P3_pseudo_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P4_pseudo_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/FDC_P4_pseudo_occupancy.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''FDC Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
There are two HV sectors, in Package 2 cell 6 (28 wires) and Package 3 cell 4 (20 wires), that are always OFF and seen in the occupancy plots as empty sectors. There are also strips with lower or no efficiency that are always there, mostly in Package 3 and 4 (see the reference plots), which also normal. What is not normal are groups of wires (of the order of 8 to 24 wires) that are noisy. They will show as brighter stripes in the occupancy. The problem is that they may lock the F1TDCs. This happened several times in the past years. In general, look for groups of channels that are overactive or have lower efficiency.<br />
<p><br />
The reference plots show pseudohits, generated from the track reconstruction. If you find an abnormality, it would be helpful to check 2 more histograms to find the underlying cause - 'FDC Hit Occupancy' will show if any channels are missing, and 'HLDT Drift Chamber Timing' (bottom right plot) will show TDC time-shifts. <br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===PS===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/PS_occupancy.png link] ]<br />
* Check Timing Alignment - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_PSTimingAlignment.png link] ]<br />
* Check PS Pair Energy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__PSPair_PSC_PS_PS_E.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''PS Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/PS_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/PS_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_PSTimingAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_PSTimingAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__PSPair_PSC_PS_PS_E.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__PSPair_PSC_PS_PS_E.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/PS_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/PS_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_PSTimingAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_PSTimingAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/__PSPair_PSC_PS_PS_E.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/__PSPair_PSC_PS_PS_E.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/PS_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/PS_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_PSTimingAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_PSTimingAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/__PSPair_PSC_PS_PS_E.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/__PSPair_PSC_PS_PS_E.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''PS Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<b>PS Occupancy:</b> PS Occupancy (bottom) should be fairly flat with a couple bad channels. PSC Occupancy (top) should have similar rates in TDC and ADC, with the same shape as the reference histogram.<br><br />
<br />
<b>PS Timing:</b> All plots should be centered at zero. The right column reflect the tagger energy, the bottom right is empty (should be updated?).<br><br />
<br />
<b>PS Pair Energy:</b> Should have similar triangle-like shape as the reference.<br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===SC===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/ST_occupancy.png link] ]<br />
* Check Recon. SC 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_SCReconstruction_p1.png link] ]<br />
* Check Recon. SC 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_SCReconstruction_p2.png link] ]<br />
* Check Recon. SC Matching - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_SC.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''SC Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/ST_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_SCReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_SCReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_SCReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_SCReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_SC.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_SC.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/ST_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_SCReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_SCReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_SCReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_SCReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_SC.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_SC.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/ST_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_SCReconstruction_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_SCReconstruction_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_SCReconstruction_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_SCReconstruction_p2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_SC.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_SC.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''SC Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<b>Occupancy plots:</b> look for gaps indicative of missing counters. Reference plot for Be target is an example of some missing counters, other targets show all counters active<br><br />
<b>SC1:</b> look for gaps in 2D histogram (missing counters), top middle dE/dx protons are seen (bending curve downwards between 0 and 1) <br><br />
<b>SC2:</b> 1D histograms of time differences (bottom row) need to peak at zero <br><br />
<b>SC Matching:</b> track SC match rate (bottom middle 1d hist) flat-ish curve above 80% <br><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===TAGH===<br />
* Check Tagger Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TAGGER_occupancy.png link] ]<br />
* Check TAGH Hits 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TAGH_hit2.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGH Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TAGGER_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TAGGER_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TAGH_hit2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TAGH_hit2.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/TAGGER_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/TAGGER_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/TAGH_hit2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/TAGH_hit2.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/TAGGER_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/TAGGER_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/TAGH_hit2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/TAGH_hit2.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGH Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<b>Tagger occupancy:</b> TAGM - Generally the fADC and TDC occupancies should be similar and mostly flat, with maybe a small increase in rates with column number. There can be a small step in the TDC occupancy. TAGH - expect the choppy pattern in the reference image, which reflects the varying size of the different counters, and a steep increase at large counter number.<br><br />
<br />
<b>TAGH Hits 2:</b> This plot is complicated - the main thing to look for is the time(TDC)-time(ADC) vs. channel plot to be centered around zero. Keep an eye out for any extra or unusual dead channels.<br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===TAGM===<br />
* Check Timing ADC-RF - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__TAGM_TW_adc_rf_all.png link] ]<br />
* Check Timing T-ADC - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__TAGM_TW_t_adc_all.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGM Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__TAGM_TW_adc_rf_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__TAGM_TW_adc_rf_all.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__TAGM_TW_t_adc_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/__TAGM_TW_t_adc_all.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/__TAGM_TW_adc_rf_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/__TAGM_TW_adc_rf_all.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/__TAGM_TW_t_adc_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/__TAGM_TW_t_adc_all.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/__TAGM_TW_adc_rf_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/__TAGM_TW_adc_rf_all.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/__TAGM_TW_t_adc_all.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/__TAGM_TW_t_adc_all.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TAGM Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
Generally both distributions should be centered near zero. There is some variation in intensity due to the shape of the photon beam energy dependence (coherent peak) and the inefficiency of some of the channels.<br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===TOF===<br />
* Check Occupancy - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TOF_occupancy.png link] ]<br />
* Check TOF Matching 1 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_TOF.png link] ]<br />
* Check TOF Matching 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_TOF2.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TOF Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TOF_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/TOF_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_TOF.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_TOF.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_TOF2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Matching_TOF2.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/TOF_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/TOF_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_TOF.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_TOF.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_TOF2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Matching_TOF2.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/TOF_occupancy.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/TOF_occupancy.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_TOF.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_TOF.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_TOF2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Matching_TOF2.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''TOF Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
<b>Occupancy plot:</b> look for "gaps" indicative of missing counters<br><br />
<b>Matching TOF Hits 1 (Time-Based Tracks):</b> X and Y (rightmost plots) distributions should be centered close to zero (vertical)<br><br />
<b>TOF matching rate (1D histograms):</b> There should be regions of larger than 80%, gaps in 2D histograms are indicative of wrong timing calibrations between ADC and TDC<br><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===RF===<br />
* Check Timing Offsets - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_RF_p2.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''RF Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_RF_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_RF_p2.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_RF_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_RF_p2.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_RF_p2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_RF_p2.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''RF Instructions'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
<html><br />
Should be centered around zero.<br />
</html><br />
</div><br />
</div><br />
<br />
===Timing===<br />
* Check HLDT Calorimeter Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_CalorimeterTiming.png link] ]<br />
* Check HLDT Drift Chamber Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TrackingTiming.png link] ]<br />
* Check HLDT PID System Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_PIDSystemTiming.png link] ]<br />
* Check HLDT Tagger Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerTiming.png link] ]<br />
* Check HLDT Tagger/RF Align 2 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerRFAlignment2.png link] ]<br />
* Check HLDT Tagger/SC Align - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerSCAlignment.png link] ]<br />
* Check HLDT Track-Matched Timing - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TrackMatchedTiming.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Timing Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_CalorimeterTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_CalorimeterTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TrackingTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TrackingTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_PIDSystemTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_PIDSystemTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerRFAlignment2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerRFAlignment2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerSCAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TaggerSCAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TrackMatchedTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_TrackMatchedTiming.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_CalorimeterTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_CalorimeterTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TrackingTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TrackingTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_PIDSystemTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_PIDSystemTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TaggerTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TaggerTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TaggerRFAlignment2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TaggerRFAlignment2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TaggerSCAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TaggerSCAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TrackMatchedTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_TrackMatchedTiming.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_CalorimeterTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_CalorimeterTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TrackingTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TrackingTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_PIDSystemTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_PIDSystemTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TaggerTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TaggerTiming.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TaggerRFAlignment2.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TaggerRFAlignment2.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TaggerSCAlignment.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TaggerSCAlignment.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_TrackMatchedTiming.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111197/HistMacro_TrackMatchedTiming.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Timing Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
Note that the relative size of peaks can change between different running conditions.<br />
<br />
* Calorimeter Timing - The right two plots aren't aligned at zero because not all corrections are currently applied. If there is a 32 ns shift in part of this data, please note this.<br />
* Drift Chamber Timing - In each case, the main peaks should line up at zero, but often have other structures. Ignore the first few bins of the lower left plot (they mostly say something about the noise in the detector). There can be 32 ns shifts in the lower right plot.<br />
* PID System Timing - Nothing to note yet.<br />
* Tagger Timing - The signal to background levels of the left two plots depend on the electron beam current.<br />
* Track Matched Timing - Some overlap here with the tracking timing. The new plots should be centered at zero.<br />
* Tagger/RF Timing - Look for the nice "picket fences" on the right two plots, and that in the bottom left plot each channel peaks at zero.<br />
* Tagger/SC Timing - Should be similar to Tagger/RF Timing but with larger resolution.<br />
<br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div><br />
<br />
===Analysis===<br />
* Tracking 1 - [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Tracking_p1.png link] ]<br />
* Tracking 3 - [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Tracking_p3.png link] ]<br />
* Check BCAL pi0 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_inv_mass.png link] ]<br />
* Check BCAL/FCAL pi0 - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_fcal_inv_mass.png link] ]<br />
* Check p+2pi - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_p2pi.png link] ]<br />
* Check p+3pi - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_p3pi.png link] ]<br />
* Check p+pi0g - Reference: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_ppi0gamma.png link] ]<br />
* Check CCAL Comp - Refernece: [ [https://halldweb.jlab.org/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver13/Run112001/ccal_comp.png link] ]<br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Analysis Reference Plots'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
'''Beryllium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Tracking_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Tracking_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Tracking_p3.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_Tracking_p3.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_fcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/bcal_fcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_p2pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_p2pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_p3pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_p3pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_ppi0gamma.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run110551/HistMacro_ppi0gamma.png" height="200"/></a><br />
</html><br />
<br />
'''Full Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Tracking_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Tracking_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Tracking_p3.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_Tracking_p3.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_fcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/bcal_fcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_p2pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_p2pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_p3pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_p3pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_ppi0gamma.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111884/HistMacro_ppi0gamma.png" height="200"/></a><br />
</html><br />
<br />
'''Empty Liquid Helium Target'''<br />
<br />
<html><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Tracking_p1.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Tracking_p1.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Tracking_p3.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_Tracking_p3.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_fcal_inv_mass.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/bcal_fcal_inv_mass.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_p2pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_p2pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_p3pi.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_p3pi.png" height="200"/></a><br />
<a href="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_ppi0gamma.png"><img src="/work/halld2/data_monitoring/RunPeriod-2022-08/mon_ver10/Run111917/HistMacro_ppi0gamma.png" height="200"/></a><br />
</html><br />
</div><br />
</div><br />
<br />
<div class="toccolours mw-collapsible mw-collapsed"><br />
<br />
'''Analysis Notes'''<br />
<br />
<div class="mw-collapsible-content"><br />
<br />
Generally in these plots, there will be a difference between diamond and amorphous radiator running. Should probably add some references for non-diamond plots.<br />
<br />
* Tracking 1 - There should be some mild dependence on beam current and radiator. Note the spikes in the upper right plot are because we have 4 hypotheses fit to a track by default. The lower left plot does have a peak at zero.<br />
* Tracking 3 - <br />
* Check BCAL pi0 - The fitted peak should near at the correct pi0 mass of 135 MeV.<br />
* Check BCAL/FCAL pi0 - The fitted peak should be lower than the correct pi0 mass, I think because the wrong vertex is used.<br />
* Check p+2pi - The top middle plot should have a sin(2phi) shape for diamond runs. Note that the yields in the top right plot vary from run to run on the order of 10-20%. <br />
* Check p+3pi -Note that the yields in the top right plot vary from run to run on the order of 10-20%. <br />
* Check p+pi0g - Note that the yields in the top right plot vary from run to run on the order of 10-20%. <br />
** Note that these yields are sensitive to the tagger range used! This changes for different beam current settings.<br />
<br />
<html><br />
</html><br />
<br />
</div><br />
</div></div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX-Collaboration-Feb-2024&diff=124680GlueX-Collaboration-Feb-20242024-02-13T18:53:07Z<p>Aaustreg: /* Thursday, February 22 */</p>
<hr />
<div>== GlueX Winter 2024 Collaboration Meeting ==<br />
<br />
<font size="+1">February 21-23, 2024 Hybrid </font><br />
<br />
The GlueX Collaboration Meeting will be held jointly online and in person at Florida State University.<br />
<br />
To connect to the meeting, please go to [https://halldweb.jlab.org/wiki-private/index.php/Connect_to_ZoomGov_Meetings the listing of Zoom meetings on the private Wiki] and select the GlueX/Hall D Biweekly Meeting link at the top of the list.<br />
<br />
== Registration ==<br />
<br />
Please register for the meeting [https://forms.gle/FPo8thHY4j6THMFm9 using this link]. Note that there is a $30 registration fee for in-person participants to offset the cost of refreshments for breaks.<br />
<br />
== Travel Logistics ==<br />
<br />
The in-person session will be hosted at Florida State University. It is recommended to make travel arrangements well in advance. The meeting is during the Florida state legislative session which can place some demand on travel infrastructure.<br />
<br />
* [http://hadron.physics.fsu.edu/~sdobbs/suggested_hotels.html Suggested Hotels & Other Logistics]<br />
<br />
== Special Events ==<br />
<br />
The meeting will be immediately preceded by a [[Workshop on Polarized Target Studies with Real Photons in Hall D]], which will take place on Wednesday morning.<br />
<br />
== Agenda ==<br />
<br />
<br />
<br />
=== Wednesday, February 21 ===<br />
<br />
* '''Session I: Overview and Collaboration Business''' (Chair: Justin Stevens)<br />
** 13:30 (10+5): Opening Remarks (Matt Shepherd)<br />
** 13:45 (20+5): Hall D Status Report (Eugene Chudakov)<br />
** 14:10 (15+5): Report from the Collaboration Board (Naomi Jarvis)<br />
** 14:30 (20+5): Status of Charged Pion Polarizability Reconstruction and Analysis (Albert Fabrizi)<br />
** 14:55 (20+5): Summary of Production, Monitoring, and Calibration Activities (Naomi Jarvis)<br />
<br />
*''15:25 - 15:55: Coffee Break''<br />
<br />
* '''Session II: Engineering and FCAL2''' (Chair: )<br />
** 15:50 (20+5): Hall D Engineering Overview (Tim Whitlatch) <br />
** 16:15 (20+5): FCAL Assembly (Malte Albrecht)<br />
** 16:40 (20+5): ECAL Assembly (Sasha Somov)<br />
** 17:05 (20+5): FCAL2 Software and Analysis (Simon Taylor)<br />
** 17:30 (15+5): Tracking and Alignment Update (Simon Taylor)<br />
<br />
=== Thursday, February 22 ===<br />
<br />
* '''Session I: Software and Analysis''' (Chair: )<br />
** 9:00 (20+5): Software Report / Summary of SciComp Review (Alex Austregesilo)<br />
** 9:25 (20+5): Using AI in Analysis (Daniel Lersch)<br />
** 9:50 (20+5): Status of Hydra (Thomas Britton)<br />
** 10:15 (20+5): Report from the Physics Coordinator (Sean Dobbs)<br />
<br />
* ''10:40 - 11:10: Coffee Break''<br />
<br />
* '''Session II: Cross Sections''' (Chair: )<br />
** 11:10 (15+5): &eta; Cross Section (Jon Zarling)<br />
** 11:30 (20+5): &Lambda;(1405) Overview (Reinhard Schumacher)<br />
** 11:55 (20+5): &Xi; Analyses (Jesse Hernandez/Volker Crede)<br />
<br />
* ''12:20 - 13:30: Lunch''<br />
<br />
* '''Session III: Amplitude Analysis''' (Chair: )<br />
** 13:30 (20+5): a<sub>2</sub> Cross section (Malte)<br />
** 13:55 (30+5): Ambiguities in 2 Pseudoscalar and Vector Pseudoscalar (Edmundo Barriga, Jiawei Guo, Kevin Scheuer)<br />
** 14:30 (20+5): Non-parametric PWA for 2 PS (Lawrence)<br />
** 14:55 (20+5): K<sup>+</sup>K<sup>-</sup>&pi;<sup>0</sup> Analysis (Mike Dugger)<br />
<br />
* ''15:20 - 15:40: Coffee Break''<br />
<br />
* '''Session IV: Amplitude Analysis and Beam Asymmetries''' (Chair: )<br />
** 15:40 (20+5): &omega; Dalitz Plot Analysis (Saheli Rakshit/Volker Crede)<br />
** 16:05 (20+5): &eta;' Dalitz Plot Analysis (Olga Cortes)<br />
** 16:30 (20+5): Polarized Moments (Boris)<br />
** 15:55 (20+5): &eta; Beam Asymmetry (Tolga Erbora)<br />
** 17:20 (20+5): &eta;’ Beam Asymmetry (Churamani Paudel)<br />
<br />
=== Friday, February 23 ===<br />
<br />
* '''Session I: Cross Sections''' (Chair: )<br />
** 09:00 (20+5): Cross Section Systematics (Justin Stevens)<br />
** 09:25 (20+5): f<sub>1</sub>(1285) (Tyler Viducic) <br />
** 09:50 (15+5): f<sub>1</sub>(1420) (Daniel Barton)<br />
** 10:10 (20+5): &phi;&eta; (Darius Darulis)<br />
<br />
* ''10:40 - 11:10: Coffee Break''<br />
<br />
* '''Session II: Amplitude Analysis''' (Chair: )<br />
** 11:00 (20+5): &Delta;<sup>++</sup> SDMEs (Farah Afzal)<br />
** 11:25 (20+5): &omega;&pi;<sup>-</sup>&Delta;<sup>++</sup> (Amy Schertz)<br />
** 11:50 (20+5): &omega;&pi;<sup>0</sup> (Kevin Scheuer)<br />
<br />
* ''12:15 - 13:30: Lunch''<br />
<br />
* '''Session III: Future Plans''' (Chair: )<br />
** 13:30 (25+5): Summary of Polarized Target Workshop and Beam Polarization Proposal (Mark Dalton) <br />
** 14:00 (20+5): Charmonium in future GlueX running and Gravitational Form Factors (Lubomir Pentchev)<br />
** 14:25 (20): [Open/Discussion of GlueX-III Proposal]<br />
<br />
* ''14:45 - 15:15: Coffee Break''<br />
<br />
* '''Session IV: Cross Sections''' (Chair: )<br />
** 15:15 (20+5): J/&psi; Interpretation (Igor Strakovsky)<br />
** 15:40 (20+5): PrimEx 1 (Igal Jagle?)<br />
** 16:05 (20+5): PrimEx 2 (Drew Smith?)<br />
** 16:30: Closing Remarks</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX-Collaboration-Feb-2024&diff=124679GlueX-Collaboration-Feb-20242024-02-13T18:51:13Z<p>Aaustreg: /* Thursday, February 22 */</p>
<hr />
<div>== GlueX Winter 2024 Collaboration Meeting ==<br />
<br />
<font size="+1">February 21-23, 2024 Hybrid </font><br />
<br />
The GlueX Collaboration Meeting will be held jointly online and in person at Florida State University.<br />
<br />
To connect to the meeting, please go to [https://halldweb.jlab.org/wiki-private/index.php/Connect_to_ZoomGov_Meetings the listing of Zoom meetings on the private Wiki] and select the GlueX/Hall D Biweekly Meeting link at the top of the list.<br />
<br />
== Registration ==<br />
<br />
Please register for the meeting [https://forms.gle/FPo8thHY4j6THMFm9 using this link]. Note that there is a $30 registration fee for in-person participants to offset the cost of refreshments for breaks.<br />
<br />
== Travel Logistics ==<br />
<br />
The in-person session will be hosted at Florida State University. It is recommended to make travel arrangements well in advance. The meeting is during the Florida state legislative session which can place some demand on travel infrastructure.<br />
<br />
* [http://hadron.physics.fsu.edu/~sdobbs/suggested_hotels.html Suggested Hotels & Other Logistics]<br />
<br />
== Special Events ==<br />
<br />
The meeting will be immediately preceded by a [[Workshop on Polarized Target Studies with Real Photons in Hall D]], which will take place on Wednesday morning.<br />
<br />
== Agenda ==<br />
<br />
<br />
<br />
=== Wednesday, February 21 ===<br />
<br />
* '''Session I: Overview and Collaboration Business''' (Chair: Justin Stevens)<br />
** 13:30 (10+5): Opening Remarks (Matt Shepherd)<br />
** 13:45 (20+5): Hall D Status Report (Eugene Chudakov)<br />
** 14:10 (15+5): Report from the Collaboration Board (Naomi Jarvis)<br />
** 14:30 (20+5): Status of Charged Pion Polarizability Reconstruction and Analysis (Albert Fabrizi)<br />
** 14:55 (20+5): Summary of Production, Monitoring, and Calibration Activities (Naomi Jarvis)<br />
<br />
*''15:25 - 15:55: Coffee Break''<br />
<br />
* '''Session II: Engineering and FCAL2''' (Chair: )<br />
** 15:50 (20+5): Hall D Engineering Overview (Tim Whitlatch) <br />
** 16:15 (20+5): FCAL Assembly (Malte Albrecht)<br />
** 16:40 (20+5): ECAL Assembly (Sasha Somov)<br />
** 17:05 (20+5): FCAL2 Software and Analysis (Simon Taylor)<br />
** 17:30 (15+5): Tracking and Alignment Update (Simon Taylor)<br />
<br />
=== Thursday, February 22 ===<br />
<br />
* '''Session I: Software and Analysis''' (Chair: )<br />
** 9:00 (20+5): Software Report / Summary of SciComp Review (Alex Austregesilo)<br />
** 9:25 (20+5): Using AI in Analysis (Daniel Lersch)<br />
** 9:50 (20+5): Status of Hydra (Thomas Britton)<br />
** 10:15 (20+5): Report from the Physics Coordinator (Sean Dobbs)<br />
<br />
* ''10:40 - 11:10: Coffee Break''<br />
<br />
* '''Session II: Cross Sections''' (Chair: )<br />
** 11:10 (15+5): &eta; Cross Section (Jon Zarling)<br />
** 11:30 (20+5): &Lambda;(1405) Overview (Reinhard Schumacher)<br />
** 11:55 (20+5): &Xi; Analyses (Jesse Hernandez/Volker Crede)<br />
<br />
* ''12:20 - 13:30: Lunch''<br />
<br />
* '''Session III: Amplitude Analysis''' (Chair: )<br />
** 13:30 (30+5): Ambiguities in 2 Pseudoscalar and Vector Pseudoscalar (Edmundo Barriga, Jiawei Guo, Kevin Scheuer)<br />
** 14:05 (20+5): a<sub>2</sub> Cross section (Malte)<br />
** 14:30 (20+5): Non-parametric PWA for 2 PS (Lawrence)<br />
** 14:55 (20+5): K<sup>+</sup>K<sup>-</sup>&pi;<sup>0</sup> Analysis (Mike Dugger)<br />
<br />
* ''15:20 - 15:40: Coffee Break''<br />
<br />
* '''Session IV: Amplitude Analysis and Beam Asymmetries''' (Chair: )<br />
** 15:40 (20+5): &omega; Dalitz Plot Analysis (Saheli Rakshit/Volker Crede)<br />
** 16:05 (20+5): &eta;' Dalitz Plot Analysis (Olga Cortes)<br />
** 16:30 (20+5): Polarized Moments (Boris)<br />
** 15:55 (20+5): &eta; Beam Asymmetry (Tolga Erbora)<br />
** 17:20 (20+5): &eta;’ Beam Asymmetry (Churamani Paudel)<br />
<br />
=== Friday, February 23 ===<br />
<br />
* '''Session I: Cross Sections''' (Chair: )<br />
** 09:00 (20+5): Cross Section Systematics (Justin Stevens)<br />
** 09:25 (20+5): f<sub>1</sub>(1285) (Tyler Viducic) <br />
** 09:50 (15+5): f<sub>1</sub>(1420) (Daniel Barton)<br />
** 10:10 (20+5): &phi;&eta; (Darius Darulis)<br />
<br />
* ''10:40 - 11:10: Coffee Break''<br />
<br />
* '''Session II: Amplitude Analysis''' (Chair: )<br />
** 11:00 (20+5): &Delta;<sup>++</sup> SDMEs (Farah Afzal)<br />
** 11:25 (20+5): &omega;&pi;<sup>-</sup>&Delta;<sup>++</sup> (Amy Schertz)<br />
** 11:50 (20+5): &omega;&pi;<sup>0</sup> (Kevin Scheuer)<br />
<br />
* ''12:15 - 13:30: Lunch''<br />
<br />
* '''Session III: Future Plans''' (Chair: )<br />
** 13:30 (25+5): Summary of Polarized Target Workshop and Beam Polarization Proposal (Mark Dalton) <br />
** 14:00 (20+5): Charmonium in future GlueX running and Gravitational Form Factors (Lubomir Pentchev)<br />
** 14:25 (20): [Open/Discussion of GlueX-III Proposal]<br />
<br />
* ''14:45 - 15:15: Coffee Break''<br />
<br />
* '''Session IV: Cross Sections''' (Chair: )<br />
** 15:15 (20+5): J/&psi; Interpretation (Igor Strakovsky)<br />
** 15:40 (20+5): PrimEx 1 (Igal Jagle?)<br />
** 16:05 (20+5): PrimEx 2 (Drew Smith?)<br />
** 16:30: Closing Remarks</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX-Collaboration-Feb-2024&diff=124678GlueX-Collaboration-Feb-20242024-02-13T18:48:56Z<p>Aaustreg: /* Thursday, February 22 */</p>
<hr />
<div>== GlueX Winter 2024 Collaboration Meeting ==<br />
<br />
<font size="+1">February 21-23, 2024 Hybrid </font><br />
<br />
The GlueX Collaboration Meeting will be held jointly online and in person at Florida State University.<br />
<br />
To connect to the meeting, please go to [https://halldweb.jlab.org/wiki-private/index.php/Connect_to_ZoomGov_Meetings the listing of Zoom meetings on the private Wiki] and select the GlueX/Hall D Biweekly Meeting link at the top of the list.<br />
<br />
== Registration ==<br />
<br />
Please register for the meeting [https://forms.gle/FPo8thHY4j6THMFm9 using this link]. Note that there is a $30 registration fee for in-person participants to offset the cost of refreshments for breaks.<br />
<br />
== Travel Logistics ==<br />
<br />
The in-person session will be hosted at Florida State University. It is recommended to make travel arrangements well in advance. The meeting is during the Florida state legislative session which can place some demand on travel infrastructure.<br />
<br />
* [http://hadron.physics.fsu.edu/~sdobbs/suggested_hotels.html Suggested Hotels & Other Logistics]<br />
<br />
== Special Events ==<br />
<br />
The meeting will be immediately preceded by a [[Workshop on Polarized Target Studies with Real Photons in Hall D]], which will take place on Wednesday morning.<br />
<br />
== Agenda ==<br />
<br />
<br />
<br />
=== Wednesday, February 21 ===<br />
<br />
* '''Session I: Overview and Collaboration Business''' (Chair: Justin Stevens)<br />
** 13:30 (10+5): Opening Remarks (Matt Shepherd)<br />
** 13:45 (20+5): Hall D Status Report (Eugene Chudakov)<br />
** 14:10 (15+5): Report from the Collaboration Board (Naomi Jarvis)<br />
** 14:30 (20+5): Status of Charged Pion Polarizability Reconstruction and Analysis (Albert Fabrizi)<br />
** 14:55 (20+5): Summary of Production, Monitoring, and Calibration Activities (Naomi Jarvis)<br />
<br />
*''15:25 - 15:55: Coffee Break''<br />
<br />
* '''Session II: Engineering and FCAL2''' (Chair: )<br />
** 15:50 (20+5): Hall D Engineering Overview (Tim Whitlatch) <br />
** 16:15 (20+5): FCAL Assembly (Malte Albrecht)<br />
** 16:40 (20+5): ECAL Assembly (Sasha Somov)<br />
** 17:05 (20+5): FCAL2 Software and Analysis (Simon Taylor)<br />
** 17:30 (15+5): Tracking and Alignment Update (Simon Taylor)<br />
<br />
=== Thursday, February 22 ===<br />
<br />
* '''Session I: Software and Analysis''' (Chair: )<br />
** 9:00 (20+5): Software Report / Summary of SciComp Review (Alex Austregesilo)<br />
** 9:25 (20+5): Using AI in Analysis (Daniel Lersch)<br />
** 9:50 (20+5): Status of Hydra (Thomas Britton)<br />
** 10:15 (20+5): Report from the Physics Coordinator (Sean Dobbs)<br />
<br />
* ''10:40 - 11:10: Coffee Break''<br />
<br />
* '''Session II: Cross Sections''' (Chair: )<br />
** 11:10 (15+5): &eta; Cross Section (Jon Zarling)<br />
** 11:30 (20+5): &Lambda;(1405) Overview (Reinhard Schumacher)<br />
** 11:55 (20+5): &Xi; Analyses (Jesse Hernandez/Volker Crede)<br />
<br />
* ''12:20 - 13:30: Lunch''<br />
<br />
* '''Session III: Amplitude Analysis''' (Chair: )<br />
** 13:30 (20+5): Polarized Moments (Boris)<br />
** 13:55 (20+5): a<sub>2</sub> Cross section (Malte)<br />
** 14:20 (20+5): Non-parametric PWA for 2 PS (Lawrence)<br />
** 14:45 (20+5): K<sup>+</sup>K<sup>-</sup>&pi;<sup>0</sup> Analysis (Mike Dugger)<br />
<br />
* ''15:10 - 15:40: Coffee Break''<br />
<br />
* '''Session IV: Amplitude Analysis and Beam Asymmetries''' (Chair: )<br />
** 15:40 (20+5): &omega; Dalitz Plot Analysis (Saheli Rakshit/Volker Crede)<br />
** 16:05 (20+5): &eta;' Dalitz Plot Analysis (Olga Cortes)<br />
** 16:45 (30+5): Ambiguities in 2 Pseudoscalar and Vector Pseudoscalar (Edmundo Barriga, Jiawei Guo, Kevin Scheuer)<br />
** 17:05 (20+5): &eta; Beam Asymmetry (Tolga Erbora)<br />
** 17:30 (20+5): &eta;’ Beam Asymmetry (Churamani Paudel)<br />
<br />
=== Friday, February 23 ===<br />
<br />
* '''Session I: Cross Sections''' (Chair: )<br />
** 09:00 (20+5): Cross Section Systematics (Justin Stevens)<br />
** 09:25 (20+5): f<sub>1</sub>(1285) (Tyler Viducic) <br />
** 09:50 (15+5): f<sub>1</sub>(1420) (Daniel Barton)<br />
** 10:10 (20+5): &phi;&eta; (Darius Darulis)<br />
<br />
* ''10:40 - 11:10: Coffee Break''<br />
<br />
* '''Session II: Amplitude Analysis''' (Chair: )<br />
** 11:00 (20+5): &Delta;<sup>++</sup> SDMEs (Farah Afzal)<br />
** 11:25 (20+5): &omega;&pi;<sup>-</sup>&Delta;<sup>++</sup> (Amy Schertz)<br />
** 11:50 (20+5): &omega;&pi;<sup>0</sup> (Kevin Scheuer)<br />
<br />
* ''12:15 - 13:30: Lunch''<br />
<br />
* '''Session III: Future Plans''' (Chair: )<br />
** 13:30 (25+5): Summary of Polarized Target Workshop and Beam Polarization Proposal (Mark Dalton) <br />
** 14:00 (20+5): Charmonium in future GlueX running and Gravitational Form Factors (Lubomir Pentchev)<br />
** 14:25 (20): [Open/Discussion of GlueX-III Proposal]<br />
<br />
* ''14:45 - 15:15: Coffee Break''<br />
<br />
* '''Session IV: Cross Sections''' (Chair: )<br />
** 15:15 (20+5): J/&psi; Interpretation (Igor Strakovsky)<br />
** 15:40 (20+5): PrimEx 1 (Igal Jagle?)<br />
** 16:05 (20+5): PrimEx 2 (Drew Smith?)<br />
** 16:30: Closing Remarks</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124662GlueX Software Meeting, February 12, 20242024-02-12T23:16:19Z<p>Aaustreg: /* Action Items */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Incomplete draft]<br />
#* Report from the February SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-02.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2024 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Improve documentation on singularity containers:<br />
#** supply Alma9 container through CVMFS<br />
#** modify batch submission scripts<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#*** Fixed, evaluate effect on analysis trees<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124642GlueX Software Meeting, February 12, 20242024-02-09T18:54:15Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#** [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=6314 Incomplete draft]<br />
#* Report from the February SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-02.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2024 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124641GlueX Software Meeting, February 12, 20242024-02-09T18:53:10Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#* Report from the February SciComp Meeting: [https://halldweb.jlab.org/talks/2024/scicomp/SciOps+ENP%202024-02.pdf slides]<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2024 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124638GlueX Software Meeting, February 12, 20242024-02-09T16:42:55Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#* Report from the February SciComp Meeting<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2024 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124637GlueX Software Meeting, February 12, 20242024-02-09T16:42:33Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#* Report from the February SciComp Meeting<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2023 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124636GlueX Software Meeting, February 12, 20242024-02-09T16:41:59Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#* Report from the February SciComp Meeting<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29_2023 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124635GlueX Software Meeting, February 12, 20242024-02-09T16:41:45Z<p>Aaustreg: /* Agenda */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#* Report from the February SciComp Meeting<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_29,_2023 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124634GlueX Software Meeting, February 12, 20242024-02-09T16:41:26Z<p>Aaustreg: </p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
#* Report from the February SciComp Meeting<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_January_28,_2023 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_February_12,_2024&diff=124633GlueX Software Meeting, February 12, 20242024-02-09T16:38:08Z<p>Aaustreg: Created page with "GlueX Software Meeting<br> Monday, February 14, 2024<br> 11:00 am EDT<br> F326/327<br> <div class="mw-collapsible mw-collapsed"> Zoom Meeting ID: 160 636 9159 Passcode: 88878..."</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, February 14, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* release [https://halldweb.jlab.org/halld_versions/version_5.14.2.xml version_5.14.2.xml]: Successfully deployed on CentOS7 and AlmaLinux9<br />
#** [https://halldweb.jlab.org/halld_versions/version_5.14.0.xml version_5.14.0.xml]: Diracxx did not chose the right compiler on CentOS7<br />
#** [https://halldweb.jlab.org/halld_versions/version_5.14.0.xml version_5.14.1.xml]: CCDB 1.07.00 is very slow, reverted to 1.06.10<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_December_18,_2023 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Offline_Software_Meetings&diff=124632GlueX Offline Software Meetings2024-02-09T16:35:38Z<p>Aaustreg: /* Offline Meetings in 2024 */</p>
<hr />
<div>=Regularly Scheduled Meetings=<br />
<br />
== Offline Meetings in 2024 ==<br />
<br />
* [[GlueX Software Meeting, February 12, 2024 | February 12, 2024]]<br />
* [[GlueX Software Meeting, January 29, 2024 | January 29, 2024]]<br />
<br />
== Offline Meetings in 2023 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 18, 2023 | December 18, 2023]]<br />
* [[GlueX Software Meeting, November 20, 2023 | November 20, 2023]]<br />
* [[GlueX Software Meeting, November 6, 2023 | November 6, 2023]]<br />
* [[GlueX Software Meeting, October 23, 2023 | October 23, 2023]]<br />
* [[GlueX Software Meeting, October 9, 2023 | October 9, 2023]]<br />
* [[GlueX Software Meeting, September 11, 2023 | September 11, 2023]]<br />
|<br />
* [[GlueX Software Meeting, August 28, 2023 | August 28, 2023]]<br />
* [[GlueX Software Meeting, August 14, 2023 | August 14, 2023]]<br />
* [[GlueX Software Meeting, March 27, 2023 | March 27, 2023]]<br />
* [[GlueX Software Meeting, March 13, 2023 | March 13, 2023]]<br />
* [[GlueX Software Meeting, January 30, 2023 | January 30, 2023]]<br />
|}<br />
<br />
== Offline Meetings in 2022 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 19, 2022 | December 19, 2022]]<br />
* [[GlueX Software Meeting, November 7, 2022 | November 7, 2022]]<br />
* [[GlueX Software Meeting, October 10, 2022 | October 10, 2022]]<br />
* [[GlueX Software Meeting, August 29, 2022 | August 29, 2022]]<br />
* [[GlueX Software Meeting, August 15, 2022 | August 15, 2022]]<br />
* [[GlueX Software Meeting, July 18, 2022 | July 18, 2022]]<br />
|<br />
* [[GlueX Software Meeting, May 9, 2022 | May 9, 2022]]<br />
* [[GlueX Software Meeting, April 27, 2022 | April 27, 2022]]<br />
* [[GlueX Software Meeting, April 13, 2022 | April 13, 2022]]<br />
* [[GlueX Software Meeting, June 6, 2022 | June 6, 2022]]<br />
* [[GlueX Software Meeting, March 30, 2022 | March 30, 2022]]<br />
|<br />
* [[GlueX Software Meeting, March 16, 2022 | March 16, 2022]]<br />
* [[GlueX Software Meeting, March 2, 2022 | March 2, 2022]]<br />
* [[GlueX Software Meeting, February 16, 2022 | February 16, 2022]]<br />
* [[GlueX Software Meeting, February 2, 2022 | February 2, 2022]]<br />
* [[GlueX Software Meeting, January 18, 2022 | January 18, 2022]]<br />
|}<br />
<br />
== Offline Meetings in 2021 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 20, 2021 | December 20, 2021]]<br />
* [[GlueX Software Meeting, December 6, 2021 | December 6, 2021]]<br />
* [[GlueX Software Meeting, November 8, 2021, 2021 | November 8, 2021]]<br />
* [[GlueX Software Meeting, October 25, 2021, 2021 | October 25, 2021]]<br />
|<br />
* [[GlueX Software Meeting, October 11, 2021, 2021 | October 11, 2021]]<br />
* [[GlueX Software Meeting, September 27, 2021, 2021 | September 27, 2021]]<br />
* [[GlueX Software Meeting, August 31, 2021 | August 31, 2021]]<br />
* [[GlueX Software Meeting, August 17, 2021 | August 17, 2021]]<br />
* [[GlueX Software Meeting, July 20, 2021 | July 20, 2021]]<br />
|<br />
* [[GlueX Software Meeting, July 6, 2021 | July 6, 2021]]<br />
* [[GlueX Software Meeting, June 22, 2021 | June 22, 2021]]<br />
* [[GlueX Software Meeting, May 11, 2021 | May 11, 2021]]<br />
* [[GlueX Software Meeting, April 28, 2021 | April 28, 2021]]<br />
* [[GlueX Software Meeting, March 30, 2021 | March 30, 2021]]<br />
|<br />
* [[GlueX Software Meeting, March 16, 2021 | March 16, 2021]]<br />
* [[GlueX Software Meeting, March 2, 2021 | March 2, 2021]]<br />
* [[GlueX Software Meeting, February 2, 2021 | February 2, 2021]]<br />
* [[GlueX Software Meeting, January 19, 2021 | January 19, 2021]]<br />
* [[GlueX Software Meeting, January 5, 2021 | January 5, 2021]]<br />
|}<br />
<br />
== Offline Meetings in 2020 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 8, 2020 | December 8, 2020]]<br />
* [[GlueX Software Meeting, November 24, 2020 | November 24, 2020]]<br />
* [[GlueX Software Meeting, November 10, 2020 | November 10, 2020]]<br />
* [[GlueX Software Meeting, October 13, 2020 | October 13, 2020]]<br />
|<br />
* [[GlueX Software Meeting, September 29, 2020 | September 29, 2020]]<br />
* [[GlueX Software Meeting, September 15, 2020 | September 15, 2020]]<br />
* [[GlueX Software Meeting, September 1, 2020 | September 1, 2020]]<br />
* [[GlueX Software Meeting, August 18, 2020 | August 18, 2020]]<br />
* [[GlueX Software Meeting, August 4, 2020 | August 4, 2020]]<br />
* [[GlueX Software Meeting, July 21, 2020 | July 21, 2020]]<br />
|<br />
* [[GlueX Software Meeting, July 7, 2020 | July 7, 2020]]<br />
* [[GlueX Software Meeting, June 9, 2020 | June 9, 2020]]<br />
* [[GlueX Software Meeting, May 26, 2020 | May 26, 2020]]<br />
* [[GlueX Software Meeting, April 28, 2020 | April 28, 2020]]<br />
* [[GlueX Software Meeting, April 14, 2020 | April 14, 2020]]<br />
* [[GlueX Software Meeting, March 31, 2020 | March 31, 2020]]<br />
|<br />
* [[GlueX Software Meeting, March 17, 2020 | March 17, 2020]]<br />
* [[GlueX Software Meeting, March 3, 2020 | March 3, 2020]]<br />
* [[GlueX Software Meeting, February 18, 2020 | February 18, 2020]]<br />
* [[GlueX Software Meeting, February 4, 2020 | February 4, 2020]] <br />
* [[GlueX Software Meeting, January 21, 2020|January 21, 2020]]<br />
* [[GlueX Software Meeting, January 7, 2020|January 7, 2020]]<br />
|}<br />
<br />
== Offline Meetings in 2019 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 10, 2019|December 10, 2019]]<br />
* [[GlueX Software Meeting, November 26, 2019|November 26, 2019]]<br />
* [[GlueX Software Meeting, November 12, 2019|November 12, 2019]]<br />
* [[GlueX Software Meeting, October 29, 2019|October 29, 2019]]<br />
* [[GlueX Software Meeting, October 15, 2019|October 15, 2019]]<br />
|<br />
* [[GlueX Software Meeting, September 17, 2019|September 17, 2019]]<br />
* [[GlueX Software Meeting, September 3, 2019|September 3, 2019]]<br />
* [[GlueX Software Meeting, August 20, 2019|August 20, 2019]]<br />
* [[GlueX Software Meeting, August 6, 2019|August 6, 2019]]<br />
* [[GlueX Software Meeting, July 23, 2019|July 23, 2019]]<br />
|<br />
* [[GlueX Software Meeting, July 9, 2019|July 9, 2019]]<br />
* [[GlueX Software Meeting, June 25, 2019|June 25, 2019]]<br />
* [[GlueX Software Meeting, June 11, 2019|June 11, 2019]]<br />
* [[GlueX Software Meeting, May 28, 2019|May 28, 2019]]<br />
* [[GlueX Software Meeting, April 30, 2019|April 30, 2019]]<br />
|<br />
* [[GlueX Software Meeting, April 16, 2019|April 16, 2019]]<br />
* [[GlueX Software Meeting, March 5, 2019|March 5, 2019]]<br />
* [[GlueX Software Meeting, February 5, 2019|February 5, 2019]]<br />
* [[GlueX Software Meeting, January 22, 2019|January 22, 2019]]<br />
* [[GlueX Software Meeting, January 8, 2019|January 8, 2019]]<br />
|}<br />
<br />
== Offline Meetings in 2018 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Software Meeting, December 11, 2018|December 11, 2018]]<br />
* [[GlueX Software Meeting, November 13, 2018|November 13, 2018]]<br />
* [[GlueX Software Meeting, October 30, 2018|October 30, 2018]]<br />
* [[GlueX Offline Software Meeting, October 16, 2018|October 16, 2018]]<br />
* [[GlueX Offline Software Meeting, October 2, 2018|October 2, 2018]]<br />
* [[GlueX Offline Meeting, September 18, 2018|September 18, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, September 4, 2018|September 4, 2018]]<br />
* [[GlueX Offline Meeting, August 21, 2018|August 21, 2018]]<br />
* [[GlueX Offline Meeting, August 7, 2018|August 7, 2018]]<br />
* [[GlueX Offline Meeting, July 24, 2018|July 24, 2018]]<br />
* [[GlueX Offline Meeting, July 13, 2018|July 13, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, June 29, 2018|June 29, 2018]]<br />
* [[GlueX Offline Meeting, June 15, 2018|June 15, 2018]]<br />
* [[GlueX Offline Meeting, June 1, 2018|June 1, 2018]]<br />
* [[GlueX Offline Meeting, May 18, 2018|May 18, 2018]]<br />
* [[GlueX Offline Meeting, May 4, 2018|May 4, 2018]]<br />
|<br />
* [[GlueX Offline Meeting, April 6, 2018|April 6, 2018]]<br />
* [[GlueX Offline Meeting, March 9, 2018|March 9, 2018]]<br />
* [[GlueX Offline Meeting, February 9, 2018|February 9, 2018]]<br />
* [[GlueX Offline Meeting, January 26, 2018|January 26, 2018]]<br />
* [[GlueX Offline Meeting, January 10, 2018|January 10, 2018]]<br />
|}<br />
<br />
== Offline Meetings in 2017 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 13, 2017|December 13, 2017]]<br />
* [[GlueX Offline Meeting, November 29, 2017|November 29, 2017]]<br />
* [[GlueX Offline Meeting, November 15, 2017|November 15, 2017]]<br />
* [[GlueX Offline Meeting, November 1, 2017|November 1, 2017]]<br />
* [[GlueX Offline Meeting, October 4, 2017|October 4, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, September 20, 2017|September 20, 2017]]<br />
* [[GlueX Offline Meeting, September 6, 2017|September 6, 2017]]<br />
* [[GlueX Offline Meeting, August 23, 2017|August 23, 2017]]<br />
* [[GlueX Offline Meeting, August 9, 2017|August 9, 2017]]<br />
* [[GlueX Offline Meeting, July 26, 2017|July 26, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, July 12, 2017|July 12, 2017]]<br />
* [[GlueX Offline Meeting, June 28, 2017|June 28, 2017]]<br />
* [[GlueX Offline Meeting, June 14, 2017|June 14, 2017]]<br />
* [[GlueX Offline Meeting, May 31, 2017|May 31, 2017]]<br />
* [[GlueX Offline Meeting, April 19, 2017|April 19, 2017]]<br />
|<br />
* [[GlueX Offline Meeting, March 22, 2017|March 22, 2017]]<br />
* [[GlueX Offline Meeting, March 8, 2017|March 8, 2017]]<br />
* [[GlueX Offline Meeting, February 22, 2017|February 22, 2017]]<br />
* [[GlueX Offline Meeting, February 1, 2017|February 1, 2017]]<br />
* [[GlueX Offline Meeting, January 18, 2017|January 18, 2017]]<br />
|}<br />
<br />
== Offline Meetings in 2016 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 21, 2016|December 21, 2016]]<br />
* [[GlueX Offline Meeting, December 7, 2016|December 7, 2016]]<br />
* [[GlueX Offline Meeting, November 9, 2016|November 9, 2016]]<br />
* [[GlueX Offline Meeting, October 26, 2016|October 26, 2016]]<br />
* [[GlueX Offline Meeting, October 12, 2016|October 12, 2016]]<br />
* [[GlueX Offline Meeting, September 28, 2016|September 28, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, September 14, 2016|September 14, 2016]]<br />
* [[GlueX Offline Meeting, August 31, 2016|August 31, 2016]]<br />
* [[GlueX Offline Meeting, August 17, 2016|August 17, 2016]]<br />
* [[GlueX Offline Meeting, August 3, 2016|August 3, 2016]]<br />
* [[GlueX Offline Meeting, July 20, 2016|July 20, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, July 6, 2016|July 6, 2016]]<br />
* [[GlueX Offline Meeting, June 8, 2016|June 8, 2016]]<br />
* [[GlueX Offline Meeting, May 25, 2016|May 25, 2016]]<br />
* [[GlueX Offline Meeting, April 27, 2016|April 27, 2016]]<br />
* [[GlueX Offline Meeting, April 13, 2016|April 13, 2016]]<br />
|<br />
* [[GlueX Offline Meeting, March 30, 2016|March 30, 2016]]<br />
* [[GlueX Offline Meeting, March 2, 2016|March 2, 2016]]<br />
* [[GlueX Offline Meeting, February 3, 2016|February 3, 2016]]<br />
* [[GlueX Offline Meeting, January 20, 2016|January 20, 2016]]<br />
* [[GlueX Offline Meeting, January 6, 2016|January 6, 2016]]<br />
|}<br />
<br />
== Offline Meetings in 2015 ==<br />
<br />
{|<br />
|-<br />
|<br />
* [[GlueX Offline Meeting, December 9, 2015|December 9, 2015]]<br />
* [[GlueX Offline Meeting, November 11, 2015|November 11, 2015]]<br />
* [[GlueX Offline Meeting, October 28, 2015|October 28, 2015]]<br />
* [[GlueX Offline Meeting, October 14, 2015|October 14, 2015]]<br />
* [[GlueX Offline Meeting, September 30, 2015|September 30, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, September 16, 2015|September 16, 2015]]<br />
* [[GlueX Offline Meeting, September 2, 2015|September 2, 2015]]<br />
* [[GlueX Offline Meeting, August 19, 2015|August 19, 2015]]<br />
* [[GlueX Offline Meeting, August 5, 2015|August 5, 2015]]<br />
* [[GlueX Offline Meeting, July 22, 2015|July 22, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, July 8, 2015|July 8, 2015]]<br />
* [[GlueX Offline Meeting, June 24, 2015|June 24, 2015]]<br />
* [[GlueX Offline Meeting, June 10, 2015|June 10, 2015]]<br />
* [[GlueX Offline Meeting, May 27, 2015|May 27, 2015]]<br />
* [[GlueX Offline Meeting, April 29, 2015|April 29, 2015]]<br />
* [[GlueX Offline Meeting, April 15, 2015|April 15, 2015]]<br />
|<br />
* [[GlueX Offline Meeting, April 1, 2015|April 1, 2015]]<br />
* [[GlueX Offline Meeting, March 18, 2015|March 18, 2015]]<br />
* [[GlueX Offline Meeting, March 4, 2015|March 4, 2015]]<br />
* [[GlueX Offline Meeting, February 4, 2015|February 4, 2015]]<br />
* [[GlueX Offline Meeting, January 21, 2015|January 21, 2015]]<br />
* [[GlueX Offline Meeting, January 7, 2015|January 7, 2015]]<br />
|}<br />
<br />
== Offline Meetings in 2014 ==<br />
<br />
<table><tr><td><br />
* [[GlueX Offline Meeting, December 10, 2014|December 10, 2014]]<br />
* [[GlueX Offline Meeting, November 12, 2014|November 12, 2014]]<br />
* [[GlueX Offline Meeting, October 29, 2014|October 29, 2014]]<br />
* [[GlueX Offline Meeting, October 15, 2014|October 15, 2014]]<br />
* [[GlueX Offline Meeting, September 17, 2014|September 17, 2014]]<br />
* [[GlueX Offline Meeting, September 3, 2014|September 3, 2014]]<br />
* [[GlueX Offline Meeting, August 20, 2014|August 20, 2014]]<br />
</td><td><br />
* [[GlueX Offline Meeting, August 6, 2014|August 6, 2014]]<br />
* [[GlueX Offline Meeting, July 23, 2014|July 23, 2014]]<br />
* [[GlueX Offline Meeting, July 9, 2014|July 9, 2014]]<br />
* [[GlueX Offline Meeting, June 25, 2014|June 25, 2014]]<br />
* [[GlueX Offline Meeting, June 11, 2014|June 11, 2014]]<br />
* [[GlueX Offline Meeting, May 28, 2014|May 28, 2014]]<br />
* [[GlueX Offline Meeting, April 30, 2014|April 30, 2014]]<br />
</td><td><br />
* [[GlueX Offline Meeting, April 16, 2014|April 16, 2014]]<br />
* [[GlueX Offline Meeting, April 2, 2014|April 2, 2014]]<br />
* [[GlueX Offline Meeting, March 19, 2014|March 19, 2014]]<br />
* [[GlueX Offline Meeting, March 5, 2014|March 5, 2014]] (canceled, JLab network outage)<br />
* [[GlueX Offline Meeting, February 5, 2014|February 5, 2014]]<br />
* [[GlueX Offline Meeting, January 22, 2014|January 22, 2014]]<br />
* [[GlueX Offline Meeting, January 8, 2014|January 8, 2014]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2013 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 11, 2013|December 11, 2013]]<br />
* [[GlueX Offline Meeting, November 13, 2013|November 13, 2013]]<br />
* [[GlueX Offline Meeting, October 30, 2013|October 30, 2013]]<br />
* [[GlueX Offline Meeting, October 16, 2013|October 16, 2013]]<br />
* [[GlueX Offline Meeting, September 18, 2013|September 18, 2013]]<br />
* [[GlueX Offline Meeting, September 4, 2013|September 4, 2013]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 21, 2013|August 21, 2013]]<br />
* [[GlueX Offline Meeting, August 7, 2013|August 7, 2013]]<br />
* [[GlueX Offline Meeting, July 24, 2013|July 24, 2013]]<br />
* [[GlueX Offline Meeting, June 26, 2013|June 26, 2013]]<br />
* [[GlueX Offline Meeting, June 12, 2013|June 12, 2013]]<br />
* [[GlueX Offline Meeting, May 15, 2013|May 15, 2013]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, May 1, 2013|May 1, 2013]]<br />
* [[GlueX Offline Meeting, April 17, 2013|April 17, 2013]]<br />
* [[GlueX Offline Meeting, April 3, 2013|April 3, 2013]]<br />
* [[GlueX Offline Meeting, March 20, 2013|March 20, 2013]]<br />
* [[GlueX Offline Meeting, February 6, 2013|February 6, 2013]]<br />
* [[GlueX Offline Meeting, January 23, 2013|January 23, 2013]]<br />
* [[GlueX Offline Meeting, January 9, 2013|January 9, 2013]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2012 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 12, 2012|December 12, 2012]]<br />
* [[GlueX Offline Meeting, November 28, 2012|November 28, 2012]] (ARC 428)<br />
* [[GlueX Offline Meeting, November 14, 2012|November 14, 2012]]<br />
* [[GlueX Offline Meeting, October 31, 2012|October 31, 2012]]<br />
* [[GlueX Offline Meeting, October 17, 2012|October 17, 2012]]<br />
* [[GlueX Offline Meeting, October 3, 2012|October 3, 2012]]<br />
* [[GlueX Offline Meeting, September 19, 2012|September 19, 2012]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, September 5, 2012|September 5, 2012]]<br />
* [[GlueX Offline Meeting, August 22, 2012|August 22, 2012]]<br />
* [[GlueX Offline Meeting, August 8, 2012|August 8, 2012]]<br />
* [[GlueX Offline Meeting, July 25, 2012|July 25, 2012]]<br />
* [[GlueX Offline Meeting, July 11, 2012|July 11, 2012]]<br />
* [[GlueX Offline Meeting, June 27, 2012|June 27, 2012]]<br />
* [[GlueX Offline Meeting, June 13, 2012|June 13, 2012]]<br />
* [[GlueX Offline Meeting, May 30, 2012|May 30, 2012]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, May 16, 2012|May 16, 2012]]<br />
* [[GlueX Offline Meeting, April 18, 2012|April 18, 2012]]<br />
* [[GlueX Offline Meeting, March 21, 2012|March 21, 2012]]<br />
* [[GlueX Offline Meeting, February 22, 2012|February 22, 2012]]<br />
* [[GlueX Offline Meeting, February 8, 2012|February 8, 2012]]<br />
* [[GlueX Offline Meeting, January 25, 2012|January 25, 2012]]<br />
</td><td width=250><br />
</td></tr></table><br />
<br />
== Offline Meetings in 2011 ==<br />
<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 14, 2011|December 14, 2011]]<br />
* [[GlueX Offline Meeting, November 30, 2011|November 30, 2011]]<br />
* [[GlueX Offline Meeting, November 16, 2011|November 16, 2011]]<br />
* [[GlueX Offline Meeting, November 2, 2011|November 2, 2011]]<br />
* [[GlueX Offline Meeting, October 19, 2011|October 19, 2011]] (canceled: Lehman Review)<br />
* [[GlueX Offline Meeting, September 21, 2011|September 21, 2011]]<br />
* [[GlueX Offline Meeting, September 7, 2011|September 7, 2011]]<br />
* [[GlueX Offline Meeting, August 24, 2011|August 24, 2011]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 10, 2011|August 10, 2011]]<br />
* [[GlueX Offline Meeting, July 27, 2011|July 27, 2011]]<br />
* [[GlueX Offline Meeting, July 13, 2011|July 13, 2011]]<br />
* [[GlueX Offline Meeting, June 29, 2011|June 29, 2011]]<br />
* [[GlueX Offline Meeting, June 15, 2011|June 15, 2011]]<br />
* [[GlueX Offline Meeting, June 1, 2011|June 1, 2011]]<br />
* [[GlueX Offline Meeting, May 18, 2011|May 18, 2011]]<br />
* [[GlueX Offline Meeting, April 20, 2011|April 20, 2011]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, April 6, 2011|April 6, 2011]]<br />
* [[GlueX Offline Meeting, March 23, 2011|March 23, 2011]]<br />
* [[GlueX Offline Meeting, March 9, 2011|March 9, 2011]]<br />
* [[GlueX Offline Meeting, February 23, 2011|February 23, 2011]]<br />
* [[GlueX Offline Meeting, February 9, 2011|February 9, 2011]]<br />
* [[GlueX Offline Meeting, January 26, 2011|January 26, 2011]]<br />
* [[GlueX Offline Meeting, January 12, 2011|January 12, 2011]]<br />
</td></tr></table><br />
<br />
== Offline Meetings in 2010 ==<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 15, 2010|December 15, 2010]]<br />
* [[GlueX Offline Meeting, December 1, 2010|December 1, 2010]]<br />
* [[GlueX Offline Meeting, November 17, 2010|November 17, 2010]]<br />
* [[GlueX Offline Meeting, November 2, 2010|November 2, 2010]]<br />
* [[GlueX Offline Meeting, October 19, 2010|October 19, 2010]]<br />
* [[GlueX Offline Meeting, October 5, 2010|October 5, 2010]]<br />
* [[GlueX Offline Meeting, September 21, 2010|September 21, 2010]]<br />
* [[GlueX Offline Meeting, August 24, 2010|August 24, 2010]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, August 10, 2010|August 10, 2010]]<br />
* [[GlueX Offline Meeting, July 27, 2010|July 27, 2010]]<br />
* [[GlueX Offline Meeting, July 13, 2010|July 13, 2010]]<br />
* [[GlueX Offline Meeting, June 29, 2010|June 29, 2010]]<br />
* [[GlueX Offline Meeting, June 15, 2010|June 15, 2010]]<br />
* [[GlueX Offline Meeting, June 1, 2010|June 1, 2010]]<br />
* [[GlueX Offline Meeting, May 18, 2010|May 18, 2010]]<br />
* [[GlueX Offline Meeting, May 4, 2010|May 4, 2010]]<br />
</td><td width=250><br />
* [[GlueX Offline Meeting, April 20, 2010|April 20, 2010]]<br />
* [[GlueX Offline Meeting, April 6, 2010|April 6, 2010]]<br />
* [[GlueX Offline Meeting, March 23, 2010|March 23, 2010]]<br />
* [[GlueX Offline Meeting, March 9, 2010|March 9, 2010]]<br />
* [[GlueX Offline Meeting, February 23, 2010|February 23, 2010]]<br />
* [[GlueX Offline Meeting, February 9, 2010|February 9, 2010]]<br />
* [[GlueX Offline Meeting, January 12, 2010|January 12, 2010]]<br />
</td></tr></table><br />
== Offline Meetings in 2009 ==<br />
<table><tr><td width=250><br />
* [[GlueX Offline Meeting, December 15, 2009|December 15, 2009]]<br />
* [[GlueX Offline Meeting, December 1, 2009|December 1, 2009]]<br />
* [[GlueX Offline Meeting, November 17, 2009|November 17, 2009]]<br />
* [[GlueX Offline Meeting, November 4, 2009|November 4, 2009]]<br />
* [[GlueX Offline Meeting, October 21, 2009|October 21, 2009]]<br />
</td><td width=250><br />
* [[ October 7, 2009 Software ]]<br />
* [[ September 23, 2009 Software ]]<br />
* [[ August 26, 2009 Software ]]<br />
* [[ August 12, 2009 Software ]]<br />
* [[ July 29, 2009 Software ]]<br />
* [[ July 1, 2009 Software ]]<br />
* [[ June 17, 2009 Software ]]<br />
</td><td width=250><br />
* [[ May 20, 2009 Software ]]<br />
* [[ May 6, 2009 Software ]]<br />
* [[ April 22, 2009 Software ]]<br />
* [[ April 8, 2009 Software ]]<br />
* [[ March 11, 2009 Software ]]<br />
* [[ Feburary 25, 2009 Software ]]<br />
* [[ February 11, 2009 Software ]]<br />
* [[ January 14, 2009 Software ]]<br />
</td></tr></table><br />
== Offline Meetings in 2008 ==<br />
<table><tr><td width=250><br />
* [[ December 17, 2008 Software ]]<br />
* [[ December 3, 2008 Software ]]<br />
* [[ November 18, 2008 Software ]]<br />
* [[ October 8, 2008 Software ]]<br />
* <s>[[ September 12, 2008 Software ]]</s><br />
</td><td width=250><br />
* [[ August 29, 2008 Software ]]<br />
* [[ August 15, 2008 Software ]]<br />
* [[ August 1, 2008 Software ]]<br />
* [[ July 18, 2008 Software ]]<br />
* [[ July 3, 2008 Software ]]<br />
* [[ June. 6, 2008 Software ]]<br />
</td><td width=250><br />
* [[ May. 23, 2008 Software ]]<br />
* [[ Feb. 29, 2008 Tracking CDC/FDC ]]<br />
* [[ Feb. 22, 2008 Tracking CDC/FDC ]]<br />
* [[ Feb. 15, 2008 Tracking CDC/FDC ]]<br />
* [[February 8, 2008 Software]]<br />
* [[January 25, 2008 Software]]<br />
</td></tr></table><br />
== Offline Meetings in 2007 ==<br />
<table><tr><td width=250><br />
* [[December 7, 2007 Software]]<br />
* [[November 30, 2007 Software]]<br />
* [[November 13, 2007 Software]]<br />
* [[October 19, 2007 Software]]<br />
* [[September 21,2007 Software]]<br />
* [[September 11,2007 Software]]<br />
* [[August 21,2007 Software]]<br />
* [[August 14, 2007 Software]]<br />
</td><td width=250><br />
* [[July 31, 2007 Software]]<br />
* [[July 17, 2007 Software]]<br />
* [[June 5, 2007 Software]]<br />
* [[May 22, 2007 Software]]<br />
* [[May 1, 2007 Software]]<br />
* [[April 17, 2007 Software]]<br />
* [[April 10, 2007 Software]]<br />
* [[March 20, 2007 Software]]<br />
</td><td width=250><br />
* [[March 13, 2007 Software]]<br />
* [[February 27, 2007 Software]]<br />
* [[February 20, 2007 Software]]<br />
* [[February 13, 2007 Software]]<br />
* [[February 6, 2007 Software]]<br />
* [[January 30, 2007 Software]]<br />
* [[January 16, 2007 Software]]<br />
* [[January 8, 2007 Software]]<br />
</td></tr></table><br />
== Offline Meetings in 2006 ==<br />
<table><tr><td width=250><br />
* [[December 18, 2006 Software]]<br />
* [[December 11, 2006 Software]]<br />
* [[December 4, 2006 Software]]<br />
</td><td width=250><br />
* [[September 6, 2006 Software]]<br />
* [[August 28, 2006 Software]]<br />
* [[August 14, 2006 Software]]<br />
* [[August 7, 2006 Software]]<br />
</td><td width=250><br />
* [[July 31, 2006 Software]]<br />
* [[July 10, 2006 Software]]<br />
* [[July 5,2006 Software]]<br />
* [[May 8, 2006 Software]]<br />
</td></tr></table><br />
<br />
=Special Meetings=<br />
* [[fADC Emulation Meeting, August 26, 2015]]<br />
* [[Data Plan Meeting, February 13, 2014]]<br />
* [[Particle Decay Chain Meeting, September 11, 2013]]<br />
* [[GlueX and the OSG, Meeting on Resource Contribution, March 31, 2017]]</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Offline_Software&diff=124508GlueX Offline Software2024-01-31T22:52:25Z<p>Aaustreg: /* GlueX and Containers */</p>
<hr />
<div>{| border<br />
|-<br />
|<br />
'''News:'''<br />
* July 19, 2022: [[GlueX_Software_Meeting,_July_18,_2022#Minutes|Minutes of the July 18th Software Meeting]] are available<br />
* July 12, 2022: [https://halldweb.jlab.org/halld_versions/version_5.8.0.xml New version set: 5.8.0].<br />
* June 7, 2022: [[GlueX_Software_Meeting,_June_6,_2022#Minutes|Minutes of the June 6th Software Meeting]] are available<br />
* May 23, 2022" [https://halldweb.jlab.org/wiki/index.php/GlueX_Tutorial_2022 GlueX Software and Analysis Tutorial 2022]<br />
* May 19, 2022: [https://mailman.jlab.org/pipermail/halld-offline/2022-May/008816.html New version set: 5.7.1].<br />
* [[Offline Software News Archive|Previous news items...]]<br />
|<br />
'''Quick Links:'''<br />
* [https://scicomptest.jlab.org/scicomp/tapeFile Listing of files missing from the tape library]<br />
* [[GlueX Offline FAQ|Frequently Asked Questions]]<br />
* [[Offline HOWTO List|HOWTO List]]<br />
* [[GlueX_Offline_FAQ#Where_do_I_find_version_set_files.3F|Version Set Files]]<br />
* [https://github.com/orgs/JeffersonLab/teams/gluex/repositories Repositories on GitHub]<br />
* [[GlueX Offline Software Meetings]]<br />
|}<br />
<br />
The legacy version of this page is [[Offline Software|here]].<br />
<br />
==General Information==<br />
<br />
[https://halldweb.jlab.org/docs/build_scripts_web/ Build Scripts: A Version Management System for GlueX] describes a standard directory structure for GlueX software, how to create complete or partial builds, and how to specify versions of individual packages, both for building and for use. It is also available as [https://halldweb.jlab.org/doc-public/DocDB/ShowDocument?docid=2793 GlueX Note 2793].<br />
<br />
See below for other helpful links.<br />
<br />
===Shell Environment Set-Up===<br />
<br />
* [https://halldweb.jlab.org/docs/build_scripts_web/node6.html#SECTION00062400000000000000 Simple Environment Set Up]<br />
<br />
===Building GlueX Software===<br />
<br />
* [[Building_Private_Versions_of_GlueX_Software:_my_sim-recon.sh|Building private versions of GlueX software at JLab]]<br />
* [https://halldweb.jlab.org/docs/build_scripts_web/node10.html Scripts for Installing GlueX Software]: Do a complete build of all standard components of GlueX software or add new versions to an existing build tree.<br />
* [[Hall D Package Manager|hdpm - Hall D Package Manager]]: A package manager for Hall-D software, which provides an alternative method for installing GlueX software.<br />
* [[Legacy Build Instructions]]: Guides to various aspects of the build process. Up-to-date-ness may vary.<br />
* [[GlueX_Offline_FAQ#Where_do_I_find_version_set_files.3F|Version Sets Files]]<br />
<br />
=== GlueX and Containers ===<br />
<br />
* [[GlueX Software on Oasis]]<br />
* [https://github.com/JeffersonLab/hd_singularity Package for Building Singularity Containers]<br />
* [[GlueX Containers Meetings]]<br />
* [[HOWTO use the GlueX Singularity Container]]<br />
<br />
==Software Documentation==<br />
<br />
Packages used in the GlueX software stack with links to package-specific documentation.<br />
<br />
===GlueX Software===<br />
<br />
Documentation for software packages specific to GlueX or to JLab.<br />
* A brief overview of the GlueX-specific software can be found [[ Software | here ]]<br />
* Best practices in writing code for the HallD/GlueX software environment check out this link of [[Coding Conventions| '''''HallD/GlueX Coding Conventions''''']]<br />
* A convenient way to enhance the impact of comments inside the code is to use [https://halldweb.jlab.org/talks/2021/usedoxy.pdf '''''Doxygen''''']. The hooks and infrastructures are already implemented for the packages ''halld_recon'' and ''halld_sim''.<br />
* [[Documentation Initiative]]<br />
<br />
====Software Packages====<br />
<br />
* [https://github.com/mashephe/AmpTools AmpTools]: Amplitude analysis package (aka partial wave analysis)<br />
* [[Calibration Database|CCDB]]: Calibration Database<br />
** [[Policy on CCDB Variations for Reconstructing Simulated Data]]<br />
* [https://github.com/sdobbs/HDEventStore EventStore]: a package for managing and deploying data files and event lists<br />
* [https://coda.jlab.org/drupal/content/event-io-evio EVIO]: The [https://coda.jlab.org/drupal/ CODA] event format.<br />
* [https://github.com/jeffersonlab/halld_recon halld_recon]<br />
* [https://github.com/jeffersonlab/halld_sim halld_sim]<br />
* [[HDDM Programmer's Interface|HDDM]]: The Hall D Data Model, a compact xml-based format for event-based data.<br />
* HDDS: Detector geometry specification<br />
** [https://halldsvn.jlab.org/repos/trunk/hdds/HDDS-1_1.xsd HDDS Schema]: Includes description of the HDDS mark-up language.<br />
** [https://halldweb.jlab.org/doc-public/DocDB/ShowDocument?docid=64 Geometry Specification for Hall D]: General Description and Philosophy, September 2003<br />
** [https://halldweb.jlab.org/doc-public/DocDB/ShowDocument?docid=654 Detector Models for GlueX Monte Carlo Simulation: A Status Report], June 2006<br />
** [https://halldweb.jlab.org/doc-public/DocDB/ShowDocument?docid=732 Detector Models for GlueX Monte Carlo Simulation: the CD2 Baseline], January 2007<br />
** [[HDDS Tagged Releases]]<br />
* [[HOWTO install and run HDGeant4|HDGeant4]]: Geant4-based simulation of the GlueX Detector and Hall D beamline<br />
* [https://www.jlab.org/JANA/ JANA]<br />
* MCwrapper<br />
**[https://halldweb.jlab.org/gluex_sim/Dashboard.html '''Active Sample Dashboard''']<br />
**[https://halldweb.jlab.org/gluex_sim/SubmitSim.html '''MC Submission Form''']<br />
* [https://pypwa.jlab.org/index.html PyPWA:] [https://pypwa.readthedocs.io/ Python-based Partial Wave Analysis Toolkit]<br />
* [https://github.com/JeffersonLab/rcdb/wiki RCDB]: Run Conditions Database<br />
** [https://halldweb.jlab.org/rcdb/ Web Interface]<br />
** [[GlueX Implementation of the RCDB]]<br />
* sim-recon (deprecated, July 2018)<br />
** [http://www.jlab.org/Hall-D/software/HDSoftware_Documentation/ Doxygen Documentation]: Descriptions of sim-recon C++ classes and their relationships.<br />
** [[Sim-Recon Tagged Releases]]<br />
** [[GlueX_Analysis_Software | Analysis Software in sim-recon]]<br />
** [[SCons Build System]] (SBMS)<br />
<br />
====Special Topics====<br />
<br />
* [[Offline HOWTO List]]: Guides to performing various specific tasks.<br />
* [[Beam Simulations]]: Geant simulations of the Hall D Tagger Hall<br />
* [[Magnetic Field Maps for Solenoid]]<br />
* [[Experimental_Sensitivity_to_Solenoidal_Field|Sensitivity to Magnetic Field Strength]]<br />
* [[How HDGeant defines time-zero for physics events]]<br />
* [[Guide to Monte Carlo event timing and detached vertices in HDGeant/4]]<br />
* [[Guide to roll-your-own python hddm transforms]]<br />
* [[Geometry]]<br />
* [[Reconstruction Software]]<br />
* [[Kinematic Fitting]]<br />
# [[Splitoff_Information | Material on Splitoffs]]<br />
<br />
===Documentation for External Software Packages===<br />
<br />
Documentation on additional software packages used by GlueX. Development and maintenance of these are not directly related to GlueX.<br />
<br />
{|<br />
|-<br />
|<br />
* [http://cernlib.web.cern.ch/cernlib/ CERNLIB]<br />
* [http://proj-clhep.web.cern.ch/proj-clhep/ CLHEP]<br />
* [https://en.wikipedia.org/wiki/C%2B%2B C++]<br />
|<br />
* [http://www.stack.nl/~dimitri/doxygen/ Doxygen]<br />
* [https://halldweb.jlab.org/manuals/geant.pdf GEANT 3 (manual)]<br />
* [http://geant4.cern.ch/ GEANT4]<br />
* [https://git-scm.com/ Git]<br />
|<br />
* [http://www.gnu.org/software/make/ GNU Make]<br />
* [http://www.mysql.com/ MySQL]<br />
* [https://www.python.org/ Python]<br />
|<br />
* [http://root.cern.ch/ ROOT]<br />
* [http://www.scons.org/ SCons]<br />
* [http://www.sqlite.org/ SQLite]<br />
|<br />
* [https://subversion.apache.org/ Subversion]<br />
* [http://xerces.apache.org/ Xerces]<br />
* [http://www.w3.org/TR/xmlschema11-1/ XSD]<br />
|}<br />
<br />
==Data Sets==<br />
<br />
* [[Simulations]]: a guide to simulated data sets.<br />
<br />
==Offline Data Monitoring==<br />
<br />
* [[Data_Monitoring_Procedures]]: Information on offline monitoring of recently-taken data<br />
<br />
==Computing Facilities==<br />
<br />
===JLab===<br />
* [https://halldweb.jlab.org/disk_management/halld_status.html Hall D Offline System Status Plots] [[File:User jobs 0 thumb.png|link=https://halldweb.jlab.org/disk_management/halld_status.html]]<br />
* [[Computing Services and Servers|Hall D and JLab Servers]]: Web servers, database servers, etc.<br />
* [https://cc.jlab.org/ Computer Center]<br />
* [https://scicomptest.jlab.org/scicomp/ Scientific Computing]<br />
** [https://scicomp.jlab.org/scicomp/#/auger/jobs Farm Job Status] Auger job status from SciComp<br />
*** [https://halldweb.jlab.org/cgi-bin/jproj_status.pl jproj Job Status] Status of jobs using the jproj system<br />
** [https://scicomp.jlab.org/scicomp/#/jasmine/jobs Tape Request Status]<br />
* Disk space usage<br />
** Group<br />
*** [https://halldweb.jlab.org/disk_management/group_report.html /group/halld leader board]<br />
** [https://scicomp.jlab.org/scicomp/#/disk/cache-family Cache]<br />
*** [https://halldweb.jlab.org/disk_management/cache_oldest.html oldest files under /cache/halld]<br />
** [https://scicomp.jlab.org/scicomp/#/disk/volatile Volatile]<br />
*** [https://halldweb.jlab.org/disk_management/volatile_oldest.html oldest files under /volatile/halld]<br />
** [https://scicomp.jlab.org/scicomp/#/disk/work Work]<br />
*** [https://halldweb.jlab.org/disk_management/work_report.html /work/halld leader board]<br />
* [[GlueX-related shared accounts on the JLab CUE]]<br />
* [[Hall D MySQL/MariaDB Servers and Replication Relationship]]: a list of database servers<br />
<br />
===Grid===<br />
<br />
* [http://www.opensciencegrid.org/ Open Science Grid]<br />
** [https://mailman.jlab.org/pipermail/halld-offline/2017-June/002827.html Using the Grid with Containers]<br />
** [[Using the Grid]]<br />
** [[Updating Oasis for the GlueX VO]]<br />
** [https://halldweb.jlab.org/talks/2011-2Q/gridmake-6-2011.ppt Gridmake]<br />
<br />
===Off-Site Computing Resources===<br />
<br />
* [[Off-site Computing Resources]]: Status of applications, current and past allocations<br />
<br />
==Software Management==<br />
<br />
===Source Code Management===<br />
<br />
* [https://git-scm.com/ Git]<br />
** [https://github.com/orgs/JeffersonLab/teams/gluex/repositories GlueX Git Repositories]<br />
** [[Instructions for Working with GlueX Git Repositories]]. Workflow description.<br />
** [https://mailman.jlab.org/pipermail/halld-offline/2015-July/002086.html Instructions for joining the GlueX team on GitHub]<br />
** [[Git Help Resources]], July 14<br />
** [[Guide to Using Git]]. General usage tips.<br />
** [[GlueX_Offline_FAQ#Git|Git questions and answers in the Offline FAQ]]<br />
** [[Splitting sim-recon]]<br />
** [[Conversion from Subversion to Git]] (Summer 2015)<br />
* [https://subversion.apache.org/ Subversion]<br />
** [http://halldweb.jlab.org/websvn/prod/ Browse the Hall D Subversion Repository]<br />
*** [http://clasweb.jlab.org/websvn/prod/ Legacy version]<br />
<br />
===Testing and Debugging===<br />
<br />
* [[Automatic Builds of GlueX Software]]: nightly build, pull-request-initiated test build, scan-build<br />
* [[Automatic Tests of GlueX Software]]<br />
* [[Diagnosing segmentation faults in reconstruction software]]<br />
<br />
===Governance, Conventions, Standard Practices===<br />
<br />
* [https://github.com/JeffersonLab/sim-recon/issues Offline Issue Tracking] using GitHub Issue Tracking<br />
* [https://halldweb.jlab.org/mantisbt Offline Issue Tracking] using the Mantis Bug Tracker<br />
* [[Coding Conventions|GlueX Coding Standards]]<br />
* [[Version Management]]<br />
* [[GlueX Cron Jobs]]<br />
* [[Data Challenges]]: Large-scale tests of computing infrastructure<br />
* [https://halldweb.jlab.org/doc-public/DocDB/ShowDocument?docid=2808 Data Management Plan for Hall D]: Long-term preservation of data and data-analysis tools<br />
* [https://data.jlab.org/drupal/?q=system/files/Offline%20Plan%20FY10.pdf JLab Offline Computing Plan, FY2010]<br />
* [[Role of the Software Coordinator]]<br />
<br />
=== Containers ===<br />
<br />
* [[GlueX and Containers]]: an overview of the use of containers in GlueX<br />
<br />
==Meetings and Reviews==<br />
<br />
* [[GlueX Offline Software Meetings]]: Agendas and minutes<br />
* [[HDGeant4 Meetings]]<br />
* [[GlueX Containers Meetings]]<br />
* [[Software and Computing Reviews]]<br />
* [[Geometries for 2008 DC Review]]<br />
<br />
==Communication and Help==<br />
<br />
* Offline Software Email List: announcements and discussion<br />
** [mailto:halld-offline@jlab.org Send email to the list]<br />
** [https://mailman.jlab.org/pipermail/halld-offline/ Message archive]<br />
** [https://www.google.com/cse/publicurl?cx=001547825138043056762:dydinmymrvu Google search of archive]<br />
** [https://mailman.jlab.org/mailman/listinfo/halld-offline Information page] (subscribe, unsubscribe, list members, etc.)<br />
* [https://groups.google.com/forum/#!forum/gluex-software Google Group for GlueX Software]<br />
**[[GlueX-Related Google Groups|Help on GlueX-related Google Groups]] <br />
** [https://mailman.jlab.org/pipermail/halld-offline/2016-April/002297.html Original announcement]<br />
* [https://slack.com/ Slack]<br />
** Chat application using the workspace '''jlab12gev'''<br />
** Click [https://jlab12gev.slack.com/signup here] to join. You must use your JLab email address.<br />
* [[Email Lists#Simple Email Lists|Simple Email Lists]]: get notifications from automated tests<br />
* [[GlueX_Communications|Communication Instructions]]: Guides to communication systems used by GlueX<br />
<br />
===FAQ===<br />
<br />
* [[GlueX Offline FAQ|The GlueX Offline Frequently Asked Questions List]]<br />
<br />
==Legacy Links==<br />
<br />
* [[Offline Computing Project Management]]<br />
* [[Versioning of Calibration Constants]]<br />
* [[Releases of GlueX Software]]<br />
* [[Reconstruction Tasks and Topics for Further Development]]<br />
* [[Running jobs on the grid]]<br />
* [[Calibration Constants, Tagged Versions]]<br />
* [[Action Items Archive (May 4, 2010 and before)]]<br />
* [http://www.jlab.org/Hall-D/offline/Software_tasks.php Software Task List]<br />
* [[Tracking resolution estimator (REZEST)]]<br />
* [[ded]]</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=GlueX_Software_Meeting,_January_29,_2024&diff=124507GlueX Software Meeting, January 29, 20242024-01-31T22:51:34Z<p>Aaustreg: /* Action Items */</p>
<hr />
<div>GlueX Software Meeting<br><br />
Monday, January 29, 2024<br><br />
11:00 am EDT<br><br />
F326/327<br><br />
<br />
<div class="mw-collapsible mw-collapsed"><br />
Zoom Meeting ID: 160 636 9159 Passcode: 888788 [https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09 Join]<br />
<div class="mw-collapsible-content"><br />
Mark Ito is inviting you to a scheduled ZoomGov meeting.<br />
<br />
Topic: GlueX Software<br />
Time: This is a recurring meeting Meet anytime<br />
<br />
Join ZoomGov Meeting<br />
https://jlab-org.zoomgov.com/j/1606369159?pwd=SlBrdStCQzllano1SmVQazMwaFExdz09<br />
<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
One tap mobile<br />
+16692545252,,1618692159# US (San Jose)<br />
+16468287666,,1618692159# US (New York)<br />
<br />
Dial by your location<br />
+1 669 254 5252 US (San Jose)<br />
+1 646 828 7666 US (New York)<br />
+1 669 216 1590 US (San Jose)<br />
+1 551 285 1373 US<br />
833 568 8864 US Toll-free<br />
Meeting ID: 160 636 9159<br />
Find your local number: https://jlab-org.zoomgov.com/u/acAwo1X4w9<br />
<br />
Join by SIP<br />
1618692159@sip.zoomgov.com<br />
<br />
Join by H.323<br />
161.199.138.10 (US West)<br />
161.199.136.10 (US East)<br />
Meeting ID: 160 636 9159<br />
Passcode: 888788<br />
<br />
</div><br />
</div><br />
<br />
==Agenda==<br />
<br />
# Announcements<br />
#* release [https://halldweb.jlab.org/halld_versions/version_5.14.2.xml version_5.14.2.xml]: Successfully deployed on CentOS7 and AlmaLinux9<br />
#** [https://halldweb.jlab.org/halld_versions/version_5.14.0.xml version_5.14.0.xml]: Diracxx did not chose the right compiler on CentOS7<br />
#** [https://halldweb.jlab.org/halld_versions/version_5.14.0.xml version_5.14.1.xml]: CCDB 1.07.00 is very slow, reverted to 1.06.10<br />
#* [[ Software and Computing Review 7 ]]: Feb 1-2, 2024<br />
# Review of [https://halldweb.jlab.org/wiki/index.php/GlueX_Software_Meeting,_December_18,_2023 minutes and action items]<br />
# Container updates<br />
#* [https://hub.docker.com/repository/docker/jeffersonlab/gluex_almalinux_9 gluex_almalinux_9 docker container]<br />
#** both AlmaLinux9 and default CentOS7 containers are linked with gxshell<br />
#* [https://hub.docker.com/repository/docker/rjones30/gluextest rjones30-gluextest (almalinux_9) docker container]<br />
# Discussion of software upgrade projects:<br />
#* JANA2 (Nathan)<br />
#* RCDB/CCDB (Dmitry)<br />
#* Geant4 (Richard):<br />
#** Link to Richard's [https://docs.google.com/document/d/1qZR4IdhVHzCUqDi6Hvi45raQzJd_xLAUvY1zKEmEBkg/edit logbook] for the Alma9 port<br />
#* ROOT<br />
#* RHEL8/Alma9 (Sean)<br />
#* Remove python2 dependency<br />
# Review of recent issues and pull requests:<br />
## halld_recon: [https://github.com/JeffersonLab/halld_recon/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_recon/pulls?q=is%3Apr PRs]<br />
## halld_sim: [https://github.com/JeffersonLab/halld_sim/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/halld_sim/pulls?q=is%3Apr PRs]<br />
## hdgeant4: [https://github.com/JeffersonLab/HDGeant4/issues Issues], [https://github.com/JeffersonLab/HDGeant4/pulls PRs]<br />
## MCwrapper: [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr PRs]<br />
## gluex_root_analysis: [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues], [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr PRs]<br />
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)<br />
<br />
== Questions ==<br />
<br />
* ifarm monitoring:<br />
** will be much improved with Alma9 roll out<br />
* GPU monitoring (Justin): Jupyter notebooks often block GPUs<br />
* Apps through oasis on CVMFS, or Jlab's own server? (Richard)<br />
** /cvmfs/oasis.opensciencegrid.org/jlab/scicomp/sw/el9/modulefiles/root<br />
* Tokens for xrootd? (Richard)<br />
<br />
== Action Items ==<br />
# Documentation<br />
#* Add prominent links to singularity containers on work and CVMFS: [https://halldweb.jlab.org/wiki/index.php/HOWTO_use_the_GlueX_Singularity_Container#Get_the_Container check]<br />
# Software Upgrades<br />
#* halld_recon:<br />
#** $HALLD_RECON_HOME/src/BMS is deprecated, remove from the repo?<br />
#** [https://github.com/JeffersonLab/halld_recon/issues/613 Issue #613]: ReactionFilter crashes in OS8/9<br />
#* JANA2 (Nathan): <br />
#** implement JANA2 in build_scripts, provide version.xml for general testing<br />
#** N. will focus on the transition now<br />
#** Use default CentOS7 container<br />
#* CCDB 2.0 (Dmitry):<br />
#** Check alma9 container<br />
#** Implement version check in v1, test with v2<br />
#** Need to test CCDB DB version update - need instructions / command from Dmitry (Sean)<br />
#* Geant4<br />
#** Use newest version that was approved by Richard<br />
#** Upgrade the Alma9 build first, then try to build on Centos7<br />
#* ROOT<br />
#** Upgrade the Alma9 build first, then try to build on Centos7</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=HOWTO_use_the_GlueX_Singularity_Container&diff=124505HOWTO use the GlueX Singularity Container2024-01-31T20:23:07Z<p>Aaustreg: /* Get the Container */</p>
<hr />
<div>== Install Singularity ==<br />
<br />
See the instructions on the [https://apptainer.org/ Apptainer/Singularity site].<br />
<br />
Alternately, RedHat Enterprise 7 has an RPM:<br />
<br />
yum install singularity<br />
<br />
Or on Ubuntu 16.04 and earlier:<br />
<br />
go [https://singularity.lbl.gov/install-linux here] and follow the instructions<br />
<br />
Or Ubuntu 16.10 and later:<br />
<br />
sudo apt-get install singularity-container<br />
<br />
The last one does not work on Pop!_OS, which is based on Ubuntu 22.04. Here is what worked for me [J.R.]:<br />
There seems an issue with at least some Ubuntu versions: if singularity is not yet installed,<br />
singularity --version<br />
will respond with that it can get installed with 'sudo apt install singularity'. DON'T DO THAT! It will install a game called 'Endgame: Singularity'.<br />
Instead do the following:<br />
sudo apt update<br />
wget https://github.com/sylabs/singularity/releases/download/v3.11.0/singularity-ce_3.11.0-jammy_amd64.deb<br />
sudo apt install ./singularity-ce_3.11.0-jammy_amd64.deb <br />
I had pieced this together from Syslabs [https://github.com/sylabs/singularity/releases/tag/v3.11.0 github page] and instructions [https://www.linuxwave.info/2022/02/installing-singularity-in-ubuntu-2004.html here].<br />
<br />
== Get the Container==<br />
<br />
Download [https://halldweb.jlab.org/dist/gluex_centos-7.9.2009_gxi2.34.sif gluex_centos-7.9.2009_gxi2.34.sif], the default container for CentOS7.<br />
<br />
At JLab there is no need to download. The file is on the group disk:<br />
<br />
/group/halld/www/halldweb/html/dist/gluex_centos-7.9.2009_gxi2.34.sif<br />
<br />
Alternatively, the container can also be accessed via CVMFS ([[HOWTO Install and Use the CVMFS Client]]):<br />
<br />
/cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest<br />
<br />
<br />
JLab currently (Jan 2024) hosts a container built with AlmaLinux9, which is not yet available via CVMFS: [https://halldweb.jlab.org/dist/gluex_almalinux9_gxi2.34.sif gluex_almalinux9_gxi2.34.sif]<br />
<br />
/group/halld/www/halldweb/html/dist/gluex_almalinux9_gxi2.34.sif<br />
<br />
== Get the Software and Support Files ==<br />
<br />
Use one of three methods.<br />
<br />
=== 1. tarball ===<br />
<br />
This method is not supported at present (December 13, 2018). If you would like to see it revived, contact the Software Working group.<br />
<br />
# Download the tarball: [https://halldweb.jlab.org/dist/group_halld.tar.gz group.halld.tar.gz]. It's 18 GB.<br />
# cd <directory that will contain "group"><br />
# tar zxvf <directory containing tarball>/group_halld.tar.gz<br />
<br />
=== 2. rsync with direct ssh ===<br />
<br />
rsync -ruvt --delete --links scosg16.jlab.org:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/<br />
<br />
=== 3. rsync through ssh tunnel ===<br />
<br />
<ol><br />
<li> Establish the tunnel<br />
<pre><br />
ssh -t -L9001:localhost:9001 login.jlab.org ssh -t -L9001:localhost:22 scosg16<br />
</pre><br />
<br />
<li> In a separate shell instance, do the rsync<br />
<pre><br />
rsync -ruvt --delete --links -e 'ssh -p9001' localhost:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/<br />
</pre><br />
</ol><br />
<br />
=== 4. Run CVMFS ===<br />
<br />
There are two options here:<br />
<br />
* [[HOWTO use prebuilt GlueX software from any linux user account using cvmfsexec|Run CVMFS in user space (cvmfsexec)]]<br />
* [[HOWTO Install and Use the CVMFS Client|Run CVMFS as root]]<br />
<br />
Depending on choices made during installation the directory that contains "group" will be something like<br />
<br />
/path/to/cvmfs/oasis.opensciencegrid.org/gluex<br />
<br />
== Get a Shell Inside the Container ==<br />
<br />
singularity shell --cleanenv --bind <directory that contains "group">/group/halld:/group/halld <directory with container>/gluex_centos-7.7.1908_sng3.8_gxi2.20.sif<br />
<br />
== Set-up the Developer Toolset package (optional) ==<br />
<br />
The standard GCC version is 4.8.5. Perform the following step if you would like to use the GCC 8.3.1 compiler.<br />
<br />
scl enable devtoolset-8 bash<br />
<br />
or<br />
<br />
scl enable devtoolset-8 tcsh<br />
<br />
== Set-Up the GlueX Environment ==<br />
<br />
For bash:<br />
<br />
source /group/halld/Software/build_scripts/gluex_env_jlab.sh<br />
<br />
or for tcsh:<br />
<br />
source /group/halld/Software/build_scripts/gluex_env_jlab.csh<br />
<br />
== gxshell ==<br />
We set up a tool to more easily set up and start a shell session. It requires singularity and cvmfs to be set up. For instructions consult [[HOWTO Install and Use the CVMFS Client]] <br />
<br />
Once the software is installed create an alias like <br />
alias gxshell singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/home /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest gxshell<br />
where you bin all directories that you want to take "into" the container. Binding /cvmfs/oasis.opensciencegrid.org/gluex/group/halld to /group/halld/ ensures that the prebuilt software will be available in the container.<br />
<br />
If everything is set up correctly all you have to do is type <br />
gxshell<br />
to start a bash shell within the GlueX container with the software environment set up.</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=HOWTO_use_the_GlueX_Singularity_Container&diff=124504HOWTO use the GlueX Singularity Container2024-01-31T20:21:33Z<p>Aaustreg: /* Get the Container */</p>
<hr />
<div>== Install Singularity ==<br />
<br />
See the instructions on the [https://apptainer.org/ Apptainer/Singularity site].<br />
<br />
Alternately, RedHat Enterprise 7 has an RPM:<br />
<br />
yum install singularity<br />
<br />
Or on Ubuntu 16.04 and earlier:<br />
<br />
go [https://singularity.lbl.gov/install-linux here] and follow the instructions<br />
<br />
Or Ubuntu 16.10 and later:<br />
<br />
sudo apt-get install singularity-container<br />
<br />
The last one does not work on Pop!_OS, which is based on Ubuntu 22.04. Here is what worked for me [J.R.]:<br />
There seems an issue with at least some Ubuntu versions: if singularity is not yet installed,<br />
singularity --version<br />
will respond with that it can get installed with 'sudo apt install singularity'. DON'T DO THAT! It will install a game called 'Endgame: Singularity'.<br />
Instead do the following:<br />
sudo apt update<br />
wget https://github.com/sylabs/singularity/releases/download/v3.11.0/singularity-ce_3.11.0-jammy_amd64.deb<br />
sudo apt install ./singularity-ce_3.11.0-jammy_amd64.deb <br />
I had pieced this together from Syslabs [https://github.com/sylabs/singularity/releases/tag/v3.11.0 github page] and instructions [https://www.linuxwave.info/2022/02/installing-singularity-in-ubuntu-2004.html here].<br />
<br />
== Get the Container==<br />
<br />
Download [https://halldweb.jlab.org/dist/gluex_centos-7.9.2009_gxi2.34.sif gluex_centos-7.9.2009_gxi2.34.sif], the default container for CentOS7.<br />
<br />
At JLab there is no need to download. The file is on the group disk:<br />
<br />
/group/halld/www/halldweb/html/dist/gluex_centos-7.9.2009_gxi2.34.sif<br />
<br />
Alternatively, the container can also be accessed via CVMFS ([[HOWTO Install and Use the CVMFS Client]]):<br />
<br />
/cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest<br />
<br />
<br />
We currently (Jan 2024) provide a container built with AlmaLinux9 at JLab: [https://halldweb.jlab.org/dist/gluex_almalinux9_gxi2.34.sif gluex_almalinux9_gxi2.34.sif]<br />
<br />
/group/halld/www/halldweb/html/dist/gluex_almalinux9_gxi2.34.sif<br />
<br />
== Get the Software and Support Files ==<br />
<br />
Use one of three methods.<br />
<br />
=== 1. tarball ===<br />
<br />
This method is not supported at present (December 13, 2018). If you would like to see it revived, contact the Software Working group.<br />
<br />
# Download the tarball: [https://halldweb.jlab.org/dist/group_halld.tar.gz group.halld.tar.gz]. It's 18 GB.<br />
# cd <directory that will contain "group"><br />
# tar zxvf <directory containing tarball>/group_halld.tar.gz<br />
<br />
=== 2. rsync with direct ssh ===<br />
<br />
rsync -ruvt --delete --links scosg16.jlab.org:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/<br />
<br />
=== 3. rsync through ssh tunnel ===<br />
<br />
<ol><br />
<li> Establish the tunnel<br />
<pre><br />
ssh -t -L9001:localhost:9001 login.jlab.org ssh -t -L9001:localhost:22 scosg16<br />
</pre><br />
<br />
<li> In a separate shell instance, do the rsync<br />
<pre><br />
rsync -ruvt --delete --links -e 'ssh -p9001' localhost:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/<br />
</pre><br />
</ol><br />
<br />
=== 4. Run CVMFS ===<br />
<br />
There are two options here:<br />
<br />
* [[HOWTO use prebuilt GlueX software from any linux user account using cvmfsexec|Run CVMFS in user space (cvmfsexec)]]<br />
* [[HOWTO Install and Use the CVMFS Client|Run CVMFS as root]]<br />
<br />
Depending on choices made during installation the directory that contains "group" will be something like<br />
<br />
/path/to/cvmfs/oasis.opensciencegrid.org/gluex<br />
<br />
== Get a Shell Inside the Container ==<br />
<br />
singularity shell --cleanenv --bind <directory that contains "group">/group/halld:/group/halld <directory with container>/gluex_centos-7.7.1908_sng3.8_gxi2.20.sif<br />
<br />
== Set-up the Developer Toolset package (optional) ==<br />
<br />
The standard GCC version is 4.8.5. Perform the following step if you would like to use the GCC 8.3.1 compiler.<br />
<br />
scl enable devtoolset-8 bash<br />
<br />
or<br />
<br />
scl enable devtoolset-8 tcsh<br />
<br />
== Set-Up the GlueX Environment ==<br />
<br />
For bash:<br />
<br />
source /group/halld/Software/build_scripts/gluex_env_jlab.sh<br />
<br />
or for tcsh:<br />
<br />
source /group/halld/Software/build_scripts/gluex_env_jlab.csh<br />
<br />
== gxshell ==<br />
We set up a tool to more easily set up and start a shell session. It requires singularity and cvmfs to be set up. For instructions consult [[HOWTO Install and Use the CVMFS Client]] <br />
<br />
Once the software is installed create an alias like <br />
alias gxshell singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/home /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest gxshell<br />
where you bin all directories that you want to take "into" the container. Binding /cvmfs/oasis.opensciencegrid.org/gluex/group/halld to /group/halld/ ensures that the prebuilt software will be available in the container.<br />
<br />
If everything is set up correctly all you have to do is type <br />
gxshell<br />
to start a bash shell within the GlueX container with the software environment set up.</div>Aaustreghttps://halldweb.jlab.org/wiki/index.php?title=HOWTO_use_the_GlueX_Singularity_Container&diff=124503HOWTO use the GlueX Singularity Container2024-01-31T20:18:07Z<p>Aaustreg: /* Get the Container */</p>
<hr />
<div>== Install Singularity ==<br />
<br />
See the instructions on the [https://apptainer.org/ Apptainer/Singularity site].<br />
<br />
Alternately, RedHat Enterprise 7 has an RPM:<br />
<br />
yum install singularity<br />
<br />
Or on Ubuntu 16.04 and earlier:<br />
<br />
go [https://singularity.lbl.gov/install-linux here] and follow the instructions<br />
<br />
Or Ubuntu 16.10 and later:<br />
<br />
sudo apt-get install singularity-container<br />
<br />
The last one does not work on Pop!_OS, which is based on Ubuntu 22.04. Here is what worked for me [J.R.]:<br />
There seems an issue with at least some Ubuntu versions: if singularity is not yet installed,<br />
singularity --version<br />
will respond with that it can get installed with 'sudo apt install singularity'. DON'T DO THAT! It will install a game called 'Endgame: Singularity'.<br />
Instead do the following:<br />
sudo apt update<br />
wget https://github.com/sylabs/singularity/releases/download/v3.11.0/singularity-ce_3.11.0-jammy_amd64.deb<br />
sudo apt install ./singularity-ce_3.11.0-jammy_amd64.deb <br />
I had pieced this together from Syslabs [https://github.com/sylabs/singularity/releases/tag/v3.11.0 github page] and instructions [https://www.linuxwave.info/2022/02/installing-singularity-in-ubuntu-2004.html here].<br />
<br />
== Get the Container==<br />
<br />
Download [https://halldweb.jlab.org/dist/gluex_centos-7.9.2009_gxi2.34.sif gluex_centos-7.9.2009_gxi2.34.sif], the default container for CentOS7.<br />
<br />
At JLab there is no need to download. The file is on the group disk:<br />
<br />
/group/halld/www/halldweb/html/dist/gluex_centos-7.9.2009_gxi2.34.sif<br />
<br />
<br />
Alternatively, the container can also be accessed via CVMFS ([[HOWTO Install and Use the CVMFS Client]]):<br />
<br />
/cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest<br />
<br />
== Get the Software and Support Files ==<br />
<br />
Use one of three methods.<br />
<br />
=== 1. tarball ===<br />
<br />
This method is not supported at present (December 13, 2018). If you would like to see it revived, contact the Software Working group.<br />
<br />
# Download the tarball: [https://halldweb.jlab.org/dist/group_halld.tar.gz group.halld.tar.gz]. It's 18 GB.<br />
# cd <directory that will contain "group"><br />
# tar zxvf <directory containing tarball>/group_halld.tar.gz<br />
<br />
=== 2. rsync with direct ssh ===<br />
<br />
rsync -ruvt --delete --links scosg16.jlab.org:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/<br />
<br />
=== 3. rsync through ssh tunnel ===<br />
<br />
<ol><br />
<li> Establish the tunnel<br />
<pre><br />
ssh -t -L9001:localhost:9001 login.jlab.org ssh -t -L9001:localhost:22 scosg16<br />
</pre><br />
<br />
<li> In a separate shell instance, do the rsync<br />
<pre><br />
rsync -ruvt --delete --links -e 'ssh -p9001' localhost:/cvmfs/oasis.opensciencegrid.org/gluex/group/ <directory that contains "group">/group/<br />
</pre><br />
</ol><br />
<br />
=== 4. Run CVMFS ===<br />
<br />
There are two options here:<br />
<br />
* [[HOWTO use prebuilt GlueX software from any linux user account using cvmfsexec|Run CVMFS in user space (cvmfsexec)]]<br />
* [[HOWTO Install and Use the CVMFS Client|Run CVMFS as root]]<br />
<br />
Depending on choices made during installation the directory that contains "group" will be something like<br />
<br />
/path/to/cvmfs/oasis.opensciencegrid.org/gluex<br />
<br />
== Get a Shell Inside the Container ==<br />
<br />
singularity shell --cleanenv --bind <directory that contains "group">/group/halld:/group/halld <directory with container>/gluex_centos-7.7.1908_sng3.8_gxi2.20.sif<br />
<br />
== Set-up the Developer Toolset package (optional) ==<br />
<br />
The standard GCC version is 4.8.5. Perform the following step if you would like to use the GCC 8.3.1 compiler.<br />
<br />
scl enable devtoolset-8 bash<br />
<br />
or<br />
<br />
scl enable devtoolset-8 tcsh<br />
<br />
== Set-Up the GlueX Environment ==<br />
<br />
For bash:<br />
<br />
source /group/halld/Software/build_scripts/gluex_env_jlab.sh<br />
<br />
or for tcsh:<br />
<br />
source /group/halld/Software/build_scripts/gluex_env_jlab.csh<br />
<br />
== gxshell ==<br />
We set up a tool to more easily set up and start a shell session. It requires singularity and cvmfs to be set up. For instructions consult [[HOWTO Install and Use the CVMFS Client]] <br />
<br />
Once the software is installed create an alias like <br />
alias gxshell singularity exec --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/scratch,/home /cvmfs/singularity.opensciencegrid.org/jeffersonlab/gluex_devel:latest gxshell<br />
where you bin all directories that you want to take "into" the container. Binding /cvmfs/oasis.opensciencegrid.org/gluex/group/halld to /group/halld/ ensures that the prebuilt software will be available in the container.<br />
<br />
If everything is set up correctly all you have to do is type <br />
gxshell<br />
to start a bash shell within the GlueX container with the software environment set up.</div>Aaustreg