Difference between revisions of "GlueX Software Meeting, August 31, 2021"

From GlueXWiki
Jump to: navigation, search
(Agenda: first draft)
(added minutes)
 
(5 intermediate revisions by 2 users not shown)
Line 8: Line 8:
 
# Announcements
 
# Announcements
 
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008616.html New version set (4.45.0) with new versions of Diracxx (2.0.0) and HDGeant4 (2.28.0)] (Mark I.)
 
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008616.html New version set (4.45.0) with new versions of Diracxx (2.0.0) and HDGeant4 (2.28.0)] (Mark I.)
 +
##* [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008625.html default version set reverted: 4.45.0 -> 4.44.0] (Mark I.)
 
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008613.html /work/halld is back, /work/halld3 did not move] (Mark I.)
 
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008613.html /work/halld is back, /work/halld3 did not move] (Mark I.)
 
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008620.html New required packages: python3-devel and boost-python36-devel] (Mark I.)
 
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008620.html New required packages: python3-devel and boost-python36-devel] (Mark I.)
 
## New build: complete GlueX software stack, GCC 5.3.0 via module, RHEL Workstation release 7.6 (gluons), as requested by A. Somov (Mark I.)
 
## New build: complete GlueX software stack, GCC 5.3.0 via module, RHEL Workstation release 7.6 (gluons), as requested by A. Somov (Mark I.)
 +
## [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008624.html /work/halld3: transition to new server on Thursday morning] (Mark I.)
 
# Review of [[GlueX Software Meeting, August 17, 2021#Minutes|Minutes from the Last Software Meeting]] (all)
 
# Review of [[GlueX Software Meeting, August 17, 2021#Minutes|Minutes from the Last Software Meeting]] (all)
 
# Review of [[HDGeant4 Meeting, August 24, 2021#Minutes|Minutes from the Last HDGeant4 Meeting]] (all)
 
# Review of [[HDGeant4 Meeting, August 24, 2021#Minutes|Minutes from the Last HDGeant4 Meeting]] (all)
 
# FAQ of the Fortnight: [[GlueX_Offline_FAQ#What_is_the_scratch_disk.3F|What is the scratch disk?]]
 
# FAQ of the Fortnight: [[GlueX_Offline_FAQ#What_is_the_scratch_disk.3F|What is the scratch disk?]]
 +
# Update: [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=5237 getting started with gluupy] (Jon)
 
# Review of recent issues and pull requests:
 
# Review of recent issues and pull requests:
 
## halld_recon
 
## halld_recon
Line 30: Line 33:
 
##* [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues]
 
##* [https://github.com/JeffersonLab/gluex_MCwrapper/issues?q=is%3Aopen+is%3Aissue Issues]
 
##* [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr pull requests]
 
##* [https://github.com/JeffersonLab/gluex_MCwrapper/pulls?q=is%3Apr pull requests]
 +
## gluex_root_analysis
 +
##* [https://github.com/JeffersonLab/gluex_root_analysis/issues?q=is%3Aopen+is%3Aissue Issues]
 +
##* [https://github.com/JeffersonLab/gluex_root_analysis/pulls?q=is%3Apr pull requests]
 
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)
 
# Review of [https://groups.google.com/forum/#!forum/gluex-software recent discussion on the GlueX Software Help List] (all)
 
# Meeting time change? (all)
 
# Meeting time change? (all)
 
# Action Item Review (all)
 
# Action Item Review (all)
 +
 +
== Minutes ==
 +
 +
Present: Alex Austregesilo, Edmundo Barriga, Nathan Brei, Sergey Furletov, Nathaniel D. Hoffman, Mark Ito (chair), Igal Jaegle, Naomi Jarvis, David Lawrence, Simon Taylor, Jon Zarling
 +
 +
There is a [https://bluejeans.com/s/14ogCmKG9Dq/ recording of this meeting]. Log into the [https://jlab.bluejeans.com BlueJeans site] first to gain access (use your JLab credentials).
 +
 +
=== Announcements ===
 +
 +
# [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008616.html New version set (4.45.0) with new versions of Diracxx (2.0.0) and HDGeant4 (2.28.0)] and [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008625.html default version set reverted: 4.45.0 -> 4.44.0] The new release from last week, which used a new cmake-enabled version of Diracxx, had to be pulled back due to a non-functioning hdgeant4 binary. See [https://groups.google.com/g/gluex-software/c/p7MfPhJwglM this discussion] on the software help list.
 +
# [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008613.html /work/halld is back, /work/halld3 did not move] and [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008624.html /work/halld3: transition to new server on Thursday morning] We are about to be fully moved to a new work disk server. The final step will be the morning of September 2.
 +
# [https://mailman.jlab.org/pipermail/halld-offline/2021-August/008620.html New required packages: python3-devel and boost-python36-devel] The new Diracxx brings these in.
 +
# New build: complete GlueX software stack, GCC 5.3.0 via module, RHEL Workstation release 7.6 (gluons), as requested by A. Somov.
 +
 +
=== Review of Minutes from the Last Software Meeting ===
 +
 +
We went over the [[GlueX Software Meeting, August 17, 2021#Minutes|minutes from the meeting on August 17th.]]
 +
 +
* On halld_recon issue #537, [https://github.com/JeffersonLab/halld_recon/issues/537 Problems with photon energies in MC samples], Sean Dobbs has fixed many random trigger files and will be releasing them into the wild soon. He also thinks that we should backport the software fixes related to this issue to previous recon launch versions and is preparing those branches.
 +
* Mark reported that there is more work to be done on the GCC 8 access schemes before they are ready for general use.
 +
* Alex called the meeting on maintaining the online version of halld_recon. Mark was able to do complete builds (all packages) on the gluons using both GCC 4.8.5 and GCC 5.3.0. The system has not changed yet; there is more work to do, but we are maintaining the current system for the start of the run.
 +
 +
=== Review of Minutes from the Last HDGeant4 Meeting ===
 +
 +
We went over the [[HDGeant4 Meeting, August 24, 2021#Minutes|minutes from the meeting on August 24th]]. Alex has closed [https://github.com/JeffersonLab/HDGeant4/issues/181 Issue #181]: G3/G4 Difference in FDC wire efficiency at the cell boundary. Thanks to Alex, Richard Jones, and Lubomir Pentchev for all the work that went into new functions for modeling FDC efficiency in mcsmear. If more work needs to be done on this we will open an issue against halld_sim.
 +
 +
=== FAQ of the Fortnight: What is the scratch disk? ===
 +
 +
Mark reviewed the [[GlueX_Offline_FAQ#What_is_the_scratch_disk.3F|FAQ]]. David asked why we need a volatile disk and a scratch disk. Mark pointed out that since volatile is on Lustre, it is suitable for large data files only. Also it is only available from the farm whereas scratch (or can be) mounted nearly everywhere at JLab.
 +
 +
=== Update: getting started with gluupy ===
 +
 +
Jon describe recent work making it easier to adapt [[GlueX_Software_Meeting,_August_17,_2021#Histogramming_Using_Uproot_and_Our_Flat_Tree_Output|gluupy]] to users' needs. He also clarified some requirements and behaviors. Please see [https://halldweb.jlab.org/doc-private/DocDB/ShowDocument?docid=5237 his slides] for the details.
 +
 +
===Crashes with minimal DSelector upon writing output trees, probably memory leak ===
 +
 +
This gluex_root_analysis [https://github.com/JeffersonLab/gluex_root_analysis/issues/156 Issue #156]. Naomi led us through this "long-standing problem with running DSelector jobs" on the CMU cluster. Please see the issue itself for a complete description. She has provided information so that others can try to reproduce the problem.
 +
 +
[Added in press: Alex was able to duplicate the crash on the ifarm at JLab. It seems intermittent there as well.]
 +
 +
=== Meeting Time ===
 +
 +
Mark received no objections to moving the meeting time to 2 pm. Stay tuned.

Latest revision as of 15:09, 1 September 2021

GlueX Software Meeting
Tuesday, August 31, 2021
3:00 pm EDT
BlueJeans: 968 592 007

Agenda

  1. Announcements
    1. New version set (4.45.0) with new versions of Diracxx (2.0.0) and HDGeant4 (2.28.0) (Mark I.)
    2. /work/halld is back, /work/halld3 did not move (Mark I.)
    3. New required packages: python3-devel and boost-python36-devel (Mark I.)
    4. New build: complete GlueX software stack, GCC 5.3.0 via module, RHEL Workstation release 7.6 (gluons), as requested by A. Somov (Mark I.)
    5. /work/halld3: transition to new server on Thursday morning (Mark I.)
  2. Review of Minutes from the Last Software Meeting (all)
  3. Review of Minutes from the Last HDGeant4 Meeting (all)
  4. FAQ of the Fortnight: What is the scratch disk?
  5. Update: getting started with gluupy (Jon)
  6. Review of recent issues and pull requests:
    1. halld_recon
    2. halld_sim
    3. CCDB
    4. RCDB
    5. MCwrapper
    6. gluex_root_analysis
  7. Review of recent discussion on the GlueX Software Help List (all)
  8. Meeting time change? (all)
  9. Action Item Review (all)

Minutes

Present: Alex Austregesilo, Edmundo Barriga, Nathan Brei, Sergey Furletov, Nathaniel D. Hoffman, Mark Ito (chair), Igal Jaegle, Naomi Jarvis, David Lawrence, Simon Taylor, Jon Zarling

There is a recording of this meeting. Log into the BlueJeans site first to gain access (use your JLab credentials).

Announcements

  1. New version set (4.45.0) with new versions of Diracxx (2.0.0) and HDGeant4 (2.28.0) and default version set reverted: 4.45.0 -> 4.44.0 The new release from last week, which used a new cmake-enabled version of Diracxx, had to be pulled back due to a non-functioning hdgeant4 binary. See this discussion on the software help list.
  2. /work/halld is back, /work/halld3 did not move and /work/halld3: transition to new server on Thursday morning We are about to be fully moved to a new work disk server. The final step will be the morning of September 2.
  3. New required packages: python3-devel and boost-python36-devel The new Diracxx brings these in.
  4. New build: complete GlueX software stack, GCC 5.3.0 via module, RHEL Workstation release 7.6 (gluons), as requested by A. Somov.

Review of Minutes from the Last Software Meeting

We went over the minutes from the meeting on August 17th.

  • On halld_recon issue #537, Problems with photon energies in MC samples, Sean Dobbs has fixed many random trigger files and will be releasing them into the wild soon. He also thinks that we should backport the software fixes related to this issue to previous recon launch versions and is preparing those branches.
  • Mark reported that there is more work to be done on the GCC 8 access schemes before they are ready for general use.
  • Alex called the meeting on maintaining the online version of halld_recon. Mark was able to do complete builds (all packages) on the gluons using both GCC 4.8.5 and GCC 5.3.0. The system has not changed yet; there is more work to do, but we are maintaining the current system for the start of the run.

Review of Minutes from the Last HDGeant4 Meeting

We went over the minutes from the meeting on August 24th. Alex has closed Issue #181: G3/G4 Difference in FDC wire efficiency at the cell boundary. Thanks to Alex, Richard Jones, and Lubomir Pentchev for all the work that went into new functions for modeling FDC efficiency in mcsmear. If more work needs to be done on this we will open an issue against halld_sim.

FAQ of the Fortnight: What is the scratch disk?

Mark reviewed the FAQ. David asked why we need a volatile disk and a scratch disk. Mark pointed out that since volatile is on Lustre, it is suitable for large data files only. Also it is only available from the farm whereas scratch (or can be) mounted nearly everywhere at JLab.

Update: getting started with gluupy

Jon describe recent work making it easier to adapt gluupy to users' needs. He also clarified some requirements and behaviors. Please see his slides for the details.

Crashes with minimal DSelector upon writing output trees, probably memory leak

This gluex_root_analysis Issue #156. Naomi led us through this "long-standing problem with running DSelector jobs" on the CMU cluster. Please see the issue itself for a complete description. She has provided information so that others can try to reproduce the problem.

[Added in press: Alex was able to duplicate the crash on the ifarm at JLab. It seems intermittent there as well.]

Meeting Time

Mark received no objections to moving the meeting time to 2 pm. Stay tuned.