GlueX Offline Meeting, January 7, 2015
GlueX Offline Software Meeting
Wednesday, January 7, 2015
1:30 pm EST
JLab: CEBAF Center F326/327
- 1 Agenda
- 2 Communication Information
- 3 Minutes
- 3.1 Review of Minutes from December 10
- 3.2 DAQ and TTab Plugins Converted into Libraries
- 3.3 HDDM Versions and Backward Compatibility
- 3.4 Commissioning Run Review
- 3.5 Storage of Data Taken between Runs
- Review of minutes from December 10 (all)
- DAQ and TTab plugins converted into libraries
- HDDM versions and backward compatibility
- Commissioning Run Review: What tasks are important going forward?
- Offline Monitoring Report (Kei)
- Commissioning branch-to-trunk migration
- REST data
- Data Management (and skims)
- Data Challenge 3
- Software Review - February 2015
- Action Item Review
- The BlueJeans meeting number is 968 592 007 .
- Join the Meeting via BlueJeans
Talks can be deposited in the directory
/group/halld/www/halldweb/html/talks/2015 on the JLab CUE. This directory is accessible from the web at https://halldweb.jlab.org/talks/2015/ .
- FIU: Mahmoud Kamel
- FSU: Aristeidis Tsaris
- JLab: Mark Ito (chair), David Lawrence, Paul Mattione, Kei Moriya, Nathan Sparks, Simon Taylor
- NU: Sean Dobbs
Review of Minutes from December 10
We looked over the minutes of the last meeting.
- Kei is using the gxproj1 account for offline monitoring jobs and Mark is using gxproj2 for Data Challenge 3 (DC3).
- Mark was able to get the stand-alone version of Dmitry's Run Conditions Database (RCDB) running. He will circulate instructions on how to do that. He will also approach the Computer Center on installing it on halldweb1.
- Paul mentioned that he could not find documentation on the version management system that Mark presented last time. Turns out it does not exist yet. Paul suggested that the documentation be featured in the "getting started" section of the wiki.
- Mark presented a new version of the main "Offline Software" wiki page. It is still under development.
DAQ and TTab Plugins Converted into Libraries
David reviewed for us the email he sent today. Now it is no longer necessary to specify these plugins as JANA command options.
HDDM Versions and Backward Compatibility
We reviewed principles that Richard proposed for future changes to REST format. The issue is preservation of backward compatibility, being able to analyze old REST data with new code. The scheme does the preservation at the cost of complication in the element names for those that are changed from the elements they replace. We did not decide if this approach should be enshrined in policy, but will discuss it further in the future.
Commissioning Run Review
We reviewed tasks and progress from the recent run.
Offline Monitoring Report
Kei presented issues from the offline monitoring reconstruction jobs during the run. The slides:
- Offline Monitoring Summary
- Disk Usage For Each Week
- 2-track Skim Output
- Number of Files Processed
- EVIO Statistics
- Errors found:
- DAQ plugin
- Too Many FDC Hits
- Insufficient Buffer Space
- Bad Alloc
- Mismatch in Trigger Bank
- Unknown Module Type
- F1TDC Block Header
- Looking Ahead
Commissioning Branch-to-Trunk Migration
Most of the development that went on during the run was checked into the commissioning branch. A lot (but not all) of this code now has to be moved the trunk, in particular those changes that improve the reconstruction in general. Simon has been managing the commissioning branch and will look into doing this transfer.
There was interest during the run for REST-formatted data for high-level analysis. Kei has already produced these files (see his talk above). They can be found in /volatile/halld/RunPeriod-2014-10/offline_monitoring/danarest/2014-12-19/REST .
A lot of recent analysis has been based on raw data with corrections done on raw or reconstructed quantities to get better results. At present these corrections are not reflected in the REST data. In order to make future production of REST data more useful we need to move corrections and calibrations into the standard reconstruction.
Sean will coordinate production/update of calibration constants and capture of correction algorithms through the Calibration Working Group. The goal is to make a push for some substantial progress and then to re-make the REST data set.
Paul brought up the a private discussion we had on the email list on "data reconstruction trains". The idea is to have a single set of jobs do reconstruction on raw data and have several "cars" attached to the jobs for specialized purposes. Each individual project would then avoid having to fetch the data from tape and pay the CPU price of reconstruction. However such an effort requires significant coordination and management. We did not move to start designing a system right away since (a) the need is only prospective at this point and (b) a lot of the savings for high-level analysis can be achieved with useful REST-formatted data.
Storage of Data Taken between Runs
David pointed out that there will be cosmic and test running in the coming months before the spring run and we need to decide where in the tape directory hierarchy they should be stored. The consensus was to simply create a new run period directory in parallel to "RunPeriod-2014-10" (used to store data from the commissioning run just ended), something like "RunPeriod-2015-01".