SoftwareReview October 11

From GlueXWiki
Jump to: navigation, search

Software Review Planning Meeting Friday October 11, 2013, 14:30-15:30.

2013 Software Review Page



  • +1-866-740-1260 : US+Canada
  • +1-303-248-0285 : International
  • then enter participant code: 3421244# (remember the "#").
  • or (and code without the #)

Attendees: Curtis Meyer (CMU) Mark Ito (JLab) and David Lawrence (JLab).


  1. Discussion of our current status relative to the charge.
  2. Discussion of needed documentation.
  3. Discussion of afternoon session
    • Afternoon Session
    • Breakout Session with details - 90 minutes total
    • This feels to be hard to fill, but the key thing we need to fill is end-user experience. This seems to me that we should spend a lot of time on our analysis framework and advanced tools that have literally all been developed since the last review.
      • Big picture of how data comes off the detector into PWA?
        • (i) Event building
        • (ii) Level-3 Trigger/Pass through
        • (iii) Data transfer to silo (Raw Data Format)
        • (iv) Reconstruction: REST format files.
        • (v) Simulation: REST format.
        • (vi) Skimming: REST/Root Tree.
        • (vii) Event selection: Root Tree.
        • (viii) Physics/Amplitude analysis.
      • The Software workshop in July. Discuss activities, participation and documentation.
      • User experiences – how to do this?
      • Data Challenge Plans?
      • Plans on calibration and time line overlaid with installation and commissioning timeline.
  4. Discussion of presenters.
  5. Morning Session
    • Overview of progress since last review, 30 minutes total
      • 1-2 Title + Outline
      • 3 Time line over the last 18 months.
      • 4-18 Walk through major accomplishements. This includes
        • (i) Defined REST data format.
        • (ii) development of analysis framework to initially facilitate the PID upgrade work, and ultimately develop the proposal in June.
        • (iii) Job management exercises.
        • (iv) December 2012 data challange producing about 70% of the data we expect in 2016 with 10**7 running. What did we learn? ...
        • (v) Development of analysis tools and the RootTree format for phsyics analysis.
        • (vi) Work of the BCAL simulation and reconstruction.
        • (vii) Implementation of CCDB into framework.
        • (viii) Defined EVIO Data format and tools to convert simulation to this form.
        • (ix) Software Workshop.
        • (x) Conversion to GEANT4.
        • (xi) Start work on Level-III trigger and algorithms
        • (xii) Online Data Challange with level-III trigger.
        • (xiii) Calibrations.
        • (xiv) Next Data Challenges.
      • 19 Summary/Conclusions


  1. We discussed all the topics relevant for the review, and came to the following conclusions about how we should present the material to the reviewers.
    • Morning talk given by Curtis will present an overview of progress and will end with calibration and manpower summaries.
    • Afternoon I, given by Mark Ito will provide a picture of how the data flows from the experiment to the analysis. It will also discuss the completed and planned data challenges.
    • Afternoon II, given by David Lawrence will focus in on ~two of the major software updates since the last review.
    • Afternoon III, given by Justin Stevens will focus on how we do Physics Analysis, describe the Workshop, and Justin will serve as the software user.
  2. We also realized that we need to write a longer software document, which is an expansion of what was written for 2012 that goes into some detail on everything that has happened since June 2012. We will start with the old document as a base, purge unneeded material, include material from other interim reports, and then fill out the details.

Notes (from Mark)

Present: Mark Ito, David Lawrence, Curtis Meyer

  • main focus: structure of the break-out session in the afternoon
  • several versions of the structure were proposed and discussed, including the one in the agenda
  • we re-agreed on the value of having younger collaborators make presentations
  • we ended with something like
    1. big-picture/data challenge (Mark)
    2. reconstruction highlights (David)
      1. BCAL reconstruction improvements
      2. spiral track handling
    3. analysis hightlights (Justin)
      1. Paul's analysis tools
      2. boosted decision trees
      3. Analysis Workshop
  • we agreed that Curtis's talk in the morning would likely touch on all these areas, just not at the same level of detail
  • Curtis proposed that we write a document including all topics relevant to the committee, and in particular include items which we do not present in talks because of the time constraint.
  • we agreed to meet again in a week

Action Items