GlueX Offline Meeting, October 17, 2012

From GlueXWiki
Revision as of 08:02, 18 October 2012 by Marki (Talk | contribs)

Jump to: navigation, search

GlueX Offline Software Meeting
Wednesday, October 17, 2012
1:30 pm EDT
JLab: CEBAF Center, F326/327


  1. Announcements
  2. Review of minutes from the last meeting: all
  3. Report from the last Data Challenge Meeting
  4. Reconstruction sub-group reports
    1. Calorimeters
    2. Tracking
    3. PID:
      1. GlueX Analysis Factories
      2. GlueX Kinematic Fitting
  5. Preliminary results from raw data tape simulation
  6. CCDB transition: Dmitry
  7. Action Item Review
  8. Review of recent repository activity: all

Communication Information

Video Conferencing


Talks can be deposited in the directory /group/halld/www/halldweb1/html/talks/2012-4Q on the JLab CUE. This directory is accessible from the web at .



  • CMU: Will Levine, Paul Mattione, Curtis Meyer
  • FSU: Nathan Sparks
  • JLab: Mark Ito (chair), David Lawrence, Yi Qiang, Dmitry Romanov, Simon Taylor, Elliott Wolin, Beni Zihlmann
  • UConn: Richard Jones

Review of minutes from the last Offline and Data Challenge meetings

Data Challenge and Software Release

We decided to proceed with the data challenge without waiting for the full implementation of the SEST format. If the format does become available during the Challenge, we will discuss switching it in. Since the principal goal of the Challenge is to produce output in REST format, and only a small fraction of the HDGeant output will be saved, this switching it in is a realistic option.

We will also use the latest version of the BCAL code that does not use the new digitization scheme by default. That avoids the need to re-tune the BCAL reconstruction for the newer scheme.

The version used will also use the latest changes to the REST format recently checked in by Paul.

New CentOS 6 Nodes at JLab

David has run some jobs on the new nodes. These provide a 50% increase in the power of the JLab farm and are as yet only very lightly used. He ran HDGeant-mcsmear-DANA jobs, but without full reconstruction turned on in DANA. Some job configuration lessons have been learned already. Mark will start a mini-challenge using the new nodes soon.

Profiling the Code

We raised the issue of profiling our code again. We discussed using oprofile or gprof. Dmitry, Richard, David, and Mark will propose a solution.

Reconstruction sub-group reports


Naomi ??? has reported that the pattern of stereo layers in the CDC, as built, does not match the default geometry that we have in HDDS. Layers with positive stereo angles have to be swapped with those with negative ones. Simon is working on implementing reality.


Paul called our attention to the documentation for his new analysis tools and kinematic fitter. He encouraged us to check them out, literally and figuratively.

Nathan reported some problems using the new code with SIMD use enabled. Paul will have a look.

Preliminary results from raw data tape simulation

Elliott reported that the raw data format, using EVIO, is almost ready to use. David Abbot of the Data Acquisition group has provided a library to produce CODA data in block mode with a single event per block. The raw event plug-in for JANA was written some time ago and David has been working on parsing the EVIO data. The FADC-125, FADC-250, F1-TDC, 32-channel and 48-channel versions are supported at present.

David produced a file of Pythia data with the level 1 trigger applied. Elliott sees an average of 6700 words per event. If crates with no hits are eliminated, the size goes down to 4800 words per event. There is another level of reduction from eliminating hit-less boards which has not been done yet.

We are close to being able to run this data through the reconstruction.

CCDB Improvements

Dmitry presented a recently implemented scheme to allow choice of various versions of the calibration constants at run-time. See talk for details.

Mark mentioned that Dmitry has changed the structure of the CCDB to reduce reliance on a single API library in C++. Previously, the Python scripts used this library wrapped using the SWIG package. Now the Python scripts use SQLAlchemy to encapsulate database operations and SWIG is no longer used at all. Write operations using the C++ API are deprecated. This will make the code base easier to maintain and extend as well as simplifying the move to other database engines. The change was motivated by adopting SQLite as a server-less solution for local access to constants.

Action Items

  1. Develop a profiling regime.
  2. Star a mini-challenge on the new farm nodes.