January 28, 2015 Calibration
GlueX Calibration Meeting
Wednesday, January 28, 2015
11:00 am, EDT
JLab: CEBAF Center, F326
Connection Using Bluejeans
- To join via Polycom room system go to the IP Address: 18.104.22.168 (bjn.vc) and enter the meeting ID: 630804895.
- To join via a Web Browser, go to the page  https://bluejeans.com/630804895.
- To join via phone, use one of the following numbers and the Conference ID: 630804895
- US or Canada: +1 408 740 7256 or
- US or Canada: +1 888 240 2560
- Upon connection all microphones are automatically muted. To unmute your mike on a Polycom or equivalent unit, enter *4
- More information on connecting to bluejeans is available.
- Software Review, February 10-11, 2015
- Collaboration Meeting February 19-21 at Jefferson Lab
- Timing Offset Calibrations (Mike S.)
- REST data status (Justin)
- Calibration status/updates
- Start Counter
- Calibration Dashboard
- Data Monitoring
Attending: Sean (NU); Simon, Mike S., Paul, Mark I., Mark D., Nathan, Adesh, Eric, Alex B., Will M., Lubomir, Eugene, Beni, Kei (JLab); Curtis, Naomi (CMU); Justin (MIT); Mahmoud (FIU); Andrei (Regina)
Timing Offset Calibrations
Mike Staib gave a report on his work on automatically calculating timing offsets for the whole detector. This procedure leverages the tracking information, and takes into account the correlations. He starts with all time offsets set to zero, first does a rough timing alignment on the calibrated hit objects, then does a rough calibration of the TDC times with respect to the ADC times, and finally uses negative tracks to calculate offsets relative to the start counter time and derive a finer calibration (in some cases, per-channel). He gets calibrations that looks as good as those currently in the CCDB, sometimes better. For example, the start counter timing resolution for his example run was reduced from 1.4 ns to 1.1 ns, and a larger number of tracks were reconstructed.
There was some discussion about how to best structure this effort. This plugin gives a good start on timing calibrations, but the ultimate calibrations should be done by the detector groups which have the best understanding of their detector. Either Mike's procedure will need to be modular enough so that other people can contribute, or more refined efforts can build on top of these calibrations. A major open question is how interdependent these calibrations are, e.g., if a small epsilon change in one detector necessarily means that we need to recalibrate all subsystems.
- tools for submission of constants to central CCDB
- tests with other runs for stability and robustness of calibrations
- monitoring of results
- investigate automated running of procedure (once/twice a day? offline v. online farms?)
REST file status
Justin gave a comparison between results obtained processing EVIO and REST files. The REST files were generated over the weekend, using the latest calibrations. Generally, the results using REST files were similar, were processed 30 times faster than EVIO, and took up two orders of magnitude less space on disk (~300 - 650 times less). Note that the EVIO data is taken in "mode 8", which stores the waveforms for each hit, along with pulse integral/time summary data. Some caveats:
- There appears to be some timing problems with the tagger microscope
- The default PID algorithms are not ready to be used
- There is a small discrepancy in the results after the kinematic fit
Matt and John have made progress in isolating a useful sample of pi0's, and are progressing with the relative gain calibrations.
No new results to report. Will McGinley is working on relative gain calibration using pi0's; Mark Dalton is working on improving the timing of individual fADC channels; cosmic ray data is still being analyzed at Regina. Sean is helping with implementing the calibrations in the reconstruction software.
Progress is being made with the per-channel timing calibration and the overall energy scale (for dE/dx measurements). Naomi has been studying the drift time spectra with different gas mixes using the CDC prototype at CMU. She finds a good match to the photon beam data with a gas mix of ~57-58% argon. An outstanding question is what the contribution from alcohol in the gas mix is.
Progress has also been made in understanding the gas mix in the FDC, with an argon contribution of ~45% and some contribution from alcohol. Lubomir has successfully aligned the strips and wires in one package, and is moving to align the others. He has obtained resolutions of 200 um for the wires and 250 um for the strips. Vlad's gain calibrations have been given to Simon, who is formatting them to add to the CCDB.
Sasha O. has been extracting calibrations and putting them into the TOF hit factories. He has calibrations for timewalks, time offsets, and light velocity. One subtle issue is that he has been working with the sim-recon trunk, while most development in the past 6 months has been on the commissioning branch, so some work will be needed to synchronize the efforts.
Mahmoud has been working on getting the test stand calibrations into the CCDB with Sean's help. Eric is working on extract per-counter calibrations from data.
Nathan showed some plots which indicated only a very small timewalk correction. There was discussion of various features of the data which could use more study, such as the backgrounds and the apparent double-peaked structure in the main signal peak.
Alex is finishing up a study on the quality of the fibers, which involves some information on their gains. His other current priority is replacing some electronics boards which were not delivering the proper bias voltages, leading to some lost channels.
Simon checked in some geometry updates, including adding the tilt to the Start Counter. He is also working on including updates from the survey results. Sean is checking out these updates in preparation for a larger run, and updating his calculations of hit rates.
Kei gave an update on the performance of the offline monitoring run over the past weekend. He ran using the new Centos 6.5 farm nodes using 6 threads per job. Besides the monitoring, he also generated REST files and various skims. Using a new database and tools from Mark Ito, he was able to report various statistics of his jobs.