Difference between revisions of "October 22, 2014 Calibration"

From GlueXWiki
Jump to: navigation, search
m (Text replacement - "hdops.jlab.org/wiki" to "halldweb.jlab.org/hdops/wiki")
 
(5 intermediate revisions by 3 users not shown)
Line 28: Line 28:
 
### Have been developing and running scripts to run plugins over simulated data
 
### Have been developing and running scripts to run plugins over simulated data
 
### David just put in fix for DAQ plugin, will test all plugins and macros throughout the next week.
 
### David just put in fix for DAQ plugin, will test all plugins and macros throughout the next week.
### Will have scripts ready to run over real data as it comes in (see [https://halldweb1.jlab.org/wiki/index.php/Data_Monitoring_Procedures Data Monitoring Procedures])
+
### Will have scripts ready to run over real data as it comes in (see [https://halldweb.jlab.org/wiki/index.php/Data_Monitoring_Procedures Data Monitoring Procedures])
 
### To reflect most recent results (including pedestals in simulated MC), see files in /u/scratch/kmoriya/detcom_01/
 
### To reflect most recent results (including pedestals in simulated MC), see files in /u/scratch/kmoriya/detcom_01/
 
### Directory monitoring/ in above will contain files with ROOT output of plugins (soon!)
 
### Directory monitoring/ in above will contain files with ROOT output of plugins (soon!)
 
## Sean
 
## Sean
## Justin:
+
## Justin (version 0.2):
### [https://halldweb1.jlab.org/cgi-bin/data_monitoring/ver0.2/runBrowser.py Run Browser]
+
### [https://halldweb.jlab.org/cgi-bin/data_monitoring/ver0.2/runBrowser.py Run Browser]
### [https://halldweb1.jlab.org/data_monitoring/ver0.2/timeSeriesTest.html Time Series] (temporary static page)
+
### [https://halldweb.jlab.org/data_monitoring/ver0.2/timeSeriesTest.html Time Series] (temporary static page)
## [https://hdops.jlab.org/wiki/index.php/Offline_Analysis_Commissioning Commissioning Plan]
+
## [https://halldweb.jlab.org/hdops/wiki/index.php/Offline_Analysis_Commissioning Commissioning Plan]
 
## [[Data_Monitoring_Database | Data Monitoring Database & Histograms]]
 
## [[Data_Monitoring_Database | Data Monitoring Database & Histograms]]
 
# [[Media:Versioning.pdf|Software Versioning]] (Mark I.)
 
# [[Media:Versioning.pdf|Software Versioning]] (Mark I.)
Line 58: Line 58:
 
| ST  || YES || YES  ||
 
| ST  || YES || YES  ||
 
|-
 
|-
| TAGH || YES || NO?  || Nathan working on plugin, [https://halldweb1.jlab.org/wiki/images/d/d4/TAGH_monitor.pdf initial macro results]
+
| TAGH || YES || NO?  || Nathan working on plugin, [https://halldweb.jlab.org/wiki/images/d/d4/TAGH_monitor.pdf initial macro results]
 
|-
 
|-
 
| <span style="color:#FF0000">TAGM</span> || NO? || NO?  || Richard
 
| <span style="color:#FF0000">TAGM</span> || NO? || NO?  || Richard
Line 65: Line 65:
 
|-
 
|-
 
|}
 
|}
 +
 +
 +
== Minutes ==
 +
 +
Attending: Sean (NU); Simon, Lubomir, Kei, Adesh, Mark I., Mark D., Beni (JLab); Paul, Curtis, Naomi, Will L. (CMU); Tegan (Regina); Justin (MIT); Matt S. (IU)
 +
 +
# Status Update
 +
#* Adesh is a new IU postdoc who is based at JLab, and one of the items he is currently working on is per-channel timing offsets for the FCAL
 +
#* Zisis communicated by email that Andrei and Noemi are working on extracting calibrations for the BCAL.
 +
# [[Fall 2014 Commissioning Simulations]] updates
 +
#* We have simulated 5 billion beam photons for no-field and 1200A magnetic field settings.  This corresponds to ~1 second of beam.  The exact normalization was calculated last week by Richard Jones.  Sean will update the simulation wiki pages to document this calculation.
 +
#* Sean is analyzing these data and did not have results ready for this meeting.
 +
#* We discussed the issue of BCAL noise simulation at length. 
 +
#**At the October Collaboration meeting, it was noticed that the BCAL dark noise simulation generated hits at a substantial rate, but there were very few noise hits in the data taken over the summer.  Will Levine brought this to Sean's attention on Friday.  To try to get some reasonable numbers from this rate simulation, Sean and Will decided on a kludge that would only add in the dark hits to BCAL hits corresponding to a real photon.  The BCAL reconstruction is tuned to expect these dark hits to be folded in, and a substantial retuning of this code is outside the scope of this exercise.  The BCAL results from these simulations should therefore be taken with an appropriately-sized grain of salt.
 +
#** The current BCAL dark noise model depends on several parameters whose values do not reflect the current configuration of the BCAL.  This model needs to be reevaluated.
 +
#** There was a side discussion of fADC threshold settings.  Beni made the point that if the threshold is set too close to the pedestal, this produces very bad results from the fADC algorithms
 +
# Data Monitoring
 +
#* Kei has been developing and running scripts to run plugins over simulated data.  The data that is written to the RAID disks in the counting house will show up on tape within ~20 minutes.  He has also been putting in a substantial amount of effort in testing the online monitoring plugins and debugging a variety of issues.
 +
#** He also reviewed the current status of the online plugins.  An initial version of the tagger hodoscope monitoring was presented.
 +
#** The plan is to have a copy of the online monitoring ROOT file available for the monitoring web pages immediately after the run is done.  Right now, the group disk is mounted on the online machines, but it is not terribly large.  Saving the monitoring ROOT files on the work disk would be better, and David L. will follow up on this.
 +
#* Simon has been busy working the pair spectrometer into sim-recon.  He has checked in updates to the data model to load PS/PSC hits.  He has unchecked-in code that produces hits in HDGeant and factories to create raw and calibrated hits.  He has also incorporated code from Aristeidis for a TOF display into hdview2.
 +
#* Sean produced a new version of the monitoring DB with a nearly-final set of values for the start of running.  The DB has now been moved to the MySQL server on hallddb.  He is polishing scripts that will run over the ROOT files and extract the monitoring data.
 +
#* Justin has produced a new version of the offline monitoring web pages.  He is currently working on incorporating all the new information from the database as well.
 +
# Mark I. talked about his scheme for controlling different versions of GlueX software.
 +
#* The base is an XML file that contains the version strings for all of the components of the offline software stack.  This values can be used as an input to the build scripts, to build a version of the software from scratch, and to define environment variables to load a version of the software for analysis. An alpha version of this system is being readied for release.  These files also will be used by the offline data monitoring system to track the different versions of software used for reconstruction.
 +
#* Matt S. brought up a couple points.  One is that the capability of the current HALLD_MY area for local development should be maintained, especially now that people will be writing plugins to do analyses.  Mark I. said that in the current version of SBMS, it works for plugins but not libraries. David L. pointed out that plugins will build correctly if one uses [https://halldweb.jlab.org/wiki/index.php/HOWTO_make_a_plugin the mkplugin script] to create a new plugin, but NOT if you copy one of the existing plugins from the sim-recon tree.
 +
#* Matt's other question was about capturing the CCDB version used to reconstruct the data.  Sean mentioned that this information was being saved in the data monitoring DB, but it should probably be also saved in Mark's XML file as well.
 +
#AOB

Latest revision as of 05:07, 1 April 2015

GlueX Calibration Meeting
Wednesday, October 22, 2014
11:00 am, EDT
JLab: CEBAF Center, F326

Connection Using Bluejeans

  1. To join via Polycom room system go to the IP Address: 199.48.152.152 (bjn.vc) and enter the meeting ID: 630804895.
  2. To join via a Web Browser, go to the page [1] https://bluejeans.com/630804895.
  3. To join via phone, use one of the following numbers and the Conference ID: 630804895
    • US or Canada: +1 408 740 7256 or
    • US or Canada: +1 888 240 2560
  4. Upon connection all microphones are automatically muted. To unmute your mike on a Polycom or equivalent unit, enter *4
  5. More information on connecting to bluejeans is available.

Agenda

  1. Announcements
  2. Status Update
  3. Commissioning Planning updates
    1. Tracking
    2. Calorimetery
    3. Others
  4. Fall 2014 Commissioning Simulations updates
  5. Data Monitoring
    1. Kei:
      1. Have been developing and running scripts to run plugins over simulated data
      2. David just put in fix for DAQ plugin, will test all plugins and macros throughout the next week.
      3. Will have scripts ready to run over real data as it comes in (see Data Monitoring Procedures)
      4. To reflect most recent results (including pedestals in simulated MC), see files in /u/scratch/kmoriya/detcom_01/
      5. Directory monitoring/ in above will contain files with ROOT output of plugins (soon!)
    2. Sean
    3. Justin (version 0.2):
      1. Run Browser
      2. Time Series (temporary static page)
    4. Commissioning Plan
    5. Data Monitoring Database & Histograms
  6. Software Versioning (Mark I.)
  7. AOB
Detector developed? tested with data? notes
BCAL YES YES
CDC YES YES
FCAL YES YES
FDC YES YES
PSC, PS NO NO comments from Simon
ST YES YES
TAGH YES NO? Nathan working on plugin, initial macro results
TAGM NO? NO? Richard
TOF YES NO?


Minutes

Attending: Sean (NU); Simon, Lubomir, Kei, Adesh, Mark I., Mark D., Beni (JLab); Paul, Curtis, Naomi, Will L. (CMU); Tegan (Regina); Justin (MIT); Matt S. (IU)

  1. Status Update
    • Adesh is a new IU postdoc who is based at JLab, and one of the items he is currently working on is per-channel timing offsets for the FCAL
    • Zisis communicated by email that Andrei and Noemi are working on extracting calibrations for the BCAL.
  2. Fall 2014 Commissioning Simulations updates
    • We have simulated 5 billion beam photons for no-field and 1200A magnetic field settings. This corresponds to ~1 second of beam. The exact normalization was calculated last week by Richard Jones. Sean will update the simulation wiki pages to document this calculation.
    • Sean is analyzing these data and did not have results ready for this meeting.
    • We discussed the issue of BCAL noise simulation at length.
      • At the October Collaboration meeting, it was noticed that the BCAL dark noise simulation generated hits at a substantial rate, but there were very few noise hits in the data taken over the summer. Will Levine brought this to Sean's attention on Friday. To try to get some reasonable numbers from this rate simulation, Sean and Will decided on a kludge that would only add in the dark hits to BCAL hits corresponding to a real photon. The BCAL reconstruction is tuned to expect these dark hits to be folded in, and a substantial retuning of this code is outside the scope of this exercise. The BCAL results from these simulations should therefore be taken with an appropriately-sized grain of salt.
      • The current BCAL dark noise model depends on several parameters whose values do not reflect the current configuration of the BCAL. This model needs to be reevaluated.
      • There was a side discussion of fADC threshold settings. Beni made the point that if the threshold is set too close to the pedestal, this produces very bad results from the fADC algorithms
  3. Data Monitoring
    • Kei has been developing and running scripts to run plugins over simulated data. The data that is written to the RAID disks in the counting house will show up on tape within ~20 minutes. He has also been putting in a substantial amount of effort in testing the online monitoring plugins and debugging a variety of issues.
      • He also reviewed the current status of the online plugins. An initial version of the tagger hodoscope monitoring was presented.
      • The plan is to have a copy of the online monitoring ROOT file available for the monitoring web pages immediately after the run is done. Right now, the group disk is mounted on the online machines, but it is not terribly large. Saving the monitoring ROOT files on the work disk would be better, and David L. will follow up on this.
    • Simon has been busy working the pair spectrometer into sim-recon. He has checked in updates to the data model to load PS/PSC hits. He has unchecked-in code that produces hits in HDGeant and factories to create raw and calibrated hits. He has also incorporated code from Aristeidis for a TOF display into hdview2.
    • Sean produced a new version of the monitoring DB with a nearly-final set of values for the start of running. The DB has now been moved to the MySQL server on hallddb. He is polishing scripts that will run over the ROOT files and extract the monitoring data.
    • Justin has produced a new version of the offline monitoring web pages. He is currently working on incorporating all the new information from the database as well.
  4. Mark I. talked about his scheme for controlling different versions of GlueX software.
    • The base is an XML file that contains the version strings for all of the components of the offline software stack. This values can be used as an input to the build scripts, to build a version of the software from scratch, and to define environment variables to load a version of the software for analysis. An alpha version of this system is being readied for release. These files also will be used by the offline data monitoring system to track the different versions of software used for reconstruction.
    • Matt S. brought up a couple points. One is that the capability of the current HALLD_MY area for local development should be maintained, especially now that people will be writing plugins to do analyses. Mark I. said that in the current version of SBMS, it works for plugins but not libraries. David L. pointed out that plugins will build correctly if one uses the mkplugin script to create a new plugin, but NOT if you copy one of the existing plugins from the sim-recon tree.
    • Matt's other question was about capturing the CCDB version used to reconstruct the data. Sean mentioned that this information was being saved in the data monitoring DB, but it should probably be also saved in Mark's XML file as well.
  5. AOB