BCAL Calibration
From GlueXWiki
Contents
CCDB Tables
base_time_offset
Full path : /BCAL/base_time_offset Rows : 1 Columns : 2 Location : Used in DBCALHit_factory.cc , it is added to the hit time Function : To centre the ADC and TDC times at 0 as initial calibration
ADC_timing_offsets
Full path : /BCAL/ADC_timing_offsets Rows : 1536 Columns : 1 Location : Used in DBCALHit_factory Function : To remove the 4 ns timing offset. (No longer needed after firmware fix for the F250 ADCs.)
TDC_offsets
Full path : /BCAL/TDC_offsets Rows : 1152 Columns : 1 Location : Used in DBCALTDCHit_factory , it is subtracted from the hit time. Function : To remove the 32 ns offset.
channel_global_offset
Full path : /BCAL/channel_global_offset Rows : 768 Columns : 1 Location : Used in DBCALHit_factory and DBCALTDCHit_factory , it is subtracted from the hit time. Function : Fine timing calibration
effective_velocities
Full path : /BCAL/effective_velocities Rows : 768 Columns : 1 Location : used in DBCALCluster_factory::overlap() (cluster with hit) Location : used in DBCALPoint_factory to calculate z and throw away points > 60 cm from BCAL Location : used in DBCALPoint to subtract the fiber-propagation-time from the time sum
z_track_parms
Full path : /BCAL/z_track_parms Rows : 768 Columns : 3 Location : Used in BCALPoint to get z position from delta_t
tdiff_u_d
Full path : /BCAL/tdiff_u_d Rows : 768 Columns : 1 Location : Used in DBCALHit_factory and DBCALTDCHit_factory, but we agree now that it should not be.
digi_scales
Full path : /BCAL/digi_scales Rows : 1 Columns : 3 Location : ASCALE not used, but TSCALE is
Timing Calibration Order
- 2 numbers (ADC, TDC) for each system (BCAL, FCAL etc.) Obtained by lining up the timing peak for all hits at zero. (plugin hl_detector_timing)
- Per channel: 1 number to move the TDC distributions mean to coincide with the ADC distribution mean. Average these offsets and apply to base time for TDC.
- Timewalk
- Make correlated offset to each end of the BCAL to set Z=0 at the center. Add time to one and and subtract same time to other end.
- Make correlated offset to each end of the BCAL to set global time correct. Add same time to both ends.
Concrete Steps in Calibration
macros are located at /group/halld/Users/dalton/BCAL/monitoring/macros/
- Remove 4ns ADC hardware offset (pass 1)
- Run reconstruction with BCAL_TDC_Timing plugin to produce the BCAL_TDC_Offsets/Deltat_raw/ histogram
- Run analysis script over output
- script for single run is macros/extract_ADC_Dt_Zcorr.C
- macros/monitor.py --adc4ns can be used to run over all runs
- Load constants to CCDB table /BCAL/ADC_timing_offsets using python macros/commit_CCDB.py -d output/ADCendDt/correction/ --adc
- Remove 32ns TDC hardware offset (pass 1)
- Set CCDB table /BCAL/TDC_offsets to 0
- Run reconstruction with BCAL_online plugin to produce the /bcal/bcal_Uhit_tdiff_raw_ave histogram (pass 1)
- Run analysis script over output
- script for single run is macros/extract_Uhit_tdiff.C
- macros/monitor.py can be used to run over all runs
- Load constants to CCDB table /BCAL/TDC_offsets using python macros/commit_CCDB.py -d output/tdiff/TDCcorrection/runs/ --tdc -v
- TDC timewalk correction (pass 2)
- Prerequisites:
- Remove 4ns ADC hardware offset
- Remove 32ns ADC hardware offset
- Run reconstruction with BCAL_TDC_Timing plugin
- Run analysis script over output
- script for single run is macros/fit_TDCtimewalk.C
- macros/monitor.py --timewalk can be used to run over all runs
- Load constants to CCDB
- Prerequisites:
- Position from end time-difference (pass 2)
- Prerequisites:
- Remove 4ns ADC hardware offset
- Run reconstruction with BCAL_TDC_Timing plugin
- Run analysis script over output
- script for single run is macros/fit_ZvsDeltaT_p1.C
- ./macros/monitor.py --position can be used to run over all runs
- Load constants to CCDB
- Prerequisites:
- Calibrate the point time offset from tracking (pass 2)
- Prerequisites:
- Remove 4ns ADC hardware offset
- Run reconstruction with BCAL_TDC_Timing plugin
- Run analysis script over output
- Linear fit (needs ~1M events per run)
- script for single run is macros/ExtractTimeOffsets.C
- macros/monitor.py can be used to run over all runs
- Quadratic fit (needs 5 files per run)
- Linear fit (needs ~1M events per run)
- Load constants to CCDB
- Prerequisites:
- Attenuation length and gain ratio (pass 3)
- Prerequisites:
- position calibration (pass 2)
- Run reconstruction with BCAL_attenlength_gainratio plugin
- Run analysis script over output
- Load constants to CCDB
- Prerequisites:
Calibration Details
Position
- Combine all runs into a single file.
- Run a script to see how well the calibration is already being done.
root -b -q 'macros/z_point_vs_tracking.C+("/home/dalton/work_halld/data_monitoring/RunPeriod-2023-01/mon_ver12/rootfiles/AllFileSum//hd_root_120000.root",120000,"202301_mon_ver12_AllFileSum",0,2)'
- This produces a plot like Fig. 1. which shows how well the BCAL and tracking agree.
- There are also plots like Fig. 2 for each module. Here module 7 has a channel that is different by about 12 cm over the full range.
- If a new calibration is needed, run run the following root macro for each root file
macros/z_point_pol2.C
- which does the fits and produces the parameters and produces plots of parameters like Fig. 3
- And can produce plots like Fig. 4 for each channel if so desired.
- Note that it's possible to automate running on the individual runs using the monitor script but in practice at least 20 runs have to have been combined to get reasonable statistics for the fit. Use script to do position fits for individual files in a directory
python macros/monitor.py --position
- The histograms are located:
- BCAL_TDC_Offsets/ZvsDeltat/
- Combine the calibration numbers from all the runs run or blocks (20 runs hadded together) of runs
macros/plot_output_fit_ZvsDeltaT_p2.C
If there are multiple runs, this is a single parameter fit for each parameter and each channel to the set of results determined for all the runs. If there is only 1 run, this essentially is a format change. - Reformat all the calibration data for CCDB, in particular adding the existing delta_t from the CCDB to the change in delta_t from the analysis.
python macros/prepare_CCDB_position.py