Normalization of Commissioning Beam Simulations

From GlueXWiki
Jump to: navigation, search
From: Richard Jones <richard.t.jones@uconn.edu>
To: Sean A Dobbs <s-dobbs@northwestern.edu>
CC: HallD Software Group <HallD-Offline@jlab.org>
Subject: MC beam sim rates normalization for commissioning

Sean,

I have matched the beam simulation spectrum to the calculated bremsstrahlung 
spectral rate to compute the beam sim normalization factor as shown above, assuming:

  *   100nA beam
  *   10 micron Aluminum radiator (1.12e-4 rad.len.)
  *   5mm collimator 75m from radiator

You can rescale this with radiator thickness * beam current for other settings.

See the attached plot. The red curve is the computed rate incident on the face 
of the primary collimator in units of /GeV/s under these conditions. The 
histogram is a simulation of 1 million beam events with a special ntuple 
configured to measure the spectrum at the primary collimator face, converted 
from counts/bin to counts/GeV and scaled up by a factor 975 so that it 
matches the theoretical rate curve.

The normalization factor is that each beam sim event subtends 215 ps. [Corrected value]

Please be careful in how you use this, because in beam sim mode not every event 
is written out to hddm. Only events with hits in the detector actually get 
written to hddm, and many of the beam photons get stopped in the primary 
collimator. So when you normalize you need to use the number of triggers you 
actually simulated in the simulation (or the eventNo of the last event) and 
not the number of event records that happen to appear in the hddm output stream.

-Richard Jones

Beamnormfact commissioning.gif