Skip to content

Calibration_Database

jrquirk edited this page Dec 16, 2014 · 8 revisions

Layout

There are currently two tables in this database. If either table is not found, an exception is thrown early on in the program. However, if rootana is invoked with the -c option (c alibration), then rootana will continue even if the tables aren't found. This is because rootana expects to be doing some sort of calibration pass on the data (though it doesn't know what). If later on, information is requested from a table that wasn't found, an exception is thrown.

Updating

To update the database, first you must run with the correct modules files and whatnot. Then you'll get some files calib.<run number>.<calibration type>.csv for the pedestal/noise/timing and calib.GeEnergy.csv for the germanium energy calibration. After checking these out by eye, you can upload them to the database of your choice

AlcapDAQ/analyzer/rootana/scripts/merge_calibration_database.py <database> <csv file 1> <csv file 2> ...

It will make the necessary tables and columns in the database.

Pedestals and Noises

This has 4 columns, run, channel, pedestal, and noise. The information can be accessed with double SetupNavigator::GetPedestal(const IDs::channel&) and double SetupNavigator::GetNoise(const IDs::channel&). To produces this information, the pedestalsandnoise_calib.cfg modules file can be used. It's a short file:

list_modules
dump_contents

[ MODULES ]
peds = PlotTPI_PedestalAndNoise

[ peds ]
n_samples=5
export_sql=true

This indicated that the first 5 samples (if available, fewer if not) will be used to calculate the average pedestal for each pulse, as well as the standard deviation per pulse. The values are histogrammed. If export_sql=true (default: false), then the mean of the pedestal and noise calculations are output to a CSV file named calib.run#####.PedestalAndNoise.csv.

Coarse Timing Offsets

This table has at least two columns, run and channel. However, there is a variable number of columns after that. For each generator used to calculate the timing offset, a column named after the generator's string is used. When calculating the offset, though, generators need to be told not to load the timing offset (if it exists) from the database. This is done with no_time_shift=true. The coder must put this in explicitly for each generator. This part of the generators configuration string is stripped before trying to read from the database. For instance FirstComplete(0.50, true), FirstComplete(0.50, false), and FirstComplete(0.50) all look for the same table in the database: FirstComplete#{constant_fraction=0.50}.

If you later was to use a different parameter related to the timing, you need to run the calibration again.

You can grab the offset with double SetupNavigator::GetCoarseTimeOffset(const IDs::source&) The modules file timeoffset_calib.cfg can be used, and is a bit more complicated than the previous one. Below is just part of the file showing calculation of the offset for the thick right silicon slow channel and the germanium fast.

list_modules
dump_contents

[ MODULES ]
analyse_pulses = MakeAnalysedPulses
r2s = PlotTAP_TDiff(SiR2-S, muSc, 3000., 3500., true)
gf = PlotTAP_TDiff(Ge-F, muSc, 29000., 34000., true)

[ analyse_pulses ]
muSc=FirstComplete(0.60,true)
muSc+=FirstComplete(0.90,true)
SiR2-S=FirstComplete(0.60,true)
Ge-F=FirstComplete(0.90,true)

In the PlotTAP_TDiff module, we're saying plot a detector against muSc, bin in the timing window 3000ns-3500ns for the silicon and 29us-34us for the germanium. And the true is for the export_sql argument. It forces the outputting of calib.run#####.CoarseTimeOffset.csv.

Pitfalls

There are a number of problems here. If the parameter order changes, new parameters are introduced, or you use "0.5" instead of "0.50" the timing offset that was correctly calculated for your generator won't be loaded, and you'll have to copy the data over to a new column in the calibration DB.

Adding CSV Files to a database

Use these commands from issue #216

$ ./rootana -i tree.root -o out.root -m configurations/pedestalsandnoise_calib.cfg -c
...
$ scripts/merge_calibration_database.py ./calibration.db calib.run?????.PedestalAndNoise.csv
$ ./rootana -i tree.root -o out.root -m configurations/timeoffset_calib.cfg -c
...
$ scripts/merge_calibration_database.py ./calibration.db calib.run?????.CoarseTimeOffset.csv

Energy Calibration CURRENTLY A BIT OUTDATED, IN PROGRESS

Right now only the germanium energy calibration has any sort of automation, motivated by the shift in gain from dataset-to-dataset. The calibration is a multipoint in situ fitting of certain background/activation peaks. The columns are run, channel, gain, gain_err, offset, offset_err, chi2, and ndf. The channel column is included to be friendly to other calibrations introduced in the future. The calibration is a line of the form Energy = gain*ADC + offset, with the *_err columns the error on the two parameters returned by the ROOT fitting procedure to that line. The Chi2 and NDF (Number of degrees of freedom) are also from that linear fit.

Germanium Calibration

The germanium energy calibration is not as simple as the above and is currently a multistep process. Below are the steps.

Step 1

Run rootana with the ge.cfg modules file on as much of the dataset of interest as you can. To get enough statistics, several runs have to be merged together.

$ AlcapDAQ/analyzer/batch/jobscripts/run_production.py --production=rootana --new=geenergycalib --version=3 --database=~/production.db --modules=AlcapDAQ/analyzer/rootana/configurations/ge.cfg --dataset=Al50b

Step 2

Run the merge script, which takes 5 arguments: group size (how many runs to hadd), production database, calibration database, ODB directory, and rootana output directory

$ AlcapDAQ/analyzer/rootana/scripts/merge_group.sh 5 ~/production.db ~/calibration.db ~/data/odb ~/data/out/v999

The output is a bunch of files _#.root, as well as a new database merge.db with information on the merged files with two tables

PedestalAndNoise
file channel pedestal noise
Al50b_1 muSc 3174.9 0.823
Al50b_1 SiR2-F 2682.528 1.904542
... ... ... ...
Al50b_2 muSc 3174.6 0.765
... ... ... ...
Al100_1 muSc 3173.2 1.001
... ... ... ...
Merge
file runs time runtime
Al50b_1 3563 3564 3565 3566 3567 1387604653.92 2836.0
Al50b_2 3568 3569 3570 3571 3572 1387607518.98 2834.0
... ... ... ...
* *PedestalAndNoise*: For each group of runs, the average pedestal and noise.

* *Merge*: For each group of runs, a text field indicating the runs that were used (SQLite can't store arrays of integers, so this must be processed separately), the weighted average Unix time in seconds of the run, and the total run time of the group (not accounting for any sort of dead time).

Step 3

Run the germanium energy run-by-run calibration in the same directory as the merged files

root[0] .L AlcapDAQ/analyzer/rootana/scripts/ge/ge_energy_runbyrun_calib.c
root[1] ge_energy_runbyrun_calib("merge.db")

Also there will be some plots.

Step 4

Merge the resulting calib.run#####.Energy.csv files into the database like before.

SiR Dataset issue

The Ge calibration does not work on datasets where the germanium fast and slow pulses were not both fed into the UH CAEN. This is because of the way pulses are paired by timestamp. In the future this will need to be modified to rely on TDetectorPulses instead, but not now.

Clone this wiki locally