Python scripts for the daily tasks in particle physics (Run 2) - Daily Python Scripts (aka DPS)
If working on soolin (or anywhere where dependencies like ROOT/latex/etc are not available), run it within CMSSW:
git clone https://github.com/BristolTopGroup/DailyPythonScripts
cd DailyPythonScripts
source bin/env.sh
# make sure matplotlib is up to date (should return 1.5.1 or above):
python -c 'import matplotlib; print matplotlib.__version__'
# test ROOT and rootpy
root -l -q
time python -c "import ROOT; ROOT.TBrowser()"
time python -c 'import rootpy'
time python -c 'from ROOT import kTRUE; import rootpy'
Setting up conda environment on your own machine, please have a look under the Conda
section.
ROOT ≥ 6.04
matplotlib ≥ 1.5
All plots/histograms provided by this package are based on either toy MC or simulated events from the CMS experiment. Real data is not included at any point.
config/* - files to save presets for available modules
data/* - for input/output of ROOT & JSON files (will be created by some scripts)
examples/* - generic examples for available modules
plots/* - default output folder for plots (will be created by some scripts)
src/* - specific use of available modules
test/* - unit tests for available modules
tools/* - available modules
- Run
experimental/BLTUnfold/merge_unfolding_jobs.sh
to merge BLT unfolding CRAB output files (each sample needs to be merged into one file, cannot be split over several files) - Move merged files to e.g.:
/hdfs/TopQuarkGroup/mc/7TeV
or8TeV/v11/NoSkimUnfolding/BLT/
- run
produceUnfoldingJobs.py
with finebinning option turned on, on central sample only (run locally on soolin):python produceUnfoldingJobs.py -c=7 -f --sample=central
- Move fine binned unfolding file to
/hdfs/TopQuarkGroup/results/histogramfiles/AN-14-071_6th_draft/7TeV
or8TeV/unfolding/
- Run the
src/cross_section_measurement/00_pick_bins.py
script to find new binning. - Modify
config/variable_binning
(and TTbar_plus_X_analyser.cpp in AnalysisSoftware) with new binning
- Run
python experimental/BLTUnfold/runJobsCrab.py
with the last few lines commented out and uncommenting the lineprint len(jobs)
to print the number of jobs. - Update
queue
inexperimental/BLTUnfold/submitBLTUnfold.description
with the outputted number of jobs cd
up to the folder containing DailyPythonScripts andtar --exclude='external/vpython' --exclude='any other large/unnecessary folders in DailyPythonScripts' -cf dps.tar DailyPythonScripts
(tar file should be <= 100MB)- Run
experimental/BLTUnfold/produceUnfoldingHistogram.py
script on merged files using HTCondor:condor_submit submitBLTUnfold.description
to convert unfolding files to our binning. Check progress usingcondor_q your_username
- Once all jobs have finished, untar output files:
tar -xf *.tar
- Output root files should be in a folder called
unfolding
. Move these new files to/hdfs/TopQuarkGroup/results/histogramfiles/AN-14-071_6th_draft/7TeV
or8TeV/unfolding/
- After running the Analysis Software, move the output files to
/hdfs/TopQuarkGroup/results/histogramfiles/AN-14-071_7th_draft/7TeV
or8TeV
using ```python experimental/move_BAT_output_files_to - Find out how many merging jobs are needed using
python experimental/merge_samples_onDICE.py -c 7(or 8) -n 1 --listJobs
- Modify the following lines in experimental/submitMerge.description:
centre of mass energy:
arguments = $(process) 7
orarguments = $(process) 8
number of jobs: enter the number of merging jobs for the centre of mass energy in question here e.g.queue 65
cd
up to the folder containing DailyPythonScripts andtar --exclude='external/vpython' --exclude='any other large/unnecessary folders in DailyPythonScripts' -cf dps.tar DailyPythonScripts
(tar file should be <= 100MB)- Merge the required BAT output files (e.g. SingleTop, QCD etc.) using
condor_submit DailyPythonScripts/experimental/submitMerge.description
x_01_all_vars
x_02_all_vars
x_03_all_vars
x_04_all_vars
x_05_all_vars
x_98_all_vars
x_99_QCD_cross_checks
x_make_binning_plots
x_make_control_plots
x_make_fit_variable_plots
(script AN-14-071 runs all of these scripts automatically if you are confident everything will run smoothly(!))
DailyPythonScripts relies on a (mini)conda environment to provide all necessary dependencies. This section describes how to set up this environment from scratch. As a first step, download and install conda (you can skip this step if you are using a shared conda install, e.g. on soolin):
wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh;
export CONDAINSTALL=<path to miniconda install> # e.g. /software/miniconda
bash miniconda.sh -b -p $CONDAINSTALL # or a different location
Next let us create a new conda environment with some base packages:
export PATH="$CONDAINSTALL/bin:$PATH"
export ENV_NAME=dps
export CONDA_ENV_PATH=$CONDAINSTALL/envs/${ENV_NAME}
conda config --add channels http://conda.anaconda.org/NLeSC
conda config --set show_channel_urls yes
conda update -y conda
conda install -y psutil
conda create -y -n ${ENV_NAME} python=2.7 root=6 root-numpy numpy matplotlib nose sphinx pytables rootpy
Then activate the environment and install the remaining dependencies
source activate $ENV_NAME
# install dependencies that have no conda recipe
pip install -U uncertainties tabulate
At this point you should have all necessary dependencies which you can try out with:
root -l -q
time python -c "import ROOT; ROOT.TBrowser()"
time python -c 'import rootpy'
time python -c 'from ROOT import kTRUE; import rootpy'
tdr gives the full documentation on setting up an AN or other tdr document. Here are some instructions for retreiving our current AN:
mkdir AnalysisNotes
svn co -N svn+ssh://svn.cern.ch/reps/tdr2 AnalysisNotes
cd AnalysisNotes
svn update utils
svn update -N notes
svn update notes/AN-17-012
eval `notes/tdr runtime -sh`
cd notes/AN-17-012/trunk
You can now edit the analysis note. Make sure to add any new file created
svn add FILE
and update the repository
svn ci
after any changes. To run AN Latex
tdr --draft --style=an b AN-17-012
Any output pdf file produced will be sent to
AnalysisNotes/notes/tmp/.