kinit -n [userid] xhost + cmsuafng.fnal.gov ssh -t cmsuafng.fnal.gov
source /afs/fnal.gov/files/code/cms/setup/cshrc uaf [LPC only] setenv SCRAM_ARCH `scramv1 arch`
mkdir ~/DEV cd ~/DEV scramv1 project CMSSW CMSSW_0_9_0_pre3 cd CMSSW_0_9_0_pre3/src project CMSSWat the LPC only:
cmscvsroot CMSSW cvs login [98passwd]Next, check out the necessary packages:
cvs co -r HEAD DataFormats/CSCDigi cvs co -r HEAD EventFilter/CSCRawToDigi cvs co -r HEAD CondFormats/CSCObjects cvs co -r HEAD RecoLocalMuon/CSCRecHit(Note that you do not necessarily need all of these packages.
Get copies of the "private" routines that Michael wrote as a starting point.
(Notice the periods at the end!)
These routines are meant to be for your analysis. You can get an idea
of how to hook your own routines into the CMSSW framework: aside from
the routines you need to list them in the BuildFile file. Without that,
the routines will not be "known" to CMSSW.
cd RecoLocalMuon/CSCRecHit/test/ cp -v ~schmittm/PUBLIC/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test/myRoutines.cc . cp -v ~schmittm/PUBLIC/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test/myRoutines.h . cp -v ~schmittm/PUBLIC/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test/myHistograms.h . cp -v ~schmittm/PUBLIC/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test/BuildFile . cp -v ~schmittm/PUBLIC/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test/mtccproc.cfg .Notice!
Meanwhile I have copied the current version of all these files to ~schmittm/public/ on lxplus where you can get at them, I hope. The tar file ALL_THESE_FILES.tar.gz contains all the files in that area include the /results subdirectory.
cd ~/DEV/CMSSW_0_9_0_pre3/src eval `scramv1 runtime -csh` scramv1 bThis will take several minutes.
@@@@ Checking shared library for missing symbols:
tmp/slc3_ia32_gcc323/src/RecoLocalMuon/CSCRecHit/test/libCSCRecHitReader.so: undefined symbols
_ZNK12CSCStripDigi8getStripEv
needed by tmp/slc3_ia32_gcc323/src/RecoLocalMuon/CSCRecHit/test/libCSCRecHitReader.so
If that occurs, then try scramv1 b a second time.
cd ~/DEV/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test eval `scramv1 runtime -csh` cmsRun mtccproc.cfgThe program may "sit" there for a while before showing anything on the screen. This is normal.
ConvertFile::ConvertFile() File file:./mtcc.root was not found.then you need to make a soft link to a real file -- see below.
If the job runs successfully, then you should see something like this:
Runs Run 2181 /Run /Runs InputSourceClass PoolSource /InputSourceClass EventsRead 1000 /EventsRead /InputFile TimeReport Time report complete in 37.4453 seconds Time Summary: Min: 0.00365001 Max: 7.43148 Avg: 0.0374453 monHists: Write monHists: Write monHists: Write monHists: Write monHistsGlobal: Write
The configuration file is mtccproc.cfg and it looks like this:
As you can see,
process TEST = {
# Specify your input file(s)
source = PoolConvert {
untracked vstring fileNames = {
'file:./mtcc.root'
}
untracked int32 maxEvents = 1000
}
# Specify the output file for the reconstructed events.
module out = PoolOutputModule {
untracked string fileName ="eventsOutput.root"
}
# These are some useful utilities.
service = Timing { }
service = SimpleMemoryCheck { }
# specify the data unpacker, which makes the digis
module cscunpacker = CSCDCCUnpacker {
untracked bool Debug = false
untracked bool PrintEventNumber = false
FileInPath theMappingFile = "CondFormats/CSCObjects/data/csc_slice_test_map.txt"
untracked bool UseExaminer = true
untracked uint32 ErrorMask = 0xDFCFEFFF
untracked uint32 ExaminerMask = 0x7FB7BF6
}
# These are some geometry modules needed for recHits and segments
# (I'm not sure what to do about the magnetic field.)
include "Geometry/MuonCommonData/data/muonIdealGeometryXML.cfi"
include "Geometry/CSCGeometry/data/cscGeometry.cfi"
# include "MagneticField/Engine/data/volumeBasedMagneticField.cfi"
# Specify the recHit reconstruction routine.
# The first file sets default paramters.
include "RecoLocalMuon/CSCRecHit/data/CSCRecHit2DProducer.cfi"
module rechitproducer = CSCRecHit2DProducer{
string CSCStripDigiProducer = "cscunpacker"
string CSCWireDigiProducer = "cscunpacker"
int32 no_of_chamber_types = 9
vstring algo_types = {
"CSCRecHit2DFromStripsAndWires",
"CSCRecHit2DFromORedStrips"
}
VPSet algo_psets = {
{using common_params},
{using common_params}
}
vint32 algo_per_chamber_type = { 2, 1, 1, 1, 1, 1, 1, 1, 1 }
untracked bool debug = true
}
# Specify the segment reconstruction routine.
include "RecoLocalMuon/CSCSegment/data/CSCSegmentAlgorithmSK.cfi"
include "RecoLocalMuon/CSCSegment/data/CSCSegmentAlgorithmTC.cfi"
module segmentproducer = CSCSegmentProducer {
InputTag inputObjects = rechitproducer
# Name of RecHitProducer producer module(s)...
string CSCRecHit2DProducer = "rechitproducer"
# Choice of the building algo: 1 SK, 2 TC...
int32 algo_type = 1
VPSet algo_psets = {
{using CSCSegAlgoSK}, {using CSCSegAlgoTC}
}
}
# Finally, specify my user analysis package !
module mine = myRoutines{
}
# Here is the path of modules for the analysis of a single event:
path p = {cscunpacker, rechitproducer, segmentproducer, mine}
# Specify the output:
# endpath e = {out}
}
(For example, "ln -s /tmp/schmittm/MTCC/mtcc.00002181.A.testStorageManager_0.0.root mtcc.root".)
By default, this is not written - you need to uncomment the line
endpath e = {out} in order to get the file.
It might be wise to direct that output to the /tmp/ disk.
The routines myRoutines and myHistograms create collections of histograms some of which may be useful for monitoring the MTCC data. All of these histograms are written to the file monHists.root. (This file can get large due to a number of 2D histograms. You may want to edit myRoutines.cc so that the file is written to your /tmp area. Later I will pass a parameter from the configuration file to make this more convenient...)
You can make formatted plot from these histograms using the
routines stored in
~schmittm/PUBLIC/CMSSW_0_9_0_pre3/src/RecoLocalMuon/CSCRecHit/test/results
Most of these make plots for a single chamber - you need to edit
them to get what you want.
I have example MTCC data files you can use:
It is not obvious how long the files will remain in these temporary areas.
You should make soft links to the files you want to analyze as
explained above. You can use the soft link in the mtccproc.cfg file.
In order to get these files yourself, or to get other files, you need to use CASTOR which is a general file-storage system at CERN. At Fermilab there is a different system.
In order to find what files are available, try:
nsls -l /castor/cern.ch/cms/emuslice/globaldaq/
The latest MTCC data are written to
/castor/cern.ch/cms/MTCC/data/0000NNNN/A/
where NNNN is the run number.
To copy a file to your own disk, use
rfcp /castor/cern.ch/cms/emuslice/globaldaq/
Note that this can take several minutes.
At the LPC, some files have been put on a public disk:
/uscmst1b_scratch/lpc1/lpcmuon/MTCC.
See the LPC cosmic data
web page for more information.
We will try to hand-pick some "good" files and make them available to everyone, as well as specify some useful information about them. This will take some time.