We can target only sites with /cvmfs/lhcbdev.cern.ch mounted with j.backend.diracOpts('setTag(["/cvmfs/lhcbdev.cern.ch/"])') (see LHCBDIRAC-890). MC samples should be replicated to T1 sites, and if not the case, we should check with lhcb-datamanagement@cern.ch.
Edited
Designs
Child items
...
Show closed items
Linked items
0
Link issues together to show that they're related or that one is blocking others.
Learn more.
@shunan would you be able to check if you can run Ganga from lhcb-master and access the entire MB sample as described above? It would be great if you could also update the tutorial in a MR.
Thanks @rmatev for this solution, I tried to submit a job and got this prompt error:
File "JS.py", line 38, in <module> create_job('min-bias_down', myApp, bkk_minbias, 1, 1, 'for upgrade BDT') File "JS.py", line 19, in create_job job.backend.diracOpts('setTag(["/cvmfs/lhcbdev.cern.ch/"])')TypeError: 'str' object is not callable
using job.backend.diracOpts = 'setTag(["/cvmfs/lhcbdev.cern.ch/"])' doesn't work either, Ganga gives me:
ERROR BackendError: Error submitting job to Dirac: GangaDiracError: Traceback (most recent call last): File "<stdin>", line 697, in <module> File "<string>", line 684, in <module> File "/afs/cern.ch/user/s/shunan/gangadir/workspace/shunan/LocalXML/677/input/dirac-script-0.py", line 31, in <module> setTag(["/cvmfs/lhcbdev.cern.ch/"])NameError: name 'setTag' is not defined (Dirac backend)
I guess it's just some stupid mistake I made when configured the job, could you tell me how to do so?
Thanks @apearce , I submitted a Ganga job processing a few (< 10) min-bias samples and it finished successfully. I will request to process all of min-bias and let you know the outcome, which will (ideally) take one day.
@apearce@rmatev Just an update on this: I've managed to process nearly all min-bias samples, <1% of my jobs are failed probably due to some problem with their sites. From the file size of output .dst, I actually processed a bit more events than before. However I ran into some trouble processing those DST files to produce ROOT ntuples with DaVinci, I still need some time to figure out the problem. If I cannot solve it by myself I'll ask in the Mattermost channel. I think this updated tutorial is good to go anyway, I can try to update the documentation after the problem with DaVinci is solved.
That's great news! Thank you very much for checking. If you are willing to update the docs that would be really helpful.
The ntuple processing problem might be due to !770 (merged). I've be meaning to update the docs but haven't gotten around to it yet. The only change should be that you no longer need to specify the raw event format when calling reading.decoders, i.e. you can just do reading.decoders() rather than reading.decoders(4.3) as the docs say.
I've aligned my stack to the latest master, and now simply using reading.decoders() is just fine, I can obtain reasonable output. Variables like IPCHI2_OWNPV for child particles are now correctly filled in the branch.
BTW I think reading.decoders(0.3) is not working because I got the following error
File "/afs/cern.ch/work/s/shunan/Upgrade/BDT/tuples/before_selection/Bu2D0K_D02KsPiPi.py", line 101, in read_hlt2 reading.unpackers() + File "/afs/cern.ch/work/s/shunan/Upgrade/stack/LHCb/GaudiConf/python/GaudiConf/reading.py", line 62, in decoder container_map = prpacking.packedToOutputLocationMap() File "/afs/cern.ch/work/s/shunan/Upgrade/stack/LHCb/GaudiConf/python/GaudiConf/PersistRecoConf.py", line 216, in packedToOutputLocationMap for name, d in self._descriptors.items() File "/afs/cern.ch/work/s/shunan/Upgrade/stack/LHCb/GaudiConf/python/GaudiConf/PersistRecoConf.py", line 216, in <dictcomp> for name, d in self._descriptors.items() File "/cvmfs/lhcb.cern.ch/lib/lcg/releases/LCG_97a/Python/2.7.16/x86_64-centos7-gcc9-opt/lib/python2.7/posixpath.py", line 70, in join elif path == '' or path.endswith('/'):AttributeError: 'float' object has no attribute 'endswith'
I guess maybe we should pass a string '0.3' instead of a float 0.3? I'm not planning to dig further since the problem is solved. I will try to add something in the documentation later.