Andrea Villa (dd6174dd) at 28 Mar 21:40
Merge branch 'master' into 'avilla_ctau_renaming'
... and 12 more commits
(c) Copyright 2000-2024 CERN for the benefit of the LHCb Collaboration
# (c) Copyright 2000-2024 CERN for the benefit of the LHCb Collaboration #
# Change the alg name to ensure the code is finalized after all the others,
# see discussion at https://gitlab.cern.ch/lhcb/DaVinci/-/merge_requests/1058#note_7795091.
Thank you for this quick fix (in spite of it not being the final word as per the review of / comments about the original issue by @clemenci - !1058 (comment 7795091)).
I believe this YAML file is used by several tests and you may be failing those now since adding a new algo. If I recall well, then it is easier if you do not change here but rather add the extra (well, override the) option in the relevant QMT file. Then this should be good to go. Maybe you are running some checks locally and can confirm ... In any case let's try and get this merged quickly to get the nightlies all green again
Andrea Villa (413a28c6) at 28 Mar 19:18
renamed TAU to CTAU in DaVinciExamples tests
Thank you very much @clemenci and well spotted! I did not think of checking the Gaudi version in the table header and only realised that logs weren't accessible anymore.
All you say makes sense and I totally agree with your suggestions.
Davide Fazzini (58042c93) at 28 Mar 17:49
fix typo
Davide Fazzini (a9498b29) at 28 Mar 17:44
fix for the merge of the genfsr
I do not know if there is an issue opened about the failing test, but since it was discussed here I write here my report.
Yesterday I had a chat with @egraveri and @dfazzini and we concluded that the test failure was probably the result of a change in the relative order of finalization between the algorithms GenFSRMerge
and GenFSRLog
.
Looking a bit better at what changed between the latest working build (https://lhcb-nightlies.web.cern.ch/nightly/lhcb-head/3813/) and the first failing one (https://lhcb-nightlies.web.cern.ch/nightly/lhcb-head/3814/) I noticed that the main change was the upgrade of the version of Gaudi from v37r2 to v38r0.
Indeed Gaudi v38r0 includes a change that changes the order in which algorithms are initialized/finalized: gaudi/Gaudi!1508 (merged)
Since now algorithms happen to be finalized in alphabetical order, to fix the test it is enough to make sure that the name of the instance of GenFSRLog
is lexicographically higher than the name of the instance of GenFSRMerge
(@dfazzini should be preparing a MR in that direction).
As a side note, relying on a specific order of finalization of algorithms is not a good idea. A better way would be to refactor the algorithms that operate on the FSRTS into tools and have one algorithm that invokes the tools in the right order during the finalize. OTOH, the old FSR machinery is deprecated and we should migrate to the new FSR (which, by the way, do not require merging), so it's better to invest the small resources we have into the migration rather than the clean up of the old system.
Hi @dfazzini , Yes, as eduardo said storing these information in the FSRs is possible and on my to-do list already :)
I don't think that we need opening all the files just for the configuration. Ideally, the options stored in the FSRs of files processed by each AP job should be identical, given that the input files were generated using the same Moore steps.
Therefore, my proposal is to read one file, extract the options needed to configure the AP, and subsequently, during the execution of the job on each input file, DaVinci can verify that the metadata information passed to the AP corresponds with the data stored in its FSR. If this is not the case it can raise an error.
On the other side this could led to a lot of failures, so another idea could be to perform this consistency check on the FSRs content at the end of the production. So it will be performed only once and not every time the AP is run.
Anfeng Li (919a09c9) at 28 Mar 13:01
Remove redundant information in DecayTreeFitterResults
Anfeng Li (fa18c449) at 28 Mar 12:58
Remove redundant information in MCReconstructible
... and 2 more commits
Many thanks! Should be fixed now.
So this will be stored on the file level, in which case, how do you envision this working as we'll be operating on bookkeeping paths?
Is the idea to open all of the files during job configuration? What if they're inconsistent?
dddb_tag, conddb_tag
We should probably use the new names rather than the deprecated detdesc ones.
Hi @dfazzini, in fact we recently had a discussion on the enhancement of what we store as FSRs, see the task https://gitlab.cern.ch/lhcb-dpa/project/-/issues/271. Aside the config items you list, we have agreed to store the Options configuration. Maybe useful, @eleckste, to update https://gitlab.cern.ch/lhcb-dpa/project/-/issues/271 with the conclusions of our meeting :-).
In any case, yes @dfazzini, we're getting there to solve this task and have self-descriptive data and more consistency checks in our DPA jobs.
Quick update on this topic: I saw that the MR LHCb!4404 (merged) has been merged and so the new FSRs are now available. I discussed with @nskidmor about the possibility to add at the Sprucing level some metadata information in the new FSRs. In particular: [input_type, data_type, simulation, lumi, dddb_tag, conddb_tag], these are the information needed in the AP for allowing an automatic input data configuration. Nicole said that they could be easily added. @eleckste do you think can you add these infos when the work on the encoding keys is finalized?
I just went ahead and added this. Sorted :-).