Commit 1d17ec06 authored by Davide Fazzini's avatar Davide Fazzini
Browse files

fix failing test

parent 74222bca
Pipeline #4025100 failed with stages
in 30 seconds
......@@ -5,15 +5,20 @@ ApplicationMgr SUCCESS
ApplicationMgr INFO Application Manager Configured successfully
DetectorPersistencySvc INFO Added successfully Conversion service:XmlCnvSvc
DetectorDataSvc SUCCESS Detector description database: git:/lhcb.xml
FSROutputStreamDstWriter INFO Data source: EventDataSvc output: SVC='Gaudi::RootCnvSvc'
EventClockSvc.FakeEventTime INFO Event times generated from 0 with steps of 0
ApplicationMgr INFO Application Manager Initialized successfully
DeFTDetector INFO Current FT geometry version = 62
ApplicationMgr INFO Application Manager Started successfully
EventSelector INFO Stream:EventSelector.DataStreamTool_1 Def:DATAFILE='mdf:root://eoslhcb.cern.ch//eos/lhcb/wg/rta/WP2/Hlt2Throughput/minbias_filtered_1.mdf' SVC='LHCb::MDFSelector' OPT='READ' IgnoreChecksum='YES'
EventSelector.DataStreamTool_1 INFO Compression:0 Checksum:1
EventSelector SUCCESS Reading Event record 1. Record number within stream 1: 1
ApplicationMgr INFO Application Manager Stopped successfully
FSROutputStreamDstWriter INFO Set up File Summary Record
FSROutputStreamDstWriter INFO Events output: 1
LAZY_AND: DaVinci #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
NONLAZY_OR: FileSummaryRecords #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
LAZY_AND: GenFSR #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
RecordStream/FSROutputStreamDstWriter #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
NONLAZY_OR: UserAnalysis #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
LAZY_AND: UserAlgorithms #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
Gaudi__Examples__VoidConsumer/Gaudi__Examples__VoidConsumer #=200 Sum=200 Eff=|( 100.0000 +- 0.00000 )%|
......
......@@ -467,7 +467,13 @@ def expand_input_files(options):
for file_name in options.input_files:
if "root://eoslhcb.cern.ch//" in file_name:
import XRootD.client as c
expanded_files.extend(c.glob(file_name))
# Workaround since file with a prefix before 'root://'
# as ('mdf:', 'PFN:') are not expanded correctly with .glob
if ":root" in file_name:
prefix, file_name = file_name.split(":", 1)
expanded_files.extend([f"{prefix}:{f}" for f in c.glob(file_name)])
else:
expanded_files.extend(c.glob(file_name))
else:
import glob
expanded_files.extend(glob.glob(file_name, recursive=True))
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment