Skip to content

Race condition in allen.py

Description

In order to run Allen using Gaudi framework, we use the Python script Dumpers/BinaryDumpers/allen.py. Basically, this script initializes two main threads, one running Gaudi, and the other running Allen (allen.cpp::allen).

When Allen is called, it creates and launches two types of threads: stream threads and I/O threads. Stream threads are the data processing threads, those which apply the sequence of algorithms and selection lines to the input data; meanwhile the I/O threads are the input reading and output writing threads. The input reading thread read slices from the input provider. All this threads are controlled within Allen main thread using zmq sockets, such that Allen is able to send and receive instructions from them. When Allen is initialized, this threads are launched but blocked waiting for an instruction from it.

Allen thread itself is also controlled via a zmq control socket (from allen.py). For Allen to actually start to do something, allen.py sends to it a START instruction. Then Allen sends a signal to the input reading threads to read slices from the input reader (slice=event data corresponding to a bunch of events).

The input reader thread can send two types of signals back to Allen, RUN or SLICE. The latter (SLICE) just indicates that a slice is available, sending it to the stream threads through their corresponding sockets, together with the instruction PROCESS to also start the processing. On the other hand, when the signal is RUN, it reports a run change, requesting to update (if needed) detector data, which is done through the Allen updater (non-event-data updater). Note that in this context, both the input provider and updater are Gaudi services.

Within allen.py, we first start Gaudi (gaudi.start()), which does not block program execution and starts the input provider. The input provider loader thread keeps running to fetch data. Right after this call, we instruct Allen thread to START, which also sends a START signal to input thread, requesting slices from the input provider. The input thread will continue to request slices until they are available. Meanwhile allen.py continues execution and sends a STOP instruction to Allen, not really affecting program execution. However, right after it, Gaudi is stopped (gaudi.stop()), causing its services to stop, so do the input provider and the updater.

The problem is that we have here two competing threads: the main thread from allen.py managing also Gaudi and the Allen thread. Once a slice is ready for Allen (and since we are running using a MEP) Allen gets a RUN response signal instructing to update detector data, so Allen calls the updater, which is a Gaudi service. But it might be that gaudi.stop() is called before arriving to update detector data, which cannot be done afterwards because Gaudi services were already stopped. This typically happens when the input provider is not fast enough when loading the data, so slices are not ready before stopping Gaudi.

Setup and running details

I am using a single x86 logical core and x86_64_v2-el9-gcc13-opt stack and the following Allen options (I am not running on LHCb online machines, thus absolute paths correspond to my particular machine and irrelevant here):

. /cvmfs/lhcb.cern.ch/lib/LbEnv
export ALLEN_SCRIPT=$ALLENDIR/Dumpers/BinaryDumpers/options/allen.py
export PARAMFILES=$STACKDIR/PARAM/ParamFiles/
export SEQUENCE=$ALLENBUILDDIR/hlt1_pp_forward_then_matching_1000KHz.json
export MEP=$LHCB_LUSTRE/Allen_data/MEP_dumps_18_09_24/bu_307020_LHCb_VCEB01_BU_0.mep

$STACKDIR/MooreOnline/build.x86_64_v2-el9-gcc13-opt/run python $ALLEN_SCRIPT --params $PARAMFILES --sequence $SEQUENCE --tags "run3/2024.Q1.2-v00.00,master" --mep $MEP --real-data -t 1 -n 100 -m 100 -r 100

Processing prints

This custom prints show the origin of the issue explained above (the [...] represents omited data summary prints to avoid attaching long outputs):

Expected processing flow:

allen.cpp: telling allen_control that READY
==> allen.py: allen.py: gaudi.start()
MEPProvider                            INFO Opened /sps/lhcb/gdiazlop/Allen_data/MEP_dumps_18_09_24/bu_307020_LHCb_VCEB01_BU_0.mep
MEPProvider                            INFO Reading 100 events
ApplicationMgr                         INFO Application Manager Started successfully
<== allen.py: allen.py: gaudi.start()
allen.cpp: POOLLIN allen_control: msg=START
allen.cpp: Send slice thread START to start asking for slices for input 0
MEPProvider                            INFO Writing to MEP slice index 0
MEPProvider                            INFO Calling MEP::read_mep
MEPProvider                            INFO Reading 1073741823 bytes
allen.cpp: Processing complete
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
MEPProvider                            INFO Reading 853679413 bytes
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 1 timed_out 0 n_filled 100
allen.cpp: Checking input slices (i=0) msg: RUN
allen.cpp: Requested run change from 0 to 307020
allen.cpp: Run number changing from 0 to 307020
allen.cpp: Update geometry and conditions data
AllenUpdater: Run the fake event loop to produce the new data
TBB Warning: The number of workers is currently limited to 0. The request for 1 workers is ignored. Further requests for more workers will be silently ignored until the limit changes.

HLTControlFlowMgr                      INFO Will measure time between events 0 and 0 (stop might be some events later)
HLTControlFlowMgr                      INFO Starting loop on events
HLTControlFlowMgr                      INFO Timing started at: 17:00:34
HLTControlFlowMgr                      INFO Timing stopped at: 17:00:34
ConditionsMgr                          INFO Created IOV Pool for:run(0):[300238-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[0-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[295402-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[291593-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[307020-307020]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[263836-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[300195-319274]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[303027-319274]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[307020-307023]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[274700-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[254302-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[304425-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[303096-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[276525-9223372036854775807]
ConditionsMgr                          INFO Created IOV Pool for:run(0):[400-9223372036854775807]
Align                                  INFO Alignments:(D:2831,C:25593,M:0,*:127965) Effective IOV:run(0):[307020-307020]  [0.11578 seconds]
DeMagnetConditionCall                  INFO Loading mag field from /cvmfs/lhcb.cern.ch/lib/lhcb/DBASE/FieldMap/v8r1/cdf
MagneticFieldExtension                 INFO Scale factor: 0.999283
MagneticFieldGridReader                INFO Opened magnetic field file:  /cvmfs/lhcb.cern.ch/lib/lhcb/DBASE/FieldMap/v8r1/cdf/field.v6r1.c1.down.cdf
MagneticFieldGridReader                INFO Opened magnetic field file:  /cvmfs/lhcb.cern.ch/lib/lhcb/DBASE/FieldMap/v8r1/cdf/field.v6r1.c2.down.cdf
MagneticFieldGridReader                INFO Opened magnetic field file:  /cvmfs/lhcb.cern.ch/lib/lhcb/DBASE/FieldMap/v8r1/cdf/field.v6r1.c3.down.cdf
MagneticFieldGridReader                INFO Opened magnetic field file:  /cvmfs/lhcb.cern.ch/lib/lhcb/DBASE/FieldMap/v8r1/cdf/field.v6r1.c4.down.cdf
DeviceFTGeometry                       INFO Conditions DB is compatible with FT bank version 7 and 8.
DeviceFTGeometry                       INFO Deactivated 38 links.
DependencyHandler                      INFO Inserted 23 [23] conditions to pool-iov: run(0):[307020-307020]   [0.00001 seconds]
DetectorDataService                    INFO +  Created/Accessed a total of 6231 conditions (S:  3104,L:  3104,C:    23,M:0)  Load:1.17655s/0KB Compute:0.75305s/131072KB
HLTControlFlowMgr                      INFO ---> Loop over 1 Events Finished -  WSS 3914.02, timed 0 Events: 0 ms, Evts/s = 0
allen.cpp: Checking input slices (i=0) msg: SLICE
Starting timer for throughput measurement
allen.cpp: Sending slice 0 to processor
Submitted   100 events in slice  0 to stream  0
allen.cpp: Checking input slices (i=0) msg: DONE
allen.cpp: Input complete for slice i=0
allen.cpp: POOLLIN allen_control: msg=STOP
Processed 100 events
Processed   10000 events at a rate of   196.15 events/s
Output          0 events at a rate of     0.00 events/s
allen.cpp: Sending slice 0 to write thread
allen.cpp: Checking input slices (i=1) msg: WRITTEN
allen.cpp: Processing complete
==> allen.py: gaudi.stop()
Bursts                                 INFO Number of counters : 1
[...]
ApplicationMgr                         INFO Application Manager Stopped successfully
<== allen.py: gaudi.stop()
==> allen.py: gaudi.finalize()
HLTControlFlowMgr                      INFO Memory pool: used 0.00012207 +/- 0 MiB (min: 0, max: 0) in 1 +/- 0 blocks (allocated >once in 0 +/- 0% events). Allocated capacity was 10 +/- 0 MiB (min: 10, max: 10) and 2 +/- 0 (min: 2, max: 2) requests were served
HLTControlFlowMgr                      INFO Timing table:
HLTControlFlowMgr                      INFO
[...]
ApplicationMgr                         INFO Application Manager Finalized successfully
<== gaudi.finalize()
allen.cpp: POOLLIN allen_control: msg=RESET
allen.cpp: exit event loop
2 stores; 2 empty; 1 filled.

rate_validator validation:
[...]
196.152874 events/s
Ran test for 50.980645 seconds
ApplicationMgr                         INFO Application Manager Terminated successfully

Race condition issue appearing example:

Silent
allen.cpp: telling allen_control that READY
==> allen.py: allen.py: gaudi.start()
MEPProvider                            INFO Opened /sps/lhcb/gdiazlop/Allen_data/MEP_dumps_18_09_24/bu_307020_LHCb_VCEB01_BU_0.mep
MEPProvider                            INFO Reading 100 events
ApplicationMgr                         INFO Application Manager Started successfully
<== allen.py: allen.py: gaudi.start()
allen.cpp: POOLLIN allen_control: msg=START
allen.cpp: Send slice thread START to start asking for slices for input 0
MEPProvider                            INFO Writing to MEP slice index 0
MEPProvider                            INFO Calling MEP::read_mep
MEPProvider                            INFO Reading 1073741823 bytes
allen.cpp: Processing complete
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
allen.cpp: POOLLIN allen_control: msg=STOP
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
allen.cpp: Processing complete
==> allen.py: gaudi.stop()
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
MEPProvider                            INFO Reading 853679413 bytes
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
ApplicationMgr                         INFO Application Manager Stopped successfully
<== allen.py: gaudi.stop()
==> allen.py: gaudi.finalize()
HLTControlFlowMgr                      INFO Memory pool: used 0 +/- 0 MiB (min: 1.75922e+13, max: 0) in 0 +/- 0 blocks (allocated >once in -100 +/- -100% events). Allocated capacity was 0 +/- 0 MiB (min: 1.75922e+13, max: 0) and 0 +/- 0 (min: 18446744073709551615, max: 0) requests were served
HLTControlFlowMgr                      INFO Timing table:
HLTControlFlowMgr                      INFO
[...]
HLTControlFlowMgr                      INFO Histograms converted successfully according to request.
ToolSvc                                INFO Removing all tools created by ToolSvc
ApplicationMgr                         INFO Application Manager Finalized successfully
<== gaudi.finalize()
allen.cpp: POOLLIN allen_control: msg=RESET
allen.cpp: exit event loop
run_slices thread:get_slice(1000) returned: good 1 done 1 timed_out 0 n_filled 100
2 stores; 2 empty; 0 filled.
Timer wasn't started.
ApplicationMgr                         INFO Application Manager Terminated successfully
Crashed
allen.cpp: telling allen_control that READY
==> allen.py: allen.py: gaudi.start()
MEPProvider                            INFO Opened /sps/lhcb/gdiazlop/Allen_data/MEP_dumps_18_09_24/bu_307020_LHCb_VCEB01_BU_0.mep
MEPProvider                            INFO Reading 100 events
ApplicationMgr                         INFO Application Manager Started successfully
<== allen.py: allen.py: gaudi.start()
allen.cpp: POOLLIN allen_control: msg=START
allen.cpp: Send slice thread START to start asking for slices for input 0
MEPProvider                            INFO Writing to MEP slice index 0
MEPProvider                            INFO Calling MEP::read_mep
MEPProvider                            INFO Reading 1073741823 bytes
allen.cpp: Processing complete
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
MEPProvider                            INFO Reading 853679413 bytes
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
allen.cpp: POOLLIN allen_control: msg=STOP
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
allen.cpp: Processing complete
==> allen.py: gaudi.stop()
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 0 timed_out 1 n_filled 0
run_slices thread:get_slice(1000) returned: good 1 done 1 timed_out 0 n_filled 100
allen.cpp: Checking input slices (i=0) msg: RUN
allen.cpp: Requested run change from 0 to 307020
allen.cpp: Run number changing from 0 to 307020
Update geometry and conditions data
AllenUpdater: Run the fake event loop to produce the new data
ApplicationMgr                         INFO Application Manager Stopped successfully
<== allen.py: gaudi.stop()
==> allen.py: gaudi.finalize()
HLTControlFlowMgr                      INFO Memory pool: used 0 +/- 0 MiB (min: 1.75922e+13, max: 0) in 0 +/- 0 blocks (allocated >once in -100 +/- -100% events). Allocated capacity was 0 +/- 0 MiB (min: 1.75922e+13, max: 0) and 0 +/- 0 (min: 18446744073709551615, max: 0) requests were served
HLTControlFlowMgr                      INFO Timing table:
HLTControlFlowMgr                      INFO
[...]
HLTControlFlowMgr                      INFO Histograms converted successfully according to request.
ToolSvc                                INFO Removing all tools created by ToolSvc
ApplicationMgr                         INFO Application Manager Finalized successfully
<== gaudi.finalize()
TBB Warning: The number of workers is currently limited to 0. The request for 1 workers is ignored. Further requests for more workers will be silently ignored until the limit changes.

HLTControlFlowMgr                      INFO Will measure time between events 0 and 0 (stop might be some events later)
HLTControlFlowMgr                      INFO Starting loop on events
HLTControlFlowMgr                      INFO Timing started at: 13:10:29
HLTControlFlowMgr                      INFO Timing stopped at: 13:10:29
[FATAL] Process: 'P1179055' (SignalHandler) RTL:Handled signal: 11 [SIGSEGV] Old action:(nil) Mem:0x153e2bb29cf8 Code:00000001

[INFO] Process: 'P1179055' (ExitSignalHandler) ---------------------- Backtrace ----------------------

[INFO] Process: 'P1179055' Number of elements in backtrace: 23
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/Online/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libOnlineBase.so.7.28.0.0(+0x13e580)[0x154092f77580]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/Online/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libOnlineBase.so.7.28.0.0(_ZN3RTL17ExitSignalHandler7handlerEiP9siginfo_tPv+0x224)[0x154092f7bee4]
/lib64/libc.so.6(+0x3ebf0)[0x1540bba3ebf0]
/cvmfs/lhcb.cern.ch/lib/lcg/releases/DD4hep/01.31-727d6/x86_64-el9-gcc13-opt/lib/libDDCore.so.1.30(_ZN6dd4hep16ObjectExtensions12addExtensionEyPNS_14ExtensionEntryE+0xbf)[0x154091930e9f]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/Detector/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libDetectorLib.so(_ZN4LHCb8Detector19DetectorDataServiceC1ERN6dd4hep8DetectorESt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaISB_EE+0x33e)[0x1540924c314e]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libLbDD4hep.so(_ZN4LHCb3Det8LbDD4hep9DD4hepSvcC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEP11ISvcLocator+0x1b5)[0x15408fa2bfd5]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libLbDD4hep.so(_ZNSt17_Function_handlerIFSt10unique_ptrI8IServiceSt14default_deleteIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEP11ISvcLocatorEN5Gaudi13PluginService2v27Details14DefaultFactoryIN4LHCb3Det8LbDD4hep9DD4hepSvcENSI_7FactoryIFPS1_SC_SE_EEEEEE9_M_invokeERKSt9_Any_dataSC_OSE_+0x2e)[0x15408fa2d68e]
/cvmfs/lhcb.cern.ch/lib/lhcb/GAUDI/GAUDI_v39r4/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libGaudiCoreSvc.so(_ZN14ServiceManager13createServiceERKN5Gaudi5Utils14TypeNameStringE+0x1ed)[0x154096ff66dd]
/cvmfs/lhcb.cern.ch/lib/lhcb/GAUDI/GAUDI_v39r4/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libGaudiCoreSvc.so(_ZN14ServiceManager10addServiceERKN5Gaudi5Utils14TypeNameStringEi+0x110)[0x154096ff3c20]
/cvmfs/lhcb.cern.ch/lib/lhcb/GAUDI/GAUDI_v39r4/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libGaudiCoreSvc.so(_ZN14ServiceManager7serviceERKN5Gaudi5Utils14TypeNameStringEb+0x47a)[0x154096ff43aa]
/cvmfs/lhcb.cern.ch/lib/lhcb/GAUDI/GAUDI_v39r4/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libGaudiKernel.so(_ZNK20ServiceLocatorHelper7serviceESt17basic_string_viewIcSt11char_traitsIcEEbb+0xbf)[0x15409c03e88f]
/cvmfs/lhcb.cern.ch/lib/lhcb/GAUDI/GAUDI_v39r4/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libGaudiKernel.so(_ZNK20ServiceLocatorHelper13createServiceESt17basic_string_viewIcSt11char_traitsIcEERK11InterfaceIDPPv+0x42)[0x15409c03ecc2]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/Allen/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libBinaryDumpersModule.so(_ZNK13ServiceHandleIN4LHCb3Det8LbDD4hep10IDD4hepSvcEE10i_retrieveIS3_EE10StatusCodeRPT_+0x151)[0x1540901eedb1]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libLbDD4hep.so(_ZNK4LHCb3Det8LbDD4hep11IOVProducerclERKNS_18ODINImplementation2v74ODINE+0x124)[0x15408fa41144]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libLbDD4hep.so(_ZNK5Gaudi10Functional7details11TransformerIFSt10shared_ptrIN6dd4hep4cond15ConditionsSliceEERKN4LHCb18ODINImplementation2v74ODINEENS0_6Traits11BaseClass_tI10FixTESPathINS_17FSMCallbackHolderINS_9AlgorithmEEEEEELb0EE7executeERK12EventContext+0x52)[0x15408fa42ef2]
/cvmfs/lhcb.cern.ch/lib/lhcb/GAUDI/GAUDI_v39r4/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libGaudiKernel.so(_ZN5Gaudi9Algorithm10sysExecuteERK12EventContext+0x180)[0x15409be5afe0]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libHLTScheduler.so(_ZNK10AlgWrapper7executeER12EventContextN3gsl4spanIN4LHCb10Interfaces23ISchedulerConfiguration5State8AlgStateELm18446744073709551615EEE+0x50)[0x154090d6a220]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libHLTScheduler.so(+0x11ef59)[0x154090d64f59]
/pbs/throng/lhcb/gdiazlop/Software/lhcb-stack/stack/LHCb/InstallArea/x86_64_v2-el9-gcc13-opt/lib/libHLTScheduler.so(+0x11f89a)[0x154090d6589a]
/cvmfs/lhcb.cern.ch/lib/lcg/releases/tbb/2021.10.0-2a247/x86_64-el9-gcc13-opt/lib64/libtbb.so.12(+0x22ea8)[0x15409b5caea8]
/cvmfs/lhcb.cern.ch/lib/lcg/releases/tbb/2021.10.0-2a247/x86_64-el9-gcc13-opt/lib64/libtbb.so.12(+0x250ee)[0x15409b5cd0ee]
/lib64/libc.so.6(+0x8a19a)[0x1540bba8a19a]
/lib64/libc.so.6(+0x10f210)[0x1540bbb0f210]
[INFO] Process: 'P1179055' (SignalHandler) 00 --> 0x154092f77580
[INFO] Process: 'P1179055' (SignalHandler) 01 --> 0x154092f7bee4
[INFO] Process: 'P1179055' (SignalHandler) 02 --> 0x1540bba3ebf0
[INFO] Process: 'P1179055' (SignalHandler) 03 --> 0x154091930e9f
[INFO] Process: 'P1179055' (SignalHandler) 04 --> 0x1540924c314e
[INFO] Process: 'P1179055' (SignalHandler) 05 --> 0x15408fa2bfd5
[INFO] Process: 'P1179055' (SignalHandler) 06 --> 0x15408fa2d68e
[INFO] Process: 'P1179055' (SignalHandler) 07 --> 0x154096ff66dd
[INFO] Process: 'P1179055' (SignalHandler) 08 --> 0x154096ff3c20
[INFO] Process: 'P1179055' (SignalHandler) 09 --> 0x154096ff43aa
[INFO] Process: 'P1179055' (SignalHandler) 10 --> 0x15409c03e88f
[INFO] Process: 'P1179055' (SignalHandler) 11 --> 0x15409c03ecc2
[INFO] Process: 'P1179055' (SignalHandler) 12 --> 0x1540901eedb1
[INFO] Process: 'P1179055' (SignalHandler) 13 --> 0x15408fa41144
[INFO] Process: 'P1179055' (SignalHandler) 14 --> 0x15408fa42ef2
[INFO] Process: 'P1179055' (SignalHandler) 15 --> 0x15409be5afe0
[INFO] Process: 'P1179055' (SignalHandler) 16 --> 0x154090d6a220
[INFO] Process: 'P1179055' (SignalHandler) 17 --> 0x154090d64f59
[INFO] Process: 'P1179055' (SignalHandler) 18 --> 0x154090d6589a
[INFO] Process: 'P1179055' (SignalHandler) 19 --> 0x15409b5caea8
[INFO] Process: 'P1179055' (SignalHandler) 20 --> 0x15409b5cd0ee
[INFO] Process: 'P1179055' (SignalHandler) 21 --> 0x1540bba8a19a
[INFO] Process: 'P1179055' (SignalHandler) 22 --> 0x1540bbb0f210