Skip to content

GitLab

  • Menu
Projects Groups Snippets
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in
  • C cmsgemos-analysis
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 29
    • Issues 29
    • List
    • Boards
    • Service Desk
    • Milestones
    • Iterations
  • Merge requests 3
    • Merge requests 3
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Releases
  • Packages & Registries
    • Packages & Registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • cmsgemonline
  • gem-daq
  • cmsgemos-analysis
  • Issues
  • #20

Closed
Open
Created Sep 17, 2020 by Louis Moureaux@lmoureauDeveloper

Improve the handling of big files (GB scale)

Summary

The existing unpacker can be configured to read only the first N events, but it will always copy and byte-swap the whole input file. Turn it into a "streamer" that can read events in successive batches of defined size and maintains state between reads.

What is the expected correct behavior?

The memory usage is no longer linear in the size of the input file.

Assignee
Assign to
Time tracking