Skip to content
Snippets Groups Projects
Commit 98b2301b authored by Riccardo De Maria's avatar Riccardo De Maria
Browse files

Update softwarelist_boa.md

parent cef27766
No related branches found
No related tags found
No related merge requests found
Pipeline #4868487 passed
......@@ -5,71 +5,63 @@ This a list of packages with different stage of maturity and support that cover
### Maintained "Team" Tools
- **pymask** (G. Sterbini and G. Iadarola, [GitHub](https://github.com/lhcopt/lhcmask/)) Main tool to setup LHC and HL-LHC simulations for dynamic aperture with beam-beam effects and field imperfections
- Contains:
- Analysis of deferred expressions (identification of dependencies)
- MadPoint: obtain absolute coordinates combining survey and twiss results
- Note: some of these functionalities are also available in [pyoptics](#pyoptics)
- **pymask** (G. Sterbini and G. Iadarola): [GitHub](https://github.com/lhcopt/lhcmask/)
- Main tool to setup LHC and HL-LHC simulations for dynamic aperture with beam-beam effects and field imperfections
- Contains:
- Analysis of deferred expressions (identification of dependencies)
- MadPoint: obtain absolute coordinates combining survey and twiss results
- Note: some of these functionalities are also available in [pyoptics](#pyoptics)
- **pockpy** (D. Gamba): [GitHub](https://github.com/pylhc/pockpy)
- Used to carry out orbit correctors budget calculation for HL-LHC triplet
- Note: similar to CORR in MADX but it computes matrices from twiss parameters and computes expected min/max/rms orbit excursions/corrector strength, using linear algebra and non-linear optimizations.
- Used to carry out orbit correctors budget calculation for HL-LHC triplet
- Note: similar to CORR in MADX but it computes matrices from twiss parameters and computes expected min/max/rms orbit excursions/corrector strength, using linear algebra and non-linear optimizations.
- **xdeps** (R. De Maria): [GitHub](https://github.com/xsuite/xdeps)
- Generic Python data dependency manager (updates values in containers when setting depending values). It supportss nested structures and multiple input/output. Works on top of any Python data structure.
- MAD-X expression parser using LARK and compatible with xdeps expressions
- Simple mad-like environment obtained by damping data from a cpymad instance
- Generic Python data dependency manager (updates values in containers when setting depending values). It supportss nested structures and multiple input/output. Works on top of any Python data structure.
- MAD-X expression parser using LARK and compatible with xdeps expressions
- Simple mad-like environment obtained by damping data from a cpymad instance
- **tfs-pandas** (F. Soubelet, J. Dilly, OMC Team - "officially supported"): [GitHub](https://github.com/pylhc/tfs) | [PyPI](https://pypi.org/project/tfs-pandas/) | [Conda-Forge](https://anaconda.org/conda-forge/tfs-pandas)
- Pythonic, robust, tested *TFS* files I/O to tailored `pandas` dataframes (`TfsDataFrames`)
- Pythonic, robust, tested *TFS* files I/O to tailored `pandas` dataframes (`TfsDataFrames`)
### Frequency Analysis Tools
- **PySUSSIX** (S. Joly): [GitHub](https://github.com/PyCOMPLETE/PySUSSIX)
- Wrapper of SUSSIX.
- Wrapper of SUSSIX.
- **NAFFlib** (S. Kostouglou, K. Paraschou): [GitHub](https://github.com/PyCOMPLETE/NAFFlib)
- Tune determination from turn by turn data in C.
- Tune determination from turn by turn data in C.
- **harpy original** (L. Malina):
- Harmonic analysis using quadratic interpolation like SUSSIX.
- Harmonic analysis using quadratic interpolation like SUSSIX.
- **harpy new** (L. Malina):
- Decompose modes using SVD and zero padding.
- Decompose modes using SVD and zero padding.
- **harmonic_fit.py** (R. De Maria): [GitHub](https://github.com/rdemaria/pyoptics/blob/master/pyoptics/harmonic_fit.py) as part of pyoptics
- Decomposition using numpy (zero padding, peak finder, lst square fitting), support coupled signals.
- Decomposition using numpy (zero padding, peak finder, lst square fitting), support coupled signals.
### Data format tools
- **tfsdata** (R. De Maria): [GitHub](https://github.com/rdemaria/pyoptics/blob/master/pyoptics/tfsdata/tfsdata.py) as part of pyoptics
- Convert tfs file to dictionary and back
- Convert tfs file to dictionary and back
- **sdds** (J. Dilly, F. Soubelet, OMC Team): [GitHub](https://github.com/pylhc/sdds) | [PyPI](https://pypi.org/project/sdds/) | [Conda-Forge](https://anaconda.org/conda-forge/sdds)
- Python interface for *SDDS* files I/O
- Python interface for *SDDS* files I/O
- **sddsdata** (R. De Maria): [GitHub](https://github.com/rdemaria/pyoptics/blob/master/pyoptics/sddsdata.py) as part of pyoptics
- Convert sdds file to dictionary in pyoptics
- Forked in SPSMeasurement tools (not really used anymore)
- **turn_by_turn** (F. Soubelet, J. Dilly, OMC Team): [GitHub](https://github.com/pylhc/turn_by_turn) | [PyPI](https://pypi.org/project/turn-by-turn/) | [Conda-Forge](https://anaconda.org/conda-forge/turn_by_turn)
- Pythonic I/O functionality for turn-by-turn BPM measurements data from different particle accelerators
- **optics_functions** (J. Dilly, OMC Team): [GitHub](https://github.com/pylhc/optics_functions) | [PyPI](https://pypi.org/project/optics-functions/) | [Conda-Forge](https://anaconda.org/conda-forge/optics_functions)
- Calculate various beam optics functions from TWISS outputs (`TfsDataframes`)
Note: complementary to pyoptics optics class
- Convert sdds file to dictionary in pyoptics
- Forked in SPSMeasurement tools (not really used anymore)
### Job submissions tools
- **pylhc-submitter** (M. Hofer, J. Dilly, OMC Team): [GitHub](https://github.com/pylhc/submitter) | [PyPI](https://pypi.org/project/pylhc-submitter/) | [Conda-Forge](https://anaconda.org/conda-forge/pylhc_submitter) | [Demo](https://slides.com/fsoubelet/pylhc-submitter-presentation/fullscreen)
- Utility HTCondor Submitter for Parametrized Studies (any language / executable) <br> &rArr; simple to use, positive feed-back from non-OMC users
- Includes also a wrapper for SixDesk (AutoSix)
### LHC specific tools
- **PyLHC** (J. Dilly, M. Hofer, F. Soubelet, OMC Team): [GitHub](https://github.com/pylhc/PyLHC) | [PyPI](https://pypi.org/project/pylhc/)
......@@ -95,10 +87,16 @@ This a list of packages with different stage of maturity and support that cover
- read/write PTC output
- SixTrack ouput
- **turn_by_turn** (F. Soubelet, J. Dilly, OMC Team): [GitHub](https://github.com/pylhc/turn_by_turn) | [PyPI](https://pypi.org/project/turn-by-turn/) | [Conda-Forge](https://anaconda.org/conda-forge/turn_by_turn)
- Pythonic I/O functionality for turn-by-turn BPM measurements data from different particle accelerators
### Machine agnonist tools
- **optics_functions** (J. Dilly, OMC Team): [GitHub](https://github.com/pylhc/optics_functions) | [PyPI](https://pypi.org/project/optics-functions/) | [Conda-Forge](https://anaconda.org/conda-forge/optics_functions)
- Calculate various beam optics functions from TWISS outputs (`TfsDataframes`)
- Note: complementary to pyoptics optics class
- **IRNL RDT Correction** (J. Dilly, OMC Team): [GitHub](https://github.com/pylhc/irnl_rdt_correction) | [PyPI](https://pypi.org/project/irnl_rdt_correction/)
- Calculate local corrections in the IR based on minimizing RDTs
- Like S. Fartoukh's fortran script, but
......@@ -109,7 +107,6 @@ This a list of packages with different stage of maturity and support that cover
- HINT: currently being restructured (in [restructuring branch](https://github.com/pylhc/irnl_rdt_correction/tree/restructuring), which will be version 1.0).
Code restructuring is done, but an additional pdf-note with background info is currently being written.
- **gmtoolbox** (D. Gamba): [GitLab](https://gitlab.cern.ch/abpcomputing/sandbox/gmtoolbox)
- small toolbox *under SLOW development* to study impact of misalignment/ground motion on closed orbit
- fit of single sources of orbit kick (e.g. for 10 Hz oscillation)
......@@ -190,4 +187,3 @@ try to repackage functionalities in one single package with dependencies by:
- set up https://boa.readthedocs.io/en/latest/
- next steps: examples
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment