Commit 591b381a authored by Alessio Cosenza's avatar Alessio Cosenza
Browse files

Merge branch 'dev' of ssh://gitlab.cern.ch:7999/LHCData/lhc-sm-api into SIGMON-325_versions

parents 31101b7d f8441d62
Pipeline #4129774 failed with stages
in 14 minutes and 19 seconds
......@@ -14,7 +14,7 @@ stages:
- build
- pages
- deploy
- deploy_commons
- deploy_analysis
- notebooks_exec
doc:
......@@ -168,6 +168,7 @@ deploy_development_eos:
delete_venv:
stage: deploy
image: gitlab-registry.cern.ch/ci-tools/ci-worker:cc7
variables:
GIT_STRATEGY: none
# EOS_MGM_URL: eosproject.cern.ch
......@@ -221,6 +222,23 @@ deploy_pro_lhcsmnb_eos:
before_script: [ ]
after_script: [ ]
deploy_lhcsmanalysis_eos:
stage: deploy_analysis
variables:
"EOS_PUBLIC_PATH": "/eos/project/l/lhcsm/public/"
"VENV_PATH": "/eos/project/l/lhcsm/venv_${CI_COMMIT_BRANCH}"
"SPARK3_REQUIREMENTS_FILE": swan-nxcals-spark3-requirements.txt
"SCRIPT_FILENAME": "${CI_COMMIT_BRANCH}.sh"
only:
- branches
image: gitlab-registry.cern.ch/ci-tools/ci-web-deployer
script:
- bash CI/commons_deployment_setup.sh
- bash CI/eos_setup.sh
- bash CI/deploy_analysis.sh
before_script: [ ]
after_script: [ ]
notebooks_exec:
only:
- dev
......@@ -230,6 +248,7 @@ notebooks_exec:
variables:
API_BRANCH: $CI_COMMIT_BRANCH
NB_BRANCH: $CI_COMMIT_BRANCH
ANALYSIS_BRANCH: $CI_COMMIT_BRANCH
stage: notebooks_exec
trigger:
project: LHCData/lhc-sm-hwc
......
#!/bin/bash
# This script is triggered by CI to clone a chosen branch of lhc-sm-analysis and copy it's contents to the venv on EOS.
# The project lhc-sm-analysis consists of several packages with sigmon analysis.
# Contents of each package are copied to be accessible under a path /eos/project/l/lhcsm/venv_${CI_COMMIT_BRANCH}/lhcsmapi/api/
# set ANALYSIS_BRANCH if not set
export ANALYSIS_BRANCH="${ANALYSIS_BRANCH:master}"
# check if branch exists
git ls-remote --exit-code --heads https://:@gitlab.cern.ch:8443/LHCData/lhc-sm-analysis.git "${ANALYSIS_BRANCH}"
if [ "$?" == "2" ]; then
echo "Branch '${ANALYSIS_BRANCH}' not found in lhc-sm-analysis, using 'master' instead"
export ANALYSIS_BRANCH='master'
fi;
git clone --single-branch --branch $ANALYSIS_BRANCH https://:@gitlab.cern.ch:8443/LHCData/lhc-sm-analysis.git
analysis_dirs=($(find lhc-sm-analysis -maxdepth 1 -mindepth 1 -type d | grep -v 'CI' | grep -v 'git'))
EOS_PATH=/eos/project/l/lhcsm/venv_${CI_COMMIT_BRANCH}/lhcsmapi/api/
rsync="/usr/bin/rsync"
if [ ! -x $rsync ]
then
echo ERROR: $rsync not found
exit 1
fi
# SSH will be used to connect to LXPLUS and there check if the EOS folder exists
ssh="/usr/bin/ssh"
if [ ! -x $ssh ]
then
echo ERROR: $ssh not found
exit 1
fi
# Copy contents of each directory to EOS
for dir in "${analysis_dirs[@]}"; do
$rsync -abvuz -e "ssh -o StrictHostKeyChecking=no -o GSSAPIAuthentication=yes -o GSSAPITrustDNS=yes -o GSSAPIDelegateCredentials=yes" $dir/lhcsmapi/api/analysis $EOS_ACCOUNT_USERNAME@lxplus.cern.ch:$EOS_PATH/
if [ $? -ne 0 ]
then
echo ERROR: Rsync to \"$EOS_PATH\" via lxplus.cern.ch, failed
exit 1
fi
done
......@@ -5,95 +5,26 @@ This is a package with an API for signal access and processing for the LHC Signa
The API documentation is available at <https://sigmon.docs.cern.ch/api>
The User Guide is available at <https://sigmon-docs.web.cern.ch/>
The User Guide is available at <https://sigmon.docs.cern.ch/>
## Installation
There are two ways of using the API in your code:
1. Loading preinstalled packages from an EOS project folder (in SWAN environment)
2. Manual installation (in any environment)
:warning: The project is currently under heavy refactoring and the documentation could be outdated. Please try the exemplary notebooks and don't hesitate to contact us at <lhc-signal-monitoring@cern.ch> if you have any doubts.
The first option guarantees the use of the most recent code version without manual installation. The second one is more time consuming, however, works in environments with no access to the EOS folder (e.g., Apache Airflow scheduler). In addition, the second method allows to install a selected version (`pip install package_name=version`).
### Manual Installation
In order to use the API, it has to be installed with a python package installer as
## Installation
To install the `lhcsmapi` package with all the dependencies, you will need to setup a Python `venv` with access to the ACC-PY package repository.
```python
pip install --user lhcsmapi
python -m venv ./my_venv
source ./my_venv/bin/activate
python -m pip install git+https://gitlab.cern.ch/acc-co/devops/python/acc-py-pip-config.git
python -m pip install lhcsmapi
```
Check the latest version at <a href="https://pypi.org/project/lhcsmapi/">https://pypi.org/project/lhcsmapi/</a>
The API relies on several external python packages which have to be installed in a similar manner. The list of packages is stored in the <u><i>requirements.txt</i></u> file.
If you use SWAN, the service provides a set of pre-installed python packages through CVMFS. The LHC-SM notebooks require installation of several additional packages on top of CVMFS. In order to install a package, please open a SWAN Terminal by clicking [>_] icon in the top right corner.
![SWAN CLI Button](https://gitlab.cern.ch/LHCData/lhc-sm-hwc/-/raw/master/figures/swan-cli-button.png)
Five additional python packages have to be installed:
- tzlocal - for time zone convertion
- tqdm - for progress bar to track queries
- influxdb - for communication with an Influxdb
- plotly - for interactive plotting of circuit schematics
- lhcsmapi - for LHC-SM API
In order to install a package please execute the following command
```
$ pip install --user package_name
```
The expected output, after installing all packages, is presented in five figures below.
- SWAN Terminal output after successful installation of tzlocal package.
![SWAN pip install tzlocal](https://gitlab.cern.ch/LHCData/lhc-sm-hwc/-/raw/master/figures/swan-pip-install-tzlocal.png)
- SWAN Terminal output after successful installation of tqdm package.
![SWAN pip install tqdm](https://gitlab.cern.ch/LHCData/lhc-sm-hwc/-/raw/master/figures/swan-pip-install-tqdm.png)
- SWAN Terminal output after successful installation of influxdb package.
![SWAN pip install influxdb](https://gitlab.cern.ch/LHCData/lhc-sm-hwc/-/raw/master/figures/swan-pip-install-influxdb.png)
- SWAN Terminal output after successful installation of plotly package.
![SWAN pip install plotly](https://gitlab.cern.ch/LHCData/lhc-sm-hwc/-/raw/master/figures/swan-pip-install-plotly.png)
For the specific versions of our dependencies, you can consult the `test-requirements.txt`.
- SWAN Terminal output after successful installation of lhcsmapi package.
![SWAN pip install lhcsmapi](https://gitlab.cern.ch/LHCData/lhc-sm-hwc/-/raw/master/figures/swan-pip-install-lhcsmapi.png)
### Updating lhcsmapi Package
Please note that the first four packages (tzlocal, tqdm, influxdb, plotly) have to be installed only once while the last one is still in the development phase and subject to frequent updates. Please send us an e-mail request (mailto:lhc-signal-monitoring@cern.ch) if you want to subscribe for updates. In order to update the lhcsmapi package, please execute the following command.
```
$ pip install --user --upgrade lhcsmapi
```
### Known Issues
At times, in order to update the lhcsmapi package one has to execute the command
```
pip install --user --upgrade lhcsmapi
```
twice while using the SWAN terminal (cf. an error message in the figure below).
<img src="https://gitlab.cern.ch/LHCData/lhc-sm-api/raw/master/figures/double_reinstallation_error.png">
In case this command returns an error, please try to execute it again. Should that operation also fail, please uninstall the package by executing
```
$ pip uninstall lhcsmapi
```
and performing a fresh installation the package
```
$ pip install --user lhcsmapi
```
:pencil: The library is available on EOS in the form of preinstalled packages so it can be loaded e.g. on SWAN. Please consult the `lhc-sm-hwc` README for the instructions.
Should you experience any further issues with installing a package, please contact <a href="https://swan.web.cern.ch">SWAN support</a> or use the preinstalled package with the environment script.
### NXCALS Access with SWAN
## NXCALS Access
The API allows to perform a query of signals from PM and NXCALS. The NXCALS database requires an assignment of dedicated access rights for a user.
If you want to query NXCALS with the API, please follow a procedure below on how to request the NXCALS access.
If you want to query NXCALS with the API, please request the access following the procedure described in http://nxcals-docs.web.cern.ch/current/user-guide/data-access/nxcals-access-request/. You will need WinCCOA and CMW systems in the PRO environment.
1. Go to http://nxcals-docs.web.cern.ch/current/user-guide/data-access/nxcals-access-request/ for most updated procedure
2. Send an e-mail to mailto:acc-logging-support@cern.ch with the following pieces of information:
- your NICE username
- system: WinCCOA, CMW
- NXCALS environment: PRO
Optionally one can mention that the NXCALS database will be accessed through SWAN.
Once the access is granted, you can use NXCALS with SWAN.
\ No newline at end of file
Once the access is granted, you can query NXCALS. Please note that you need an active Kerberos ticket.
......@@ -8,7 +8,7 @@ import numpy as np
import warnings
from lhcsmapi.analysis.decorators import check_nan_timestamp_signals
from lhcsmapi.pyedsl.QueryBuilder import QueryBuilder
from lhcsmapi.api.query_builder import QueryBuilder
from lhcsmapi.Time import Time
......@@ -84,7 +84,7 @@ class CircuitQuery(object):
*,
system: str,
signal_names: Union[str, List[str]],
spark) -> List[pd.DataFrame]:
spark) -> Union[List[pd.DataFrame], pd.DataFrame]:
""" Method querying a signal from NXCALS with a given t_start, t_end,duration as well as system and signal names.
Either t_end or duration has to be present. Signal is synchronized to synchronization time `t0`.
......@@ -105,7 +105,8 @@ class CircuitQuery(object):
.with_metadata(circuit_name=self.circuit_name, system=system, signal=signal_names) \
.signal_query() \
.synchronize_time(t0) \
.convert_index_to_sec().dfs
.convert_index_to_sec() \
.get_dataframes()
@execution_count()
@check_nan_timestamp_signals(return_type=pd.DataFrame())
......@@ -116,7 +117,7 @@ class CircuitQuery(object):
*,
system: str,
signal_names: Union[str, List[str]],
spark) -> List[pd.DataFrame]:
spark) -> Union[List[pd.DataFrame], pd.DataFrame]:
""" Method querying a signal from NXCALS with a given t_start, t_end,duration as well as system and signal names.
Either t_end or duration has to be present. Signal is synchronized to synchronization time `t0`.
......@@ -134,4 +135,5 @@ class CircuitQuery(object):
.with_duration(t_start=t_start, t_end=t_end) \
.with_circuit_type(self.circuit_type) \
.with_metadata(circuit_name=self.circuit_name, system=system, signal=signal_names) \
.signal_query().dfs
.signal_query() \
.get_dataframes()
......@@ -210,8 +210,12 @@ def get_magnets_visualisation(circuit_type: str, circuit_name: str, results_tabl
lambda row: MappingMetadata.get_crate_name_from_magnet_name(circuit_type, row['magnet']), axis=1)
# join with results_table
results_table['datetime_iqps'] = results_table['timestamp_iqps'].apply(lambda col: Time.to_string_short(col))
sub_results_table = results_table[['Position', 'I_Q_M', 'timestamp_iqps', 'datetime_iqps']]
if 'timestamp_iqps' in results_table.columns:
sub_results_table = results_table[['Position', 'I_Q_M', 'timestamp_iqps']]
sub_results_table['datetime_iqps'] = results_table['timestamp_iqps'].apply(lambda col: Time.to_string_short(col))
else:
sub_results_table = results_table[['Position', 'I_Q_M']]
sub_results_table['datetime_iqps'] = np.nan
pd.options.mode.chained_assignment = None
sub_results_table['q_order'] = sub_results_table.index
......@@ -486,4 +490,4 @@ def draw_schematic(circuit_name: str,
fig.update_shapes(dict(xref='x', yref='y'))
fig.layout.update(showlegend=False)
fig.show()
fig.show()
\ No newline at end of file
......@@ -5,7 +5,7 @@ import numpy as np
from lhcsmapi.Time import Time
from lhcsmapi.pyedsl.dbsignal.post_mortem.PmDbRequest import PmDbRequest
from lhcsmapi.pyedsl.QueryBuilder import QueryBuilder
from lhcsmapi.api.query_builder import QueryBuilder
def get_lhc_context(start_time_date, duration=[(1, 's'), (2, 's')]) -> pd.DataFrame:
......
from tokenize import group
from __future__ import annotations
import warnings
from copy import deepcopy
from enum import IntEnum
from dataclasses import dataclass
from typing import Dict, List, Tuple
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
from IPython.display import display
from lhcsmapi.api.processing import SignalProcessing
from lhcsmapi.metadata import signal_metadata
from sklearn.linear_model import LinearRegression
from lhcsmapi.analysis.CircuitAnalysis import CircuitAnalysis
from lhcsmapi.analysis.busbar.LinReg import LinReg
from lhcsmapi.analysis.CircuitAnalysis import CircuitAnalysis
from lhcsmapi.api.processing import FeatureProcessing
from lhcsmapi.api.processing import SignalProcessing
from lhcsmapi.api.query_builder import QueryBuilder
from lhcsmapi.metadata import signal_metadata
from lhcsmapi.metadata.MappingMetadata import MappingMetadata
from lhcsmapi.pyedsl.AssertionBuilder import AssertionBuilder
from lhcsmapi.pyedsl.PlotBuilder import PlotBuilder
from lhcsmapi.pyedsl.QueryBuilder import QueryBuilder
from lhcsmapi.api.processing import FeatureProcessing
def generate_busbar_status(beam_mode_with_pattern_series: pd.DataFrame, pattern: Dict[str, float]) -> pd.DataFrame:
......@@ -96,56 +96,53 @@ def get_complete_busbar_df(t_start: int, t_end: int, pattern: Dict[str, float],
.signal_query() \
.filter_values(list(pattern.keys())) \
.map_values(pattern) \
.dfs[0]
.get_dataframes()
# generate busbar status
return generate_busbar_status(beam_mode_with_pattern_df["HX:BMODE"], pattern)
def extract_intersecting_plateaus(plateau_starts: List[List[int]],
plateau_ends: List[List[int]]) -> Tuple[List[int], List[int]]:
""" Function extracting intersecting plateaus. It considers lists of list of plateau start and end.
It returns a tuple with a list of start and end times of intersecting plateaus.
def extract_intersecting_plateaus(starts: List[List[int]], ends: List[List[int]]) -> Tuple[List[int], List[int]]:
"""Finds intersections between plateaus for each signal
:param plateau_starts: list of list of plateau starts (outer list per current; inner per plateau)
:param plateau_ends: list of list of plateau ends (outer list per current; inner per plateau)
:return: a tuple with a list of start and end times of intersecting plateaus
"""
class Plateau(object):
def __init__(self, plateau_starts, plateau_ends):
self.plateau_starts = plateau_starts
self.plateau_ends = plateau_ends
def count_overlap(self, t_start, t_end):
count = 0
for plateau_start, plateau_end in zip(self.plateau_starts, self.plateau_ends):
for plateau_start_el, plateau_end_el in zip(plateau_start, plateau_end):
if min(t_end, plateau_end_el) > max(t_start, plateau_start_el):
count += 1
return count
p = Plateau(plateau_starts, plateau_ends)
plateau_starts_shared = []
plateau_ends_shared = []
# Find overlapping intervals
# For each of plateau list find overlapping intervals
# take the intersection of the overlapping intervals (max t_start and min t_end)
# take only an interval that overlaps with all intervals, the overlap count should be equal to the number of lists
for plateau_start, plateau_end in zip(plateau_starts, plateau_ends):
plateau_start_shared = []
plateau_end_shared = []
for plateau_start_el, plateau_end_el in zip(plateau_start, plateau_end):
if p.count_overlap(plateau_start_el, plateau_end_el) == len(plateau_starts):
plateau_start_shared.append(plateau_start_el)
plateau_end_shared.append(plateau_end_el)
plateau_starts_shared.append(plateau_start_shared)
plateau_ends_shared.append(plateau_end_shared)
Args:
starts: list of lists of start points for plateaus - one list for each signal
ends: list of lists of end points for plateaus - one list for each signal
Returns:
a tuple consisting of list of start and list of end points for the instersecting plateaus
"""
@dataclass
class Plateau:
start: int
end: int
def intersect(self, other_plateaus: List[Plateau]) -> List[Plateau]:
return [intersection for other in other_plateaus if (intersection := self._get_intersection(other))]
# Take only the latest plateau start and the earliest plateau end
return list(np.array(plateau_starts_shared).max(axis=0)), list(np.array(plateau_ends_shared).min(axis=0))
def _get_intersection(self, other: Plateau) -> Plateau:
intersection = Plateau(max(self.start, other.start), min(self.end, other.end))
if intersection.start >= intersection.end:
return None
return intersection
if len(starts) == 1:
return starts[0], ends[0]
plateaus = [[Plateau(s, e) for s, e in zip(start, end)] for start, end in zip(starts, ends)]
first_plateaus = plateaus.pop(0)
first_two_intersections = []
for plateau in first_plateaus:
intersections = plateau.intersect(plateaus[0])
first_two_intersections.extend(intersections)
starts = [[i.start for i in first_two_intersections]] + starts[2:]
ends = [[i.end for i in first_two_intersections]] + ends[2:]
return extract_intersecting_plateaus(starts, ends)
def find_start_end_of_the_longest_ramp_up(plateau_start: List[int], plateau_end: List[int]) -> Tuple[int, int]:
......@@ -267,8 +264,9 @@ class BusbarResistanceAnalysis(CircuitAnalysis):
i_meas_df = i_meas_df[i_meas_df[I_MEAS] >= i_meas_threshold]
# calculate time derivative
i_meas_df['dI_MEAS_dt'] = SignalProcessing(i_meas_df.loc[:, [I_MEAS]]).calculate_time_derivative(
'diff', False).get_dataframes()
i_meas_df['dI_MEAS_dt'] = SignalProcessing(i_meas_df.loc[:, [I_MEAS]]) \
.calculate_time_derivative('diff', False) \
.get_dataframes()
i_meas_df = i_meas_df.iloc[1:, :] # the derivative is undefined for the first point
# round to the nearest integer
......@@ -305,14 +303,14 @@ class BusbarResistanceAnalysis(CircuitAnalysis):
plateau_start = list(i_meas_df.loc[i_meas_df.index[plateau_start_pattern_idx[1::2]], 'acqStamp'].values)
plateau_end = list(i_meas_df.loc[i_meas_df.index[plateau_end_pattern_idx[1::2]], 'acqStamp'].values)
if len(plateau_start) < len(plateau_end):
plateau_start.insert(0, i_meas_df['acqStamp'].values[0])
if plateau_end[0] < plateau_start[0]:
plateau_start = [i_meas_df['acqStamp'].values[0]] + plateau_start
if len(plateau_start) > len(plateau_end):
plateau_end.append(i_meas_raw_df.index[-1])
if len(plateau_start) < len(plateau_end):
plateau_start = i_meas_raw_df.index[0].values + plateau_start
plateau_end.append(i_meas_df['acqStamp'].values[-1])
plateau_start_duration = []
plateau_end_duration = []
......
......@@ -17,8 +17,9 @@ from lhcsmapi.analysis.CircuitQuery import execution_count
from lhcsmapi.analysis.expert_input import check_show_next
from lhcsmapi.api import processing
from lhcsmapi.api import resolver
from lhcsmapi.pyedsl.QueryBuilder import QueryBuilder
from lhcsmapi.api.query_builder import QueryBuilder
from lhcsmapi.Time import Time
from lhcsmapi import utils
class BusbarResistanceQuery(CircuitQuery):
......@@ -88,15 +89,15 @@ class BusbarResistanceQuery(CircuitQuery):
:return: tuple of start and end plateau timestamps (each as a list)
"""
i_meas_raw_dfs = QueryBuilder().with_nxcals(spark) \
i_meas_raw_df = QueryBuilder().with_nxcals(spark) \
.with_duration(t_start=t_start, t_end=int(t_end)) \
.with_circuit_type(self.circuit_type) \
.with_metadata(circuit_name=self.circuit_name, system='PC', signal='I_MEAS') \
.signal_query() \
.dfs
.get_dataframes()
return BusbarResistanceAnalysis \
.calculate_current_plateau_start_end(i_meas_raw_dfs, i_meas_threshold=i_meas_threshold,
.calculate_current_plateau_start_end(utils.vectorize(i_meas_raw_df), i_meas_threshold=i_meas_threshold,
min_duration_in_sec=min_duration_in_sec,
time_shift_in_sec=time_shift_in_sec)
......@@ -198,7 +199,8 @@ class BusbarResistanceQuery(CircuitQuery):
.with_query_parameters(nxcals_system='CMW', signal=row['nxcals_variable_name']) \
.signal_query() \
.synchronize_time() \
.convert_index_to_sec().dfs[0]
.convert_index_to_sec() \
.get_dataframes()
if not u_df.empty:
ax = u_df.plot(figsize=(15, 7), grid=True)
......
......@@ -110,7 +110,7 @@ def compare_difference_of_features_to_reference(features_df: pd.DataFrame,
def compare(row):
if np.isnan(row['diff']):
return np.nan
return abs(row['act'] - row['ref']) <= row['diff']
return round(abs(row['act'] - row['ref']), 6) <= row['diff']
comparison_df = comparison_df.T
comparison_df['result'] = comparison_df.apply(compare, axis=1)
......
......@@ -91,7 +91,9 @@ def check_nan_timestamp_signals(return_type=pd.DataFrame,
if cond_arg or cond_kwarg:
warnings.warn('In function %s: %s' % (str(func).split(' ')[1], warning))
return [return_type] * len(kwargs[signal_name])
signals_count = 1 if isinstance(kwargs[signal_name], str) else len(kwargs[signal_name])
return return_type if signals_count == 1 else [return_type] * signals_count
return func(*args, **kwargs)
return wrapper
......@@ -109,10 +111,12 @@ def _get_indices_for_arg_names(function, arg_names) -> Union[range, List[int]]:
def check_arguments_not_none():
def inner_function(func):
arg_names = inspect.getfullargspec(func)[0]
@wraps(func)
def wrapper(*args, **kwargs):
if len(args) < len(arg_names):
raise ValueError(f'Not enough values, function expected {len(arg_names)}, but only {len(args)} were provided.')
raise ValueError(
f'Not enough values, function expected {len(arg_names)}, but only {len(args)} were provided.')
for i in range(len(arg_names)):
if not args[i]:
raise ValueError(arg_names[i] + ' cannot be None')
......
......@@ -4,7 +4,7 @@ import pandas as pd
from lhcsmapi.analysis.CircuitQuery import CircuitQuery, execution_count
from lhcsmapi.analysis.decorators import check_nan_timestamp_signals, check_nan_timestamp
from lhcsmapi.pyedsl.QueryBuilder import QueryBuilder
from lhcsmapi.api.query_builder import QueryBuilder
class DfbQuery(CircuitQuery):
......@@ -29,7 +29,8 @@ class DfbQuery(CircuitQuery):
.with_duration(t_start=t_start, duration=duration) \
.with_circuit_type(self.circuit_type) \
.with_metadata(circuit_name=self.circuit_name, system=system) \
.event_query(verbose=self.verbose).df
.event_query(verbose=self.verbose)\
.get_dataframe()
@execution_count()
@check_nan_timestamp_signals(return_type=pd.DataFrame())
......@@ -88,7 +89,8 @@ class DfbQuery(CircuitQuery):
.with_metadata(circuit_name=self.circuit_name, system=system, signal=signal_names) \
.signal_query(verbose=self.verbose) \
.synchronize_time(timestamp_fgc) \
.convert_index_to_sec().dfs
.convert_index_to_sec()\
.get_dataframes()
def _query_leads_pm(self, signal_names: List[str], source_timestamp_leads_df: pd.DataFrame, system: str,
timestamp_fgc: int) -> List[pd.DataFrame]:
......@@ -114,7 +116,8 @@ class DfbQuery(CircuitQuery):
signal=signal_names, source=source_leads) \
.signal_query(verbose=self.verbose) \
.synchronize_time(timestamp_fgc) \
.convert_index_to_sec().dfs
.convert_index_to_sec()\
.get_dataframes()
def query_dfb_signal_nxcals(self, t_start, t_end, *, system, signal_names, spark) -> List[pd.DataFrame]:
""" Method querying DFB signals with NXCALS
......@@ -134,4 +137,4 @@ class DfbQuery(CircuitQuery):
.synchronize_time() \
.convert_index_to_sec() \
.filter_median() \
.dfs
.get_dataframes()
......@@ -7,12 +7,11 @@ from tqdm.notebook import tqdm
from lhcsmapi.Time import Time
from lhcsmapi.analysis.CircuitQuery import CircuitQuery, execution_count
from lhcsmapi.metadata.MappingMetadata import MappingMetadata
from lhcsmapi.pyedsl.QueryBuilder import QueryBuilder
from lhcsmapi.api.query_builder import QueryBuilder
class DiodeLeadResistanceQuery(CircuitQuery):
"""Class containing methods for query of events and signals for diode lead resistance calculation."""
def __init__(self, circuit_type, circuit_name, max_executions=None, verbose=True):
super().__init__(circuit_type, circuit_name, max_executions, verbose)
......@@ -44,7 +43,8 @@ class DiodeLeadResistanceQuery(CircuitQuery):
.event_query(verbose=self.verbose) \
.filter_source(self.circuit_type, self.circuit_name, self._diode_system) \
.sort_values(by='timestamp') \
.drop_duplicate_source().df
.drop_duplicate_source() \
.get_dataframe()
if warn_on_missing_pm_buffers:
self._warn_if_pm_buffers_are_missing(result, t_start, duration)
......@@ -121,9 +121,7 @@ class DiodeLeadResistanceQuery(CircuitQuery):
return i_meas_u_diode_nxcals_dfs
def _warn_if_pm_buffers_are_missing(self,
source_timestamp_nqps_df: pd.DataFrame,
t_start: Union[int, str, float],
def _warn_if_pm_buffers_are_missing(self, source_timestamp_nqps_df: pd.DataFrame, t_start: Union[int, str, float],
duration: List):
""" Displays a warning if any of nQPS PM buffers are missing in the given pd.DataFrame
......@@ -148,14 +146,16 @@ class DiodeLeadResistanceQuery(CircuitQuery):
.with_duration(t_start=timestamp_qds, duration=[(600, 's'), (600, 's')]) \
.with_circuit_type(self.circuit_type) \