Commit a98e7aa1 authored by Thibaud Marie Eric Buffet's avatar Thibaud Marie Eric Buffet
Browse files

SIGMON-120: improve README.md with EOS

parent 4f18c881
...@@ -34,7 +34,21 @@ Please follow a procedure below to request the NXCALS access. ...@@ -34,7 +34,21 @@ Please follow a procedure below to request the NXCALS access.
Optionally please can mention that the NXCALS database will be accessed through SWAN. Optionally please can mention that the NXCALS database will be accessed through SWAN.
Once the access is granted, you can use NXCALS with SWAN. Once the access is granted, you can use NXCALS with SWAN.
### 2. Logging to SWAN ### 2. EOS lhcsm Access
EOS lhcsm access are granted via e-groups.
Allow for storage of reports.
- cernbox-project-mp3-writers
- cernbox-project-mp3-readers
To modify the package (env), scripts and notebooks.
- cernbox-project-lhcsm-writers
- cernbox-project-lhcsm-readers
**Note it could take some time for the changes to propagate.**
### 3. Logging to SWAN
The following steps should be followed in order to log-in to SWAN The following steps should be followed in order to log-in to SWAN
1. Go to http://swan.cern.ch 1. Go to http://swan.cern.ch
...@@ -43,7 +57,7 @@ The following steps should be followed in order to log-in to SWAN ...@@ -43,7 +57,7 @@ The following steps should be followed in order to log-in to SWAN
<img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-inactive-cernbox-error.png" width=50%> <img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-inactive-cernbox-error.png" width=50%>
### 3. Setting an Environment Script ### 4. Setting an Environment Script
In order to execute the HWC notebooks, one requires `lhc-sm-api` package and HWC notebooks. To this end, we created a dedicated environment script to prepare the SWAN project space. In order to execute the HWC notebooks, one requires `lhc-sm-api` package and HWC notebooks. To this end, we created a dedicated environment script to prepare the SWAN project space.
The script sets a path to a virtual environment with the necessary packages (for more details, cf. https://lhc-sm-api.web.cern.ch/lhc-sm-api/user_install.html#preinstalled-packages) as well as makes a copy of HWC notebooks. The script sets a path to a virtual environment with the necessary packages (for more details, cf. https://lhc-sm-api.web.cern.ch/lhc-sm-api/user_install.html#preinstalled-packages) as well as makes a copy of HWC notebooks.
...@@ -53,7 +67,7 @@ The SWAN Notebooks can be loaded in 3 different environments: ...@@ -53,7 +67,7 @@ The SWAN Notebooks can be loaded in 3 different environments:
- DEV - DEV
- DEV CONTRIBUTOR - DEV CONTRIBUTOR
#### 3.1 PRO #### 4.1 PRO
This is the PRO and stable version of the notebooks, users and experts should use this script. This is the PRO and stable version of the notebooks, users and experts should use this script.
At every log-in to SWAN, please provide the following environment script: At every log-in to SWAN, please provide the following environment script:
...@@ -74,7 +88,7 @@ Note the following settings while configuring environment: ...@@ -74,7 +88,7 @@ Note the following settings while configuring environment:
This script creates a new `hwc` folder in SWAN. This script creates a new `hwc` folder in SWAN.
#### 3.2 DEV #### 4.2 DEV
This is the DEV version of the notebooks, this script is used for new feature validations and verification. This is the DEV version of the notebooks, this script is used for new feature validations and verification.
At every log-in to SWAN, please provide the following environment script: At every log-in to SWAN, please provide the following environment script:
...@@ -85,21 +99,21 @@ This script creates a new `hwc_dev` folder in SWAN. ...@@ -85,21 +99,21 @@ This script creates a new `hwc_dev` folder in SWAN.
**Note: that in order to ensure compatibility between package and notebook versions, the `hwc_dev` folder is deleted each time the script is executed.** **Note: that in order to ensure compatibility between package and notebook versions, the `hwc_dev` folder is deleted each time the script is executed.**
**Note: Notebooks changes are not persisted when using this environment script.** **Note: Notebooks changes are not persisted when using this environment script.**
#### 3.3 DEV CONTRIBUTOR #### 4.3 DEV CONTRIBUTOR
This is a DEV version of the notebooks, this script is used to implement new features and new notebooks. This is a DEV version of the notebooks, this script is used to implement new features and new notebooks.
see [CONTRIBUTING.md](CONTRIBUTING.md) see [CONTRIBUTING.md](CONTRIBUTING.md)
This script uses the `hwc_working` folder in SWAN. If the folder already exists, then it just loads the DEV environment. This script uses the `hwc_working` folder in SWAN. If the folder already exists, then it just loads the DEV environment.
### 4. Running Notebook ### 5. Running Notebook
#### 4.1. Open notebook #### 5.1. Open notebook
To do so simply open the folder and then select a circuit. Afterwards click name of a notebook to open a new page. The top of the notebook is presented in Figure below. To do so simply open the folder and then select a circuit. Afterwards click name of a notebook to open a new page. The top of the notebook is presented in Figure below.
<img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-rb-fpa-analysis-intro.png" width=50%> <img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-rb-fpa-analysis-intro.png" width=50%>
#### 4.2. Connect to the NXCALS Spark Cluster #### 5.2. Connect to the NXCALS Spark Cluster
Once a notebook is opened, please click a star button as shown in Figure below in order to open the Spark cluster configuration in a panel on the right side of an active notebook. Once a notebook is opened, please click a star button as shown in Figure below in order to open the Spark cluster configuration in a panel on the right side of an active notebook.
<img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-open-spark-cluster-configuration.png" width=50%> <img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-open-spark-cluster-configuration.png" width=50%>
...@@ -113,7 +127,7 @@ The last step is a confirmation of a successful connection to the cluster. ...@@ -113,7 +127,7 @@ The last step is a confirmation of a successful connection to the cluster.
<img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-spark-cluster-connection.png" width=75%> <img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-spark-cluster-connection.png" width=75%>
#### 4.3. Analysis Notebook Execution #### 5.3. Analysis Notebook Execution
A notebook is composed by cells. A cell contains either a markdown text with description or python code to execute. Cells with markdown text have white background and can contain text, tables, figures, and hyperlinks. Cells with code have gray background and are executed by clicking a run icon in the top bar highlighted in Figure below. Alternatively, one can put a cursor in a cell with code an press a keyboard shortcut Ctrl+Enter. A notebook is composed by cells. A cell contains either a markdown text with description or python code to execute. Cells with markdown text have white background and can contain text, tables, figures, and hyperlinks. Cells with code have gray background and are executed by clicking a run icon in the top bar highlighted in Figure below. Alternatively, one can put a cursor in a cell with code an press a keyboard shortcut Ctrl+Enter.
<img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-execute-cell.png" width=50%> <img src="https://gitlab.cern.ch/LHCData/lhc-sm-hwc/raw/master/figures/swan-execute-cell.png" width=50%>
...@@ -123,12 +137,12 @@ A state of a cell is indicated by square brackets located on the left to a cell. ...@@ -123,12 +137,12 @@ A state of a cell is indicated by square brackets located on the left to a cell.
1. Select from the top menu: `Kernel` -> `Interrupt` and execute the problematic cell again (either a run button (cf. Figure above) or Ctrl+Enter). 1. Select from the top menu: `Kernel` -> `Interrupt` and execute the problematic cell again (either a run button (cf. Figure above) or Ctrl+Enter).
2. In case the first option does not help, select from the top menu `Kernel` -> `Restart & Clear Output`. Then all cells prior to the problematic one have to be executed again (multiple cell selection is possible by clicking on the left of a cell to select it and afterwards selecting others with pressed Shift button). After this operation one needs to reconnect to the NXCALS Spark cluster. 2. In case the first option does not help, select from the top menu `Kernel` -> `Restart & Clear Output`. Then all cells prior to the problematic one have to be executed again (multiple cell selection is possible by clicking on the left of a cell to select it and afterwards selecting others with pressed Shift button). After this operation one needs to reconnect to the NXCALS Spark cluster.
#### 4.4. Analysis Assumptions: #### 5.4. Analysis Assumptions:
1. We consider standard analysis scenarios, i.e., all signals can be queried. Depending on what signal is missing, an analysis can raise a warning and continue or an error and abort the analysis. 1. We consider standard analysis scenarios, i.e., all signals can be queried. Depending on what signal is missing, an analysis can raise a warning and continue or an error and abort the analysis.
2. It is recommended to execute each cell one after another. However, since the signals are queried prior to an analysis, any order of execution is allowed. In case an analysis cell is aborted, the following ones may not be executed (e.g. I_MEAS not present). 2. It is recommended to execute each cell one after another. However, since the signals are queried prior to an analysis, any order of execution is allowed. In case an analysis cell is aborted, the following ones may not be executed (e.g. I_MEAS not present).
#### 4.5. FPA Notebooks #### 5.5. FPA Notebooks
##### Analysis Workflow ##### Analysis Workflow
...@@ -200,7 +214,7 @@ e.g., ...@@ -200,7 +214,7 @@ e.g.,
- CSV file with MP3 results table with a subset analysis results - [circuit-name]_FPA-[fgc-timestamp]-[analysis-execution-date].csv; - CSV file with MP3 results table with a subset analysis results - [circuit-name]_FPA-[fgc-timestamp]-[analysis-execution-date].csv;
#### 4.6. HWC Notebooks #### 5.6. HWC Notebooks
##### Analysis Workflow ##### Analysis Workflow
A HWC analysis workflow consists of four steps: (i) finding of start and end time of an HWC test (ii) executing analysis cells on the cluster (iii); (iv) storing output files on EOS; see Figure below. A HWC analysis workflow consists of four steps: (i) finding of start and end time of an HWC test (ii) executing analysis cells on the cluster (iii); (iv) storing output files on EOS; see Figure below.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment