Skip to content
Snippets Groups Projects

tl;dr

Use the Scan Operator together with Yarr's devel branch and labRemote's master. Edit the module config in configs/so_modules.json, and the scan sequence in configs/so_yarr.

Introduction

The Scan Operator (SO) is an integration tool for scan data and DCS data, which is meant to be used during the QC procedure of the Rd53a/Rd53b modules. It uses multiple localDB scripts as well as some integrated tools to achieve these features:

  • Calls Yarr repeatedly to run a sequence of scans on the chip.
  • Makes very easy to edit chip-by-chip config parameters (trims, chip name, rx and tx, ...)
  • Registers a module in LocalDB if it's not already there
  • Monitors the Power Supply (PS) powering the ASIC and storing the DCS data in InfluxDB
  • Synchronizes the DCS data and the scan data in LocalDB.

All the DCS-related functionality is optional. The SO provides scripts based on labRemote (LR) to monitor DCS in real time, but any other external tool can also be used as long as it is capable of uploading the DCS data to InfluxDB.

Other independent tools included in the SO

This repository includes the following independent tools:

  • libDCS/iviscan.py: safely performs sensor IV and module VI scans
  • libDCS/psOperator.py: Controls and monitors a power supply. Optionally uploads PS data to influxDB until killed
  • libDCS/parameterOptimizer.py: Performs trim scans, tap scans, etc on a selected chip to find its best configuration.

More on this tools here.

Quick Tutorial

Requirements

To run the Scan Operator, you need at least:

  • Yarr
  • the jq command installed. On Centos, sudo yum install jq.
  • the unbuffer command installed. On Centos, sudo yum install expect.

1. Get the code

If you want to use any PS-related functionality, start by cloning and building labRemote. You don't need to do it in another case.

Make sure to enable the Python bindings using cmake's -DUSE_PYTHON=on flag. For a detailed installation guide, follow labRemote's readme.

git clone https://gitlab.cern.ch/berkeleylab/labRemote.git
cd labRemote
git submodule update --init --recursive

mkdir build && cd build
yum install python3-devel
cmake3 -DUSE_PYTHON=on .. && make -j6

Clone also somewhere the Scan Operator

git clone --recursive https://gitlab.cern.ch/YARR/utilities/scan-operator.git

3. Run the Scan Operator

The Scan Operator uses many configuration files, all of them located by default under ./configs. Most of them are intuitive and the SO will tell you when you need to modify them.

Start from ./ScanOperator.sh to see a complete list of options.

List of arguments

(a more detailed explanation of the output of ./ScanOperator.sh)

Required

  • -m: Serial Number of the module. Chose from the ones you have defined in so_modules.json, under ."modules".

Optional

  • -c: Create / overwritte Yarr config files in localdb(controller, connectivity and chip cfg files). Use this option if you are using a module for the first time or you want to regenerate its config from scratch. Summarizing, "-c" does the following.

    • Retrieves the files from localDB if the module is already registered and the connection is good. Creates them offline otherwise.
    • Edits them based on so_modules.json
  • -R: Reset Yarr config files (controller and chip cfg files). Use this option if you are using a module you want to regenerate its config from scratch. Summarizing, "-R" does the following.

    • Create the chip configs from scratch, this ignore whether the configs exists in localdb.
    • Edits them based on so_modules.json.
  • The connectivity file isn't regenerate from scratch, so you have to change to original connectivity with both of "-l" and "-c".

The cfg files are created in the Scan Operator's folder, so everything you have under Yarr's one remains untouched

  • -s: Doesn't look for changes in the config. Can be useful when runing the SO without having modified so_modules.json. Saves some time.

  • -j: Just updates the module configs according to so_modules.json, and exits before interacting with the module or the PS at all.

  • -A: Just sends the module configs to the module according to so_modules.json, this is equivalent to scanConsole without scan config.

  • -l: The list name of module. You can choose the scan flow you will execute only modify this option. "-l" does the following.

    • You write the scan list you want to execute in "so_yarr_rd53a/b.json". The rd53a or rd53b can be changed by the front end type of the module in "so_modules.json".
    • Executes the scans in the lists.
  • -W: This one is equivalent to the -W option used when calling Yarr's ScanConsole

  • -Q: This one is equivalent to the -Q option you use when calling Yarr's ScanConsole

  • -q: Quiet mode: Reduces the terminal output

DCS related

  • -o: Turns on the output of the channels_to_be_used_by_the_scanoperator specified in lr_powersupply.json before running the sequence of scans. Turns the output off again after running all the scans. Not very reccomended, specially when working at low temperatures.

  • -d: Monitor the PS and store real-time DCS data in InfluxDB.

  • -t: After each scan, retrieves the DCS data from InfluxDB and uploads it to LocalDB, synchronized with each scan. Remember to tell the databases how to talk to each other in the InfluxDB-LocalDB connectivity file. If you want instead to transfer DCS data from InfuxDB to localDB once all the scans are done, refer to this section.

  • -e: Checks that T and H are within a safe interval (specified in idb_env.json) before running each scan. Requires anything uploading these two variables in real-time to influxdb (e.g. a "Canary Board")

Example calls

Here are some example use cases of the Scan Operator.

Hello world

./ScanOperator.sh -m module_id

You should start from here before going any further. This will just run in series the scans you specified in scan_list (in so_yarr.json). All the DCS / LocalDB / InfluxDB functionality is disabled.

QC procedure

If you are using the Scan Operator during the QC procedure of any of your modules, you should call it by enabling all the QC / LocalDB Features. (-Q and -W respectively).

./ScanOperator.sh -m module_id -c -Q -W -l [The name of you want to exucute]

To manually sync DCS in localDB after running this, refer to the following section.

Monitoring and syncing DCS data

If you want to use the SO to monitor DCS (-d) and synchronize the DCS data and the scan data in localDB after each scan (-t), use

./ScanOperator.sh -m module_id -d configs/m_powersupply.json -W -d -t

Monitoring DCS with an external tool

Say you are using your own tool (e.g. libDCS/psOperator.py) to constantly upload DCS data to InfluxDB, while using the SO to run all the scans.

Then you can omit -d

./ScanOperator.sh -m module_id configs/m_powersupply.json -W -t

If it doesn' work in real time, use just -W and you will see this message once the SO finishes each scan:

[ info ][so] If you want to upload to localDB the DCS data associated
             to all the scans you have just run (assuming you have it
             stored in InfluxDB), you can do it in one command:

                 bash /home/mario/work/scan-operator/data/201214_051/moveDCStoLocalDB.sh path/to/idb_to_ldb.json

Refer to the following section if you need any help filling the config files.

The SO configuration files

You most likely won't need to take care of every single one of the following. Fill only the ones you need depending on the options you use.

If you want to move/rename any of the following, just update its path on configs/index.json

Module configuration file (so_modules.json)

This file contains the properties of all of your modules (SCCs, quads, ...) you want to scan with the SO.

Most of the items are self-explanatory. Here are the ones that may require some indications to fill.

  • .modules: Write here the configs for as many modules as you want. The file comes with an example for a SCC and an example for a quad. You will choose between the modules you have defined here in the command line. Notes:
    • The name of each module (e.g. moduleid_quad) should (but doesn't have to) match the Serial Number of the module.
    • For each chip:
      • For any key written in "GlobalConfig", the Scan Operator will:
        • Look for it in Yarr's chip config file (in configs/rd53x) under .RD53A.GlobalConfig.
        • If found, change the value to the one specified by the user.
        • If not found, create it and continue.
      • The same goes for the "connectivity" and the "Parameter" section.

Yarr's general configuration file (so_yarr_rd53a/b.json)

  • There are two files( for rd53a and rd53b) in configs. Using file depends on the front end type of the module in so_modules.json.
  • .scan_list: Default place here the sequence of scans that you want to run. Without "-l" option, the scans in the place will run.
  • "common_config": Two configs exist in this key.
    • "max_trial_in_case_of_failure": Sometimes, you might get a segfault from Yarr comming from corrupted data. Normally just resumming the scan sequence from where you left it does the job. This option does it for you.)
    • "default_option": Input to scanConsole regardless of the kind of scans.
  • ["std_digitalscan", "-m 1"]: The scans may need to create mask, tune to the target threshold/ToT or inject a specific charges. This arguments will be input to scanConsole.

Powersupply configuration file (lr_powersupply.json)

Contains the information to establish communication with the PS as well as the definition of the channels. Amongst all the channels you have defined, select the ones you want to use in "channels_to_be_used_by_the_scanoperator". The SO will ignore any channel not appearing there.

  • .devices: Write here the PS / PSs you want to use. A list of supported "hw-model" and "protocol" is available in labRemote's docs. To find the port your PS is connected to, look at the output of ls -l /dev/serial/by-id/.

  • .channels: You can have here as many channels as you want, but only the ones appearing in channels_to_be_used_by_the_scanoperator will be used. Under .channels, "device" should be one of the ones in the upper section, and "channel" is the actual output channel number of the PS (e.g. from 1 to 4 for the R&S HMP4040).

InfluxDB (DCS) configuration file (idb_dcs.json)

  • .influxdb_cfg.username: Don't worry about this if you don't have authentication enabled.
  • .influxdb_cfg.measurement: The measurement (data table) where to upload data to in InfluxDB.

InfluxDB (ENV) configuration file (idb_env.json)

  • Used only with -e. Contains the information the SO needs to retrieve T and H from InfluxDB.
  • The SO will proceed with the scans only if T and H are within (e.g.) setting_value
    ±\pm
    allowed_half_width.
  • Otherwise, it will keep monitoring them until they stabilize.

InfluxDB to LocalDB configuration file (idb_to_ldb.json)

Used when associating any DCS/ENV data to the scans in LocalDB

  • .influxdb_cfg: Which DB you want to take the data from.

  • .environment: List here the title of every set of data you want to upload to localDB, as you want it to appear in localDB (e.g. vdda, temperature, HV, ...)

  • .environments.measurement: The measurement where to read data from in InfluxDB

  • .environments.dcslist[i].key: Which key you want your data to be associated to (in localDB); has to match one of the ones declared in .environment

  • .environments.dcslist[i].data_name: The title of the column in InfluxDB where to read data from, to be associated with .environments.dcslist[i].key in localDB. If you are using the SO (and not your own DCS tool) to monitor your PS, data_name is not a free parameter. It is named as AX_C, where:

    • A: meas for the measured values and sett for the setting values
    • X: V or I
    • C: PS Channel, as you named it in the main config file. E.g. measV_LV will contain the sense values of V read on the channel named "LV".
  • .environments.dcslist[i].setting_column_name: Column name in influxDB storing the setting value of each "key". It is not used by the Scan Operator, but it's uploaded to localDB for reference.

LocalDB "site" and "user" configuration files (ldb_site.json and ldb_user.json)

Refer to the FAQs if you don't know the code of your institution.

LocalDB Module registration configuration file (ldb_module_registration.json)

This one is autogenerated. Don't worry about it.

4. View the results offline

The SO puts all the scan data under the data/ folder. There, the data is organized in folders named YYMMDD_(RUN_COUNTER). Each of them contains Yarr's output for all the scans performed under the same call to the SO.

Useful Independent tools

IV / VI scan script (./libDCS/iviscan.json)

  • Run python3 libDCS/iviscan.py -h for a list of options
  • Edit all the scan parameters in configs/lr_iviscan.json

You can specify the number of measurements per step and the sleeping time between measurements. Moving to the initial point, and going back to the initial state after finishing scan is done ramping amoothly. The measured values per step, as well as their standard deviation is displayed in the terminal in real time.

PS Controller / monitor tool

  • python3 libDCS/psOperator.py -h for a complete set of options.

This script can be used to remotely control a power supply. (set/get-voltage, power on/off, etc). It can also be left running in the background (e.g. in a screen) constantly monitoring a single channel and uploading the data to influxDB.

  • Parameter Optimizer

  • python3 libDCS/parameterOptimizer.py -h for usage information

Used to quickly find the best config values for a single chip. Can be used to scan trim values, tap values or any other config value. Builds a 2D matrix with all the value combinations and fills it in real-time on the terminal. A sample output for a trim scan made with this too is the following:

haxis: CmlTapBias1; vaxis: CmlTapBias0

  #      0     100  200   300      400
 300  76800.0  0.0  8.0   -1.0     -1.0
 400   258.0   0.0  0.0  1971.0    -1.0
 500    0.0    0.0  0.0   0.0    54638.0
 600    0.0    0.0  0.0   0.0    27142.0
 700    0.0    0.0  0.0   0.0     3513.0
 800    0.0    0.0  0.0   0.0      0.0
 900    0.0    0.0  0.0   0.0      0.0

The numbers in the matrix indicate the number of failing pixels (0's in the enMask) from a digital.

The default config file for this script is so_optimizer.json. There, the target chip number starts from 1. 1 is the first chip, 2 is the 2nd, ... as you have them ordered in your SO config. To check the order you can also look at the connectivity file displayed on the terminal when calling ./ScanOperator.sh -m your_module -j.

The script tries to look for the best region by itself. Once found, you can wait for the whole matrix to get filled, or just kill the script at any stage. After using this script run the SO once, just to reset the configuration to its original state.

FAQs

These are a few issues you might encounter when using this software.

1. How do I know the code of my institution?

You might ask yourself this quesiton if you run into the following error message:

[ error  ][   Local DB    ]: Not found site data {'institution': 'localhost.localdomain'} registered in Local DB.
[ error  ][   Local DB    ]: Please set your institution correctly in
[ error  ][   Local DB    ]: { "code": "xxx" } or { "institution": "xxx" } in /home/you/.yarr/localdb/localhost.localdomain_site.json
[ error  ][   Local DB    ]: Invalid configs for uploading data, aborting...

so, how do I know the code of my institution?

  1. Go to your Yarr folder
  2. Run ./localdb/bin/localdbtool-retrieve list site

Note: You might need to switch to devel-localdb to run the above command

2. My module is registered in iTk PD, but it doesn't appear in my localDB

The following error will terminate the program if running on -Q mode:

[ error  ][   Local DB    ]: Not found component data in Local DB: 20UPGM20000002

To solve it, you need to have localdb-tools installed somewhere on your computer. Then you can retrieve the module from iTkPD to localDB, following this:

  1. cd localdb-tools/viewer/itkpd-interface/
  2. source authenticate.sh
  3. ./bin/downloader.py --option Module

See ./bin/downloader.py -h for more information about this retriever tool.


Anything else? Feel free to mail me or submit any issue here! I'm looking forward to hearing your feedback.