Commit 5c4269fc authored by Alexander Froch's avatar Alexander Froch
Browse files

Merge branch birk-placeholder-url-support with refs/heads/master into refs/merge-requests/511/train

parents d3e9404a d6e455a8
Pipeline #3835609 passed with stages
in 30 minutes and 48 seconds
"""Script to replace placeholders in .md files with the content of the file that is
specified in the placeholder."""
import argparse
import os
import re
from collections import Counter
from glob import glob
from shutil import copyfile
from subprocess import run
def GetParser():
......@@ -166,6 +169,21 @@ def replace_placeholder_with_file_content(
# has to be specified like §§§<filename>:<start>:<end>§§§
placeholder = original_line.split("§§§")[1]
# Check if a url was specified. If yes, download the file
if placeholder.startswith("url="):
url_search = re.search('url="(.*)"', placeholder)
url = url_search.group(1)
os.makedirs("downloads", exist_ok=True)
tmp_filename = f"downloads/{url.split('/')[-1]}"
print(f"Downloading file {url} -> {tmp_filename}")
run(
f"wget {url} -O {tmp_filename}",
shell=True,
check=True,
)
# replacement_file = tmp_filename
placeholder = placeholder.replace(f'url="{url}"', tmp_filename)
# Check how many colons are in the placeholder
# Translate to python index + convert to start=0, end=-1 in case
# where no number is specified
......
......@@ -64,6 +64,18 @@ file `<filename>` from line `<start>` to line `<end>`.
The file in the repository will no be changed, but before building the
docs, a script will create a processed copy of the corresponding markdown file.
**Using a URL instead of a file from the repository**
If you want to link a file that is not present in the umami repo, but you have a URL
to that exact file, you can use the following syntax:
```md
§§§url="<url>":<start>:<end>§§§
```
*Note that if you want to link the content of a file living in another gitlab
repository, you have to use the link pointing to the **raw** file content.*
**Further examples**
Below you can find different versions for inserting different parts of the file
......
......@@ -72,51 +72,7 @@ python lwtnn/converters/kerasfunc2json.py architecture-lwtnn_model.json weights-
To test if the created model is properly working you can use the [training-dataset-dumper](https://gitlab.cern.ch/atlas-flavor-tagging-tools/training-dataset-dumper) and add the created model to a config (e.g. [EMPFlow.json](https://gitlab.cern.ch/atlas-flavor-tagging-tools/training-dataset-dumper/-/blob/r22/configs/single-b-tag/EMPFlow.json)). This can exemplarily look like
```json
{
"jet_collection": "AntiKt4EMPFlowJets_BTagging201903",
"jet_calibration_collection": "AntiKt4EMPFlow",
"jet_calib_file": "JES_data2017_2016_2015_Consolidated_PFlow_2018_Rel21.config",
"cal_seq": "JetArea_Residual_EtaJES_GSC_Smear",
"cal_area": "00-04-82",
"do_calibration": "true",
"run_augmenters": "false",
"vr_cuts": "false",
"jvt_cut": 0.5,
"pt_cut": 20000,
"n_tracks_to_save": 40,
"track_sort_order": "d0_significance",
"track_selection": {
"pt_minimum": 1000,
"d0_maximum": 1.0,
"z0_maximum": 1.5,
"si_hits_minimum": 7,
"si_holes_maximum": 2,
"pix_holes_maximum": 1
},
"dl2_configs": [
{
"nn_file_path": "DIPS-model.json",
"output_remapping": {
"DIPS_pu": "dips_pu",
"DIPS_pc": "dips_pc",
"DIPS_pb": "dips_pb"
}
}
],
"variables": {
"btag": {
"file": "single-btag-variables.json",
"doubles": [
"dips_pu",
"dips_pc",
"dips_pb"
]
},
"track": {
"file": "single-btag-track-variables.json"
}
}
}
§§§url="https://gitlab.cern.ch/atlas-flavor-tagging-tools/training-dataset-dumper/-/raw/r22/configs/single-b-tag/EMPFlow.json"::§§§
```
To run the taggers within the dumper, we need the [r22 Branch](https://gitlab.cern.ch/atlas-flavor-tagging-tools/training-dataset-dumper/-/tree/r22) or we need to change the AnalysisBase version in the [setup.py](https://gitlab.cern.ch/atlas-flavor-tagging-tools/training-dataset-dumper/-/blob/master/setup.sh#L21) to `asetup AnalysisBase,22.2.12,latest`.
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment