Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • compathelper/new_version/2025-09-15-01-44-22-954-00348582943
  • compathelper/new_version/2025-09-08-01-44-29-080-02036885655
  • compathelper/new_version/2025-09-01-01-44-30-182-03943269900
  • compathelper/new_version/2025-08-25-01-44-29-646-01107274557
  • main
  • dispatch-check
  • add-types
  • ci-run-on-release
  • v0.1.0
  • v0.2.0
  • v0.3.0
  • v0.3.1
  • v0.4.0
  • v0.5.0
  • v0.5.1
  • v1.0.0
16 results

Target

Select target project
  • Wataru Otsubo / PSBoardDataBase
1 result
Select Git revision
  • compathelper/new_version/2025-09-15-01-44-22-954-00348582943
  • compathelper/new_version/2025-09-08-01-44-29-080-02036885655
  • compathelper/new_version/2025-09-01-01-44-30-182-03943269900
  • compathelper/new_version/2025-08-25-01-44-29-646-01107274557
  • main
  • dispatch-check
  • add-types
  • ci-run-on-release
  • v0.1.0
  • v0.2.0
  • v0.3.0
  • v0.3.1
  • v0.4.0
  • v0.5.0
  • v0.5.1
  • v1.0.0
16 results
Show changes

Commits on Source 78

29 files
+ 8994
167
Compare changes
  • Side-by-side
  • Inline

Files

+22 −5
Original line number Original line Diff line number Diff line
variables:
  CI_JULIA_CACHE_DIR: ${CI_PROJECT_DIR}/julia_pkg
  JULIA_DEPOT_PATH: ${CI_JULIA_CACHE_DIR}
cache:
  key:
    files:
      - Project.toml
      - docs/Project.toml
    prefix: ${CI_JOB_NAME}
  paths:
    - ${CI_JULIA_CACHE_DIR}
.script:
.script:
  script:
  script:
    - |
    - |
@@ -21,28 +32,34 @@ Julia 1.10:
  extends:
  extends:
    - .script
    - .script
    - .coverage
    - .coverage
Julia 1.11:
  image: julia:1.11
  extends:
    - .script
    - .coverage
pages:
pages:
  image: julia:1.10
  image: julia:1.11
  stage: deploy
  stage: deploy
  script:
  script:
    - |
    - |
      julia --project=docs -e '
      julia --project=docs -e '
        using Pkg
        using Pkg
        @info "Pkg status before dev" Pkg.status() pwd()
        Pkg.develop(PackageSpec(path=pwd()))
        Pkg.develop(PackageSpec(path=pwd()))
        Pkg.instantiate()
        Pkg.instantiate()
        using Documenter: doctest
        @info "Pkg status after dev" Pkg.status()
        using PSBoardDataBase
        doctest(PSBoardDataBase)
        include("docs/make.jl")'
        include("docs/make.jl")'
    - mkdir -p public
    - mkdir -p public
    - mv docs/build public/dev
    - mv docs/build public/dev
    - ls docs/src -R
    - mv docs/src/assets/*.html public
  artifacts:
  artifacts:
    paths:
    paths:
      - public
      - public
  only:
  only:
    - main
    - main
CompatHelper:
CompatHelper:
  image: julia:1.10 # Set to the Julia version you want to use
  image: julia:1.11 # Set to the Julia version you want to use
  stage: test # You can place this in any stage that makes sense for your setup
  stage: test # You can place this in any stage that makes sense for your setup
  only:
  only:
    - schedules
    - schedules
+32 −1
Original line number Original line Diff line number Diff line
@@ -7,6 +7,36 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0


## [Unreleased]
## [Unreleased]


## [0.2.0] - 2024-10-23

- Julia v1.11 was released and v1.10 is new LTS
- This package will support both of them

### Added

- Add `versions` table to store version information of converter(this software)
- Add `skew` column to `qaqc_positions` table
- Add `DownloadCSVs` module and functions which downloads the latest CSVs from Google sheets
- Add `ClockParser` module and `get_skew` function
- Add `lvds_tx_skew` column to `qaqc_single_run_result` table and related functions in `import_data.jl`
- Add example Pluto notebook which plots clock skew histogram
- Add `count_riseup` to count rise up in clock result
- Add example app using Pluto
- Add `DispatchChecker` module
- Add `is_dispatchable` which checks given PSBoard is ready to dispatch
- Add `interactive_dispatch_checker` which provide interactive session for QAQC

### Changed

- Set download functions in `DownloadCSVs` as default CSV locations in `create_database_from_exported_csvs`
- Replaced CSV files used in test to newly add `DownloadCSVs` functions
- `create_database_from_exported_csvs` now requires `slavelog_dir` to get skew from slave logs
- CI runs on v1.10 and v1.11

### Deleted

- CSV files manually exported from Google Sheets

## [0.1.0]
## [0.1.0]


### Added
### Added
@@ -15,5 +45,6 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Docs on the database and example of querying the database.
- Docs on the database and example of querying the database.
- Docs on the internal (Julia implementation)
- Docs on the internal (Julia implementation)


[unreleased]: https://gitlab.cern.ch/wotsubo/PSBoardDataBase/~/compare/v0.1.0...main
[unreleased]: https://gitlab.cern.ch/wotsubo/PSBoardDataBase/-/compare/v0.2.0...main
[0.2.0]: https://gitlab.cern.ch/wotsubo/PSBoardDataBase/~/compare/v0.1.0...v0.2.0
[0.1.0]: https://gitlab.cern.ch/wotsubo/PSBoardDataBase/~/tags/v0.1.0
[0.1.0]: https://gitlab.cern.ch/wotsubo/PSBoardDataBase/~/tags/v0.1.0
+4 −1
Original line number Original line Diff line number Diff line
name = "PSBoardDataBase"
name = "PSBoardDataBase"
uuid = "779f6a9c-59fa-41f1-8ed1-e9a91eccb2f5"
uuid = "779f6a9c-59fa-41f1-8ed1-e9a91eccb2f5"
authors = ["Wataru Otsubo <wotsubo@icepp.s.u-tokyo.ac.jp>"]
authors = ["Wataru Otsubo <wotsubo@icepp.s.u-tokyo.ac.jp>"]
version = "0.1.0"
version = "0.2.0"


[deps]
[deps]
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
@@ -9,6 +9,8 @@ DBInterface = "a10d1c49-ce27-4219-8d33-6db1a4562965"
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Downloads = "f43a241f-c20a-4ad4-852c-f6b1247861c6"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
SQLite = "0aa819cd-b072-5ff4-a722-6bc24af294d9"
SQLite = "0aa819cd-b072-5ff4-a722-6bc24af294d9"
Tables = "bd369af6-aec1-5ad0-b16a-f7cc5008161c"
Tables = "bd369af6-aec1-5ad0-b16a-f7cc5008161c"


@@ -23,6 +25,7 @@ CSV = "0.10"
DBInterface = "2"
DBInterface = "2"
DataFrames = "1"
DataFrames = "1"
Documenter = "1"
Documenter = "1"
Downloads = "1"
SQLite = "1"
SQLite = "1"
Tables = "1"
Tables = "1"


+4 −0
Original line number Original line Diff line number Diff line
@@ -8,3 +8,7 @@ PS Boardに関する(特にQAQCの結果に関する)データベース。
[ドキュメント(主にデータベースを扱うパッケージに関する)](http://psboard-database.docs.cern.ch/dev/)
[ドキュメント(主にデータベースを扱うパッケージに関する)](http://psboard-database.docs.cern.ch/dev/)


PS Board QAQCに関しては[こちら](https://gitlab.cern.ch/dhashimo/PS_Board_QAQC)
PS Board QAQCに関しては[こちら](https://gitlab.cern.ch/dhashimo/PS_Board_QAQC)

解析結果の例
- 結果一覧閲覧用簡易アプリケーション: https://psboard-database.docs.cern.ch/get_results.html
- skewを複数回測定した個体に関する統計調査: https://psboard-database.docs.cern.ch/skew_stats.html
+14 −14
Original line number Original line Diff line number Diff line
@@ -2,7 +2,7 @@


julia_version = "1.10.5"
julia_version = "1.10.5"
manifest_format = "2.0"
manifest_format = "2.0"
project_hash = "ee8312616d887d85f460636f9710488be4490a26"
project_hash = "7a44705ce6faa370eeecb49dce927460b0d7fd20"


[[deps.ANSIColoredPrinters]]
[[deps.ANSIColoredPrinters]]
git-tree-sha1 = "574baf8110975760d391c710b6341da1afa48d8c"
git-tree-sha1 = "574baf8110975760d391c710b6341da1afa48d8c"
@@ -67,10 +67,10 @@ uuid = "9a962f9c-6df0-11e9-0e5d-c546b8b5ee8a"
version = "1.16.0"
version = "1.16.0"


[[deps.DataFrames]]
[[deps.DataFrames]]
deps = ["Compat", "DataAPI", "DataStructures", "Future", "InlineStrings", "InvertedIndices", "IteratorInterfaceExtensions", "LinearAlgebra", "Markdown", "Missings", "PooledArrays", "PrecompileTools", "PrettyTables", "Printf", "REPL", "Random", "Reexport", "SentinelArrays", "SortingAlgorithms", "Statistics", "TableTraits", "Tables", "Unicode"]
deps = ["Compat", "DataAPI", "DataStructures", "Future", "InlineStrings", "InvertedIndices", "IteratorInterfaceExtensions", "LinearAlgebra", "Markdown", "Missings", "PooledArrays", "PrecompileTools", "PrettyTables", "Printf", "Random", "Reexport", "SentinelArrays", "SortingAlgorithms", "Statistics", "TableTraits", "Tables", "Unicode"]
git-tree-sha1 = "04c738083f29f86e62c8afc341f0967d8717bdb8"
git-tree-sha1 = "fb61b4812c49343d7ef0b533ba982c46021938a6"
uuid = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
uuid = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
version = "1.6.1"
version = "1.7.0"


[[deps.DataStructures]]
[[deps.DataStructures]]
deps = ["Compat", "InteractiveUtils", "OrderedCollections"]
deps = ["Compat", "InteractiveUtils", "OrderedCollections"]
@@ -154,9 +154,9 @@ version = "1.3.1"


[[deps.Git_jll]]
[[deps.Git_jll]]
deps = ["Artifacts", "Expat_jll", "JLLWrappers", "LibCURL_jll", "Libdl", "Libiconv_jll", "OpenSSL_jll", "PCRE2_jll", "Zlib_jll"]
deps = ["Artifacts", "Expat_jll", "JLLWrappers", "LibCURL_jll", "Libdl", "Libiconv_jll", "OpenSSL_jll", "PCRE2_jll", "Zlib_jll"]
git-tree-sha1 = "d18fb8a1f3609361ebda9bf029b60fd0f120c809"
git-tree-sha1 = "ea372033d09e4552a04fd38361cd019f9003f4f4"
uuid = "f8c6e375-362e-5223-8a59-34ff63f689eb"
uuid = "f8c6e375-362e-5223-8a59-34ff63f689eb"
version = "2.44.0+2"
version = "2.46.2+0"


[[deps.IOCapture]]
[[deps.IOCapture]]
deps = ["Logging", "Random"]
deps = ["Logging", "Random"]
@@ -292,9 +292,9 @@ version = "0.3.23+4"


[[deps.OpenSSL_jll]]
[[deps.OpenSSL_jll]]
deps = ["Artifacts", "JLLWrappers", "Libdl"]
deps = ["Artifacts", "JLLWrappers", "Libdl"]
git-tree-sha1 = "1b35263570443fdd9e76c76b7062116e2f374ab8"
git-tree-sha1 = "7493f61f55a6cce7325f197443aa80d32554ba10"
uuid = "458c3c95-2e84-50aa-8efc-19380b2a3a95"
uuid = "458c3c95-2e84-50aa-8efc-19380b2a3a95"
version = "3.0.15+0"
version = "3.0.15+1"


[[deps.OrderedCollections]]
[[deps.OrderedCollections]]
git-tree-sha1 = "dfdf5519f235516220579f949664f1bf44e741c5"
git-tree-sha1 = "dfdf5519f235516220579f949664f1bf44e741c5"
@@ -307,8 +307,8 @@ uuid = "efcefdf7-47ab-520b-bdef-62a2eaa19f15"
version = "10.42.0+1"
version = "10.42.0+1"


[[deps.PSBoardDataBase]]
[[deps.PSBoardDataBase]]
deps = ["CSV", "DBInterface", "DataFrames", "Dates", "Documenter", "SQLite", "Tables"]
deps = ["CSV", "DBInterface", "DataFrames", "Dates", "Documenter", "Downloads", "SQLite", "Tables"]
path = ".."
path = "/home/qwjyh/Documents/school/lab/PSBoard_QAQC/PSBoardDataBase"
uuid = "779f6a9c-59fa-41f1-8ed1-e9a91eccb2f5"
uuid = "779f6a9c-59fa-41f1-8ed1-e9a91eccb2f5"
version = "0.1.0"
version = "0.1.0"
weakdeps = ["InteractiveUtils"]
weakdeps = ["InteractiveUtils"]
@@ -347,9 +347,9 @@ version = "1.4.3"


[[deps.PrettyTables]]
[[deps.PrettyTables]]
deps = ["Crayons", "LaTeXStrings", "Markdown", "PrecompileTools", "Printf", "Reexport", "StringManipulation", "Tables"]
deps = ["Crayons", "LaTeXStrings", "Markdown", "PrecompileTools", "Printf", "Reexport", "StringManipulation", "Tables"]
git-tree-sha1 = "66b20dd35966a748321d3b2537c4584cf40387c7"
git-tree-sha1 = "1101cd475833706e4d0e7b122218257178f48f34"
uuid = "08abe8d2-0d0c-5749-adfa-8a2ac140af0d"
uuid = "08abe8d2-0d0c-5749-adfa-8a2ac140af0d"
version = "2.3.2"
version = "2.4.0"


[[deps.Printf]]
[[deps.Printf]]
deps = ["Unicode"]
deps = ["Unicode"]
@@ -420,9 +420,9 @@ version = "1.10.0"


[[deps.StringManipulation]]
[[deps.StringManipulation]]
deps = ["PrecompileTools"]
deps = ["PrecompileTools"]
git-tree-sha1 = "a04cabe79c5f01f4d723cc6704070ada0b9d46d5"
git-tree-sha1 = "a6b1675a536c5ad1a60e5a5153e1fee12eb146e3"
uuid = "892a3eda-7b42-436c-8928-eab12a02cf0e"
uuid = "892a3eda-7b42-436c-8928-eab12a02cf0e"
version = "0.3.4"
version = "0.4.0"


[[deps.SuiteSparse_jll]]
[[deps.SuiteSparse_jll]]
deps = ["Artifacts", "Libdl", "libblastrampoline_jll"]
deps = ["Artifacts", "Libdl", "libblastrampoline_jll"]
+2 −0
Original line number Original line Diff line number Diff line
[deps]
[deps]
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
DocInventories = "43dc2714-ed3b-44b5-b226-857eda1aa7de"
DocInventories = "43dc2714-ed3b-44b5-b226-857eda1aa7de"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
DocumenterInterLinks = "d12716ef-a0f6-4df4-a9f1-a5a34e75c656"
DocumenterInterLinks = "d12716ef-a0f6-4df4-a9f1-a5a34e75c656"
+94 −1
Original line number Original line Diff line number Diff line
```@meta
```@meta
CurrentModule = PSBoardDataBase
CurrentModule = PSBoardDataBase
```
```

# Index

```@contents
Pages = ["about_software.md"]
Depth = 4
```

# このリポジトリのソフトウェアについての説明
# このリポジトリのソフトウェアについての説明


このリポジトリにあるのは、JATHub masterのログファイル、及びGoogle SheetsからエクスポートしたCSVファイルからデータベースを作成するためのコードである。
このリポジトリにあるのは、JATHub masterのログファイル、及びGoogle SheetsからエクスポートしたCSVファイルからデータベースを作成するためのコードである。
メインの関数は[`create_database_from_exported_csvs`](@ref)である。
メインの関数は[`create_database_from_exported_csvs`](@ref)である。


!!! info
    **TLDR**;
    データベースがほしいときは_Masterのログ_と_Slaveのログ_を用意して、[`create_database_from_exported_csvs`](@ref)
    ```julia
    create_database_from_exported_csvs(
        "database_name.db";
        masterlog_dir = "dir/to/master/logs",
        slavelog_dir = "dir/to/slave/logs"
    )
    ```

## 動かし方
## 動かし方
[Julia](https://julialang.org)[juliaup](https://github.com/JuliaLang/juliaup)でインストールする。
[Julia](https://julialang.org)[juliaup](https://github.com/JuliaLang/juliaup)でインストールする。
リポジトリのルート(`Project.toml`がある)で
リポジトリのルート(`Project.toml`がある)で
@@ -25,10 +44,66 @@ backspaceでjulianモードに戻り(左側が`julia>`になってる)、`using
# テストについて
# テストについて


テストでは実際にデータベースを作成している。
テストでは実際にデータベースを作成している。
デフォルトでは全部は実行しないが、master log fileをおき、かつ環境変数`LOCAL_TEST`を設定することで、master log fileが必要な工程まで含めて実行できる。[^1]
デフォルトでは全部は実行しないが、master logとslave logをおき、かつ環境変数`LOCAL_TEST`を設定することで、すべての工程を実行できる。[^1]
用意するものの詳細は下に書いてある。

Pkgモード(`]`を押す)で`test`とうつと実行できる。
環境変数の設定は`ENV["LOCAL_TEST"] = "1"`とでも書く。
自動でsqlite browserを開く設定になっている。


[^1]: master log fileはgitには入れたくないので、このような形態をとっている。
[^1]: master log fileはgitには入れたくないので、このような形態をとっている。


## Master logを用意する

[`start-shiftwork`](https://gitlab.cern.ch/dhashimo/PS_Board_QAQC/-/blob/master/Software_Test/ShiftWorks/script/start-shiftwork?ref_type=heads)が生成するログファイル(同じリポジトリの`../log`のところ、通常はJATHub Masterにある)を`test/input/`以下に置く。
以下のようにする。

```sh
$ tree -L 2 test/input/
test/input/
└── log
    ├── 100.log
    ├── 101_long.log
    ├── 102_long.log
    ├── 103.log
    ├── 104_long.log
    ├── 105.log
    ├── 106_long.log
    ├── 107.log
    ├── 108_long.log
    ├── 109_long.log
    ├── 110.log
...
```

## Slave logを用意する

同様にJATHub slaveで作られるログも用意する。
これはskewを抽出するのに使われる。
最悪なくてもいいのでそのときは空のディレクトリを関数に渡す。

```sh
 tree -L 2 test/input/slavelogs/ | head -n 10
test/input/slavelogs/
└── main
    ├── 101_28_longrun.txt
    ├── 101_29.txt
    ├── 101_29_clk.txt
    ├── 103_28_longrun.txt
    ├── 103_29.txt
    ├── 103_29_clk.txt
    ├── 103_89.txt
    ├── 103_89_clk.txt
```

## Coverage Reportの出し方

```julia
using LocalCoverage
html_coverage(generate_coverage(), open = true)
```
Full testのほうが当然だがcoverageが高くなる。

# 新しいQAQCキャンペーン時に更新すべき内容
# 新しいQAQCキャンペーン時に更新すべき内容


- [`PSBoardDataBase.insert_qaqc_campaign_id`](@ref): キャンペーンの日時
- [`PSBoardDataBase.insert_qaqc_campaign_id`](@ref): キャンペーンの日時
@@ -48,3 +123,21 @@ Modules = [PSBoardDataBase]
```@autodocs
```@autodocs
Modules = [QaqcMasterLog]
Modules = [QaqcMasterLog]
```
```

## `DownloadCSVs`

```@autodocs
Modules = [DownloadCSVs]
```

## `ClockParser`

```@autodocs
Modules = [ClockParser]
```

## `DispatchChecker`

```@autodocs
Modules = [DispatchChecker]
```
+17 −0

File added.

Preview size limit exceeded, changes collapsed.

+17 −0

File added.

Preview size limit exceeded, changes collapsed.

+1348 −0

File added.

Preview size limit exceeded, changes collapsed.

+767 −0

File added.

Preview size limit exceeded, changes collapsed.

+1814 −0

File added.

Preview size limit exceeded, changes collapsed.

examples/skew_stats.jl

0 → 100644
+777 −0

File added.

Preview size limit exceeded, changes collapsed.

+27 −11
Original line number Original line Diff line number Diff line
@@ -8,47 +8,62 @@ using DataFrames
using Dates
using Dates


include("parse_qaqc_master_log.jl")
include("parse_qaqc_master_log.jl")
include("parse_clock.jl")


include("create_table.jl")
include("create_table.jl")

include("download_csv.jl")
include("import_data.jl")
include("import_data.jl")


include("dispatch_checker.jl")
using .DispatchChecker

"""
"""
    create_database_from_exported_csvs(
    create_database_from_exported_csvs(
        dbpath::AbstractString;
        dbpath::AbstractString;
        single_run_csv::AbstractString,
        single_run_csv::AbstractString = DownloadCSVs.download_single_run_csv(),
        runlist_csv::AbstractString,
        runlist_csv::AbstractString = DownloadCSVs.download_runlist_csv(),
        dispatch_csv::AbstractString,
        dispatch_csv::AbstractString = DownloadCSVs.download_dispatch_csv(),
        hundred_csv::AbstractString,
        hundred_csv::AbstractString = DownloadCSVs.download_hundred_run_csv(),
        jathubs_csv::AbstractString = DownloadCSVs.download_jathub_csv(),
        masterlog_dir::AbstractString,
        masterlog_dir::AbstractString,
        slavelog_dir::AbstractString,
    )
    )


Create database at `dbpath` and import data from CSV and master log files.
Create database at `dbpath` and import data from CSV and master log files.


# Arguments
# Arguments

## Required
- `dbpath`: where the database will be created
- `dbpath`: where the database will be created
- `masterlog_dir`: path to the directory (`log`) where all JATHub master logs are stored
- `slavelog_dir`: path to the directory where all JATHub slave logs are stored

## Optional
- `single_run_csv`: CSV of single run results exported from the Google sheets database
- `single_run_csv`: CSV of single run results exported from the Google sheets database
- `runlist_csv`: CSV of run lists exported from the Google sheets database
- `runlist_csv`: CSV of run lists exported from the Google sheets database
- `dispatch_csv`: CSV of dispatch lists exported from the Google sheets database
- `dispatch_csv`: CSV of dispatch lists exported from the Google sheets database
- `hundred_csv`: CSV of 100 tests results exported from the Google sheets database
- `hundred_csv`: CSV of 100 tests results exported from the Google sheets database
- `masterlog_dir`: path to the directory (`log`) where all JATHub master log is stored
- `jathubs_csv`: CSV for jathub list used in QAQC. Used to add skew.
"""
"""
function create_database_from_exported_csvs(
function create_database_from_exported_csvs(
    dbpath::AbstractString;
    dbpath::AbstractString;
    single_run_csv::AbstractString,
    single_run_csv::AbstractString = DownloadCSVs.download_single_run_csv(),
    runlist_csv::AbstractString,
    runlist_csv::AbstractString = DownloadCSVs.download_runlist_csv(),
    dispatch_csv::AbstractString,
    dispatch_csv::AbstractString = DownloadCSVs.download_dispatch_csv(),
    hundred_csv::AbstractString,
    hundred_csv::AbstractString = DownloadCSVs.download_hundred_run_csv(),
    jathubs_csv::AbstractString = DownloadCSVs.download_jathub_csv(),
    masterlog_dir::AbstractString,
    masterlog_dir::AbstractString,
    slavelog_dir::AbstractString,
)
)
    db = create_database(dbpath)
    db = create_database(dbpath)
    single_result_df = CSV.read(single_run_csv, DataFrame)
    single_result_df = CSV.read(single_run_csv, DataFrame)
    runlist_table = CSV.read(runlist_csv, DataFrame)
    runlist_table = CSV.read(runlist_csv, DataFrame)
    dispatch_table = CSV.read(dispatch_csv, DataFrame)
    dispatch_table = CSV.read(dispatch_csv, DataFrame)
    extra_100test_result_df = CSV.read(hundred_csv, DataFrame)
    extra_100test_result_df = CSV.read(hundred_csv, DataFrame)
    jathubs_table = CSV.read(jathubs_csv, DataFrame)


    insert_qaqc_campaign_id(db)
    insert_qaqc_campaign_id(db)
    insert_qaqc_positions(db)
    insert_qaqc_positions(db, jathubs_table)


    add_psboard_ids(db, single_result_df)
    add_psboard_ids(db, single_result_df)
    add_qaqc_runlist_from_runlist(db, runlist_table)
    add_qaqc_runlist_from_runlist(db, runlist_table)
@@ -56,6 +71,7 @@ function create_database_from_exported_csvs(
    add_qaqc_dispatch(db, dispatch_table)
    add_qaqc_dispatch(db, dispatch_table)
    add_qaqc_runlist_from_masterlogs(db, masterlog_dir)
    add_qaqc_runlist_from_masterlogs(db, masterlog_dir)
    add_qaqc_100test_result(db, extra_100test_result_df)
    add_qaqc_100test_result(db, extra_100test_result_df)
    add_skew_from_slave_clk_logs(db, slavelog_dir)


    db
    db
end
end
+190 −0
Original line number Original line Diff line number Diff line
"""
Module to check PSBoard is dispatchable.

Use [`interactive_dispatch_checker`](@ref) for interactive use in QAQC.
"""
module DispatchChecker

using SQLite
using DBInterface
using DataFrames
using Printf

export DbConnection
export is_dispatchable

"""
Stores connection to database.

    DbConnection(db::SQLite.DB)

Constructor.
"""
mutable struct DbConnection
    db::SQLite.DB
    df_single_result::DataFrame
    df_extra_results::DataFrame
    function DbConnection(db::SQLite.DB)
        df_single_results =
            DBInterface.execute(db, sql"select * from qaqc_single_run_results") |> DataFrame
        df_extra_results =
            DBInterface.execute(db, sql"select * from qaqc_extra_run_results") |> DataFrame
        new(db, df_single_results, df_extra_results)
    end
end

THRESHOLD_INSUFFICIENT_RESET_WITH_10 = 0.1
THRESHOLD_RESET_FAILED_THOUGH_RECONFIG_DONE = 0.1
THRESHOLD_ALWAYS_HIT_FLAG_TRUE = 0.1
THRESHOLD_BCID_FAIL = 0.1

"""
    is_dispatchable(conn::DbConnection, psbid::Int64)

Test whether the PS Board with `psbid` is dispatchable from QAQC results in `conn`.
`conn` is type of [`DbConnection`](@ref).
Since the current implemented logic is somewhat simple, it returns `missing` if it cannot be decided.
"""
function is_dispatchable(conn::DbConnection, psbid::Int64)
    single_results = filter(:psboard_id => ==(psbid), conn.df_single_result)
    extra_results = filter(:psboard_id => ==(psbid), conn.df_extra_results)

    is_single_passed::Bool =
        nrow(single_results) == 1 && let
            single_result = Tables.rowtable(single_results) |> first

            # Clock test update was wrong
            # manually assign to psboards whose clock test failed
            if 221 <= single_result.runid < 234
                if single_result.psboard_id == 915
                    false
                elseif single_result.psboard_id in [860, 889, 876, 892]
                    # though 860 has 2+ rows
                    true
                end
            else
                single_result.resistance_test_passed == 1 &&
                    single_result.qspip == 1 &&
                    single_result.recov == 1 &&
                    single_result.power == 1 &&
                    single_result.clock == 1
            end
        end
    @debug "" is_single_passed single_results single_results.note
    is_extra_passed::Bool =
        nrow(extra_results) == 1 && let
            extra_result = Tables.rowtable(extra_results) |> first
            f1 =
                !ismissing(extra_result.insufficient_reset_with_10) &&
                extra_result.insufficient_reset_with_10 >=
                extra_result.num_tests * THRESHOLD_INSUFFICIENT_RESET_WITH_10
            f2 =
                !ismissing(extra_result.reset_failed_though_reconfig_done) &&
                extra_result.reset_failed_though_reconfig_done >=
                extra_result.num_tests * THRESHOLD_RESET_FAILED_THOUGH_RECONFIG_DONE
            f3 =
                !ismissing(extra_result.always_hit_flag_true) &&
                extra_result.always_hit_flag_true >=
                extra_result.num_tests * THRESHOLD_ALWAYS_HIT_FLAG_TRUE
            f4 =
                !ismissing(extra_result.bcid_fail) &&
                extra_result.bcid_fail >= extra_result.num_tests * THRESHOLD_BCID_FAIL
            @debug "" extra_result extra_result.note f1 f2 f3 f4
            !(f1 || f2 || f3 || f4)
        end
    @debug "" is_extra_passed extra_results

    if is_single_passed & is_extra_passed
        return true
    end

    # TODO: not yet implemented
    @info "results" single_results select(
        extra_results,
        Not(:id, :num_tests, :dac_is_0, :bcid_fail_111, :bcid_fail_000, :low_efficiency),
    )
    @debug "results(full)" extra_results
    return missing
end

"""
Interactive session for QAQC to check PSBoard is ready for dispatch.
"""
function interactive_dispatch_checker end

"""
    interactive_dispatch_checker(conn::DbConnection)
"""
function interactive_dispatch_checker(conn::DbConnection)
    dispatch_list = Int64[]

    println("Type \"quit\" to exit")
    for _ in 1:1000
        printstyled("PSBoard ID: ", bold = true)
        psbid = let
            rawin = readline()
            if lowercase(rawin) == "quit"
                printstyled("Quit\n", italic = true)
                println()
                break
            end
            m = match(r"^PS(\d+)", rawin)
            if isnothing(m)
                printstyled("Invalid input\n", color = :red)
                continue
            end
            parse(Int64, m[1])
        end
        isdispatchable = is_dispatchable(conn, psbid)
        if ismissing(isdispatchable)
            printstyled("Please determine [y/n]: ", underline = true, color = :cyan)
            isdispatchable = let
                rawin = readline()
                @info "" rawin
                if rawin == "y" || rawin == "Y"
                    true
                elseif rawin == "n" || rawin == "N"
                    false
                else
                    @warn "Invalid input falling back to \"no\""
                    false
                end
            end
        end
        if isdispatchable
            printstyled("Ok\n", bold = true, color = :green)
            if psbid in dispatch_list
                println("PSBoard ID $(psbid) is already in dispatch list")
            else
                push!(dispatch_list, psbid)
                println("Added to dispatch list")
            end
        else
            printstyled("No\n", bold = true, color = :red)
        end
    end

    printstyled("Finished\n")

    map(dispatch_list) do psbid
        @sprintf "PS%06d" psbid
    end |> (v -> join(v, "\n")) |> print
    println()

    printstyled("Paste the result to google sheets\n", underline = true)
    @info "Tips: You can use `join(ans, \"\\n\") |> clipboard` in REPL to copy the result to the clipboard"

    return dispatch_list
end

"""
    interactive_dispatch_checker(database_file::AbstractString)

Interactive session for QAQC to check provided PSBoard is ready to dispatch.
"""
function interactive_dispatch_checker(database_file::AbstractString)
    conn = DbConnection(SQLite.DB(database_file))
    interactive_dispatch_checker(conn)
end

end # module DispatchChecker

src/download_csv.jl

0 → 100644
+61 −0
Original line number Original line Diff line number Diff line
"""
Functions to download result CSVs from Google Sheets.
All functions return the filename in `String`.
"""
module DownloadCSVs

using Downloads

"""
    download_single_run_csv(outfile::AbstractString = tempname()) -> filename

# Example
```jldoctest
julia> file = PSBoardDataBase.DownloadCSVs.download_single_run_csv();

julia> using CSV

julia> using DataFrames

julia> CSV.read(file, DataFrame) isa DataFrame
true
```
"""
function download_single_run_csv(outfile::AbstractString = tempname())
    URL_SINGLE_RUN_CSV::String = "https://docs.google.com/spreadsheets/u/1/d/128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU/export?format=csv&id=128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU&gid=408695746"
    Downloads.download(URL_SINGLE_RUN_CSV, outfile)
end

"""
    download_runlist_csv(outfile::AbstractString = tempname()) -> filename
"""
function download_runlist_csv(outfile::AbstractString = tempname())
    URL_RUNLIST_CSV::String = "https://docs.google.com/spreadsheets/u/1/d/128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU/export?format=csv&id=128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU&gid=252134084"
    Downloads.download(URL_RUNLIST_CSV, outfile)
end

"""
    download_dispatch_csv(outfile::AbstractString = tempname()) -> filename
"""
function download_dispatch_csv(outfile::AbstractString = tempname())
    URL_DISPATCH_CSV::String = "https://docs.google.com/spreadsheets/u/1/d/128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU/export?format=csv&id=128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU&gid=707598141"
    Downloads.download(URL_DISPATCH_CSV, outfile)
end

"""
    download_hundred_run_csv(outfile::AbstractString = tempname()) -> filename
"""
function download_hundred_run_csv(outfile::AbstractString = tempname())
    URL_HUNDRED_RUN_CSV::String = "https://docs.google.com/spreadsheets/u/1/d/128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU/export?format=csv&id=128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU&gid=615256061"
    Downloads.download(URL_HUNDRED_RUN_CSV, outfile)
end

"""
    download_jathub_csv(outfile::AbstractString = tempname()) -> filename
"""
function download_jathub_csv(outfile::AbstractString = tempname())
    URL_JATHUB_CSV::String = "https://docs.google.com/spreadsheets/u/1/d/128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU/export?format=csv&id=128qOseOy4QDotehYe4Wf2jj88tnwiXGVdR3NHrjcDYU&gid=303843601"
    Downloads.download(URL_JATHUB_CSV, outfile)
end

end # module DownloadCSVs
+86 −6
Original line number Original line Diff line number Diff line
"""
    insert_version_info(db::SQLite.DB)

Insert version information of this software as string.
"""
function insert_version_info(db::SQLite.DB)
    stmt = DBInterface.prepare(
        db,
        sql"""
        INSERT INTO versions VALUES (:converter)
        """,
    )
    DBInterface.execute(stmt, (; converter = pkgversion(@__MODULE__) |> string))

    nothing
end

"""
"""
    insert_qaqc_campaign_id(db::SQLite.DB)
    insert_qaqc_campaign_id(db::SQLite.DB)


Fill qaqc_campaigns table in `db`.
Fill qaqc_campaigns table in `db`.
"""
"""
function insert_qaqc_campaign_id(db::SQLite.DB)
function insert_qaqc_campaign_id(db::SQLite.DB)
    campaigns = [1, 2, 3]
    campaigns = [1, 2, 3, 4]
    dates = [
    dates = [
        (DateTime(2024, 7, 22), DateTime(2024, 7, 24)),
        (DateTime(2024, 7, 22), DateTime(2024, 7, 24)),
        (DateTime(2024, 8, 6), DateTime(2024, 8, 9)),
        (DateTime(2024, 8, 6), DateTime(2024, 8, 9)),
        (DateTime(2024, 9, 10), DateTime(2024, 9, 12)),
        (DateTime(2024, 9, 10), DateTime(2024, 9, 12)),
        (DateTime(2024, 9, 30), DateTime(2024, 10, 4)),
    ]
    ]
    stmt = DBInterface.prepare(
    stmt = DBInterface.prepare(
        db,
        db,
@@ -28,11 +46,18 @@ function insert_qaqc_campaign_id(db::SQLite.DB)
end
end


"""
"""
    insert_qaqc_positions(db::SQLite.DB)
    insert_qaqc_positions(db::SQLite.DB, jathub_db_table::DataFrame)


Fill qaqc_positions table in `db`.
Fill qaqc_positions table in `db`.
Argument `jathub_db_table` is for skew for each positions.
"""
"""
function insert_qaqc_positions(db::SQLite.DB)
function insert_qaqc_positions(db::SQLite.DB, jathub_db_table::DataFrame)
    dropmissing!(jathub_db_table, :psb_position)
    transform!(
        jathub_db_table,
        Symbol("立ち上がり [ns]") => ByRow(Float64) => Symbol("立ち上がり [ns]"),
    )

    stmt = DBInterface.prepare(
    stmt = DBInterface.prepare(
        db,
        db,
        sql"""
        sql"""
@@ -41,7 +66,8 @@ function insert_qaqc_positions(db::SQLite.DB)
            :id,
            :id,
            :name,
            :name,
            :station,
            :station,
            :position
            :position,
            :rising_ns
        )
        )
        """,
        """,
    )
    )
@@ -52,6 +78,12 @@ function insert_qaqc_positions(db::SQLite.DB)
            name = ["B-$i-$j" for i in 0:1 for j in 1:9],
            name = ["B-$i-$j" for i in 0:1 for j in 1:9],
            station = [fill(0, 9); fill(1, 9)],
            station = [fill(0, 9); fill(1, 9)],
            position = [collect(1:9); collect(1:9)],
            position = [collect(1:9); collect(1:9)],
            rising_ns = [
                filter(
                    :psb_position => (s -> !ismissing(s) && s == "B-$i-$j"),
                    jathub_db_table,
                ).var"立ち上がり [ns]" |> first for i in 0:1 for j in 1:9
            ],
        ),
        ),
    )
    )


@@ -163,8 +195,10 @@ function get_campaign_id_from_run_id(runid::Integer)
        1
        1
    elseif runid < 98
    elseif runid < 98
        2
        2
    elseif runid < 169
    elseif runid < 188
        3
        3
    elseif runid < 242
        4
    else
    else
        @error "Fix this function"
        @error "Fix this function"
        DomainError("runid $(runid) is not registered to the software")
        DomainError("runid $(runid) is not registered to the software")
@@ -179,7 +213,7 @@ end
    ) -> nothing
    ) -> nothing


Fill `qaqc_single_run_results` in `db` from single result table DataFrame.
Fill `qaqc_single_run_results` in `db` from single result table DataFrame.
Additionaly, it
Additionally, it
1. automatically add `runid` if it's not in `qaqc_runs` table in `db`.
1. automatically add `runid` if it's not in `qaqc_runs` table in `db`.
2. automatically update fields in `qaqc_runs` table.
2. automatically update fields in `qaqc_runs` table.
"""
"""
@@ -671,3 +705,49 @@ function add_qaqc_100test_result(db::SQLite.DB, table::DataFrame)


    nothing
    nothing
end
end

"""
    add_skew_from_slave_clk_logs(db::SQLite.DB, logs_dir::AbstractString)

Insert skew measurement result from slave logs with name `psbid_runid_clk.txt`.
See [`ClockParser.get_skew`](@ref) for parse detail.

# Abnormal logs:
- `48_nagoya_irradition_...`: skipped
- `630_190`: broken file
"""
function add_skew_from_slave_clk_logs(db::SQLite.DB, logs_dir::AbstractString)
    stmt_insrt = DBInterface.prepare(
        db,
        sql"""
        UPDATE qaqc_single_run_results
        SET lvds_tx_skew = :skew
        WHERE runid = :runid AND psboard_id = :psbid
        """,
    )
    clk_files =
        readdir("$logs_dir/main", join = true) |>
        filter(endswith("_clk.txt")) |>
        filter(!contains("nagoya"))

    DBInterface.transaction(db) do
        for file in clk_files
            m = match(r"^(?<psbid>\d+)_(?<runid>\d+)_clk.txt$", splitdir(file) |> last)
            if isnothing(m)
                error("Invalid filename $(file)")
            end

            if m[:psbid] == "630" && m[:runid] == "190"
                @debug "skipping... (psbid=630 runid=190 is broken)"
                continue
            end

            DBInterface.execute(
                stmt_insrt,
                (skew = ClockParser.get_skew(file), runid = m[:runid], psbid = m[:psbid]),
            )
        end
    end

    nothing
end

src/parse_clock.jl

0 → 100644
+131 −0
Original line number Original line Diff line number Diff line
module ClockParser

export get_skew

function _parse_line(line::AbstractString)
    time, high, _ = split(line)
    parse(Float64, time), parse(Float64, high)
end

"""
    get_skew(file::T) where {T <: AbstractString} -> Union{Float64, Missing}

Get skew from clock result file `file`.
It returns `missing` for invalid files.
To see the detailed reason, increase log level to `DEBUG`.
Invalid cases are:

- first line has >0 counts => "Unexpected first line"
- no measurement has >500 counts => "Clock skew out of range"
"""
function get_skew(file::T) where {T <: AbstractString}
    @debug "file: $(file)"
    lines = Iterators.Stateful(eachline(file))

    was_0_before = false
    # criteria is changed from QAQC for skew parsing
    let
        first_line = popfirst!(lines)
        time, high = _parse_line(first_line)
        if high == 0
            was_0_before = true
        end
    end

    time_and_counts = Iterators.map(_parse_line, lines)
    for (time, high) in time_and_counts
        if high == 0
            was_0_before = true
        elseif was_0_before && high >= 500
            return time
        end
    end
    @debug "Clock skew out of range"
    return missing
end

"""
    search_oscillation(file::T) where {T <: AbstractString}

Search oscillation (two or more rise up) for clock measurement file.
"""
function count_riseup(file::T) where {T <: AbstractString}
    lines = Iterators.Stateful(eachline(file))

    rising_count = 0
    first_line = popfirst!(lines)
    is_high = let
        time, high = _parse_line(first_line)
        high >= 500
    end

    time_and_counts = Iterators.map(_parse_line, lines)
    for (time, high) in time_and_counts
        if !is_high && high >= 500
            is_high = true
            rising_count += 1
        elseif is_high && high < 500
            is_high = false
        end
    end

    return rising_count

    # lines = eachline(file)
    # time_and_counts = Iterators.map(_parse_line, lines)
    # is_high = Iterators.map(time_and_counts) do (time, counts)
    #     counts >= 500
    # end
    # edges = Iterators.map(accumulate((p, n) -> (n, !p[1] & n), is_high, init = (false, false))) do x
    #     _prev, edge = x
    #     edge
    # end
    #
    # return sum(edges)
end

"""
Return Tuple of
- skew (first time >500)
- rise up full (last 0 to first 1000)

If clock is abnormal (i.e., which returns `missing` when [`get_skew`](@ref) is applied),
this returns Tuple of 2 `missing`s.
"""
function get_skew_and_riseup(file::T) where {T <: AbstractString}
    lines = Iterators.Stateful(eachline(file))

    last_low_time = missing
    first_high_time = missing
    skew = missing
    is_rised = false

    let
        _time, high = _parse_line(popfirst!(lines))
        if high == 0
            last_low_time = time
        end
    end

    for line in lines
        time, high = _parse_line(line)
        if high == 0
            last_low_time = time
        elseif !ismissing(last_low_time) && !is_rised && high >= 500
            skew = time
            is_rised = true
        elseif !ismissing(skew) && high == 1000
            first_high_time = time
            break
        end
    end

    if first_high_time === missing
        @debug "Clock skew out of range"
        return (missing, missing)
    end

    return (skew, first_high_time - last_low_time)
end

end # module ClockParser
+8 −2
Original line number Original line Diff line number Diff line
CREATE TABLE versions (
  converter TEXT
);

CREATE TABLE ps_boards (
CREATE TABLE ps_boards (
  id INTEGER NOT NULL PRIMARY KEY,
  id INTEGER NOT NULL PRIMARY KEY,
  daughterboard_id INTEGER
  daughterboard_id INTEGER
@@ -17,6 +21,7 @@ CREATE TABLE qaqc_single_run_results (
  asdtp INTEGER,
  asdtp INTEGER,
  reset INTEGER,
  reset INTEGER,
  qaqc_result INTEGER,
  qaqc_result INTEGER,
  lvds_tx_skew REAL,
  note TEXT,
  note TEXT,
  FOREIGN KEY("runid") REFERENCES "qaqc_runs"("id"),
  FOREIGN KEY("runid") REFERENCES "qaqc_runs"("id"),
  FOREIGN KEY("psboard_id") REFERENCES "ps_boards"("id"),
  FOREIGN KEY("psboard_id") REFERENCES "ps_boards"("id"),
@@ -89,7 +94,8 @@ CREATE TABLE qaqc_positions (
  id INTEGER NOT NULL PRIMARY KEY,
  id INTEGER NOT NULL PRIMARY KEY,
  name TEXT NOT NULL UNIQUE,
  name TEXT NOT NULL UNIQUE,
  station INTEGER NOT NULL,
  station INTEGER NOT NULL,
  position INTEGER NOT NULL
  position INTEGER NOT NULL,
  rising_ns NUMERIC NOT NULL
);
);


CREATE VIEW qaqc_single_run_table
CREATE VIEW qaqc_single_run_table
@@ -109,8 +115,8 @@ AS
    qaqc_single_run_results.asdtp,
    qaqc_single_run_results.asdtp,
    qaqc_single_run_results.reset,
    qaqc_single_run_results.reset,
    qaqc_single_run_results.qaqc_result,
    qaqc_single_run_results.qaqc_result,
    qaqc_single_run_results.lvds_tx_skew - qaqc_positions.rising_ns AS lvds_tx_skew,
    qaqc_runs.shiftscript_ver,
    qaqc_runs.shiftscript_ver,
    qaqc_runs.shifter,
    qaqc_single_run_results.note AS result_note
    qaqc_single_run_results.note AS result_note
  FROM
  FROM
    ps_boards,
    ps_boards,
+5 −0
Original line number Original line Diff line number Diff line
log/*
log/*
slavelogs/main/*
!log/57_long.log
!log/57_long.log
!slavelogs/main/230_51_clk.txt
!slavelogs/main/448_103_clk.txt
!slavelogs/main/444_103_clk.txt
!slavelogs/main/209_51_clk.txt
+0 −0

File deleted.

Preview size limit exceeded, changes collapsed.

+0 −118

File deleted.

Preview size limit exceeded, changes collapsed.

+0 −0

File deleted.

Preview size limit exceeded, changes collapsed.

+0 −0

File deleted.

Preview size limit exceeded, changes collapsed.

+1401 −0

File added.

Preview size limit exceeded, changes collapsed.

+1401 −0

File added.

Preview size limit exceeded, changes collapsed.

+337 −0

File added.

Preview size limit exceeded, changes collapsed.

+337 −0

File added.

Preview size limit exceeded, changes collapsed.

+98 −8

File changed.

Preview size limit exceeded, changes collapsed.