persist-tag-artifacts:# Use this job to specify artifacts that should never expire for tag pipelines.stage:post-deployrules:-if:'$CI_COMMIT_TAG!=null'script:-echo "Persisting artifacts for tag pipelines"artifacts:# Specify artifacts from previous jobs to be persisted indefinitelypaths:-myArtifactsPathFromJob1-myArtifactsPathFromJob2-...expire_in:never
OK, I created another stage and added this new job in this commit: 8519f358
The first test is fine: the post-deploy stage is not triggered at all by a standard commit, that is no tag:
But creating a new tag triggers the post-deploy stage correctly; see the additional stage in the latest pipeline, which is a tag pipeline, compared to the previous one on the same commit:
I now created a dummy commit, which triggered a standard pipeline, that is without the additional post-deploy stage:
When the latest pipeline succeeds (6185279), the pipeline before the tag (6185138) should delete its artifacts tomorrow based on our current expiry policy. But, if the new post-deploy job works correctly, the tag pipeline (6185215) should keep its artifacts.
I'll update the issue tomorrow, after the 24 hours of our expiry policy.
OK, after > 24h, now the artifacts are available only for the latest standard pipeline and the tag pipeline. The other pipelines have no artifacts to be downloaded.
After running the script with a read_api access token (read-only), I run it with a full api token on a subset of jobs, and it successfully erases the jobs, that is it deletes all artifacts and logs related to a job.
This pipeline was the latest one before setting the expiry policy and had all artifacts. Now, it shows "no artifacts":
Also, it successfully keeps the artifacts of the jobs belonging to the "latest" pipeline:
and the ones belonging to a "tag" pipeline:
So, I can now run the script on all jobs. I will let run it overnight and I'll update this page after it is done.
UPDATE: I investigated the increase in size observed by @tsulaia (summarized here)
Actually, each job produces a bundle of artifacts that goes from about 1 MB to about 1.3 GB. The difference is in the dependency on Geant4. When a job depends on Geant4, the /install/ folder from the Geant4 build job is picked, used, and bundled; which results in a huge, final artifact bundle.
I summed up all jobs from a single pipeline: the result is that each pipeline produces a total amount of about 14.3 GB of artifacts:
In the last 10 days, we had two pipelines whose artifacts were correctly preserved because of our new rules/policy:
So, the total amount of those sums up to about: 14.3 GB x 4 = 57.2 GB, which is compatible with the size increase that @tsulaia observed in the last 10 days.
==> So, that is understood.
However...
While the preservation of the first two preserved pipelines is correct in respect to our rules, the second two pipelines are think they have been preserved because they are flagged as the "latest" builds of open MRs.
So, in principle, I think they should be deleted automatically after the MR is closed. But we should take an eye on that and check the successful deletion.