I understand why Salsa needs a limit on the uploading of artifacts.
My question is if there is any way to work around this for qt6-
webengine. Is there some way I can store the artifacts on my
system, similar to how I can implement my own runner? Can I
disable the artifacts or cache them locally if other tests need
them? I don’t think we generally need the artifacts, just the
output displayed for each test.
I am trying to enable Salsa CI for the qt6-webengine package.
The package takes a long time to build, longer than the default Salsa
CI runner timeout of 3 hours. I learned how to work around this with
the pyinstaller package by creating my own GitLab Runner running on my hardware, which allowed me to specify any length of job timeout.
But with qt6-webengine, I have run into another problem. Qt6-
webengine has a massive code base (500 GB tarball, 3 TB extracted).
This leads to 4.3 G of artifacts that try to upload to Salsa after
extracting the source. Which fails because it is larger than Salsa’s
750 MiB limit.
.provisioning-extract-source: &provisioning-extract-source[...]
extends:[...]
- .artifacts-default-expire <---- REMOVE ME
https://salsa.debian.org/qt-kde-team/qt6/qt6-webengine/-/jobs/7400053
My motivation in enabling Salsa CI for qt6-webengine relates to
efforts to provide better security support, which would benefit from
Salsa CI. Information can be found at:
https://salsa.debian.org/qt-kde-team/qt6/qt6-webengine/-/
merge_requests/8
I understand why Salsa needs a limit on the uploading of artifacts.
My question is if there is any way to work around this for qt6-
webengine. Is there some way I can store the artifacts on my system,
similar to how I can implement my own runner? Can I disable the
artifacts or cache them locally if other tests need them? I don’t
think we generally need the artifacts, just the output displayed for
each test.
On 09/04/2025 01:27, Soren Stoutner wrote:
I am trying to enable Salsa CI for the qt6-webengine package.
The package takes a long time to build, longer than the default
Salsa CI runner timeout of 3 hours. I learned how to work around
this with the pyinstaller package by creating my own GitLab
Runner running on my hardware, which allowed me to specify any
length of job timeout.
But with qt6-webengine, I have run into another problem. Qt6-
webengine has a massive code base (500 GB tarball, 3 TB
extracted).
This leads to 4.3 G of artifacts that try to upload to Salsa after extracting the source. Which fails because it is larger than
Salsa’s 750 MiB limit.
The 750MiB is warning in the extract-source [1] recipe itself [2].
The failure is a timeout that comes later, likely due to the working
dir being 4.3 GB.
Setting the variable 'SALSA_CI_MAX_ARTIFACTS_SIZE' may remove the
warning for you, but it will probably still timeout.
At this massive size, you probably want to exclude the extracted
source from being uploaded as artifacts, so perhaps override the
recipe and exclude all paths except log file? [3]
I don't know too well how extending works in the recipes, so I would
play around with a new recipe 'qt6we-extract-source' and put all of 'provisioning-extract-source' in it, but remove the 'artifacts'
being uploaded due to this bit...
.provisioning-extract-source: &provisioning-extract-source
[...]
extends:
- .artifacts-default-expire <---- REMOVE ME
[...]
If that works, then I'd read up on extending and try to override 'artifacts-default-expire' on all recipes.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 40:17:43 |
Calls: | 10,392 |
Files: | 14,064 |
Messages: | 6,417,203 |