File README-constraints.txt of Package ceph-test
xxxxxxxxxx
1
2023-06-13 - Tim Serong <tserong@suse.com>
2
3
Ceph needs plenty of disk space and RAM in order to build. To set
4
minimum requirements for these, we're using #!BuildConstraint directives
5
in ceph.spec and ceph-test.spec. We were previously using a _constraints
6
file, but this was shown to not always work correctly with _multibuild.
7
For more information about #!BuildConstraint directives see
8
https://github.com/openSUSE/obs-docu/pull/285, and in particular Darix's
9
comment that you shouldn't mix _constraints and #!BuildConstraint.
10
11
The #!BuildConstraint directives are added to the spec files automatically
12
by the pre_checkin.sh script. If the disk and memory constraints need to
13
be changed in future, adjust the variables in the pre_checkin.env file and
14
re-run pre_checkin.sh.
15
16
The current constraints are based on builds of ceph 16.2.7 on build.suse.de,
17
which showed the following resource usage (in MB):
18
19
ceph aarch64 max disk: 41568 max mem: 13698 (on ibs-centriq-6:3 disk: 65536 mem: 18432)
20
ceph x86_64 max disk: 41621 max mem: 9852 (on sheep74:2 disk: 51200 mem: 12500)
21
ceph ppc64le max disk: 42005 max mem: 8754 (on ibs-power9-10:1 disk: 61440 mem: 20480)
22
ceph s390x max disk: 40698 max mem: 8875 (on s390zl36:1 disk: 51200 mem: 10240)
23
ceph-test x86_64 max disk: 51760 max mem: 16835 (on sheep94:2 disk: 112640 mem: 16384)
24
25
Based on the above, and to hopefully provide a little wiggle room for
26
the future while at the same time not being too demanding of workers,
27
the minimum disk size is 50GB for ceph and 60GB for ceph-test. Memory
28
requirements remain at 8GB and 10GB respectively as they were before I
29
did the above tests - despite the memory usage shown above, AFAIK we
30
haven't run out of memory during builds, and this keeps the pool of
31
possible workers noticeably larger than it would be if we required 16GB.
32