[09:39] -queuebot:#ubuntu-release- New: accepted photoqt [sync] (jammy-proposed) [2.5-4] [09:39] -queuebot:#ubuntu-release- New: accepted python-b2sdk [amd64] (jammy-proposed) [1.3.0-2] [09:43] -queuebot:#ubuntu-release- New binary: photoqt [amd64] (jammy-proposed/none) [2.5-4] (no packageset) [09:44] -queuebot:#ubuntu-release- New binary: photoqt [ppc64el] (jammy-proposed/none) [2.5-4] (no packageset) [09:49] -queuebot:#ubuntu-release- New binary: photoqt [armhf] (jammy-proposed/none) [2.5-4] (no packageset) [09:49] -queuebot:#ubuntu-release- New binary: photoqt [s390x] (jammy-proposed/none) [2.5-4] (no packageset) [09:51] -queuebot:#ubuntu-release- New binary: photoqt [arm64] (jammy-proposed/none) [2.5-4] (no packageset) [12:45] oh, somebody just promoted llvm-14 to main [12:52] hi, if someone from the release team is around, and has some time, please take a look at the email I just sent to ubuntu-devel about the default llvm change from 13 to 14 and my postgresql upload, which was intended to fix the ftbfs that resulted from the llvm change [15:31] jbicha: thanks :) [17:04] Hi all! Is it OK to land Qt 5.15.3 today? https://bileto.ubuntu.com/#/ticket/4803 [17:50] mitya57: +1 from me. the longer we wait, the further back in test queues it ends up [18:58] mitya57: +1, clock is ticking. [18:58] Ok, doing it now [18:59] please do not launch ppa autopkgtests, they don't work [19:01] well on amd64 they might be fine [19:04] (network times out) [19:12] I don't understand why arm64, armhf, s390x are not moving [19:14] I didn't launch them. [19:20] mitya57: this was a general PSA :) [19:24] Ok :) [19:26] did people run like 1000 tests on arm64, armhf, s390x but not others? [19:27] because this doesn't make sense, there were 1000 passing tests in a day, but the queue size increased [19:28] but why only those and not ppc64el? [19:28] or amd64, but to be fair that has a ton of owrkers [19:47] -queuebot:#ubuntu-release- New: accepted photoqt [amd64] (jammy-proposed) [2.5-4] [19:47] -queuebot:#ubuntu-release- New: accepted photoqt [armhf] (jammy-proposed) [2.5-4] [19:47] -queuebot:#ubuntu-release- New: accepted photoqt [s390x] (jammy-proposed) [2.5-4] [19:47] -queuebot:#ubuntu-release- New: accepted photoqt [arm64] (jammy-proposed) [2.5-4] [19:47] -queuebot:#ubuntu-release- New: accepted photoqt [ppc64el] (jammy-proposed) [2.5-4] [19:48] this does not look right https://paste.ubuntu.com/p/38rWcxS7sm/ [19:51] juliank: Ew, how did that happen? [19:52] it's not clear [19:52] according to grafana, 1000 requests passed successfully [19:52] * Eickmeyer blinks in confusion [19:52] but it also shows the queue did not get smaller [19:53] -queuebot:#ubuntu-release- New sync: gnome-shell-pomodoro (jammy-proposed/primary) [0.20.0-3] [19:57] we must assume it's all upstream or PPA queue things [20:01] really should stop accepting any test requests [20:09] ubuntu@juju-4d1272-prod-proposed-migration-3:~$ journalctl --since=today -u download-results.service | grep s390x | grep ppa | wc -l [20:09] 777 [20:10] ubuntu@juju-4d1272-prod-proposed-migration-3:~$ journalctl --since=today -u download-results.service | grep s390x | wc -l [20:10] 1072 [20:10] ubuntu@juju-4d1272-prod-proposed-migration-3:~$ journalctl --since=today -u download-results.service | grep ppc64el | grep ppa | wc -l [20:10] 604 [20:10] ubuntu@juju-4d1272-prod-proposed-migration-3:~$ journalctl --since=today -u download-results.service | grep ppc64el | grep -v ppa | wc -l [20:10] 1719 [20:10] so yeah um [20:10] 295 non-PPA runs vs 777 for s390x [20:11] ubuntu@juju-4d1272-prod-proposed-migration-3:~$ journalctl --since=today -u download-results.service | grep arm64 | grep -v ppa | wc -l [20:11] 146 [20:11] ubuntu@juju-4d1272-prod-proposed-migration-3:~$ journalctl --since=today -u download-results.service | grep arm64 | grep ppa | wc -l [20:11] 609 [20:11] same for arm64 [20:11] if you look at ppc64el this was much more fairly distributed [20:13] So do we kick something or just wait it out? [20:16] There were too many tests requested for PPA stuff on those archs, they should catch up now [20:16] But this is all a bit problematic [20:17] timing wise with the beta that is [20:17] Right. [20:18] we can do ~500-1k tests per day I think so the queues need 7-14 days to empty [20:19] Yep, and with the beta approaching on Thursday that makes it a bit tricky for those packages to make it. [20:19] juliank: yes, and DDOS'ing the test architecture with ppc64el rebuilds 3 days before beta freeze may not have been the best option [20:19] RikMills: not my call [20:20] not blaming you. it just seems a bit delf defeating [20:20] *self [20:21] can't do it much earlier either though, as you want to wait for stuff to be rebuilt naturally vs wasting time on a rebuild [20:21] in the end, all I really care about migrating before the beta is ubiquity 22.04.9 [20:21] the question is whether some of those rebuilds ended up with unchanged binaries [20:21] like -defaults packages [20:21] I'm just hoping my MR for bug 1966523 makes it in on time per mine and vorlon's conversation. [20:21] Bug 1966523 in Ubuntu CD Images "Ubuntu Studio ISOs are reaching hard ISO 9660 limit" [Undecided, New] https://launchpad.net/bugs/1966523 [20:21] perhaps I can request a skiptest on that in the morning [20:22] some were rebuilt because they produce per-arch binaries, but they're really just metapackages [20:22] If you skiptest please also open an auto-package-testing issue to remove tests [20:22] (if it's many tests) [20:23] juliank: it is literally about 4 tests, just well down in the queue [20:23] we lost ~ a day or so getting the cloud back up to where we are now [20:23] as that took 2 days IIRC for 1 out of 2 clouds [20:24] https://people.canonical.com/~ubuntu-archive/proposed-migration/update_excuses.html#ubiquity [20:24] unfortunately networking is not fully back, firewall rules are broken or routing or whatever [20:24] so if you have ppas, or upstream systemd ones; requests to launchpad or salsa.debian.org just timeout [20:24] with 50/50 chance [20:25] I think we might want to move the tests in the normal queue to the huge queue, if they don't have a requester, than people can request urgent tests [20:25] and they'll run sorta immediately [20:26] we should also be using priority queues, and move stuff higher up if triggered by main packages vs universe ones [20:26] juliank: that would be a nice move [20:27] sorta plug into launchpad's build scoring and then transfer the build score to the triggers [20:27] so more important packages run earlier [20:28] I can also increase throughput a bit for short tests I suppose [20:28] currently we start one test per cloud every 30s [20:28] when we are looking for installer fixes etc to migrate, we definitely don't want those queued behind trivial things [20:29] s/trivial/less_critical