[00:10] <ItzSwirlz> trying to figureout the cause of this: https://ubuntu-archive-team.ubuntu.com/cd-build-logs/ubuntucinnamon/mantic/daily-live-20230518.log
[00:14] <vorlon> ItzSwirlz: doesn't that show a successful build?
[00:15] <vorlon> ah you're just looking at the message in the log; well, the code currently tries both bzr and git
[00:15] <vorlon> because it doesn't have enough information to know which to use
[00:16] <vorlon> also it looks for platform.mantic under ~ubuntucinnamon-dev first, then falls back to ~ubuntu-core-dev (where it actually lives)
[00:17] <vorlon> this code could perhaps be improved to have better knowledge of where the seeds are and not have to try every combination with failures; but better would be to get all the flavors to migrate to git and drop the bzr code
[00:21] <ItzSwirlz> where am i using bzr a tm
[00:22] <ItzSwirlz> -meta uses regular vcs: https://github.com/ubuntucinnamon/ubuntucinnamon-meta/blob/aab8dd7ec216ad683b4cb8e37210597ea196be02/update.cfg#L16
[01:33] <vorlon> ItzSwirlz: this is the livecd-rootfs code, it's nothing specific to ubuntucinnamon
[10:33] <slyon> jbicha: FYI: https://bugs.launchpad.net/ubuntu/+source/glib2.0/+bug/2026826
[10:33] -ubottu:#ubuntu-devel- Launchpad bug 2026826 in glib2.0 (Ubuntu) "glib2.0 (2.77.0 ) breaks Netplan build" [Undecided, New]
[10:36] <seb128> slyon, could you report the issue to glib upstream if you think it's a bug there?
[10:37] <seb128> slyon, and thanks for the report/tag. It wouldn't migrate anyway because of autopkgtests regressions but still better to be safe
[10:38] <slyon> seb128: I'm not sure if it's an upstream issue or something the way our glib package is being compiled... I didn't do any deep investigation yet (and don't have time for it right now). I just confirmed that everything works fine with GLib from mantic-release and wanted to escalate to the destkop team for further investigation
[10:39] <seb128> slyon, ack, do you have a specific example for the keyfile line break issue described?
[10:40] <seb128> slyon, also https://autopkgtest.ubuntu.com/packages/n/netplan.io/mantic/amd64 ... any idea why the netplan autopkgtests aren't seeing the issue?
[10:40] <slyon> seeing we're moving even ahead of Debian experimental, I thought it's worth bringing it up with Jeremy.
[10:40] <seb128> ack
[10:40] <slyon> seb128: Yes, the broken keyfiles can be seen from netplan's build log. I will copy it to the bug report.
[10:41] <seb128> slyon, the issue is that the pkg-gnome team in Debian decided to keep unstable for bugfixes updates only until the incoming point release
[10:41] <slyon> seb128: the autopkgtests are not using mantic-proposed, but rather mantic-release, I think?
[10:42] <seb128> slyon, the first entry on that page is 0.106.1-2 	glib2.0/2.77.0-0ubuntu1 	2023-07-10 23:13:20 UTC
[10:42] <slyon> Interestingly, it seems to pass on riscv64, though, but fail on any other arch
[10:43] <seb128> slyon, riscv is building with notests
[10:43] <slyon> oh, that explains the riscv situation!
[10:44] <seb128> slyon, so netplan -2 autopkgtests are green with glib 2.77.0, are the runtest not catching the issue or is only -3 failing?
[10:44] <seb128> let me trigger with -3 to see
[10:44] <slyon> Right... I think 0.106.1-2 was still compiled with older glib. while 0.106.1-3 FTBFS with new glib. GLib is a shared library, so I'm not sure this really explains the autopkgtest pass.. :-/
[10:45] <seb128> well -3 fails to build, ignore that :p
[10:45] <slyon> seb128: rigth.
[10:45] <seb128> compiled with shouldn't make a difference though
[10:45] <slyon> it's a build-time failure (dh_auto_test)
[10:45] <seb128> it's loading dynamically glib
[10:45] <slyon> when I try to compile Netplan -2 with new glib, it fails, too.
[10:46] <seb128> ack, I would just expect that if glib changed in a way that break keyfiles then installed tests would also fail somehow
[10:46] <seb128> if that glib version is installed
[10:47] <slyon> Yes, that's strange.. Maybe something else is in that build environment, which messes things up. But OTOH, just downgrading the GLib packages fixes the build
[10:49] <seb128> slyon, ack, we will investigate, thanks for the report
[10:50] <slyon> thank you seb128
[10:50] <slyon> I've updated the bug report with an example keyfile
[10:51] <seb128> slyon, thanks!
[11:12] <ginggs> there was a late (after FF?) glib2.0 update in lunar that also broke some builds
[11:12] <ginggs> e.g. LP: #2019852
[11:12] -ubottu:#ubuntu-devel- Launchpad bug 2019852 in nbd (Ubuntu Lunar) "nbd-server hangs after fork" [High, Fix Released] https://launchpad.net/bugs/2019852
[11:38] <seb128> ginggs, right, that was a different issue though (also that's one the reason we are landing the new serie earlier this cycle, avoiding issues just around feature freeze)
[11:39] <ginggs> seb128: oh sure, not the same issue.  but a test rebuild of reverse-build-deps before bumping glib2.0 would be nice
[11:52] <seb128> ginggs, that's something we could organize for doing if you/release team think there is value, I'm unsure we need a test rebuild specific for it so early in the cycle?
[11:53] <seb128> the build issues will get flagged with one of the regular test rebuilds no?
[11:55] <ginggs> seb128: yes, they will
[12:04] <ginggs> the FTBFS report only says the package FTBFS, it doesn't say glib2.0 broke this build
[12:06] <ginggs> the 2nd lunar test rebuild was after the glib2.0 upload, and two failures that I know of (nbd and thrift) appear in that report
[12:07] <ginggs> https://people.canonical.com/~ginggs/ftbfs-report/test-rebuild-20230324-lunar-lunar.html
[15:19] <dbungert> @pilot in
[17:47] <wouter> ginggs: whoa, nbd was stuck for 18 hours?! Can I get the build log somewhere?
[17:49] <wouter> oh, no, nvm -- that's the pre-fix version
[17:49] <wouter> it's expected to do that, you just need the update which you helped me with a while back
[17:50] <ginggs> wouter: exactly, yes
[17:53] <wouter> ginggs: sorry for the noise -- I have a highlight configured on NBD, so I noticed ;)
[17:53] <ginggs> wouter: no worries!
[17:54] <ginggs> wouter: of course an autopkgtest in nbd would have caught that issue early ;)
[17:56] <wouter> ginggs: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1036918 ;-)
[17:56] -ubottu:#ubuntu-devel- Debian bug 1036918 in debvm "debvm: manual mounting of root image" [Wishlist, Open]
[17:58] <ginggs> wouter: for a start, couldn't an autopkgtest just run the same tests that are run at build-time?
[17:59] <wouter> "it's complicated". In theory yes, in practice that would require a wholesale refactoring of the test suite
[17:59] <wouter> it's a good idea, for sure, but not as easy as it seems
[17:59] <wouter> and yes, that refactoring is planned, but still
[19:19] <dbungert> @pilot out
[20:59] <ahasenack> dbungert: ocaml is quite a circular mess, did you conclude anything?
[21:00] <dbungert> ahasenack: yep, making progress.  2 things needed retest and moved on, the next problem is LP: #2027333
[21:00] -ubottu:#ubuntu-devel- Launchpad bug 2027333 in ocaml-dune (Debian) "ocaml-dune: Please add ocaml:Provides to d/control" [Undecided, New] https://launchpad.net/bugs/2027333
[21:01] <dbungert> I sent a patch to Debian on the last part, we'll see if that gets picked up
[21:01] <ahasenack> ok
[21:44] <Unit193> ahasenack: As promised, if it's not merged by the time a new upstream comes along I'll merge it, but I was looking over https://salsa.debian.org/debian/wireguard/-/merge_requests/6/diffs#diff-content-867caf2861d36d708e7ffe26f3783111183154cd again and it'd be more useful I think to echo those hash and "must be equal"/"must be different" lines into STDERR and only give them on error, to make it more
[21:44] <Unit193> obvious.  I care less about the other autopkgtests since they won't run in Debian, but at the same time having the errors in STDERR makes sense there too.
[21:44] -ubottu:#ubuntu-devel- Merge 6 in debian/wireguard "New DEP8 and build-time tests" [Opened]
[21:45] <ahasenack> lemme switch context
[21:45] <Unit193> I was just trying to find a time when you were online, it's not time sensitive at all. :P
[21:45] <ahasenack> oh, I had totally forgotten about that PR
[21:46] <Unit193> I hadn't!
[21:46] <ahasenack> can you add that as a comment? Then I'll get an email and won't forget to act on it
[21:46] <Unit193> dkg seems very inactive there, otherwise I'd follow up about renaming the source package, dropping the meta, and dropping -modules and -dkms from recommends.
[21:47] <Unit193> Sure.
[21:47] <ahasenack> thx
[21:47] <ahasenack> I'm on +1 maintenance this week, so I will probably act on it next week