[02:07] <pieq> Morning!
[06:08] <cpaelzer> doko: I doubt that you had a chance to look at bug 1843394 while travelling
[06:08] <cpaelzer> doko: but if you'd have a some time for me to visit after you fully arrived here let me know
[06:09] <cpaelzer> I'd appreciate having at least someone (that is somewhat happy with assembly) to talk about this issue
[06:11] <cpaelzer> I have relaized we have a team sync later today, we might sit together a bit after the actual session is done if we had no chance before that
[07:13] <Laney> ddstreet: yes, it's a long standing problem but we've never managed to understand it or catch it happening :(
[07:14] <Laney> a while ago with Stéphane we fixed up all the settings we could find but it didn't solve the problem
[08:07] <robert_ancell> RAOF, I have a bunch of gnome-software SRUs - could you help get them out of the queue?
[08:14] <RAOF> Sure, after my current thing
[08:21] <robert_ancell> RAOF, thanks
[10:06] <RAOF> cjwatson: So, curious findings in the "why does mir now fail to build on 16.04" hunt. It doesn't seem to be a kernel bug (the xenial release kernel fails in the same way)
[10:06] <RAOF> cjwatson: The most recent successful build was 2019-08-30, so quite recently.
[10:07] <cjwatson> Have you tried diffing successful/failing build logs to look for build-dep differences?
[10:07] <RAOF> The initial part of the build shows some differences, the most suspicious of which is makedev.
[10:07] <RAOF> (The changelog for which is "don't  populate /dev in a container")
[10:08] <RAOF> However: That was uploaded in 2017!?
[10:08] <cjwatson> Hm, we did get rid of makedev from the chroots IIRC?  Or maybe add it
[10:08] <cjwatson> I remember noting that but thinking nothing could possibly care
[10:08] <cjwatson> It could be that
[10:08] <RAOF> It doesn't appear in the successful build from 2019-08-30, does in the failing builds. Has the chroot lost an apt pin or something?
[10:09] <cjwatson> Can't look right now but I think that could be a lead
[10:09] <cjwatson> We upgraded chroots in that window, switching from artisanally-crafted chroots to ones built by livecd-rootfs
[10:09] <RAOF> Ooooh.
[10:09] <RAOF> I'll try to run a build with makedev pinned.
[10:10] <cjwatson> Pinning is a red herring
[10:10] <cjwatson> It's not about version, it'll be about whether it's present or not
[10:10] <cjwatson> I forget which right now, in the middle of a retro
[10:10] <RAOF> Ok.
[10:28]  * RAOF wonders how Launchpad has generated a diff from gnome-software 3.31.2-0ubuntu1, which as far as I can tell exists nowhere?
[11:30] <cjwatson> RAOF: sure does, was in disco-proposed
[11:30] <cjwatson> I guess it's maybe still the highest version
[11:30] <cjwatson> (see https://launchpad.net/ubuntu/+source/gnome-software/+publishinghistory)
[11:31] <RAOF> cjwatson: Aha! I was looking at the soruce page, which doesn't list that (because it was deleted)
[11:31] <RAOF> It is unhelpful for launchpad to generated diffs against deleted uploads 😬
[11:35] <cjwatson> Right, so the xenial chroot change did in fact add makedev, on the grounds that it was Priority: required in xenial.
[11:35] <cjwatson> Hmm.
[11:36] <cjwatson> Apparently it was manually purged from the old chroots (I can see because they had /root/.bash_history in them ...)
[11:40] <cjwatson> So uh I guess https://paste.ubuntu.com/p/QKGNWTNPqH/ ?
[11:43] <cjwatson> https://code.launchpad.net/~cjwatson/livecd-rootfs/+git/livecd-rootfs/+merge/372869
[12:33] <ddstreet> Laney re: autopkgtest-cloud mojo-juju-2 branch, was that the right branch you suggested i use?  it seems a bit better but still doesn't work
[12:33] <Laney> doesn't work in what way?
[12:33] <ddstreet> latest failure for me is because yaml in the unit can't find yaml.CSafeLoader
[12:34] <ddstreet> seems the venv uses a different yaml than the packaged system one
[12:34] <Laney> is that a failure from mojo or juju?
[12:35] <ddstreet> the juju unit
[12:35] <Laney> good news is I'm going to try to deploy that on the staging cloud this week
[12:35] <Laney> so maybe I'll see that too and fix it
[12:35] <ddstreet> well not juju unit, one of the deployed units
[12:36] <Laney> fwiw the dependencies are in charms/bionic/autopkgtest-cloud-worker/layer.yaml
[12:41] <ddstreet> yep i updated that already to include python3-pygit2 which was missing, and python3-yaml, but even with python3-yaml installed it doesn't have the CSafeLoader as it seems it's not built correctly with cython and/or libyaml
[12:41] <ddstreet> anyway
[12:41] <ddstreet> Laney so there is no branch that currently is known to work to deploy it?
[12:43] <Laney> I mean the one I gave you is known to work to deploy for me and juliank
[12:43] <Laney> sorry that it's not working for you, but it is not known that it doesn't as far as I'm concerned
[12:43] <ddstreet> well, for production you mean?
[12:43] <ddstreet> or it works for you for devel?
[12:43] <Laney> like I say hopefully I'll get to it this week
[12:44] <Laney> the goal is to re-deploy autopkgtest.ubuntu.com with this new spec
[12:44] <juliank> Laney: I was wondering if we wanted to schedule something / get together for doing the staging deploy
[12:44] <Laney> currently it is using what's in master, but I wouldn't say that is very advisable for you to try to deploy
[12:44] <Laney> juliank: yeah, later on in the week sounds good
[12:45] <Laney> we should go sit near an IS person to do that 😈
[12:45] <ddstreet> that's strange that it works for both of you but not when i try :(  i suppose it doesn't make much sense for me to open MR for anything i fix, since it works for you already?
[12:45] <juliank> Laney: +1
[12:45] <Laney> missing dependencies would be good to add, sure
[12:46] <Laney> perhaps something moved on in bionic in the meantime and we'd see if it we tried again now
[12:46] <ddstreet> ok
[12:46] <juliank> I did deploy from an eoan host fwiw
[12:46] <Laney> shouldn't matter if it's a hook error from inside one of the units
[12:47] <juliank> yaml.CSafeLoader exists in both eoan and bionic
[12:47] <ddstreet> juliank yeah, but not with python3-yaml inside the unit's venv :(
[12:47] <Laney> what venv?
[12:47] <Laney> wait, is it one that juju itself is using?
[12:47] <ddstreet> charm vevn
[12:48] <Laney> none of those on our side I don't think
[12:48] <Laney> welllllll, probably more efficient for us to try this ourselves
[12:48] <Laney> thanks for raising it
[12:48] <ddstreet> yeah
[12:48] <ddstreet> sure
[12:48] <ddstreet> np
[12:48]  * juliank certainly hasn't tried it this month :D
[12:49] <ddstreet> btw not sure if you're using canonistack (i assume not) but it's having trouble on some archs currently
[12:50] <juliank> Laney: _now_ I just got the notification from your first sentence where you highlighted me
[12:50] <juliank> super odd
[12:50] <Laney> O_O
[12:50] <juliank> It's being pushed from weechat to pushbullet to google to chrome
[12:50] <juliank> and well, the phone
[14:15] <RAOF> robert_ancell: Enjoy your gnome-software.
[14:16] <robert_ancell> om nom. So tasty.
[15:13] <rbasak> ahasenack: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=940582
[15:13] <rbasak> Wrong bug, sorry
[15:13] <rbasak> ahasenack: https://bugs.launchpad.net/ubuntu/+source/apt/+bug/1429285