[08:01] <tomwardill> Going to start landing the yarn/sass upgrades and testing on dogfood
[08:01] <tomwardill> object now if this is a bad plan :)
[08:48] <SpecialK|Canon> SGTM!
[09:40] <tomwardill> okay, last buildbot run failed with a bunch of JS timeouts, going to retry
[10:07] <tomwardill> bah, failed again
[10:07] <tomwardill> cjwatson: is there anything special I need to do to ensure the nodeJS version is upgraded in the buildbot base images?
[10:08] <cjwatson> tomwardill: https://wiki.canonical.com/InformationInfrastructure/OSA/LPHowTo/Buildbot#Updating_the_LXC_environment
[10:08] <cjwatson> tomwardill: You'll need to ask IS to do that
[10:08] <tomwardill> aha, that would probably be the casue
[10:08] <cjwatson> (On both workers)
[10:40] <tomwardill> looking more promising now
[10:57] <tomwardill> \o/ success
[10:57]  * tomwardill lands more things
[12:42] <tomwardill> okay, still getting timeouts
[12:42] <tomwardill> got an actual problem then
[13:28] <tomwardill> cjwatson: how do I run these tests via firefox?
[13:28] <cjwatson> find the absolute path to the associated .html, paste that in
[13:29] <cjwatson> [4/4] Building fresh packages...
[13:29] <cjwatson> [1/1] ⠁ node-sass: g++ '-DNODE_GYP_MODULE_NAME=libsass' '-DUSING_UV_SHARED=1' '-DUSING_V8_SHARED=1' '-DV8_DEPRECATION_WARNINGS=1' '-D_LARGEFILE_SOURCE' '-D_FILE_OFFSET_BITS=64' '-DLIBSASS_VERSION="3.5.5"' -I/home/cjwatson/.node-g
[13:29] <cjwatson> tomwardill: ^- is node-sass meant to be doing that?  I thought you'd squashed that
[13:33] <tomwardill> yes, I had/have
[13:34] <tomwardill> cjwatson: have you got the latest source-dependencies?
[13:34] <cjwatson> Yes
[13:34] <cjwatson> After an initial false start anyway
[13:34] <cjwatson> And I upgraded my base container
[13:35] <cjwatson> Maybe a cache that defeats your attempts to tell whether your fix worked?
[13:36] <tomwardill> cjwatson: can you add --verbose to the end of the yarn line in the css_combine Makefile
[13:36] <tomwardill> it should output whether it's found the .node binding before it starts compilation
[13:55] <cjwatson> Oh.  That overflowed my terminal's scrollback
[13:55]  * cjwatson tries again with |less
[13:56] <tomwardill> oh, yes
[13:56] <tomwardill> it is very verbose
[13:56] <tomwardill> but somehwere in the middle will be the output about the binding
[13:58] <tomwardill> my local lxd has decided it's going to drop all of it's internet connection
[13:58] <tomwardill> which is not ideal
[14:00] <cjwatson> verbose 3.23 Copying "/home/cjwatson/.cache/yarn/v6/npm-node-sass-4.14.1-99c87ec2efb7047ed638fb4c9db7f3a42e2217b5-integrity/node_modules/node-sass/binding.gyp" to "/home/cjwatson/src/canonical/launchpad/git/launchpad/yarn/node_modules/node-sass/binding.gyp".
[14:00] <cjwatson> looks possibly relevant
[14:03] <cjwatson> Hm, maybe, not sure
[14:04] <cjwatson> [4/4] Building fresh packages...
[14:04] <cjwatson> verbose 6.952 node-sass build Binary found at /home/cjwatson/src/canonical/launchpad/git/launchpad/download-cache/yarn/node-sass-4.14.4-linux-x64-57_binding.node
[14:04] <cjwatson> verbose 7.674 Binary found at /home/cjwatson/src/canonical/launchpad/git/launchpad/download-cache/yarn/node-sass-4.14.4-linux-x64-57_binding.node
[14:04] <cjwatson> That file shows as modified in git
[14:05] <tomwardill> ah, so if it ever compiles it, it will replace your existing one
[14:05]  * cjwatson copies it aside and restores it
[14:05] <tomwardill> becasue input and output are synonymous, apparently
[14:05] <cjwatson> Just going to try nuking ~/.cache/yarn/
[14:11] <cjwatson> Ah of course
[14:11] <cjwatson> My container is 32-bit
[14:11] <tomwardill> ah
[14:11] <tomwardill> yes, that would do it
[14:12] <cjwatson> But it helpfully writes it back to the path you gave anyway, so replaces *-x64-* with a 32-bit build
[14:13] <cjwatson> tomwardill: So um where did the 4.14.4 bit come from?
[14:14] <cjwatson> https://www.npmjs.com/package/node-sass <- newest version is 4.14.1
[14:15] <tomwardill> yes, I just looked at that and realised my mistake there
[14:15]  * tomwardill gives up on today, it's clearly faulty and needs to be returned
[14:16] <tomwardill> cjwatson: okay, now I've finally managed to test it (via X11 forwarding firefox), the JS tests still pass locally
[14:18] <cjwatson> tomwardill: I'd suggest building two worktrees, one from before it started failing, one from current master
[14:18] <cjwatson> tomwardill: And then diff the two after running 'make' in each
[14:18] <cjwatson> I'll fix up the versions and arches and stuff
[14:37] <cjwatson> https://code.launchpad.net/~cjwatson/lp-source-dependencies/+git/lp-source-dependencies/+merge/384913 and https://code.launchpad.net/~cjwatson/launchpad/+git/launchpad/+merge/384914
[15:00] <tomwardill> cjwatson: diff doesn't show anything immediately obvious and I can't make the JS tests fail locally
[15:09] <tomwardill> cjwatson: I'm out of ideas for how to reproduce this
[15:09] <cjwatson> Trying to see if I can find anything
[15:11]  * tomwardill fetches more biscuits
[15:23] <cjwatson> tomwardill: Ah look
[15:23] <cjwatson> tomwardill: http://lpbuildbot.canonical.com/builders/lp-devel-xenial/builds/1295/steps/shell_8/logs/stdio
[15:23] <cjwatson> From the build step before the test step
[15:23] <tomwardill> hah
[15:23] <cjwatson> tomwardill: So same problem as in my setup before the MPs I posted above
[15:23] <tomwardill> LOOK A GIANT BLOCK OF RED TEST
[15:23] <tomwardill> *TEXT
[15:23] <cjwatson> Except it doesn't have the bits necessary to build it
[15:23] <tomwardill> serves me right for just ctrl-f for 'Failure'
[15:24] <cjwatson> Or rather, it tries to connect to the network and network says no
[15:24] <tomwardill> right, so it's a 32 bit container
[15:25] <cjwatson> It's not often you need to look at the output of the build step, but when you've just frobbed the build system and things are being weird ...
[15:25] <tomwardill> indeed
[15:25] <tomwardill> I've just +1'd your MPs
[15:26] <tomwardill> are we actually deployed in a 32 bit world then?
[15:26] <cjwatson> No
[15:26] <cjwatson> I suspect buildbot does this because it's doing something like 20 tests in parallel and running them in 32-bit mode uses less memory
[15:26] <tomwardill> ah
[15:27] <tomwardill> yeah, that would make sense
[15:27] <cjwatson> I admit I hadn't actually realised buildbot was 32-bit (or maybe I knew once and forgot)
[15:29]  * cjwatson adds a note about that to https://wiki.canonical.com/InformationInfrastructure/OSA/LPHowTo/Buildbot
[15:30] <cjwatson> Also I kind of hate that the test proceeds even though the build fails.  If somebody wants to figure out how to fix that, it might be a good intro to lpbuildbot
[15:30] <tomwardill> cjwatson: I've just mentally added it to my 'work out buildbot and move it to bionic/focal' task
[15:30] <tomwardill> which I might do next, I think it's sufficiently annoyed me for me to have a look at it now :)
[15:30] <cjwatson> It is *meant* to halt the build on failure
[15:31] <cjwatson> Look at bzrbuildbot.shell.ShellCommand (yes, bzrbuildbot is now horribly misnamed but ...)
[15:31] <tomwardill> can you point me at the code/setup/docs for buildbot?
[15:31] <cjwatson> Public branch in lp:lpbuildbot, some private customisations in lp:~canonical-launchpad-branches/lpbuildbot/production (make changes in the former if you can - it gets merged into the latter)
[15:32] <cjwatson> I think all other documentation that exists is probably in https://wiki.canonical.com/InformationInfrastructure/OSA/LPHowTo/Buildbot
[15:33] <cjwatson> oh.  Actually the bug is probably in lp-setup-lxd-build
[15:33] <cjwatson> lp-setup-lxc-build, rather
[15:33] <cjwatson> Which is in the most horribly confusing bit of the whole setup
[15:33] <tomwardill> righto :)
[15:34] <cjwatson> When parallel buildbot tests were being developed, there was an effort to get all the code needed for that sort of thing out of puppet and into an "lpsetup" package
[15:34] <cjwatson> And this was sort of 80% done
[15:35] <cjwatson> It's in lp:lpsetup and deployed via a recipe build into ppa:launchpad/ubuntu/ppa
[15:36] <cjwatson> Hmm
[15:36] <SpecialK|Canon> Can this end up on a wiki page if it isn't already please?
[15:36] <cjwatson> Or maybe those scripts aren't
[15:36] <cjwatson> If I understood it well enough to not be misleading I would have documented it a long time ago :-/
[15:37] <cjwatson> I think possibly lp-setup-lxc-* never got fully extracted from puppet
[15:37] <cjwatson> Which is bad because we kind of need to fix those, partly for this but also to use lxd
[15:37] <SpecialK|Canon> "this is a thing probably worth looking at" / "here is possibly what happened" are still valuable documentation
[15:37] <SpecialK|Canon> Er, sentence fail, but hopefully you get the idea
[15:38] <SpecialK|Canon> I'd rather have your vague half-understandings on a page (with appropriate notes as to confidence in the quality) than not at all!
[15:38] <cjwatson> OK, I'll see what I can do
[15:39] <SpecialK|Canon> Thank you!
[15:41] <cjwatson> tomwardill: I'd suggest getting an SRE to give you the current contents of /usr/local/bin/lp-setup-* in the appropriate container on e.g. sluagh, and then comparing that with lp:lpsetup and lp:canonical-is-puppet to work out which one of those it comes from
[15:41] <cjwatson> Should be able to tell from the comments at the top of lp-setup-lxc-build, say
[15:41] <tomwardill> that makes sense
[15:41]  * tomwardill does that now
[15:42] <cjwatson> And if they aren't the lpsetup versions, we should check the effective diffs between them, and probably make them be the lpsetup version
[15:42] <cjwatson> Because while lpsetup is kinda weird it's better than a defunct-for-us config management system
[15:58] <tomwardill> poked buildbot
[15:58] <cjwatson> Thanks
[16:02] <cjwatson> Build step worked now, so that should hopefully work better
[16:05] <tomwardill> excellent :)
[16:05] <tomwardill> wonder why it worked that once
[16:06] <cjwatson> It's interesting, since it failed before the sass move
[16:06] <cjwatson> http://lpbuildbot.canonical.com/builders/lp-devel-xenial/builds/1291/steps/shell_8/logs/stdio is a somewhat different failure
[16:06] <cjwatson> Oh right
[16:06] <cjwatson> So two different failures
[16:06] <cjwatson> It failed because you landed the yarn upgrade before buildbot workers had a newer nodejs
[16:07] <cjwatson> You got that fixed, and then nodejs on the worker was able to parse yarn and the build passed
[16:07] <cjwatson> Then you landed the node-sass upgrade, which failed due to the 32-bit thing
[16:07] <tomwardill> aah, right
[16:07] <tomwardill> of course
[16:07] <cjwatson> All appropriately deterministic :)
[16:07] <tomwardill> but because I never looked at the build step, they had the same apparent failure mode to me
[16:07] <cjwatson> Right
[16:11] <tomwardill> wonder what new and exciting failure I can come up with the next landing!
[16:11] <tomwardill> stay tuned to find out!
[16:13] <cjwatson> I'd have let you at least fix it had I realised that it was the buildbot failure rather than just weird thing on my machine :)
[16:16] <tomwardill> heh, no worries :)
[16:17]  * tomwardill has also broken his glasses today, so it's not like it's been the best day all roudn
[16:19] <cjwatson> Ugh
[16:41] <tomwardill> buildbot success!
[16:41] <tomwardill> landing the next in the series
[16:46] <SpecialK|Canon> \o/
[16:55] <tomwardill> anyone got time for this horrible diff? https://code.launchpad.net/~twom/launchpad/+git/launchpad/+merge/384834
[17:19] <pappacena> I can check that
[17:20]  * pappacena immediatelly regret this decision
[17:20] <pappacena> https://usercontent.irccloud-cdn.com/file/S535aaXN/image.png
[17:20] <pappacena> hahaha