=== igitoor_ is now known as igitoor [08:01] Going to start landing the yarn/sass upgrades and testing on dogfood [08:01] object now if this is a bad plan :) [08:48] SGTM! [09:40] okay, last buildbot run failed with a bunch of JS timeouts, going to retry [10:07] bah, failed again [10:07] cjwatson: is there anything special I need to do to ensure the nodeJS version is upgraded in the buildbot base images? [10:08] tomwardill: https://wiki.canonical.com/InformationInfrastructure/OSA/LPHowTo/Buildbot#Updating_the_LXC_environment [10:08] tomwardill: You'll need to ask IS to do that [10:08] aha, that would probably be the casue [10:08] (On both workers) [10:40] looking more promising now [10:57] \o/ success [10:57] * tomwardill lands more things [12:42] okay, still getting timeouts [12:42] got an actual problem then [13:28] cjwatson: how do I run these tests via firefox? [13:28] find the absolute path to the associated .html, paste that in [13:29] [4/4] Building fresh packages... [13:29] [1/1] ⠁ node-sass: g++ '-DNODE_GYP_MODULE_NAME=libsass' '-DUSING_UV_SHARED=1' '-DUSING_V8_SHARED=1' '-DV8_DEPRECATION_WARNINGS=1' '-D_LARGEFILE_SOURCE' '-D_FILE_OFFSET_BITS=64' '-DLIBSASS_VERSION="3.5.5"' -I/home/cjwatson/.node-g [13:29] tomwardill: ^- is node-sass meant to be doing that? I thought you'd squashed that [13:33] yes, I had/have [13:34] cjwatson: have you got the latest source-dependencies? [13:34] Yes [13:34] After an initial false start anyway [13:34] And I upgraded my base container [13:35] Maybe a cache that defeats your attempts to tell whether your fix worked? [13:36] cjwatson: can you add --verbose to the end of the yarn line in the css_combine Makefile [13:36] it should output whether it's found the .node binding before it starts compilation [13:55] Oh. That overflowed my terminal's scrollback [13:55] * cjwatson tries again with |less [13:56] oh, yes [13:56] it is very verbose [13:56] but somehwere in the middle will be the output about the binding [13:58] my local lxd has decided it's going to drop all of it's internet connection [13:58] which is not ideal [14:00] verbose 3.23 Copying "/home/cjwatson/.cache/yarn/v6/npm-node-sass-4.14.1-99c87ec2efb7047ed638fb4c9db7f3a42e2217b5-integrity/node_modules/node-sass/binding.gyp" to "/home/cjwatson/src/canonical/launchpad/git/launchpad/yarn/node_modules/node-sass/binding.gyp". [14:00] looks possibly relevant [14:03] Hm, maybe, not sure [14:04] [4/4] Building fresh packages... [14:04] verbose 6.952 node-sass build Binary found at /home/cjwatson/src/canonical/launchpad/git/launchpad/download-cache/yarn/node-sass-4.14.4-linux-x64-57_binding.node [14:04] verbose 7.674 Binary found at /home/cjwatson/src/canonical/launchpad/git/launchpad/download-cache/yarn/node-sass-4.14.4-linux-x64-57_binding.node [14:04] That file shows as modified in git [14:05] ah, so if it ever compiles it, it will replace your existing one [14:05] * cjwatson copies it aside and restores it [14:05] becasue input and output are synonymous, apparently [14:05] Just going to try nuking ~/.cache/yarn/ [14:11] Ah of course [14:11] My container is 32-bit [14:11] ah [14:11] yes, that would do it [14:12] But it helpfully writes it back to the path you gave anyway, so replaces *-x64-* with a 32-bit build [14:13] tomwardill: So um where did the 4.14.4 bit come from? [14:14] https://www.npmjs.com/package/node-sass <- newest version is 4.14.1 [14:15] yes, I just looked at that and realised my mistake there [14:15] * tomwardill gives up on today, it's clearly faulty and needs to be returned [14:16] cjwatson: okay, now I've finally managed to test it (via X11 forwarding firefox), the JS tests still pass locally [14:18] tomwardill: I'd suggest building two worktrees, one from before it started failing, one from current master [14:18] tomwardill: And then diff the two after running 'make' in each [14:18] I'll fix up the versions and arches and stuff [14:37] https://code.launchpad.net/~cjwatson/lp-source-dependencies/+git/lp-source-dependencies/+merge/384913 and https://code.launchpad.net/~cjwatson/launchpad/+git/launchpad/+merge/384914 [15:00] cjwatson: diff doesn't show anything immediately obvious and I can't make the JS tests fail locally [15:09] cjwatson: I'm out of ideas for how to reproduce this [15:09] Trying to see if I can find anything [15:11] * tomwardill fetches more biscuits [15:23] tomwardill: Ah look [15:23] tomwardill: http://lpbuildbot.canonical.com/builders/lp-devel-xenial/builds/1295/steps/shell_8/logs/stdio [15:23] From the build step before the test step [15:23] hah [15:23] tomwardill: So same problem as in my setup before the MPs I posted above [15:23] LOOK A GIANT BLOCK OF RED TEST [15:23] *TEXT [15:23] Except it doesn't have the bits necessary to build it [15:23] serves me right for just ctrl-f for 'Failure' [15:24] Or rather, it tries to connect to the network and network says no [15:24] right, so it's a 32 bit container [15:25] It's not often you need to look at the output of the build step, but when you've just frobbed the build system and things are being weird ... [15:25] indeed [15:25] I've just +1'd your MPs [15:26] are we actually deployed in a 32 bit world then? [15:26] No [15:26] I suspect buildbot does this because it's doing something like 20 tests in parallel and running them in 32-bit mode uses less memory [15:26] ah [15:27] yeah, that would make sense [15:27] I admit I hadn't actually realised buildbot was 32-bit (or maybe I knew once and forgot) [15:29] * cjwatson adds a note about that to https://wiki.canonical.com/InformationInfrastructure/OSA/LPHowTo/Buildbot [15:30] Also I kind of hate that the test proceeds even though the build fails. If somebody wants to figure out how to fix that, it might be a good intro to lpbuildbot [15:30] cjwatson: I've just mentally added it to my 'work out buildbot and move it to bionic/focal' task [15:30] which I might do next, I think it's sufficiently annoyed me for me to have a look at it now :) [15:30] It is *meant* to halt the build on failure [15:31] Look at bzrbuildbot.shell.ShellCommand (yes, bzrbuildbot is now horribly misnamed but ...) [15:31] can you point me at the code/setup/docs for buildbot? [15:31] Public branch in lp:lpbuildbot, some private customisations in lp:~canonical-launchpad-branches/lpbuildbot/production (make changes in the former if you can - it gets merged into the latter) [15:32] I think all other documentation that exists is probably in https://wiki.canonical.com/InformationInfrastructure/OSA/LPHowTo/Buildbot [15:33] oh. Actually the bug is probably in lp-setup-lxd-build [15:33] lp-setup-lxc-build, rather [15:33] Which is in the most horribly confusing bit of the whole setup [15:33] righto :) [15:34] When parallel buildbot tests were being developed, there was an effort to get all the code needed for that sort of thing out of puppet and into an "lpsetup" package [15:34] And this was sort of 80% done [15:35] It's in lp:lpsetup and deployed via a recipe build into ppa:launchpad/ubuntu/ppa [15:36] Hmm [15:36] Can this end up on a wiki page if it isn't already please? [15:36] Or maybe those scripts aren't [15:36] If I understood it well enough to not be misleading I would have documented it a long time ago :-/ [15:37] I think possibly lp-setup-lxc-* never got fully extracted from puppet [15:37] Which is bad because we kind of need to fix those, partly for this but also to use lxd [15:37] "this is a thing probably worth looking at" / "here is possibly what happened" are still valuable documentation [15:37] Er, sentence fail, but hopefully you get the idea [15:38] I'd rather have your vague half-understandings on a page (with appropriate notes as to confidence in the quality) than not at all! [15:38] OK, I'll see what I can do [15:39] Thank you! [15:41] tomwardill: I'd suggest getting an SRE to give you the current contents of /usr/local/bin/lp-setup-* in the appropriate container on e.g. sluagh, and then comparing that with lp:lpsetup and lp:canonical-is-puppet to work out which one of those it comes from [15:41] Should be able to tell from the comments at the top of lp-setup-lxc-build, say [15:41] that makes sense [15:41] * tomwardill does that now [15:42] And if they aren't the lpsetup versions, we should check the effective diffs between them, and probably make them be the lpsetup version [15:42] Because while lpsetup is kinda weird it's better than a defunct-for-us config management system [15:58] poked buildbot [15:58] Thanks [16:02] Build step worked now, so that should hopefully work better [16:05] excellent :) [16:05] wonder why it worked that once [16:06] It's interesting, since it failed before the sass move [16:06] http://lpbuildbot.canonical.com/builders/lp-devel-xenial/builds/1291/steps/shell_8/logs/stdio is a somewhat different failure [16:06] Oh right [16:06] So two different failures [16:06] It failed because you landed the yarn upgrade before buildbot workers had a newer nodejs [16:07] You got that fixed, and then nodejs on the worker was able to parse yarn and the build passed [16:07] Then you landed the node-sass upgrade, which failed due to the 32-bit thing [16:07] aah, right [16:07] of course [16:07] All appropriately deterministic :) [16:07] but because I never looked at the build step, they had the same apparent failure mode to me [16:07] Right [16:11] wonder what new and exciting failure I can come up with the next landing! [16:11] stay tuned to find out! [16:13] I'd have let you at least fix it had I realised that it was the buildbot failure rather than just weird thing on my machine :) [16:16] heh, no worries :) [16:17] * tomwardill has also broken his glasses today, so it's not like it's been the best day all roudn [16:19] Ugh [16:41] buildbot success! [16:41] landing the next in the series [16:46] \o/ [16:55] anyone got time for this horrible diff? https://code.launchpad.net/~twom/launchpad/+git/launchpad/+merge/384834 [17:19] I can check that [17:20] * pappacena immediatelly regret this decision [17:20] https://usercontent.irccloud-cdn.com/file/S535aaXN/image.png [17:20] hahaha