=== blahdeblah_ is now known as blahdeblah [07:30] xnox, both our claims were wrong... [07:30] nodejs still reproducible in debian, and the bootstrap I did failed badly [10:08] philroche: o/ Could you take a look at cornfeedhobo's issue above. please? === ricab is now known as ricab|lunch [13:24] TIL: less relies on file extensions instead of a magic. Read a text file that happens to end in ".deb" and it fails invisibly :-/ === ricab|lunch is now known as ricab === M_hc is now known as _hc [16:15] xnox, upload with testsuite disabled? do you have any news? [16:16] :/ report upstream= [16:16] ? [16:16] I'm out of ideas [16:36] LocutusOfBorg, i did open a bug in debian. [16:37] LocutusOfBorg, i cannot rebuild successfully nodejs, in debian sid/unstable, on ubuntu host. [16:37] LocutusOfBorg, i wonder if it is an ubuntu kernel bug; or like new kernel issue. [16:37] xnox: what's your build host? ANd are yo just doing a bog-standard rebuild of an existing package? I can test on my 18.04 (and 16.04 older builder) infra. [16:38] (happy to loan cycles to test-build things) [16:38] s/existing package/nodejs [16:39] (in the off chance you need another kernel/testbed i have those cycles available) [16:41] teward, this fails for me on disco host $ sbuild -A -d sid nodejs_10.15.0~dfsg-8 [16:42] on amd64 & ppc64el [16:42] teward, it would be nice if you can check if this works for you, or also fails - $ sbuild -A -d sid nodejs_10.15.0~dfsg-8 [16:42] xnox: i don't have a disco host but I do have an 18.04 host. standby while I finish populating the sid chroot [16:43] teward, yeah, 18.04 as a host is also interesting. [16:43] xnox: running now, standby [16:44] (god I love having 32GB of RAM heh) [16:46] I'm building nodejs with pbuilder and sid environment on ubuntu amd64 18.04 [16:46] xnox, ^^ [16:46] and also a disco build on same host [16:47] i'm running a sid sbuild chroot on an amd64 18.04 host, but also spinning up a ppc64el qemu-static sbuild chroot as well. [16:47] xnox: looks like it's in the compilation steps now, will provide a copy of the logs once it's complete. [16:47] i am seeing warnings and such going by at blazing speeds though [16:48] but so far I haven't hit any critical build failures yet as far as I can tell [16:51] teward, it's in the test-suite where it fails... it builds 'ok' [16:51] teward, can take like 2h =) [16:51] xnox: i'd expect it to. you are aware Debian triggers a lot of autotest regressions as well right? [16:51] even on their infra? [16:51] except on *every* arch [16:52] teward, sure, but i'm not sure if they use e.g. stable kernel. [16:52] oops my bad, on multiple other node autopkgtests I mean on amd64 [16:52] right [16:52] xnox: considering i'm stuck here until 17:00 at work (and it's 11:52 right now) 2 hours of a background task is nothing :P [16:54] though in theory 12 threads of CPU are probably going to increase the performance lol [16:57] looks like it's starting some of its tests now [16:57] and there's a few failures that i'm seeing in the parallel/* tests [16:58] xnox: i'm guessing the testsuites is what takes 2 hours :p [16:58] or do you want me to run the autopkgtests as well? [16:59] (which won't happen probably if it fails like this heh) [16:59] xnox: do we have examples of the current failing tests? [16:59] 'cause i'm getting http2 test errors at the moment :| [17:07] testsuite is fine [17:08] teward, grep for "not ok" [17:08] https://launchpad.net/ubuntu/+source/nodejs/10.15.0~dfsg-8/+build/16274528 [17:08] 3 tests [17:08] LocutusOfBorg: still running but there's a lot of STREAM_CANCEL errors on my tests here [17:08] so i'm getting a lot more failures [17:08] might be because of the env. [17:12] teward, or because of $parallel [17:12] possibly. [17:12] xnox, the bug you opened in debian has test-assert that is good [17:12] so, something not kernel related since I assume the sbuild ran on the same machine= [17:13] LocutusOfBorg: let me rerun this with a higher parallel value. [17:13] wonder if that broke it. [17:15] (it had parallel=2 which was a holdover from my older sbuildrc when I didn't have the power this system has) [17:28] LocutusOfBorg: E:NOREPRO on errors, but getting *other* errors than you are in test case. parallel/test-assert passes OK here. any tests that're pulling localhost are failing getaddrinfo in my sbuild though not sure why that's the case. parallel/test-crypto-verify-failure also fails on my system. [17:28] it's still running tests, only up to test 400 right now [17:29] on some errors so far* [17:29] odd that i'm getting different failures. [17:29] getaddrinfo EAI_AGAIN localhost <-- these're the lookup failures, wonder if the sbuild chroots are being stupid. [17:31] xnox: ^ for your awareness as well, not sure why i'm getting those specific errors :| but that throws a lot of failed test problems where it probably shouldn't be [17:31] (on my env_ [17:34] though I think i know why those were erroring hang on [18:04] teward, but debian chroot or ubuntu? [18:04] LocutusOfBorg: sid chroot [18:04] exactly [18:04] the same with disco chroot fails [18:04] LocutusOfBorg: on which, the localhost failures? [18:04] or the other ones? [18:04] no the three ones [18:04] LocutusOfBorg: oh. [18:04] so far NOREPRO on the first one [18:05] still waiting for my system to catch up, and i wanted to fix the localhost failure problems which would've tainted the build [18:05] (third run, this time to try and fix the localhost resolve problems in the chroot) [18:06] LocutusOfBorg: the first failure you linked to was parallel/test-assert - that succeeded on my end [18:06] no errors there. [18:06] in the amd chroot of sid [18:06] the sid one is what xnox asked me to run :P [18:06] so :P [18:07] haven't run the disco one yeht [18:07] same with pbuilder-dist [18:07] it never got to the next ones, and the 'results' would have been dirtied with the localhost lookup failure ones. [18:07] pbuilder-dist sid amd64 build nodejs test-assert OK [18:07] pbuilder-dist disco amd64 build nodejs test-assert NOT OK [18:08] with pbuilder I think I have exactly the same failures as the archive [18:08] interesting. i'm still letting the tests complete in sid right now to see if i can get the others [18:08] i'll run the disco one after this amd64 sid one runs [18:08] sure I'm still running the two builds [18:08] if you want to check it out [18:09] pbuilder-dist sid amd64 create && pbuilder-dist sid amd64 build nodejs*.dsc [18:09] s/sid/disco/g if you want the other one ^^ [18:09] right. i've already got the sid chroots built (I use sbuild) [18:09] i'm building the disco schroots now [18:09] yay my changes to the sid pristine tarball fixed the localhost crap [18:10] *makes a note that he probably should add a chroot build step to populate /etc/hosts with '127.0.0.1 localhost' in the roots* [18:10] this is what pbuilder-dist does automagically :D [18:10] *shrugs* [18:11] oh that's odd, test 507 parallel/test-fs-error-messages fails here o.O [18:11] missing expected exception (validateError) o.O [18:12] so far all the other tests are succeeding on a sid chroot [18:15] xnox: do you have sbuild logs for this yourself? I'm getting some errors which don't match the disco logs that LocutusOfBorg shared [18:23] Give my tests more RAM, please. [18:48] LocutusOfBorg: xnox: should I be concerned these tests're still using Py2? [18:52] LocutusOfBorg: xnox: I have a 'failed' set of tests, but different issues it seems. IPv6 related possibly. But the tests that i was shown failing didn't match up with the ones failing on Disco, in the sid chroot. my failures: https://paste.ubuntu.com/p/NVrbpNqvbj/ build logs: http://paste.ubuntu.com/p/tS7JZ6ByZg/ [18:53] i'll run the disco chroot shortly [18:54] i'm wondering how much of the build failures are v6 related or other 'weird issues' because this system doesn't have v6 currently configured [18:55] LocutusOfBorg: xnox: oh and there is one 'crashed' test - a datagram v6 issue it seems. Not sure why it died (maybe the system killed it?) but it did. [18:58] LocutusOfBorg: obvious stupid evil question, has this been tested targeting disco on Debomatic? Just a thought, because that might provide yet another testbed. (And it seems they have 'disco' as an env on there0 [18:58] or do you not have a build account there? [19:30] LocutusOfBorg: xnox: confirmed my computer does something weird with regards to udp6 datagrams in the env, I'll look into it later with some test packages to see why it explodes that way and whether it's local to nodejs or something environment-related, my guess is env-related but eh. [19:30] other tests seem to be 'running' still though, almost done with the disco test and I"ll compare [19:33] LocutusOfBorg: xnox: not sure what you're seeing but I"m seeing a lot of repl failures too in mine as well as some net failures, I have to assume those're local to me, but as for the tests failing on the ubuntu builders' logs, I can confirm that the same tests that fail in your log data i've seen match some of the test failures I"m seeing. [19:33] but i have a lot more so i'm wondering what about my sbuild the system doesn't like. [19:34] udp6 and IPv6 stuff aside. [19:36] LocutusOfBorg: xnox: disco failures: https://paste.ubuntu.com/p/qYcm8SFWyx/ full logs: http://paste.ubuntu.com/p/bP7cfcTbw4/ [19:37] i obviously have some oddness in my envs, not sure what that's about [19:37] unless you guys have any ideas why i'm failing on repl stuff [20:05] LocutusOfBorg: xnox: from what I can tell the errors I"m getting with repl indicate something busted in my binary builds, wonder if something FTBFS but didn't cause a total failure. Or if those warnings broke it. [20:51] teward, I'm administrator of that machine :) [20:52] strange results yours [20:52] I still need to finish my pbuilder session [20:52] I had my laptop on suspend :) [20:56] LocutusOfBorg: oh you are. then you need to check your emails :P [20:57] LocutusOfBorg: i traced it back to some really oddball behavior in the nodejs binary [20:57] I have to presume that something in my system hates the REPL parts of the binary but i didn't dig. [20:57] but yes, strange results indeed. [21:34] LocutusOfBorg: fun fact you seem to be getting the same REPL errors on DebOMatic