[00:52] vino: did you have a minute, i just looked at juju run pr, quick hangout? [00:57] yes [00:58] wallyworld: yes [00:59] vino: see you in standup [01:00] anastasiamac: small PR if you have a chance - needed for 2.4 beta 1 https://github.com/juju/juju/pull/8613 [01:03] wallyworld: nws, m about to pr up the comment u've asked and will need a review too :D [01:18] babbageclunk: veebers: could u please TAL https://github.com/juju/juju/pull/8614 does it make sense to u coming in cold into the context? [01:21] anastasiamac: will take a look soon [01:22] anastasiamac: will do just let me know the PR [01:27] wallyworld: https://github.com/juju/juju/pull/8614 :) [01:27] looking [01:28] anastasiamac: lgtm, ty [01:37] * anastasiamac does a fast-landing dance \o/ [01:45] oops, sorry anastasiamac [01:45] Oh, should have looked first before holding off! [01:46] babbageclunk: u r sorry i dance? :D [01:47] No, I just should have paused what I was doing! [01:48] babbageclunk: nws, i just needed someone to tell me that that i made sense :D wallyworld is good at keeping my english straight :D PR is heading for landing now [01:49] babbageclunk: no no, it was not worth interuppting anyone... was just a review ;D [01:53] Well I'll do better next time. [02:06] sorry anastasiamac was at lunch :-\ [02:06] veebers: all good! [02:13] * thumper goes to make a coffee [02:23] wallyworld: when you have a spare moment if you could ack the test addition I made since your approval: https://github.com/juju/juju/pull/8599 [02:23] sure, sorry i totally forgot [02:23] nw [02:25] veebers: just a minor tweak [02:27] gah, my illusion of perfection is broken ;-) I'll fix it up and merge [02:35] anastasiamac: thanks for review, i missed those other errors, i've updated PR with a new commit [02:46] wallyworld: remind me the command that shows charm hooks run? [02:46] hah, that's almost not english :-) [02:47] juju show-status-log [02:48] awesome thanks [02:49] wallyworld: is install hook retried by default perchance? [02:49] veebers: yah, all hooks are retried i think 3 times [02:49] wallyworld: ah right cheers, I'll update the test charm to take that into account :-) [02:50] yeah sorry should have mentioned that [02:50] nw, no biggie [02:53] anastasiamac: i just noticed you already approved but given the changes i made for errors are non-trivial, would appreciate another look; they are all in the 2nd commit [02:53] wallyworld: k, looking now :D [02:53] awesome, ty [02:59] yay, thanks for review === mup_ is now known as mup [03:00] babbageclunk: got a minute, I'm a little confused [03:00] thumper: sure [03:00] babbageclunk: jumping in the HO [03:01] wallyworld: that was really good !!! loved how u consolidated all these errors. once i saw it, it totally made sense [03:02] yeah, we do tend to scatter shite everywhere [03:02] was good to clean it up [03:02] thanks for noticing in the PR [03:02] was late when i did it :-) [03:02] \o/ [03:31] It appears that juju is attempting to re-try the install hook at least 8 times. Am I doing something odd in my charm to trigger this? [03:31] all my charm is doing is in the install hook increment the attempt count (file stored in tmp) and setting: "status-set blocked "Install hook failed on purpose." then exit 1 [03:35] wallyworld: oh you;re back :-) I was just complaining: It appears that juju is attempting to re-try the install hook at least 8 times. Am I doing something odd in my charm to trigger this? [03:35] all my charm is doing is in the install hook increment the attempt count (file stored in tmp) and setting: "status-set blocked "Install hook failed on purpose." then exit 1 [03:35] yeah, using vpn drops irc :-( [03:36] i don't think you're doing anything amiss [03:36] i think juju reties 3 times [03:36] maybe just exit 1 and see what happens though that shouldn't make a difference [03:37] wallyworld: it retries more than 3 times [03:37] really? :-( [03:37] i wonder why [03:37] even 3 seems too many [03:37] i had recalled that we settled on 3 but obviously i'm wrong [03:38] it's tried 10 times now ^_^ [03:38] i'm sure in my testing it didn't do that [03:38] it retried what i could have sworn was 3 times and then gave up [03:38] and left the unit in error state [03:39] wallyworld: if your testing was just exit 1 always you might not have seen it, but because I'm waiting to make sure it stops trying to ensure my resolve is the thing that fixed it I see it happen [03:39] but could be wrong there too [03:39] if you always exit 1 then resolve will be the thing that moves it on [03:40] * thumper thrashes his computer somewhat [03:42] wallyworld: right, I can resolve --no-retry to move it on, but I can't ensure that resolve will re-run the hook when requested (although it seems perhaps that's not needed as it's *always* retrying) [03:43] babbageclunk: got 5 minutes? [03:44] babbageclunk: I'm invoking the live demo demons [03:44] babbageclunk: demonstrating as yet untried code [03:44] sure [03:44] veebers: yeah, the code will need to be read to see what's going on. i can't recall off hand [03:44] --no-retry would be a good start [03:45] wallyworld: ack. I'll proceed with a functional test for no-retry. If I get nosey / have time I might try figure out what the code is doing there too [03:45] sgtm [03:46] * thumper jumps back in HO === mup_ is now known as mup [03:56] wallyworld: I'm confused, my juju 2.3.5 resolve works as expected, passing --no-retry skips the install hook and goes onto the start hook === kelvinliu__ is now known as kelvinliu === meetingology` is now known as meetingology === mup_ is now known as mup [04:15] babbageclunk: I realise there was one more space where I haven't hooked it up :) [04:15] thumper: I mean, I guess that had to have been the problem [04:16] :) [04:18] thumper: so it's working now with quick status changes? [04:18] shtill threading [04:34] babbageclunk: got it using the new code, but all coming back as unknown not alive [04:34] so all machine showing down [04:34] * thumper sighs [04:34] doh [04:35] added more logging [04:35] ah FFS [04:36] * thumper knows === mup_ is now known as mup [04:46] babbageclunk: I think I have it this time... [04:46] by George, I think he's got it! [04:49] nope [04:49] got logic backwards [04:49] trying again [04:49] this is the culmination of around two years of effort [04:49] I'll be celebrating when this is in [04:50] thumper: sent email about upgrade test for 2.3.6 - short answer NFI, can't reproduce [04:51] BOOM!!! [04:51] working [04:51] wallyworld: ok [04:51] babbageclunk: wanna see it before I run to get Maia [04:51] ? [04:53] too late [04:53] gotta go [04:54] I'll be back on later tonight to polish and put up for review [04:54] wallyworld: this is new presence hooked up in status [04:54] oh joy [04:54] with user updatable controller feature flags [04:54] faaark, nice [04:54] wallyworld: jump on a HO [04:54] I want to show someone [04:54] righto [04:54] lol [04:54] vino_: go fmt is still sad. you'll need to do a quick fix for run_test.go [04:54] that same line 95 ? [04:55] wallyworld: in our 1:1 [04:55] ok let me look [05:01] vino_: there's about 4 errors [05:02] thumper: oops, sorry === mup_ is now known as mup [05:06] i cudnt get this githook fix in my setup. [05:06] huh.. [05:08] thumper: wallyworld: PTAL licencing for 2.3 - https://github.com/juju/juju/pull/8615 [05:09] ok [05:09] jeez, 31 files [05:09] wallyworld: but mostly tests :D [05:10] 31? m seeing 25... [05:14] maybe i can't count [05:21] wallyworld: you're 'resolve --retry' fix, did that go in for 2.3.5 or 2.3.6? [05:21] 2.3.6 [05:23] wallyworld: so with 2.3.5 we should see that 'resolve --no-retry' does in fact try the errored hook? I'm not seeing that, but I"ll double down now and confirm that [05:24] yeah, i think 2.3.5 has the issue but hmmm, not sure now [05:30] wallyworld: run_test.go - 4 errors correct ? [05:31] sounds right yeah, that's whay my go fmt said i think [05:31] line: 105, 110, 119 [05:31] simething like that, don't have the putput anymore [05:31] no worries. [05:38] wallyworld: done. plz verify [05:40] looking [05:42] vino_: i thought there were four errors, https://pastebin.ubuntu.com/p/b3DM8qm4m3/ [05:47] wallyworld: i still got the same problem. so after I run `juju deploy ./mysql`, I got this message immediately, after unit status changed to `allocating`. [05:48] can u see that ? [05:48] which message? [05:48] next to @line104 [05:49] lookin again at pr [05:50] vino_: ah, look ok. the pr showed 2 changes together [05:50] sorry, looks good, will approve [05:50] yea. [05:50] but i shd get this githook and few other things fixed sooner.. [05:50] no sorry required :) [05:50] yeah, see if you can get it working [05:51] also, our landing bot shoulld not accept prs with these errors [05:51] we need to fix that [05:51] kelvinliu_: which message are you seeing? [05:51] wallyworld: some logs are here https://pastebin.ubuntu.com/p/D9sjnXvY4K/ [05:55] kelvinliu_: i can't see anything obvious. does juju status show an incremented version number to indicate the upgrade went ok? ie 2.4-beta1.2 or something like that [05:55] you could try creating a new model and trying again [05:57] wallyworld: the controller upgrade went ok, and tried it a few times, now I got 4 testcaas models [05:58] you should see in the operator logs that the charm reactive code is calling set-pod-spec [05:58] this triggers juju to create the unit [05:58] which then creates the mysql pod [05:59] wallyworld: no, it's not there [05:59] ok, let me run up a k8s cluster and see what my logging says [06:01] wallyworld: so after `juju-unit executing running start hook` then `workload unknown`, then `juju-unit idle` [06:03] i can't recall the exact order off hand, i'll see when my k8s cluster starts [06:03] wallyworld: thx [06:04] wallyworld: or do u have a couple of minutes for a screen share session? [06:05] kelvinliu_: i won't be much help until i run up my k8s bundle to compare [06:05] wallyworld: ok, sure, we will wait for the cluster up and running first [06:26] kelvinliu_: i just started a k8s cluster and deployed mysql without any problem [06:26] did you want to do a hangout? [06:27] wallyworld: yes, plz [06:27] see you in standup [06:27] yup [06:48] have a quick question.. only snap install, will install the service files in /var/lib/juju/init ? [07:01] i dont see in /var/lib/juju/tool [07:23] vino_: snap install of juju is just the client [07:23] when you bootstrap, it pulls down the agent binary and creates the systemd files on the vm [07:24] so you need to bootstrap and ssh into the controller machine to see what was done [07:25] wallyworld: just raised a bug for the remove-relation failure on apps running k8s -> https://bugs.launchpad.net/juju/+bug/1764649 and also created a card on caas trello board plz let me know if any questions. thx [07:25] Bug #1764649: juju caas remove-relation does not work [07:32] great ty [07:40] wallyworld: sorry i was a bit away for my tea. [07:41] yes i did that. [07:41] and i am there [07:43] vino_: great ok, so the service files - you should see a symlink from files /etc/systemd/system to their actual location under /var/lib/juju/init [07:43] :) yes. [07:45] i did ssh to instance but was looking in another terminal didnt realize :p [07:45] ah no worries [08:12] wallyworld: still around? [08:13] thumper: sorta [08:13] soccer soon [08:13] wallyworld: how soon? [08:13] just getting PR ready [08:13] 15 [08:14] maybe 0 [08:14] 20 [08:15] long enough [08:15] ok [08:19] poo, need to remove some critical logging... [08:21] wallyworld: https://github.com/juju/juju/pull/8617 [08:21] wallyworld: did you want to get on a hangout to talk through? [08:21] smaller than the penultimate branch [08:22] thumper: can do [08:22] * thumper jumps in HO [08:53] phew... [08:53] * thumper waits for the merge [08:58] manadart: ping [09:15] thumper: Pong. On a call with jam. [09:15] manadart: hey, was just wondering if you had an ETA on removing the horrible lxd patches we have [09:16] thumper: Not yet, but I can peel off this intermittent failure I am working on and round up such an estimate. [09:17] heh... no, we're good [09:17] what is the intermittent failures? [09:20] https://bugs.launchpad.net/juju/+bug/1753418 [09:20] Bug #1753418: intermittent failure in kvmProvisionerSuite.TestKVMProvisionerObservesConfigChanges [09:41] night all [09:42] G'night. [13:48] g’morning o/ [13:49] morning hml === frankban is now known as frankban|afk [20:27] morning [20:30] morning thumper [20:38] Morning o/ [21:31] wallyworld: so... no power nor internet at home now [21:32] my laptop battery has several hours I guess [21:32] and tethered to my phone [21:32] they are replacing a damanged pole outside my house, so have turned off power, but we never got notified [21:32] from 9 till 3 [21:33] thumper: oh that's what happended [21:33] yeah, I was a little surprised, but you should see the kids [21:34] they are like, I can't charge my phone? [21:34] can't watch netflix? [21:34] what am I going to do? [21:35] ha! [21:36] lol [21:36] thumper: so no release call :-) i thought i might be good to chat about that upgrade issue even thoigh nix is away [21:38] wallyworld: let's do a voice only hangout [21:39] to save my data [21:39] ok [22:07] wallyworld: I'm scratching my head on this one, re: resolve command: every build I've tried it works as expected, --no-retry skips the failing install hook and continues. [22:08] I tried 2.3.5 (snap installed), I tried the commit before your one that fixed the issue [22:08] ha, if you're benchmarking something where you're appending to logs, make sure part of the thing isn't copying all of the logs so your bechmark gets progressively slower. [22:09] veebers: i don't have an explanation for that off hand, will need to investigate [22:19] thumper: we can fill out much of the 2.4 release notes but the enable-ha/remove-machine and ha space sections are todo (john/joe) as are model owner changes (you). i think we may need a day to get these sorted [22:19] yeah [22:20] so we maybe should take a view tomorrow after forcing folks to fill out notes today :-) [22:20] we can do 2.4 and 2.3.6 a day apart [22:20] different people do each [22:35] hey Wallyworld: morning! fixed the format issues u have mentioned. [22:35] great ty [22:35] looking at pr [22:36] lgtm [23:47] babbageclunk: ready! [23:48] still in same hangout [23:49] ok