[00:31] <fatgoose> anyone knows how to find k8s logs from the juju charm canonical distribution of kubernetes? specifically the controller-manager log. where is it on the master node?
[00:31] <fatgoose> (assuming this is the right place to ask such question)
[00:36] <anastasiamac> fatgoose: best place for these kind of questions would be #juju rather than #juju-dev
[00:36] <fatgoose> thanks.
[04:17] <thumper> axw: I noticed an intermittent failure in develop featuretests, to do with upgrade testing: upgrade_test.go:140: c.Assert(waitForUpgradeToStart(upgradeCh), jc.IsTrue), ... obtained bool = false
[04:17] <thumper> axw: I know you love intermittent failures :)
[04:21] <axw> thumper: I love them so much. I'll take a look after lunch
[04:21] <thumper> axw: awesome, thanks
[05:43] <wgrant> axw: Do you have a moment to talk about statuseshistory?
[05:43] <axw> wgrant: sure
[05:43] <wgrant> Jan 10 05:43:21 juju-189c77-controller-0 mongod.37017[1098]: [conn96] remove juju.statuseshistory query: { _id: ObjectId('5a53ff76f582e9a9395ed438') } ndeleted:1 keyUpdates:0 writeConflicts:0 numYields:1 locks:{ Global: { acquireCount: { r: 762, w: 762 } }, Database: { acquireCount: { w: 762 } }, Collection: { acquireCount: { w: 393 } }, Metadata: { acquireCount: { w: 369 } }, oplog: { acquireCount: { w:
[05:43] <wgrant> 369 } } } 120ms
[05:43] <wgrant> axw: Does that mean it deleted only one row at a time?
[05:44] <wgrant> I just ran remove-application on 2.2.9 on a unit with lots of statuseshistory, and mongo and juju are now each eating a core and there's a LOT of that in the logs
[05:44] <wgrant> Ah, got some of this toward the end, which looks more reasonable:
[05:44] <wgrant> Jan 10 05:43:54 juju-189c77-controller-0 mongod.37017[1098]: [conn49] command juju.$cmd command: delete { delete: "statuseshistory", deletes: 1000, writeConcern: { getLastError: 1, j: true }, ordered: true } keyUpdates:0 writeConflicts:0 numYields:0 reslen:100 locks:{ Global: { acquireCount: { r: 2002, w: 2002 } }, Database: { acquireCount: { w: 2002 } }, Collection: { acquireCount: { w: 1002 } }, Metadata:
[05:44] <wgrant> { acquireCount: { w: 1000 } }, oplog: { acquireCount: { w: 1000 } } } protocol:op_query 1051ms
[05:45] <axw> wgrant: the first one does look like that, yes
[05:52] <wgrant> axw: Interestingly, the batching does not totally fix the problem. I've just accidentally reproduced the saslStart storm outside the big controller for the first time...
[05:52] <wgrant> I wonder what information I should gather.
[05:53] <axw> wgrant: syslog and jujud log for a start, goroutine profile and cpu profile of jujud might also be helpful
[05:54] <wgrant> axw: "perf top" shows that almost all CPU time is spent calculating SHA-1s.
[05:54] <wgrant> For SASL
[05:55] <axw> wgrant: the profile might still give a clue as to why
[05:55] <wgrant> axw: How do I do the goroutine thingy?
[05:55] <axw> 1 sec, need to refresh my own memory
[06:00] <axw> wgrant: on the controller, "juju-goroutines"
[06:00] <axw> and "juju-cpu-profile" will give you a cpu profile
[06:00] <wgrant> axw: Wow that's easier than I expected
[06:07] <wgrant> axw: https://private-fileshare.canonical.com/~wgrant/juju-prod-ols-snap-store-20180110-saslStart-storm.tar.gz
[06:07] <wgrant> I'll add it to the bug as well, just wanting to gather as much as possible before I restart it.
[06:10] <axw> wgrant: thanks
[06:24] <axw> wgrant: no smoking gun in that data AFAICS :(
[06:30] <wgrant> axw: :(
[13:22] <tasdomas> hi
[13:22] <tasdomas> how do I run juju acceptance tests on my machine?
[14:19] <balloons> tasdomas, you should be able to just run most of them
[14:19] <tasdomas> balloons, I thought so too
[14:19] <balloons> Some may require arguments but the tests try and be smart in most cases and make default values
[14:34] <balloons> tasdomas, I assume you are looking at the assess_budget tests?
[14:34] <tasdomas> balloons, yeah
[14:35] <tasdomas> balloons, however, I'm getting this when I try to run tests: https://pastebin.ubuntu.com/26360414/
[14:35] <tasdomas> balloons, that's with a virtualenv, with requirements.txt installed
[14:36] <balloons> tasdomas, well that's simple enough. You are missing python-fixtures looks like
[14:36] <balloons> tasdomas, the requirements.txt is a bit of a lie atm, as you also need the packages in juju-ci-common
[14:37] <balloons> tasdomas, these packages: https://github.com/juju/juju/blob/develop/acceptancetests/juju-ci-tools-common
[14:39] <tasdomas> balloons, thanks - I'll keep looking into this
[14:40] <balloons> tasdomas, sorry for the trouble. But thanks for updating the tests as well. If you continue to have trouble let me know
[14:40] <tasdomas> balloons, I will