[01:24] <jbicha> why are there still raring images on http://cdimage.ubuntu.com/ubuntu-gnome/daily-live/current/ ?
[01:26] <infinity> jbicha: A curious oops in how publishing scripts work, probably.  Can fix.
[01:28] <infinity> Ahh, could be because 'current' is a directory instead of a symlink.  Double oops.
[01:29] <jbicha> check the other flavors too :)
[01:43] <infinity> jbicha: That might be a bit tidier.
[02:20] <RAOF> Gah!
[02:20] <RAOF> “We've tested this and haven't found any regressions” is *not* useful in the [Regression Potential] section of an SRU bug.
[02:21] <StevenK> You're lagging, that was on G+ a whole two minutes ago. :-P
[08:01] <seb128> stokachu, stgraber, RAOF: seems like that gnome-keyring precise SRU that just went in proposed has buggy breaks/replaces version
[08:01] <seb128> SRU team: ^ maybe it should be removed from proposed?
[08:01] <seb128> (cf comments on the SRU bug)
[08:40] <infinity> seb128: Removed.
[08:40] <seb128> infinity, thanks
[08:40] <infinity> seb128: If you fix it (or get someone else to), feel free to nudge me to get it to the front of the review line, since it should be a few-character fix.
[08:41] <seb128> infinity, k, thank you ;-)
[09:09]  * xnox ponders how big is deb-src repository for raring of main/ and restricted/ => converted into PDFs
[09:12] <StevenK> xnox: You want to work it out?
[09:13] <cjwatson> *converted into PDFs*?
[09:14] <xnox> cjwatson: double-checking, yes correct. "Would it be possible to also have an overall source code report with just the first and last 30 pages of the code for 13.04 as a whole?"
[09:14] <Laney> ?!
[09:15] <xnox> I think it's an unreasonable request, i'll just wait for steve to wake up.
[09:15] <infinity> xnox: I don't even know what that means.
[09:15] <infinity> "First and last 30 pages..."
[09:16] <Laney> someone asked you for that?
[09:16] <infinity> xnox: So, I guess you just need 30 pages from aalib and 30 pages from zsync (or whatever).
[09:16] <xnox> well I generated a few reports for selected binaries & selected source packages, yesterday. that's a followup request.
[09:16] <Laney> what is this person's goal?
[09:17] <infinity> xnox: I don't think it's unreasonable so much as it highlights that the person doesn't know what they're asking for.
[09:24] <brendand> xnox, that's hilarious :)
[09:24] <brendand> xnox, they must think 'Ubuntu' is one big program
[09:25] <infinity> brendand: It's not?
[09:25] <cjwatson> xnox: Er.  Is this a request you actually need to pay attention to?
[09:26] <brendand> infinity, depends on how you say it :P
[12:34] <ScottK> Is there a chance the fix to provide a rejection rationale broke sending the reject email itself?  The uploader of oxygen-gtk3 that I just rejected said he didn't get mail.
[12:34] <ScottK> Where "just" means about 20 minutes ago.
[12:37] <infinity> ScottK: It's been working for me...
[12:37] <infinity> StevenK: ^
[12:39] <infinity> ScottK: Of course, the last time I blamed soyuz for my not getting mail (like, around 12 hours ago), it turned out that my MTA was busted. :P
[12:40] <cjwatson> infinity: It might also matter that you're in ~launchpad, conceivably
[12:40] <infinity> cjwatson: Potentially, though I'm the only AA that is, so easy enough to test with someone else rejecting an upload.
[12:41] <infinity> (And if it wasn't sending mails at all, you'd think someone else would have noticed?)
[12:41] <cjwatson> Rejections are rare-ish
[12:48] <StevenK> infinity: You are not, we both are.
[12:49] <StevenK> ScottK: No, my changes didn't drill that far down.
[12:49] <infinity> StevenK: Oh, right, but I should remove you from that group, shouldn't I?
[12:50] <StevenK> The string that was passed down the stack changed, the bottom that sent the mail did not change. Perhaps wgrant's changes broke it.
[12:50] <infinity> StevenK: (archive, not launchpad)
[12:50] <StevenK> infinity: I hope not.
[12:52] <wgrant> StevenK, infinity: My changes only touched copies.
[12:52] <StevenK> wgrant: Your ZCML mail changes, perhaps
[12:53] <wgrant> Oh, that, possibly.
[12:53] <wgrant> But let me see.
[12:53] <wgrant> Nope, works fine
[12:54] <StevenK> We don't have logs for rejection sending, since that happens in-request.
[12:54] <wgrant> Right, the only resource would be MTA logs.
[13:14] <ScottK> Thanks.
[13:14] <ScottK> shadeslayer: ^^^
[13:15] <shadeslayer> huh
[13:15] <shadeslayer> well, I have 2 mails missing
[13:15] <shadeslayer> one from a kde4libs upload to raring in ~kubuntu-ninjas
[13:15] <shadeslayer> and the rejection email from ScottK
[13:16] <ScottK> So you got the rejection mail?
[13:16] <ScottK> That came from LP, not me.
[13:16] <shadeslayer> nope, I'm missing both of them
[13:16] <infinity> A missing reject and accept?
[13:16] <ScottK> Ah.
[13:16] <shadeslayer> a missing reject and a upload I did to a PPA
[13:16] <infinity> Have you gotten any other mail since?
[13:16] <shadeslayer> both of those are missing
[13:17] <shadeslayer> infinity: any other mail from LP? nope, mail from mailing lists? YES
[13:17] <shadeslayer> whoops @ Caps
[13:17] <infinity> To the same address LP would be mailing you at?
[13:18] <shadeslayer> yep
[13:18] <shadeslayer> I use rohangarg@kubuntu.org and that's working fine
[13:19] <shadeslayer> oh
[13:19] <shadeslayer> funsies
[13:19] <shadeslayer> infinity: http://paste.kde.org/759116/
[13:20] <shadeslayer> so somehow I got deleted from the virtual alias table :D
[13:20] <shadeslayer> probably explains mail loss
[13:20] <infinity> Neat.
[13:22] <infinity> shadeslayer: I'll ask around and see if anyone can sort out why.
[13:22] <shadeslayer> thx :)
[13:28] <infinity> shadeslayer: You still exist, but you might have a sliiiight mail loop.  They're looking into it.
[13:28] <infinity>  /etc/postfix/kubuntu.org:rohangarg@kubuntu.org rohangarg@ubuntu.com
[13:28] <infinity>  /etc/postfix/ubuntu.com:rohangarg@ubuntu.com rohangarg@kubuntu.org
[13:28] <infinity> Why postfix returns "does not exist" instead of "exists excessively", I dunno.  I blame ScottK and lamont.
[13:29] <shadeslayer> O_O
[13:29] <infinity> shadeslayer: Did you recently change your primary email address on launchpad, by any chance?
[13:29] <shadeslayer> nope
[13:29] <shadeslayer> haven't touched that in years
[13:29] <infinity> shadeslayer: (This is me shooting in the dark while IS goes hunting)
[13:29] <infinity> shadeslayer: Kay.  I'll let them sort it out, then.
[13:30] <shadeslayer> thanks alot fo rthe update
[13:30] <ScottK> Fun.
[13:31] <lamont> infinity: it's going to notice the loop during lookup, and reject the address
[13:32] <infinity> lamont: Sure, I understand that it's probably pulling it from the lookup tables due to the loop, so it really "doesn't exist".  Just a shame it can't be more helpful on the topic.
[13:33] <lamont> good point.  not sure the api between the two daemons involved lets it be that granular.  either way, sounds like an enhancement-requesting bug is in order
[13:54] <stgraber> seb128: thanks for spotting. I'll let stokachu/dobey get me (or some other sponsor) a fixed debdiff... Sorry for that, I did test build it locally but took stokachu's word that this was all tested, clearly it wasn't or not properly...
[13:54] <seb128> stgraber, no worry and yw ;-)
[16:45] <adam_g_> Anyone in the SRU team able to take a look at releasing the verified Openstack packages  in raring and quantal? (nova, glance, cinder, horizon, quantum) They are all part of an MRE, and some will be getting stomped on by another security update next week (3rd time that has happened with this batch)
[16:48] <infinity> adam_g_: If the whole lot is ready, sure...
[16:52] <infinity> adam_g_: Bug #1150720 still seems a bit sketchy on the paramiko situation.
[16:52] <ubot2`> Launchpad bug 1150720 in Cinder "[SRU] There is now a dependency on paramiko v1.8.0" [High,In progress] https://launchpad.net/bugs/1150720
[16:55] <infinity> adam_g_: I'm a bit confused by the claim that something needs to be backported for paramiko, but that only appears to have happened in precise and raring, not quantal where the rest of this is being updated...
[17:01] <adam_g> jeez. freenode.
[17:01] <adam_g> infinity, well, we've backported the "known fix" to our paramiko 0.7.x package to work around the issue, and i'm not sure what i was reproducing was the same bug. in any case, its a paramiko bug and not a cinder bug
[17:02] <infinity> adam_g: Erm, but you didn't backport the fix to quantal.
[17:02] <adam_g> infinity, patching out the paramiko version req. in cinder 's pip-requires file would be needed regardless of the fix we SRU to paramiko
[17:04] <infinity> adam_g: If updating cinder without fixing paramiko is going to cause this bug to surface, that seems suboptimal.  Pretty please, can we get paramiko SRUed too?
[17:10] <adam_g> infinity, hmm yea. i can prepare the paramiko SRU but was having no luck reproducing and testing the issue.
[17:11] <infinity> adam_g: And yet, there are claims that the issue exists.  Can anyone reproduce it (upstream, perhaps?)
[17:11] <adam_g> zul, ^
[17:12] <zul> adam_g:  i havent able either
[17:12] <infinity> adam_g: Anyhow, please prep the Q SRU with the proper backported patch and I'll review that it's the same as precise's.
[17:12] <infinity> adam_g: And if you can hunt down someone who can reproduce the bug, awesome.  If you can't, then a bit of regression testing will suffice.
[17:14] <adam_g> infinity, i'll hit them up today. is there any chance we can track this as a separate paramiko issue that isn't blocking cinder?  the bug is paramiko bug gets triggered in a proprietary storage driver that we have no way of testing.
[17:15] <infinity> adam_g: I'd rather release the lot together, but there's no reason this needs a 7-day wait or anything.  If someone can put it through some quick testing, that's cool.
[17:15] <infinity> adam_g: The fact that the patch is already in raring and saucy is helpful here.
[17:17] <adam_g> infinity, cool. ill get something in queue for -proposed after i go visit the iced coffee man
[17:19] <infinity> adam_g: Stellar plan.  I might go find some breakfast.
[18:35] <adam_g> infinity, ^ paramiko
[18:53] <infinity> adam_g: Thanks, accepted.
[23:45] <adam_g> infinity, commented on bug #1150720 and flipped the tag. hopefully that paves the way for the two openstack batches (quantal + raring) to go out before the weekend.
[23:45] <ubot2`> Launchpad bug 1150720 in Cinder "[SRU] There is now a dependency on paramiko v1.8.0" [High,In progress] https://launchpad.net/bugs/1150720