[01:02] <Severity1> hi i need help. :)
[01:03] <CardinalFang> hi.  we need descriptions.
[01:03] <Severity1> https://bugs.launchpad.net/bugs/515153
[01:05] <Severity1> and here https://bugs.launchpad.net/bugs/516596
[01:05] <Severity1> i just wanna make sure that im doing the right thing here
[01:08] <dobey> i don't know enough about which bug it is, but i don't think fix released is right (proposed is not released), and it's probably a dup
[01:09] <Severity1> dobey:  yes i think so too
[01:11] <Severity1> i checked the changelog of proposed packages and it seems nothing fits the description
[01:13] <dobey> user descriptions usually are either not descriptive enough, or way too chatty :)
[01:13] <Severity1> lol
[01:14] <Severity1> okay new update emmet hickory just marked it as New => Triaged
[01:14] <dobey> oh it's probably a dup of the NoAccessToken bug
[01:14] <dobey> based on syncdaemon-exceptions.log anyway
[01:14] <dobey> for karmic, yes
[01:16] <Severity1> normally uisync --authorize fixes client to server sync issues
[01:17] <Severity1> *u1
[02:09] <habi> can anyone tell me how to synconise a complete folder?
[03:05] <statik> hey jamesh, how is it going?
[03:06] <statik> i was wondering whether it's ok to do a 0.2 release of django-openid-auth now, and whether you want to do it or if I should
[03:12] <jamesh> statik: bugger.  I got tied up with other stuff earlier.  I'll do the release now: the trunk seems to be fairly stable given the testing on edge
[03:13] <statik> jamesh: ah cool, thanks
[03:14] <statik> that will give a good week or two for me to get it into lucid before feature freeze
[03:14] <statik> i want to try and convert to the new 3.0 sourcepackage
[03:14] <statik> the new format, that is
[03:15] <jamesh> as opposed to the Python version :)
[03:22] <statik> :) indeed
[03:24] <jamesh> statik: here it is: https://launchpad.net/django-openid-auth/trunk/0.2
[03:26] <statik> jamesh: awesome, thanks!
[03:51] <duanedesign> +555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555
[03:52] <duanedesign> uh, oh
[03:53] <duanedesign> looks like my cats got on my keyboard while i was away. Sorry for any channel flood.
[07:40] <Severity1> ping duanedesign
[11:50] <vtech> [12:45:12] <vtech> Hello, have a question
[11:50] <vtech> [12:45:39] <vtech> If I have a couchdb database installed
[11:50] <vtech> [12:45:52] <vtech> and I put data into that couchdb
[11:50] <vtech> [12:46:15] <vtech> will it be replicated via ubuntuone service ?
[11:50] <vtech> [12:47:36] <vtech> I know that there is sth like desktopcouch to synchronize application data
[11:50] <vtech> [12:48:15] <vtech> but If I put data into couchdb on standard couchdb port
[11:50] <vtech> [12:48:50] <vtech> will it be somehow collected by desktopcouch client, and then synchronized by ubuntuone
[11:51] <beuno> Chipaca, would you know?  ^
[11:51] <Chipaca> vtech: no, it wouldn't
[11:51] <vtech> hmm
[11:52] <vtech> chipaca,Ihave to use desktopcouch
[11:52] <Chipaca> right
[11:52] <vtech> chipaca, Is there c++ client for desktopcouch?
[11:55] <vtech> chipaca, I know that it was developed in python, but I do not know if there is a c++ port for that client ?
[11:55] <Chipaca> vtech: desktopcouch?
[11:56] <vtech> chipaca, yhy
[11:56] <vtech> Chipaca, yhy -means yes :P
[11:56] <Chipaca> nope, no python
[11:58] <vtech> Chipaca, desktopcouch is not written in Python ?
[11:59] <vtech> Chipaca, but I saw sourcecodes of desktopcouch in Python
[11:59] <Chipaca> vtech: sorry, I meant, just python
[12:00] <vtech> Chipaca, So I have to make my own wrapper :(
[14:04] <statik> hello hello
[14:04] <__lucio__> hello
[14:04] <rmcbride> hi statik
[14:04] <nessita> hi statik
[14:06] <thisfred> yo
[14:06] <statik> so i'm supposed to talk about packaging
[14:06] <statik> i'm sorry i didn't get lernid working
[14:06] <statik> i had trouble with the iCal part
[14:06] <facundobatista> Hola statik
[14:06] <statik> so we have 5 people! thats great
[14:07] <statik> did anyone make it through that incredibly agressive list I sent out?
[14:07]  * statik was being optimistic
[14:07] <pfibiger> i'm here!
[14:07] <statik> ah great, 6 :)
[14:07] <rmcbride> well I did, apart frm putting deesktop couchdb on my PPA, but I've uploaded it several times already, so ;)
[14:07] <statik> rmcbride, thats fine :)
[14:08] <thisfred> statik: putting the finishing touches to pbuilder
[14:08] <statik> facundobatista, you mentioned that my list of assumptions you were missing a couple things, maybe I can help explain those? how far did you get on the list?
[14:08] <statik> thisfred: great
[14:08] <nessita> statik: I have a question! what's "run through setup-packaging-environtment"?
[14:09] <thisfred> for me at least, putting stuff in ~/bin doesn't put it in my path?
[14:09] <nessita> thisfred: you have to export $HOME/bin in your path
[14:09] <statik> nessita, there is a command named setup-packaging-environment, it will help with configuring some things
[14:09] <thisfred> nessita: run that command, and it'll ask you a series of questions
[14:09] <nessita> thisfred: it doesn't happen automatically
[14:09] <__lucio__> pbuilder is taking its time
[14:09] <statik> thisfred, you can also use /usr/local/bin instead, i just like keeping stuff in my ~
[14:09] <nessita> statik: I don't have that command, at leas not in my PATH
[14:10] <thisfred> nessita: which is good (I think it used to, and I got bitten by that once)
[14:10]  * jblount just saw the list and is making his way through it
[14:10] <facundobatista> statik, I did a setup.py like 8 years ago, really don't have a clue what's needed nowadays
[14:10] <thisfred> nessita: then you missed the previous step: install ubuntu-dev-tools
[14:10] <facundobatista> statik, and pbuilder is still doing its stuff (and messing with my home dir)
[14:10] <statik> thats totally ok
[14:10] <statik> that stuff can run in parallel while we talk
[14:11] <nessita> thisfred: I did
[14:11] <statik> so, I thought it would be best to start with a practical example of a packaging task, one that I was working on last night
[14:11] <statik> nessita, i wonder if the command is only added in the lucid version of ubuntu-dev-tools
[14:11] <thisfred> nessita: weird, then I'm out of clues
[14:12] <alecu> nessita, apt-get install ubuntu-dev-tools
[14:12] <statik> so, a very common task that you would do during the dev part of the ubuntu cycle (before feature freeze) is to update an existing package to a new upstream version
[14:12] <nessita> alecu: I did -.-
[14:12] <rmcbride> statik: I think that's the case. it's there on my laptop (lucid) but not on the new machine (still karmic)
[14:12] <statik> last night jamesh cut the 0.2 release of python-django-openid-auth
[14:12] <alecu> nessita, oh, wrong scrollback :P
[14:12] <nessita> alecu: ubuntu-dev-tools is already the newest version. :)
[14:12] <statik> so, we will go through updating that package, using the brand new ubuntu distributed development systems based on bzr
[14:13] <statik> I don't remember if I asked everyone to install bzr-builddeb, but you will need that package also
[14:13] <statik> I think everyone is familiar with how launchpad organizes branches by lp:~<owner>/<project>/<name>
[14:14] <statik> for using bzr with ubuntu packaging, a new parallel namespace has been set up
[14:14] <statik> every single package in ubuntu and debian has been imported into bzr
[14:14] <statik> and every time someone uploads a new version of a package, it's imported into bzr automatically, creating some nice history
[14:14] <statik> this means that we can also work on new versions of packages directly in bzr, and post merge proposals rather than attaching a debdiff to a bug
[14:15] <statik> I'll start with how a release is usually created for a python project. so, everyone go to a temp directory and do 'bzr get lp:django-openid-auth'
[14:16]  * nessita is bzr-getting
[14:16] <statik> let me know once that has completed, and i'll explain how upstream would create a release tarball from this using setup.py
[14:16] <__lucio__> done
[14:16] <facundobatista> done
[14:16] <thisfred> done
[14:16] <rmcbride> done
[14:17] <pfibiger> done
[14:17] <nessita> done
[14:17] <__lucio__> pbuild is still on "d", so it will take tome time :)
[14:17] <jblount> done
[14:18] <__lucio__> hey, theres no "debian" directory here
[14:18] <nessita> __lucio__: "here"?
[14:18] <thisfred> yeah, I've noticed the archives have been super slow lately
[14:18] <alecu> done
[14:18] <__lucio__> nessita, inside django-openid-auth
[14:18] <nessita> __lucio__: oh, right
[14:18] <statik> exactly! current standard practice is that in the upstream project, you don't include the debian directory, and that debian directory is not included in the tarball that the upstream project releases
[14:19] <alecu> btw: I'm having the same problem nessita does. I'm on karmic, and setup-packaging-environment does not show up.
[14:19] <statik> the work of the packager is to take the original upstream tarball, and add the debian directory
[14:19]  * facundobatista doesn't have setup-packaging-environment either, but pbuilder didn't finish yet... is it supposed to bring it?
[14:19] <statik> alecu, nessita: you can probably bzr get lp:ubuntu-dev-tools to get the upstream source and get setup-packaging-environment script right out of the bzr branch
[14:20] <alecu> statik, ok.
[14:20] <statik> so, back to django-openid-auth - take a look at setup.py, line 50
[14:20] <nessita> statik: ack
[14:20] <statik> this is actually more complicated than most setup.py files are
[14:20] <statik> but you can see some simple things like author, license info, and version number
[14:21] <statik> a few lines down, packages= explains what python modules this will install
[14:21] <statik> and package_data= explains that there are some extra data files that need to be included as well
[14:21] <statik> if you were working on a brand new project, you might need to create a similar (or simpler) setup.py and propose it to upstream
[14:22] <statik> now, close out that file and run 'python setup.py sdist'
[14:22] <__lucio__> statik, is installing just "install dependencies, put files in correct places, run custon shell commands", or is there more magic to it?
[14:22] <statik> and then look at the contents of the dist/ directory
[14:22] <statik> __lucio__, thats it. the only magic is a set of rules or policy about how packages are supposed to work
[14:23] <statik> setup.py is nothing to do with ubuntu/debian, thats all pure python, but the ubuntu packaging system knows how to get all the info out of setup.py so you don't have to repeat it
[14:23] <facundobatista> nice
[14:23] <statik> you should be seeing a tarball in the dist/ directory
[14:24] <nessita> yes
[14:24] <nessita> django-openid-auth-0.2.tar.gz
[14:24] <statik> congrats, this is a tarball release! now, if you were the upstream release manager, you would upload that tarball to pypi and launchpad
[14:24] <statik> now, we switch hats and look from the perspective of an ubuntu developer rather than a django-openid-auth developer
[14:25] <facundobatista> statik, question
[14:25] <statik> earlier, I talked about the namespace for bzr branches on launchpad
[14:25] <statik> facundobatista, sure
[14:25] <facundobatista> statik, the upload to LP is to the project home page (or something about releases under it), or to the PPA?
[14:25] <statik> facundobatista: for the tarball, that would be to the releases area of the project home page, and to the pypi package index on python.org
[14:26] <facundobatista> statik, ok
[14:26] <statik> once we turn this into an ubuntu sourcepackage, then we will upload to the PPA
[14:26] <facundobatista> great!
[14:26] <__lucio__> statik, where do i upload tarballs to lp?
[14:26] <__lucio__> ah
[14:27] <statik> __lucio__, check out the lp-project-upload command in ubuntu-dev-tools as well
[14:27] <statik> ubuntu developers are lazy, so if you are doing manual work for this kind of stuff you are making a mistake :)
[14:28] <thisfred> I am going to practice this all on my pet project :)
[14:28] <thisfred> I also brought an apple for statik
[14:28] <statik> the namespace for ubuntu distributed development bzr branches (UDD) is lp:~<owner>/<distro>/<series>/<packagename>/<branchname>/
[14:29] <facundobatista> thisfred, me too, I want to have a PPA for lalita
[14:29] <__lucio__> ppa for cocos2d!
[14:29] <statik> this is a crazy set of namespace things to remember, but it gets easier
[14:29] <statik> there are some good shortcuts available though
[14:29] <facundobatista> statik, one example of that namespace?
[14:29] <statik> one thing to remember is that the ubuntu package name may be different from the upstream package name
[14:29] <statik> upstream project name I mean
[14:30] <thisfred> right, the python- prefix for one
[14:30]  * rtgz just caught up :)
[14:30] <thisfred> hi rtgz!
[14:30] <statik> one example is lp:~ubuntu-branches/ubuntu/lucid/python-django-openid-auth/lucid
[14:30] <statik> you can subst karmic for lucid
[14:30] <statik> you can also swap ubuntu for debian, and lucid for sid
[14:31] <statik> the shortcut is 'bzr get lp:ubuntu/python-django-openid-auth', but wait one moment before running that
[14:31] <statik> when you use that bzr shortcut, the 'ubuntu' prefix says "this is package, ok? look it up in the current dev version of ubuntu"
[14:32] <statik> I keep my branches organized a certain way so I don't get confused when comparing between ubuntu hardy, karmic, lucid, debian sid, etc.
[14:32] <statik> so, i recommend this layout (you can change it of course): mkdir -p ~/udd/ubuntu/lucid
[14:32] <statik> cd ~/udd/ubuntu/lucid
[14:32] <statik> bzr get lp:ubuntu/python-django-openid-auth
[14:32] <statik> this will give you the current version of python-django-openid-auth that is in lucid
[14:33] <statik> you'll notice that branch is different, it has a debian/ directory in it :) this branch is the result of some special bzr import stuff that unpacks a sourcepackage and saves the whole thing in bzr
[14:33] <statik> let me know when everyone has that branch, and is inside the python-django-openid-auth directory
[14:33] <rmcbride> done
[14:33] <thisfred> done
[14:33] <__lucio__> done
[14:34] <facundobatista> statik, why ~/udd/ubuntu/lucid, because you're in lucid, or the bzr get will bring lucid stuff
[14:34] <facundobatista> ?
[14:34] <statik> once there, take a peek at debian/changelog, to confirm that the last changelog is what you expect
[14:34] <statik> facundobatista, because bzr get will bring lucid stuff
[14:34] <rtgz> Branched 2 revision(s).
[14:34] <statik> you don't have to be on lucid to do this, you could be running sid
[14:34] <nessita> done
[14:34] <facundobatista> statik, ok
[14:34] <statik> changelogs are kind of special and important in packages
[14:35] <alecu> facundobatista, I understand that because of the shortcut, that gets the most recent version.
[14:35] <facundobatista> alecu, and how did you know the most recent version was a lucid one?
[14:35] <statik> the version number and series (lucid) that you see at the top of this changelog file controls the version number of the package that will be built
[14:35] <rtgz> python-django-openid-auth (0.1-0ubuntu1) karmic; urgency=low ?
[14:36] <alecu> facundobatista, I mean, the most recent series of a distribution
[14:36] <statik> rtgz, exactly. thanks for pasting! i want to explain the different parts of this version string
[14:36] <statik> the first part is the sourcepackage name, that has to be perfect
[14:36] <statik> then, inside the ()
[14:36] <statik> 0.1 is the upstream release number
[14:36] <statik> this is from the tarball that we would have downloaded from pypi
[14:36] <__lucio__> where did all of this come from?
[14:36] <__lucio__> was it manually written?
[14:37] <__lucio__> its just lp magic?
[14:37] <__lucio__> what can i change?
[14:37] <statik> __lucio__, most of this was manually written because i didn't know the lazy shortcuts yet
[14:37] <statik> we will change it in just a moment
[14:37] <__lucio__> but it is just a branch, right?
[14:37] <statik> yes
[14:38] <statik> after the 0.1, the -0 means that this package was never in debian
[14:38] <statik> after that, the ubuntu1 means this is the first version of this package in ubuntu
[14:38] <statik> if I patched something in this package, I would change the version number to 0.1-0ubuntu2
[14:39] <statik> if I was working in my PPA, preparing something that would eventually get uploaded to ubuntu, I would add a magic suffix
[14:39] <statik> so, 0.1-0ubuntu2~karmic1 would be what i would use for making a version for karmic in my ppa
[14:39] <statik> the next version I upload to my ppa would be 0.1-0ubuntu2-karmic2
[14:39] <rmcbride> statik: check that string
[14:39] <verterok> statik: hi, question about versions
[14:39] <thisfred> ~ rather than -, right?
[14:39] <statik> if I wanted to backport that set of changes to hardy, I would change the version number to 0.1-0ubuntu2-hardy1
[14:40] <statik> oh right
[14:40] <statik> if I wanted to backport that set of changes to hardy, I would change the version number to 0.1-0ubuntu2~hardy1
[14:40] <statik> and 0.1-0ubuntu2~karmic2
[14:40] <statik> the ~ is special
[14:40] <rmcbride> Dashes where tilde's should go cause heartache and woe
[14:41] <statik> what it means is that if dpkg is going to compare packages to see which one is newer (and should be installed as an upgrade), the ~ allows you to say one package is older
[14:41] <statik> so, 0.1-0ubuntu2 is ALWAYS newer than 0.1-0ubuntu2~anything
[14:41] <__lucio__> what demons would i be invoking if i put a % in that string?
[14:42] <thisfred> aren't &s demons?
[14:42] <statik> there are a bunch of crazy complicated rules around version numbers
[14:42] <rmcbride> thisfred: good point
[14:42] <statik> it's the easiest thing to screw up, and will make your life miserable
[14:42] <__lucio__> statik, arent they just strings that get sorted to figure out whats the latests and the rest just conventions? or there non humans parsing them?
[14:42] <statik> so i'll stop talking about version numbers now, you can read in the man page for dpkg to get lots more info
[14:42] <statik> __lucio__, these get parsed by all kinds of tools
[14:42] <__lucio__> yuck
[14:43] <statik> they must be *perfect*
[14:43] <statik> they are also critical for upgrades working correctly between versions of ubuntu
[14:43] <statik> so now we have heard there is a new upstream release of django-openid-auth
[14:43] <statik> so, lets get the new tarball!
[14:43] <statik> there is a tool to help with this
[14:43] <statik> look at debian/watch
[14:44] <nessita> yes
[14:44] <statik> this is a simple pattern that the uscan tool will use to go look at a website and see if there is a new version
[14:44] <statik> lets try it out. run 'uscan --verbose --rename'
[14:45] <facundobatista> uscan: you must have the libcrypt-ssleay-perl package installed
[14:45] <thisfred> needs libcrypt-ssleay-perl
[14:45] <nessita> -- Scanning for watchfiles in .
[14:45] <nessita> uscan: No debian directories found
[14:45] <__lucio__> nessita, go inside the branch
[14:45] <rmcbride> Successfully downloaded updated package django-openid-auth-0.2.tar.gz
[14:45] <rmcbride>     and renamed it as python-django-openid-auth_0.2.orig.tar.gz
[14:45] <nessita> __lucio__: yes, I noticed :-)
[14:46] <nessita> statik: shouldn't the libcrypt-ssleay-perl be dependency on ubuntu-dev-tools?
[14:46] <rtgz> done, got the same output as rmcbride
[14:46] <statik> nessita, maybe it is an optional dependency for uscan
[14:46] <statik> many places publish their tarballs on http or ftp
[14:46] <statik> launchpad uses https, so thats why the ssl dependency
[14:47] <statik> we can look at the uscan package later, maybe thats a great bug for you to fix :)
[14:47] <nessita> statik: yey!
[14:47] <statik> you will notice that the tarball was renamed, this has to happen in exactly this pattern so that the packaging tools can find the tarball
[14:47] <__lucio__> statik, so, libssleay should be shown in apt-cache show ubuntu-dev-tools somewhere?
[14:48] <statik> perhaps
[14:49] <statik> i'm not sure which package holds uscan
[14:49] <rtgz> devscripts
[14:49] <facundobatista> statik, where the tarball is renamed?
[14:49] <statik> facundobatista, it got renamed by uscan when it was downloaded
[14:49] <statik> so you see it is now .orig.tar.gz
[14:49] <__lucio__> facundobatista, ls ..
[14:49] <statik> and the - was changed to a _ before the version number
[14:49] <facundobatista> __lucio__, oh
[14:50] <statik> more tools parsing version strings :/
[14:50] <statik> but, the tools make it easy to get it right most of the time
[14:50] <__lucio__> facundobatista, yes, having tools that touch ".." is like having frame hacks in python. ugly :)
[14:50] <statik> now we use a brand new bzr command
[14:50] <facundobatista> __lucio__, indeed
[14:51] <statik> this is going to import the upstream tarball into our packaging branch, including saving some pristine-tar info so that the exact tarball can be recreated from the branch later
[14:51] <statik> it will also do some smart things with merging
[14:51]  * alecu feels hunting version-of-by-one-character must be a huge timesink
[14:51] <statik> bzr merge-upstream --version=0.2 ../python-django-openid-auth_0.2.orig.tar.gz
[14:52] <facundobatista> bzr: ERROR: unknown command "merge-upstream"
[14:52] <verterok> facundobatista: I think you need bzr-builddeb installed
[14:52] <rtgz> Committed revision 2.\nAll changes applied successfully.
[14:52] <thisfred> facundobatista: you have bzr-builddeb?
[14:52] <nessita> facundobatista: it worked for me
[14:52] <nessita> facundobatista: so may be a missing package, nior karmic's fault :-)
[14:52] <rmcbride> worked here
[14:52] <statik> now if you do bzr status, you will see a bunch of changes
[14:53] <statik> don't commit yet
[14:53] <facundobatista> ok, done
[14:53] <statik> lets go back and look at debian/changelog
[14:53] <rtgz> wow
[14:53] <statik> there should be a new entry, automatically set up
[14:53] <rmcbride> neat
[14:53] <__lucio__> statik, all this magic parsing and i still have to tell him what version it is?
[14:53] <statik> __lucio__, i know, it kills me :)
[14:54] <statik> write a patch for bzr-builddeb :)
[14:54] <__lucio__> ok, i see where this is going :)
[14:54] <statik> now i like to write extra stuff in the changelog entry
[14:54] <statik> usually if upstream is nice they have written a NEWS file
[14:55] <nessita> statik: NEWS where?
[14:55] <statik> if upstream is lazy like me and jamesh, there is no NEWS file for django-openid-auth
[14:55] <statik> so, I went and looked at the upstream changes
[14:55] <facundobatista> statik, shame of you
[14:55] <statik> :D
[14:56] <statik> the other thing that is very interesting about changelogs is you can put bug numbers in them
[14:56] <nessita> statik: where that file would be? next to changelog or next to debian?
[14:56] <statik> and, bugs are used to track sponsoring uploads of packages
[14:56] <statik> nessita, if upstream provided a NEWS file it would probably be in root
[14:57] <statik> whats cool about putting a bug number (or several) in the changelog is that when the package is uploaded to ubuntu, the bug is automatically marked as fix released
[14:57] <statik> so, I wrote a bug number about this upgrade
[14:57] <facundobatista> oh, more magic
[14:57] <statik> and my changelog entry looks like this: * New upstream release. (LP: #517400)
[14:57] <__lucio__> statik, why does it say karmic everywhere?
[14:58] <alecu> nessita, NEWS is a file required by autotools (as well as README, AUTHORS and Changelog)
[14:58] <statik> __lucio__, if you are running on karmic the tool that helps write changelogs (dch) will default to that. you can change it to lucid
[14:58] <__lucio__> statik, im not
[14:58] <statik> __lucio__, i'm not sure then
[14:58] <facundobatista> statik, question
[14:58] <statik> facundobatista, go ahead
[14:59] <rtgz> statik, did you simly wrote the bug number to the file?
[14:59] <facundobatista> statik, I fix 20 bugs, then release, all those bug numbers are in the NEWS file
[14:59] <facundobatista> statik, should I put the 20 bug numbers in the changelog in one line?
[14:59] <statik> facundobatista: no, write it to be as nice as you can
[14:59] <statik> the changelog entry is shown by the package manager GUI tools on the desktop
[15:00] <rodrigo__> facundobatista, I do sometimes that, listing the NEWS for the release in debian/changelog, with each entry with its bug #
[15:00] <facundobatista> statik, but I want to them being marked automatically by LP as released!
[15:00] <statik> facundobatista, you can have many lines
[15:00] <rodrigo__> facundobatista, of course, for too many bugs, it might be too much :-)
[15:00] <facundobatista> ok
[15:00] <statik> facundobatista: mine is such a short line but it describes perfectly what the bug is about - new upstream release
[15:00] <statik> if you were fixing a bug you would decide how much to write based on the bug
[15:00] <statik> you do not have to list every bug
[15:00] <statik> or every change
[15:01] <statik> this is going to be very subjective, but it's a good spot to not be lazy and spend 10 minutes to try and come up with a good brief description of whats going on with the package
[15:01] <statik> other questions so far?
[15:02] <nessita> statik: me! qustion, you mentioned a GUI
[15:02] <statik> nessita: yes, like synaptic or software-center
[15:02] <nessita> statik: which one will be that GUI?
[15:02] <nessita> ah... I understand now
[15:03] <statik> oh, maybe we need to take a quick break to let the desktop+ team do their standup meeting. coffee! I'll be back in 5 minutes
[15:03] <rtgz> statik, the file needs to be edited manually right? I thought that it might be useful to fetch the description field of all the updates and have it updated automatically with the description line of bzr commit
[15:03] <rtgz> not that it is actually being written by devs, but it is recommended :)
[15:03] <statik> rtgz, yes you almost always need to write things manually in the changelog
[15:03] <rtgz> statik, ok, thanks
[15:04] <thisfred> rtgz: all the commits can be way too much information, or not very informative at all, depending on the quality/quantity of commit messages
[15:04] <alecu> statik, uploading a package for one ubuntu release that marks a bug as fixed, would mark it as fixed for all releases?
[15:04] <rmcbride> it is possible to retrieve that info from LP, put it in a file and paste from there (that's how I do the changelog notes for the test ubuntuone-client packages)
[15:05] <rmcbride> provided the upstream is in LP of course :)
[15:05] <alecu> statik, is there a way I can mark it on launchpad as not yet backported to an older release?
[15:07] <statik> i'm back
[15:07] <statik> i don't see a desktop+ meeting going, so i'll keep on hogging the channel :)
[15:07] <statik> alecu, yes. the top line of the changelog says which ubuntu release the package is for
[15:07] <statik> a bug in launchpad can have many also-affects lines
[15:07] <statik> these are sometimes called 'bug tasks'
[15:08] <statik> so you can have 1 bug, with a bugtask in python-configglue, a bugtask in ubuntuone-client, and a bugtask on the ubuntu karmic package of ubuntuone-client
[15:08] <statik> and each of those may have a different status
[15:08] <alecu> oh, right! great!
[15:09] <statik> so everyone should have a changelog entry that looks something like this:
[15:09] <statik> python-django-openid-auth (0.2-0ubuntu1) lucid; urgency=low
[15:09] <statik>   * New upstream release.
[15:09] <statik>  -- Elliot Murphy <elliot@ubuntu.com>  Fri, 05 Feb 2010 09:51:41 -0500
[15:09] <statik> now we want to turn this into a sourcepackage that can be built by pbuilder or a PPA
[15:10] <statik> so run 'bzr builddeb -S'
[15:10] <statik> the s means build a sourcepackage
[15:10] <statik> this is wrapping up a lot of smaller commands that have an infinity of options
[15:10] <statik> so if you need to do something special, it's absolutely possible, this is just handling the common case
[15:10] <__lucio__> statik, python-django-openid-auth (0.2-0ubuntu1) UNRELEASED; urgency=low ??? unreleased?
[15:11] <statik> ah, the UNRELEASED series is a special token so that you can work on this in version control and do testing, and not get mixed up about whether it has been uploaded already
[15:11] <nessita> statik: hum, something is not right in my env.
[15:11] <facundobatista> statik, should we leave it in UNRELEASED?
[15:11] <statik> when you are touching 25 different packages across debian and several versions of ubuntu and private PPAs, it is easy to get mixed up
[15:11] <statik> facundobatista, you can run dch -r to flip it from UNRELEASED to lucid
[15:11] <nessita> statik: https://pastebin.canonical.com/27500/
[15:12] <statik> when I ran bzr bd -S, I was prompted to gpg sign two files
[15:12] <thisfred> statik: it seems the options (name + email address) from the setup aren't respected
[15:12] <statik> ok
[15:12] <alecu> nessita, same problem here.
[15:12] <statik> this is a fragile part of the process it seems
[15:12] <statik> you can specify the key ID to use for signing
[15:12] <rodrigo__> nessita, I have always to add -k'rodrigo.moya@canonical.com' so that it gets the correct gpgp signature
[15:12] <statik> so, gpg --list-secret-keys
[15:13] <statik> or yes, what rodrigo said
[15:13] <rodrigo__> bzr  bd -S -k'rodrigo.moya@canonical.com'
[15:13] <rtgz> statik, gpg tells me that there's no secret key and I have exactly one gpg key which was picked up at setup-dev-thing stage
[15:13] <facundobatista> statik, hold on, please
[15:13] <__lucio__> bzr: ERROR: no such option: -k
[15:13] <statik> bzr bd -S -- -k'blah'
[15:13] <rodrigo__> yeah, -- -k...
[15:14] <facundobatista> statik, I did "dch -r", it opened a file that looked like the changelog, but modified, I closed it without saving
[15:14] <facundobatista> the changelog is untouched
[15:14] <__lucio__> yay! now i have a really dirty parent directoty!
[15:14] <statik> great
[15:14] <facundobatista> statik, now dch -r opens an empty file
[15:14] <statik> facundobatista, dch -r will make changes then open debian/changelog in your $EDITOR to review and save I think
[15:14] <alecu> rodrigo__, it worked now, thanks.
[15:14] <facundobatista> oh, now dch -r opens a file with content again!
[15:14] <verterok> statik: isn't easier to export DEBFULLNAME and DEBMAIL?
[15:15]  * rtgz has signed the deb file for the first time...
[15:15] <statik> verterok, yes I have those in my environment, I also have some custom settings in ~/.devscripts
[15:15] <facundobatista> statik, I saved the file, but the changelog is still untouched
[15:15] <statik> huh
[15:15] <statik> facundobatista, you can just edit debian/changelog manually and change from UNRELEASED to lucid
[15:15] <facundobatista> oh, I have a debian/changelog.dch now
[15:16] <statik> so the files that were created in your parent dir
[15:16] <statik> there is a .changes, a .dsc, and a .diff.gz
[15:16] <nessita> yes
[15:16] <statik> there should also be the .orig.tar.gz
[15:16]  * rtgz notices that there is no deb file
[15:16] <statik> the .changes and .dsc are used by the system somehow
[15:17] <statik> the .diff.gz should contain a diff that is only the contents of the debian/ directory
[15:17] <statik> and the orig.tar.gz, is, of course, the orig tarball from upstream
[15:17] <statik> these are the components of a sourcepackage
[15:17] <statik> it has to get built into a binary package before it can be installed
[15:17] <statik> so, lets do that!
[15:18] <statik> pbuilder-lucid build ../python-django-openid-auth_0.2-0ubuntu1.dsc
[15:18] <__lucio__> mmh, pbuilder is still on "p"
[15:18] <thisfred> here as well
[15:18] <thisfred> well, on 'l' actually
[15:18] <statik> __lucio__, you can cheat and build a binary package directly on your dev system instead of using pbuilder. this won't help you catch missing dependencies, but it will make you feel happy that you got a package that will install
[15:18] <rtgz> "pbuilder-lucid" ?
[15:19] <statik> rtgz, sorry I think you missed that
[15:19] <rtgz> statik, ok, checking...
[15:19] <statik> I recommend making a symlink named pbuilder-lucid that points to pbuilder-dist command
[15:19] <__lucio__> statik, sure, how?
[15:19] <statik> pbuilder-dist will automatically format a pbuilder for the ubuntu dist based on the basename of the script
[15:20] <statik> __lucio__, bzr builddeb with no -S
[15:20] <rmcbride> pbuilder-dist == Awesome sauce with internet frosting
[15:20] <statik> we could have just uploaded this source package directly to a PPA, but it's not so easy to iterate and fix mistakes that way
[15:20] <thisfred> W: python-django-openid-auth source: out-of-date-standards-version 3.8.0 (current is 3.8.3)
[15:21] <nessita> wow, it failed because of python-central and now it's bringing a lot of packages
[15:21] <rmcbride> thisfred: good eye :) you'll see that a lot (W:)
[15:21] <statik> you can only upload a version number to a PPA once, but in a local pbuilder you can rebuild and rebuild while you fix the warnings
[15:21] <statik> thisfred, great catch!
[15:21] <thisfred> rmcbride yeah, I usually ignore them :)
[15:21] <rmcbride> thisfred: most of the time you CAN ignore it, but it's pointing out thigns that could be better (sometimes if you need to backport it can be tricky)
[15:21] <statik> we should fix that warning (and I did in the real version of this I was working on last night)
[15:22] <rmcbride> and if you want it accepted for upload you should fix :)
[15:22] <statik> so to fix the warning, edit debian/control, change Standards-Version to 3.8.3, and rebuild the sourcepackage (bzr bd -S -- -k'blah'), and rebuild in the pbuilder
[15:22] <statik> hows everyone doing? questions?
[15:23] <rtgz> aha! pbuilder-dist lucid create... hm, did not see it here :(, though
[15:23] <facundobatista> statik, downloading stuff
[15:23] <statik> facundobatista, nessita: the downloading is normal, and it is cached so next time will be faster
[15:23] <nessita> statik: I'm still installing deps
[15:23] <statik> pbuilder runs a build system in a totally pristine chroot
[15:23] <rmcbride> all working well here
[15:23] <facundobatista> statik, ok, but hold on a couple of minutes :)
[15:23] <__lucio__> statik, when do we learn what all those files under debian are?
[15:24] <nessita> thisfred: where did yuo see that warning?
[15:24] <thisfred> nessita: when running bzr buillddeb
[15:24] <rmcbride> nessita: that happens when doing the bzr bd
[15:24] <thisfred> without the -S
[15:24] <rtgz> aha --mirror should be set to local mirror (i have a 2Mb/s connection to local mirror and 40Kb/s to some remote one)
[15:24] <rmcbride> either way
[15:24] <statik> __lucio__: debian/rules is the build script. in this case, it's a very simple passthrough to use all the defaults. somewhere there is a picture of all the stuff that is run through there, but for python packages with a setup.py, you usually don't have to care
[15:24] <statik> debian/changelog we already covered
[15:25] <statik> debian/control is where all the dependencies and the package description are listed. you normally don't have to edit the debian/control file very often
[15:25] <statik> debian/copyright is self-explanatory
[15:25] <statik> I already covered debian/watch
[15:26] <statik> debian/pycompat is useless, i have deleted it in the next version of the package
[15:26] <statik> debian/compat specifies the level of compatibility with debian packaging tools
[15:26] <statik> you should not change it unless you know what you are doing
[15:26] <__lucio__> statik, so, for every package i do, i just put a 6 there?
[15:26] <statik> i'll do a followup class to this one where we package something brand new
[15:27] <statik> __lucio__, for a new package, put a 7
[15:27] <statik> (i think)
[15:27] <__lucio__> grrr
[15:28] <statik> __lucio__, we covered a lot of new stuff here so i didn't go over creating a brand new package. there is an easy helper tool for that, and I think it won't be too bad when we work on that next time
[15:28] <rtgz> Erm... E: pbuilder-satisfydepends failed. Is there anything else required to set up pbuilder?
[15:28] <statik> rtgz, is it still running? pbuilder-lucid create should have been enough
[15:28] <__lucio__> statik, sure, im not really mad :)
[15:28] <rmcbride> rtgz: there should be another error indicating what it could not install
[15:28] <rtgz> statik,  pbuilder-satisfydepends-dummy depends on python (>= 2.5); however:
[15:28] <rtgz>   Package python is not installed.
[15:28] <statik> if you cheated and built the binary package directly, you should have a .deb in your parent dir
[15:29] <facundobatista> statik, so, my pbuilder-karmic build finished...
[15:29] <statik> rtgz, did it then continue on and install python into the pbuilder?
[15:29] <rtgz> statik, i have been cheating all the time, now I wanted to try to do it "properly"
[15:29] <__lucio__> statik, QUESTION: so, if i build binary extensions with srtup.py, will that also work by magic? (i dont even kwno if that can be done with setup.py)
[15:29] <statik> __lucio__, yes it will work
[15:30] <statik> __lucio__, for binary extensions you need to specify that the package has some arch-specific components that need to be compiled for each platform
[15:30] <__lucio__> statik, so i never have to worry about where stuff ends up? what if i want to put stuff in /srv? is this a topic for the next class?
[15:30] <statik> __lucio__, this is in debian/control, look at the Architecture: field
[15:30] <__lucio__> all
[15:30] <rtgz> statik, nope... i have to remind that i am on karmic machine, creating package for lucid: http://paste.ubuntu.com/369574/
[15:31] <statik> __lucio__, for putting stuff in /srv we have to break through the nice setup.py abstractions and go right into the guts of the packaging tools, so I'll save the ugly hacks for later :)
[15:32] <thisfred> man, this is really exciting, I've gotten it wrong so many times, without the help of these tools. It looks very doable now
[15:32] <statik> if your pbuilder finished, you should have a deb file in ~/pbuilder/lucid_result/
[15:32] <statik> so, install it and try it out! sudo dpkg -i ~/pbuilder/lucid_result/python-django-openid-auth_0.2-0ubuntu1_all.deb
[15:33] <statik> __lucio__, about where files get installed: this package is automatically been installed for both python2.5 and python2.6
[15:33] <statik> and byte-compiled for each
[15:33] <rodrigo__> statik, is there a way to get pbuilder to use a different dir than ~/pbuilder?
[15:33] <statik> you can see this in action: dpkg -L python-django-openid-auth shows some python files installed
[15:33] <rtgz> This  can  be changed by setting the $PBUILDFOLDER global variable
[15:33] <facundobatista> statik, it told me I don't have python-django installed... installing
[15:34] <rtgz> rodrigo__, ^
[15:34] <rodrigo__> rtgz, ah, cool
[15:34] <statik> but python2.6 -c "import django_openid_auth;print django_openid_auth.__file__" will show a different location of files
[15:34] <statik> this is magically handled by the python build tools
[15:35] <statik> rodrigo__, there are TONs of customizations available with pbuilder, way more than i know about
[15:35] <rodrigo__> ok, good to know, I didn't like it much because it filled my $HOME
[15:35] <thisfred> installed!
[15:36] <statik> you can register hook scripts to inspect inside the build system at critical points, make it use fancy volume snapshots to run much faster, etc.
[15:36] <rtgz> hm... I wonder whether it is because i don't have deb lines files for lucid, only deb-src...
[15:36] <facundobatista> rodrigo__, yes, it wasn't very polite!
[15:36] <__lucio__> statik, suppose we wanted to change something in the code, would we just patch it there, commit and repeat the builddeb step?
[15:36] <statik> __lucio__, mostly. you would look and see what patch system is being used already
[15:37] <statik> the what-patch command can do this
[15:37] <__lucio__> ptchless
[15:37] <statik> you don't patch the source directly, instead you store a series of patches in debian/patches/
[15:37] <statik> so for this package, it doesn't have any patch system already
[15:38] <__lucio__> what patch system options does it support?
[15:38] <statik> so I would use the new standard quilt patch system, by converting it to a 3.0 format sourcepackage
[15:38] <statik> mkdir debian/source ; echo '3.0 (quilt)' > debian/source/format ; dch 'Switch to dpkg-source 3.0 (quilt) format'
[15:38] <statik> bzr add debian/source
[15:38] <statik> and, some instruction on how to create and edit patches using quilt: http://pkg-perl.alioth.debian.org/howto/quilt.html
[15:38] <statik> this is really great to have a standard finally (launchpad just got support for this last month)
[15:39] <statik> quilt works well with git,  bzr, svn, everything
[15:39] <statik> and we're using the same standard as debian
[15:39] <joshuahoover> rmcbride, rtgz: are one of you testing bug #457147? i can do it but only have one laptop available to test on at the moment so i'd likely wait until this weekend to test
[15:39] <statik> 99% of the packages in the archive have not yet been converted to the new format, but you'll see it happening over the next year
[15:39] <rmcbride> joshuahoover: I can test it today, but not until after this class isover
[15:39] <statik> the old patchsystems were simple-patchsys, dpatch, and probably others
[15:40] <joshuahoover> rmcbride: k, thanks...sorry to interrupt the class statik and company :)
[15:40] <rtgz> joshuahoover, just got second laptop upgraded to lucid, need to wait until the interesting part here is over :)
[15:40] <statik> __lucio__, you can see an example of a patch that I did recently in the python-django package, if you do bzr get lp:ubuntu/python-django, and take a look at debian/patches/07*
[15:41] <statik> during package build, those patches are applied on top of the unpacked orig.tar.gz
[15:41] <__lucio__> statik, so, i branch the source tree, edit, get a quilt patch from there and somehow add it to the debian/source dir, right?
[15:41] <statik> yep
[15:41] <__lucio__> ok, not impossible :)
[15:41] <statik> when we were upgrading this package to a new upstream release, one of the things to do is look and see if there are any existing patches that have now been included in the new release
[15:42] <statik> so a common task is reviewing the list of patches, and seeing what can be dropped, and if anything that was forwarded upstream has been rejected in favor of a different solution
[15:42] <statik> we try to foster good relationships with upstream, and always forward patches
[15:43] <statik> so it's common to see an ubuntu developer refusing to sponsor an upload until you can point to where the patch has been sent to upstream (and preferably acknowledged and committed)
[15:43] <statik> we don't let upstreams hold us hostage though, the bottom line is we fix stuff for our ubuntu users whether upstream helps us or not
[15:44] <statik> so now that you've test installed the package locally and are happy that it builds and works ok, you can publish to your PPA
[15:44] <statik> I recommend changing the version number before uploading to the PPA
[15:44] <statik> my version number looks like this for the ppa: python-django-openid-auth (0.2-0ubuntu1~lucid1) lucid; urgency=low
[15:45] <statik> (thats in debian/changelog)
[15:45] <nessita> statik: what's the best way to do that? editing by hand?
[15:45] <statik> nessita, yes
[15:45] <statik> I always use ~<series>N
[15:45] <statik> this is because if you are working with 0.2-0ubuntu1
[15:45] <statik> and you want to put it in your ppa for hardy, jaunty, karmic, lucid
[15:45] <facundobatista> statik, we just modify the line there, or create a new "parragraph"?
[15:45] <__lucio__> statik, how do i apply the list of patches to my branch of the source tree? in the same way that the tools would do it, so i can run tests against it and stuff
[15:46] <statik> facundobatista, just modify the line
[15:46] <statik> __lucio__, quilt push I think
[15:46] <statik> that page about quilt has all the details. i'm still learning quilt
[15:46] <__lucio__> ok, so its jsut learning to use quilt, ok
[15:46] <statik> yep, quilt is now built into the tools themselves
[15:46] <nessita> statik: which line? older or newer?
[15:47] <statik> nessita: 0.2-0ubuntu1~karmic1 will be seen as a newer version than 0.2-0ubuntu1~hardy1
[15:47] <thisfred> and then debuild -S again and dput?
[15:47] <__lucio__> so, after i pushed to my ppa, suppose i want to release, should i remove the ~.* ?
[15:47] <statik> thisfred, exactly
[15:48] <statik> nessita, so using ~seriesN means that dist-upgrades between versions works ok
[15:48] <statik> __lucio__, no you should always have the ~ in your PPA versions
[15:48] <statik> you would only remove that if preparing an upload for ubuntu itself
[15:48] <facundobatista> thisfred, dput?
[15:49] <__lucio__> statik, so i end up with changes to no ~ when i upload, ~ back when i want to use my ppa, and so on and on?
[15:49] <thisfred> facundobatista: that uploads to your PPA, I'm sure statik's getting to it
[15:49] <rtgz> dpkg: dependency problems prevent configuration of pbuilder-satisfydepends-dummy
[15:49] <rtgz> still
[15:50] <nessita> statik: changelog updated
[15:50] <statik> once you have a sourcepackage that you are ready to upload to your ppa, do: 'dput ppa:username/ubuntu <packagename>.changes'
[15:50] <statik> hum
[15:50]  * statik doublechecks that command
[15:51] <statik> yep, i think thats correct
[15:51] <statik> launchpad will use the GPG signature on the sourcepackage to find your launchpad account, and decide if you have permissions to upload to that particular PPA
[15:52] <statik> you should get an email when launchpad has accepted the package, and other emails if the build fails
[15:52] <statik> questions?
[15:52] <rmcbride> statik: I have my ppa configured in locations.conf and use 'dput -f rmcbride-ppa <package>.changes'
[15:53] <rodrigo__> rmcbride: -f?
[15:53]  * rodrigo__ looks what -f does
[15:53] <rmcbride> not sure if the -f is needed.
[15:53] <nessita> statik: hum
[15:53] <statik> rmcbride: that works fine also. there are a bunch of default configs in /etc/dput.cf that make the ppa:foo stuff work
[15:53] <alecu> statik: <packagename>.changes ?
[15:53] <thisfred> I just dput without specifiying a ppa. I trust I have no rights to actually fnork ubuntu ;)
[15:53] <__lucio__> statik, <package>_source.changes?
[15:53] <nessita> statik: I ran dput ppa:nataliabidart/ubuntu-python-django-openid-auth.changes, and I've got Can't open ppa:nataliabidart/ubuntu-python-django-openid-auth.changes
[15:53] <statik> __lucio__, dput ppa:statik/ubuntu ../python-django-openid-auth_0.2-0ubuntu1_source.changes
[15:53] <rodrigo__> rmcbride, ah --force, I guess that's ok when you want to upload an existing package
[15:54] <rmcbride> rodrigo__: it totally is probably not necessary, the -f.
[15:54] <nessita> statik: ah, the blank character was *intented*
[15:54] <statik> rodrigo__, --force won't let you upload a package on top of something in the PPA, but if I'm uploading the same package to several different PPAs the --force tells dput: 'shut up and let me upload this again, i know what i'm doing'
[15:54] <rodrigo__> rmcbride, I've ran several times in dput refusing to upload a package with an existing version on my ppa, so good to know you can force it :-)
[15:54] <alecu> ok, it seems to work :-)
[15:54] <facundobatista> it worked
[15:55] <rodrigo__> statik, ah, it doesn't work on the same ppa?
[15:55] <statik> \o/
[15:55] <statik> rodrigo__, nope. you can never re-use a version number in a ppa
[15:55] <rodrigo__> ah, ok
[15:55] <facundobatista> statik, to which PPA was this uploaded?
[15:55] <rmcbride> that's why getting the version string right the first time is important
[15:55] <__lucio__> successfully uploaded
[15:55] <statik> so, you increment the last number: ~lucid1, ~lucid2, etc.
[15:55] <rmcbride> or one reason anyhow
[15:56] <statik> facundobatista, i hope you uploaded it to yours :)
[15:56] <statik> once the package is in your ppa, you can ask your colleagues and community to test it out
[15:56] <facundobatista> statik, I can have only one PPA in LP? or several?
[15:56] <statik> facundobatista, you can have several
[15:57] <thisfred> as many as you like, limited by disk space only
[15:57] <statik> facundobatista, for a project i recommend having a developer team for the project and setting up a ppa for that team
[15:57] <nessita> statik: how can I check I uploaded to *my* ppa? other than looking at the command history :-)
[15:57] <facundobatista> statik, I created only one PPA, test-learning-ppa, but I don't see the change in it
[15:57] <__lucio__> statik, i just did lucio.torre/ubuntu instead of the name of my ppa, where did i upload that?
[15:57] <rmcbride> nessita: your PPA is linked from your LP homepage
[15:58] <facundobatista> __lucio__, ah, "ubuntu" was the name of the PPA there?
[15:58]  * statik looks at lucio and facundo ppa pages
[15:58] <nessita> rmcbride: yes, but I don't have what I'ev just uploaded :-)
[15:58] <__lucio__> my ppa page says: You can upload packages to this PPA using:
[15:58] <__lucio__> dput ppa:lucio.torre/test-ppa <source.changes>
[15:58] <statik> ah perfect
[15:58] <statik> i can never remember how to format that
[15:58] <statik> I have some special config in my ~/.dput.cf
[15:59] <statik> [my-ppa]
[15:59] <statik> #fqdn = upload.dogfood.launchpad.net
[15:59] <statik> fqdn = ppa.launchpad.net
[15:59] <statik> method = ftp
[15:59] <statik> incoming = ~statik/ppa/ubuntu/
[15:59] <statik> login = anonymous
[15:59] <statik> allow_unsigned_uploads = 0
[15:59] <facundobatista> statik, ok, but I put "ubuntu", that is a PPA that I do not have, "dput" tells me that "Successfully uploaded packages.", and I don't know to where they were uploaded...
[15:59] <statik> so I always upload using 'dput my-ppa <source.changes>'
[15:59] <rmcbride> nessita: you're right it doesnt appear to be there
[15:59] <__lucio__> where would i find this thing i uploaded? i changed the line to lucio.torre/test-ppa and it says:
[15:59] <nessita> statik: so, I uploaded to a wrong PPA before, and now, when trying to use the correct one (ppa:nataliabidart/packaging-class) I've got Already uploaded to ppa on ppa.launchpad.net
[15:59] <__lucio__> Already uploaded to ppa on ppa.launchpad.net
[16:00] <statik> facundobatista, what is the exact command you typed? It probably uploaded to the main ubuntu archive, and will get rejected
[16:00] <nessita> heh
[16:00] <verterok> facundobatista, nessita, __lucio__: you will get an email soon
[16:00] <__lucio__> but i cant findh "changes" on the ppa page
[16:00] <rmcbride> nessita: what dput line did you use?
[16:00] <statik> nessita, use the --force option
[16:00] <rmcbride> heh "use the --force"
[16:00] <nessita> rmcbride: before I used dput ppa:nataliabidart/ubuntu ../python-django-openid-auth_0.2-0ubuntu1~karmic1_source.changes
[16:00] <nessita> but ppa:nataliabidart/ubuntu is not a PPA of mine :-D
[16:00] <alecu> facundobatista, I got an email saying my ppa did not exist.
[16:00] <nessita> --force works
[16:01] <statik> alecu, enabling a ppa has to be done manually, because you have to agree to some terms of service or code of conduct or something
[16:01] <alecu> statik, sure, I was wondering about that.
[16:01] <statik> there is a bunch of info here: https://help.launchpad.net/Packaging/PPA
[16:02] <statik> one last thing
[16:02] <statik> when I was doing this work for real last night, my end result was a merge proposal into ubuntu
[16:02] <mandel> thisfred, ping
[16:02] <thisfred> mandel: pong
[16:03] <facundobatista> alecu, an email from who?
[16:03] <statik> so as a last step I ran debcommit, pushed my branch to launchpad, and then proposed a merge. you can see my merge proposal here: https://code.edge.launchpad.net/~statik/ubuntu/lucid/python-django-openid-auth/new-upstream-version/
[16:03] <statik> and thats everything I prepared!
[16:03] <mandel> thisfred, question for you, why is it not the new rev number returned when we do put_record in desktopcouch???
[16:03] <thisfred> statik: awesome!
[16:03] <statik> i hope this was useful, and I will answer questions as long as you want
[16:03] <alecu> facundobatista, from "launchpad ppa"
[16:03] <rmcbride> Thanks statik!
[16:03] <thisfred> statik: thanks for this, I think I'm a step closer to the training wheels coming  off!
[16:04] <rmcbride> or the wheels anyhow ;)
[16:04] <thisfred> mandel: let me have a look
[16:04] <__lucio__> statik, what does debcommit do? why not just bzr commit?
[16:04] <nessita> statik: this was very interesting and fun, thank you!!!
[16:04] <statik> welcome :)
[16:04] <facundobatista> alecu, I don't have a rejection email :|
[16:05] <statik> __lucio__, I think debcommit pulls all the content of the commit message out of the changelog, and sets some extra metadata like parsing bug numbers and tying them to the branch
[16:05] <alecu> facundobatista, I've fwd you mine, so you can see how they look :-)
[16:05] <facundobatista> statik, now I try to upload it again to *my* ppa, and...
[16:05] <statik> debcommit is standard tool to use whether you are packaging with svn, git, or bzr. it has a bunch of hooks to do the right thing
[16:05] <facundobatista> $ dput ppa:facundo/test-learning-ppa ../python-django-openid-auth_0.2-0ubuntu1_source.changes
[16:05] <facundobatista> Already uploaded to ppa on ppa.launchpad.net
[16:05] <statik> facundobatista, --force
[16:05] <mandel> thisfred, is stupid to get the record again and not return the rev... I mean I cannot do something like put_record(record) twice with no conflict... kinda lame
[16:06] <statik> all dput is doing is noticing this file: python-django-openid-auth_0.2-0ubuntu1_source.ppa.upload
[16:06] <statik> you can delete that file, or use the --force option to ignore it
[16:06] <mandel> thisfred, it would be great to get id and rev, we already have it... and another look to the db would eb a waste
[16:07] <thisfred> mandel: I would ask CardinalFang when he's around, but I tend to agree. Actually I think put should return the whole record
[16:07] <facundobatista> statik, it tells me that everything ok, but if I go to my PPA's page, I don't see anything: https://edge.launchpad.net/~facundo/+archive/test-learning-ppa
[16:07] <thisfred> if anything
[16:07] <verterok> facundobatista: give it some time :)
[16:07] <facundobatista> verterok, oh, ok
[16:07] <statik> facundobatista, there is a delay of a few minutes, and then depending on how much traffic backlog there is, it can take a while for the build to complete
[16:08] <facundobatista> perfect
[16:08] <thisfred> mandel: I don't like the way python-couchdb solves this: it manipulates the argument to the put. I don't want us to do that
[16:08] <statik> there is a farm of build daemons running the PPAs for the entire ubuntu and all developers and PPA users
[16:08] <nessita> statik: how can I remove a ppa of my own?
[16:08] <statik> nessita: you want to delete the whole ppa? or just a package out of the ppa?
[16:08] <nessita> statik: a whole PPA
[16:08] <mandel> thisfred, I was going to mention that as an other option
[16:09] <nessita> statik: I created two, one was by mistake
[16:09] <facundobatista> statik, thanks for all this... where can I learn how to do a similar process to this, but for a project that never had a package created?
[16:09] <__lucio__> mmh.. looks like i never signed the ubuntu code of conduct.. is that bad?
[16:09] <statik> nessita, i'm not sure. if you don't see an option on your launchpad page to delete it, then go to launchpad.net/launchpad and file a 'question' asking for it to be removed, and the launchpad admins will take care of it
[16:09] <mandel> thisfred, but certainly not returning the _rev is a pain, I was going to show some examples during the weekend and I know is going to be mentioned :(
[16:10] <statik> facundobatista, I'll do another class covering making a totally new package for a python module
[16:10] <nessita> statik: thank you
[16:10] <facundobatista> statik, great! thanks
[16:10] <statik> facundobatista, how about next friday? i was thinking to use python-whisper as an example, it's a package i'm working on right now for lucio
[16:10] <__lucio__> facundobatista, yes, and also, statik will send an email with more than 5 hours notice so we can prepare our environments. right? :)
[16:10] <statik> your environments are already prepared :)
[16:10] <__lucio__> statik, next friday sounds great
[16:11] <facundobatista> statik, +1 to next friday
[16:11] <thisfred> mandel: if you want to propose a merge that returns the whole record, I will approve it. If you don't have time, I may get to it, but maybe not before FOSDEM
[16:12] <statik> i hope today introduced enough of the tools that next week when looking at a new package we'll be able to focus more on the files we are writing in debian/, rather than having to learn a bunch of new tools for the first time
[16:12] <thisfred> mandel: in either case filing a bug would be greatly appreciated
[16:12] <alecu> statik, thanks a lot. It still feels like debian packaging is a very complicated bureaucracy. :-)
[16:12] <rodrigo__> mandel, fosdem is tomorrow, right?
[16:12] <facundobatista> statik, please, tell us in advance what we need to do in *the project* to attend the class
[16:12] <mandel> thisfred, I'll do the patch and file the bug
[16:13] <mandel> rodrigo__, yes, are u here??
[16:13] <statik> alecu: it is. this is good because it enforces quality, it is bad because many people find it frustrating to learn so many details. There is a project getting started called cambria which wants to make packaging easier for upstream developers or casual contributors: https://launchpad.net/cambria. also, many of the core tools are slowly getting better, what I have shown you today is LOADs better than how it was 6 months ago
[16:13] <thisfred> mandel: awesome! ping me anytime, and I'll do a review, and blacmail/beer someone else into doing the second one
[16:13] <mandel> anyone going to FOSDEM late me know and we will go for drinks :D
[16:13] <mandel> thisfred, superb, on it right now
[16:13] <thisfred> I wish :) Belgian beer, hmmm
[16:13] <rodrigo__> mandel, I'm here, in my house, yes :-)
[16:14] <statik> facundobatista, you don't need to do anything in the project at all. upstream has already released a tarball on pypi, and it has a simple setup.py already. we'll cover turning it into a debian pacakge
[16:14] <facundobatista> statik, ok
[16:14] <alecu> statik, yes, I know it was worse before :-)
[16:14] <mandel> rodrigo__ next time, te invito a mi casa :P
[16:14] <rodrigo__> mandel, si, a ver si el año que viene :-)
[16:14] <statik> facundobatista, usually when i am packaging something the first thing i have to tell to upstream is "dude! please take 5 minutes and do a release tarball"
[16:15] <thisfred> Actually I can buy quite a number of belgians here. Including to my surprise my favorite, Poperingse Hommel!
[16:15] <statik> it's amazing how many people write great software but don't bother to cut a release
[16:15] <facundobatista> statik, do you know a tutorial for "your first release tarball"?
[16:15] <mandel> thisfred, is that in nl?
[16:15] <thisfred> it's like writing documentation: it doesn't scratch your own itches
[16:16] <thisfred> mandel: no in Baltimore :)
[16:16] <statik> facundobatista: i will have to ask someone with commit rights to the python project who really should know how distutils works for that class :)
[16:16] <thisfred> mandel: in NL I could get everything
[16:16] <rodrigo__> thisfred, judas (is it Belgian, right?) was my favorite, although a bit strong :-)
[16:16] <mandel> thisfred, hehe I was not expecting that answer
[16:16] <statik> facundobatista, it's basically just setup.py, then run setup.py sdist
[16:16] <mandel> rodrigo__, yes, judas is belgian
[16:16] <thisfred> rodrigo__:  I think that's Belgian yes, I don't think I've had it
[16:17] <rodrigo__> it's a bit too strong, but tastes very nice
[16:17] <thisfred> rodrigo__: sounds like it's one of those devil beers, like Duvel and Satan
[16:17] <__lucio__> statik, nessita asked about who writes the summary  :)
[16:17] <statik> sure, i don't mind if nessita writes a summary
[16:17] <rodrigo__> thisfred, yeah, at least the names are similar, yes :-)
[16:17] <statik> we also have IRC logs captured on this channel i think
[16:17] <thisfred> rodrigo__: there's a local brewery which has a beer inspired by those called Ozzy :)
[16:17] <rodrigo__> :)
[16:18] <facundobatista> statik, ok :)
[16:18] <nessita> __lucio__: -.-
[16:18] <__lucio__> :D
[16:18] <statik> any other questions about what we did today?
[16:18] <nessita> __lucio__: you'll be doing my reviews? :-D
[16:18] <__lucio__> nessita, dont be lazy
[16:18] <thisfred> A lot of the american beer is *very* good actually. Just not the big brand lagers, mostly
[16:18] <mandel> thisfred, returning record will take me longer than I though, it brakes 15 tests...
[16:19] <thisfred> ah..
[16:19] <statik> __lucio__, nessita: more seriously; i didn't plan  to write a summary because its a lot of work to turn an interactive session into something that is generic and complete enough to be useful
[16:19] <facundobatista> thisfred, a lot of american beer is good, actually, most of them are not from US
[16:19] <statik> we assumed a lot of knowledge, and answered very specific questions
[16:19] <thisfred> facundobatista: sorry, yes I meant US
[16:19] <facundobatista> thisfred, :)
[16:19] <nessita> statik, __lucio__: but we can select parts of this chat  with xamples and pusblish them on the public wiki
[16:20] <nessita> so anyone can follow those
[16:20] <thisfred> facundobatista: I have not sampled the rest of america's beer yet, but I'm sure I'll get to it ;)
[16:20] <nessita> and set aside the questions, with their answers
[16:20] <statik> sure
[16:20] <facundobatista> thisfred, it's a dirty job, but somebody needs to do it
[16:20] <thisfred> hehe
[16:20] <nessita> ok, I'll do it :-)
[16:21] <__lucio__> habemus package in ppa
[16:21] <__lucio__> AWESOME
[16:21] <thisfred> mandel: but the tests expect an id, so it should be as easy as substituting result with result['_id'] or something right?
[16:22] <mandel> thisfred, yes, I'll do result.record_id which is nicer
[16:22] <thisfred> mandel: agreed
[16:23] <rodrigo__> thisfred, in Spain  the best beer, IMO, Mahou, is just sold in a few places, while the worst (San Miguel) is sold all over the world, so I guess the big brand lagers are just like MS, very good marketing :-)
[16:24] <rtgz> it turned out that my system does not suspend/hibernate due to my workaround made @ 2009-06-28 to rmmod ath_pci module w/o checking that it is loaded :)
[16:24] <joshuahoover> rtgz: heh
[16:25] <mandel> thisfred, good ting that Chad used record_id as the name of the id everywhere, so much easier to fix ;)
[16:25] <rtgz> so, syncdaemon reconnects perfectly
[16:25] <joshuahoover> rtgz: i'm trying to get a hold of dobey (probably traveling or too early yet on the west coast) but maybe you can give some insight...i'm confused by bug #492100 which appears to fix something that breaks when the patch for bug #491777 is applied, but 491777 is not targeted for an sru...should it be? if not, should 492100 go in the sru?
[16:26] <rtgz> joshuahoover, hm... i can try to suspend while file is syncing... hmmm.
[16:30] <rodrigo__> statik, btw, did you upload couchdb-glib/evo-couchdb for lucid?
[16:37] <rtgz> yes, reproduced the bug with clicking on applet and apport will collect the error
[16:38] <rtgz> when applet tries to receive the info from syncdaemon about current transfers and syncdaemon is not ready to answer, the timeout exception is raised and apport starts to collect info
[16:39] <rtgz> no strange errors in syncdaemon.log and oauth-login.log
[16:41]  * rtgz had to reboot his laptop because second suspend left him w/o keyboard support for more than 1 second of operation in all x terminal emulators...
[16:44] <rtgz> hm, but the file does not get uploaded after the resume... the state is STANDOFF_WAITING_WITH_NETWORK_WITH_BOTHQ
[16:47] <rtgz_> joshuahoover, 2010-02-05 18:45:42,044 - ubuntuone.SyncDaemon.Main - NOTE - ---- MARK (state: STANDOFF_WAITING_WITH_NETWORK_WITH_BOTHQ; queues: metadata: 1; content: 1; hash: 0, fsm-cache: hit=3911 miss=375) ----
[16:47] <rtgz_> joshuahoover, what does this state mean?
[16:47] <mandel> thisfred, done
[16:48] <thisfred> mandel: awesome!
[16:48] <mandel> thisfred, but report https://bugs.launchpad.net/desktopcouch/+bug/517676
[16:48] <rtgz_> joshuahoover, i was testing bug #457147
[16:48] <joshuahoover> rtgz_: not sure, maybe verterok, nessita, or facundobatista can help us? ^^
[16:48]  * verterok looks
[16:48] <rtgz_> joshuahoover, it does not crash anymore
[16:48]  * thisfred rveiews https://launchpad.net/~mandel/desktopcouch/fix_bug_517676/+merge/18708
[16:48] <verterok> jamesh: wasup?
[16:48] <joshuahoover> verterok: thank you
[16:49] <verterok> ups
[16:49] <verterok> joshuahoover: whats up?
[16:49] <rtgz_> verterok, STR:
[16:49] <joshuahoover> verterok: rtgz_ is testing bug #457147
[16:49] <pygi> statik: poke
[16:49] <verterok> joshuahoover, rtgz_: hmm, states
[16:50] <verterok> joshuahoover, rtgz_: facundobatista and nessita are going to work on states, should ask them about it ;)
[16:50] <joshuahoover> verterok: heh, fair enough :)
[16:50] <verterok> facundobatista, nessita: ^ states issue
[16:51] <rtgz_> facundobatista, nessita, verterok, 1. start ubuntuone, verify that it is working and it is ready for file uploads; 2. put file to the directory, wait until it is picked up for upload. 3. verify that upload is started by u1sdtool --current-transfers; 4 before it compeltes, suspend the system. 5. drink some tea; 6. unsuspend the system, wait until nm connects. Observe 0 download/uploads and the client stuck in STANDOFF_WAITING_WITH_NETWORK_WITH_BOTHQ sta
[16:51] <rtgz_> te
[16:52] <nessita> rtgz_: we're aware of this issue, but what system are you running?
[16:52] <rtgz_> nessita, karmic host, karmic-proposed version of ubuntuone
[16:54] <rtgz_> and that's it, new files don't get uploaded, the client is just WAITING for something...
[16:54] <joshuahoover> rtgz_, rmcbride: fyi...bug #457564 has steps to reproduce/test and it's not passing :( the "never" display icon pref works fine w/ the fix but then we switch back to "always" display and it doesn't come back until the client is restarted...adding comment to note this
[16:55] <joshuahoover> nessita: rtgz_, rmcbride and i are trying to verify sru fixes (proposed updates)...just to give you some context about why you're getting these questions :)
[16:55] <nessita> joshuahoover: thank you, I'm a bit lost indeed
[16:55] <rtgz_> joshuahoover, hm... it does work for me, the icon hides and shows itself according to the preference set
[16:56] <nessita> joshuahoover: so, could you please start from the beginning? :-)
[16:56] <joshuahoover> rtgz_: care to take nessita through the beginning? the test you're running, the results you're getting, and what you expect to get
[16:56] <rtgz_> nessita, where should i mark the beginning as?
[16:57] <nessita> rtgz_: to the first thing, of course :-)
[16:57] <nessita> where do we come from?
[16:58] <nessita> rtgz_: I understand you're doing some QA on SRU fixes, is that so?
[16:59] <rtgz_> nessita, ok, so. I was testing the bug 457147. Since it did not want to fail, i decided to stress it a little bit further. I connected to ubuntuone, then put a 4Mb file to my directory. When file started to upload I suspended the system and unsuspended it after a minute. When system returned from suspend and nm restarted network connection the syncdaemon become stuck in  STANDOFF_WAITING_WITH_NETWORK_WITH_BOTH state. It does not upload old file and new
[16:59] <rtgz_>  files do not get uploaded as well
[17:00] <rtgz_> joshuahoover, the client is no longer crashes after suspend so the bug looks like fixed.
[17:01] <nessita> rtgz_: ok, that problem is a bit different than the original error reported
[17:01] <joshuahoover> rtgz_: ok, so we need a new bug (or find an existing one) for the fact that it doesn't continue to upload after resume but can pass 457147?
[17:01] <rtgz_> nessita, yes, it is just I found another problem while testing the original issue
[17:02] <nessita> rtgz_: I'd say that we'd need a new bug reprot for that, and we well working on the fix rght after UDF. This issue is a consequence of a very complicated state machine that we're gonna re-do starting next week
[17:02] <rtgz_> nessita, ok, i will file a bug
[17:02] <nessita> rtgz_: thank you. Please assign it to me, as confirmed. Please add those steps you described
[17:02] <dobey> hmm
[17:03] <thisfred> mandel: 1st review done, and second one promised after lunch
[17:03] <nessita> rtgz_: we looove that details, and put logs, and all :-)
[17:03] <mandel> thisfred, superb
[17:03] <rtgz_> nessita, but what exactly STANDOFF_WAITING_WITH_NETWORK_WITH_BOTHQ means? :)
[17:03] <joshuahoover> rtgz_: it's a secret ;)
[17:04] <mandel> thisfred, got another complain... I'd love to be able to do put_record(record), remove_record(record.record_id) and later put_record()
[17:04] <verterok> rtgz_: I "think" it's "I have network, have items in both queues (metadata and content), but can't connect and waiting for a retry"
[17:04] <mandel> thisfred, sounds stupid, but when apps use something that is not auotgenerated id that trace will raise a conflict exception since we do not delete records and just flag them
[17:05] <nessita> rtgz_: the exact meaning is only known by Chipaca, but I includes that the syncdaemon has network, has metadata an content to upload, and that is waiting for some connection cleanup to reconnct"
[17:05] <nessita> it* includes
[17:05] <nessita> heh, at least verterok and I said *almost* the same!
[17:06] <nessita> verterok: ;-)
[17:06] <verterok> nessita: :)
[17:06] <rtgz_> nessita, ok, so it needs something that will poke it :)
[17:06] <verterok> nessita, rtgz_: from the code: "wait for SYS_CONNECTION_LOST, then go on to connect"
[17:06] <verterok> so, it's waiting to get the connectionLost event from twisted
[17:07] <nessita> verterok: it will not receive it... poor thing
[17:07] <verterok> but Chipaca will know for sure what that means :)
[17:07] <rtgz_> ok, here's the log http://paste.ubuntu.com/369612/ :)
[17:08] <thisfred> mandel: hmm, yeah I hadn't thought about that
[17:08] <mandel> thisfred, this keeps bothering me a lot https://bugs.launchpad.net/desktopcouch/+bug/462245
[17:09] <mandel> thisfred, i reported but I do defend my position correctly hehe but know that i think about not autogenerated ids it makes sense
[17:10] <Chipaca> the exact meaning is known only to me?
[17:10] <Chipaca> oh, come on
[17:10] <Chipaca> rtgz_: nessita was right on the money
[17:11] <Chipaca> the "waiting for some connection cleanup" is more a "waiting for connections to finish dying"
[17:11] <nessita> Chipaca: the "STANDOFF" confuses me a lot
[17:11] <Chipaca> nessita: BACKOFF might've been a happier choice of words
[17:11] <thisfred> FRAKOFF
[17:13] <thisfred> mandel: I think that one you're going to have to fight out with chad ;)
[17:14] <thisfred> mandel: a solution could be to do r = delete_record(id), but I don't like that, as the fact that we only mark records as deleted is an implementation detail
[17:14] <mandel> thisfred, I'll when ever I see him, is more a philosophical argument than anything else
[17:15] <thisfred> mandel: perhaps we should rename deleted records: id += '-deleted'
[17:15] <thisfred> mandel: right
[17:15] <mandel> thisfred, problem I see is that the implementation should hide that detail if I remove and do not care want you do but I should be able to use the same id since it does not "exist"
[17:16] <thisfred> mandel: yep
[17:16] <thisfred> I agree, but I don't see an easy solution
[17:16] <thisfred> I hope we can move away from the marked instead of deleted soon
[17:16] <joshuahoover> rtgz_, rmcbride: sigh...another one failed, this time bug #465030 (bandwidth throttling prefs related)
[17:16] <mandel> thisfred, I know is for "back_up" purposes to mark it as deleted  we should just create a new revision but then if the db is compacted you loose the data
[17:19] <rtgz_> joshuahoover, erm... it behaves weird here, download speed is set to 0, upload speed might not be saved, etc...
[17:19] <joshuahoover> rtgz_: yeah, i filed a bug about this separate from this one...let me find it
[17:22] <rtgz_> joshuahoover, hm... if the preference window is left for some time then the changes are saved...
[17:22] <joshuahoover> rtgz_: ummm...that's strange...that might explain the behavior i was seeing in the beta ppa...it was acting very funny
[17:22] <rtgz_> joshuahoover, no
[17:22] <rtgz_> ha
[17:22] <rtgz_> got it
[17:24] <rtgz_> the save is performed only when a field loses focus
[17:24] <joshuahoover> rtgz_: ah, good catch!
[17:25] <mandel> CardinalFang, ping
[17:25] <CardinalFang> mandel, hi
[17:26] <mandel> CardinalFang, hello! how are things, can I try and convince you of something??
[17:27] <joshuahoover> rtgz_: that would explain the strange behavior i was seeing then...couldn't figure out why it didn't appear to behave consistently
[17:27] <CardinalFang> mandel, It is early, but I can perhaps be persuaded to alter my cognitive state by a smidgen.  Please, proceed.
[17:29] <mandel> CardinalFang, lets say I have an app that uses ids given by the user.. the user gives and id and we add the doc to the db, later he deletes it. After a while he forgets and decides to create a new doc with the same id... and gets a conflict error can we fix that ;)
[17:29] <joshuahoover> rtgz_: if i take focus off each field then they save properly, just as you said :) i'll note this in a comment so that it helps get the problem fixed
[17:29] <mandel> CardinalFang, ah, I nearly forgot, take a look at this: https://bugs.launchpad.net/desktopcouch/+bug/517676
[17:31] <rtgz_> joshuahoover, so now the bug is "Bandwidth limit preference requires strange human behavior to be saved" :)
[17:31] <joshuahoover> rtgz_: i always take focus off my input fields when i want them to save...what are you saying about me? ;)
[17:32] <CardinalFang> mandel, Hrm.  What do you propose for the first?   try, send_record(new), except ConflictError: old = get_record(); new._rev = old._rev; send_record(new) ?
[17:32] <dobey> huh?
[17:32] <CardinalFang> mandel do that IFF the old record is deleted?
[17:32] <rtgz_> joshuahoover, i imagine that for, say, tomboy note... write, click other window and only then the note is saved :)
[17:32] <CardinalFang> so, except ConflictError, if deleted: get old and update new and send?
[17:32] <joshuahoover> rtgz_: heh, right
[17:33] <mandel> CardinalFang, yes, I was thinking about that
[17:33] <CardinalFang> aquarius, ^ ?
[17:34]  * aquarius reads
[17:34] <mandel> is a possible scenario, since the fact that is flagged is just an implementation detail
[17:34] <aquarius> the problem is our stupid delete thing.
[17:35] <aquarius> if we actually deleted "deleted" records, there wouldn't be anything to comflict with :(
[17:35] <aquarius> am starting to think...maybe we should actually delete things.
[17:35] <CardinalFang> Ah.  try, save, except conflict:  really delete; save again;
[17:36] <CardinalFang> Ah.  try, save, except conflict:  get old, and if deleted then really delete; save again;
[17:36] <CardinalFang> ...commit transaction.  ha
[17:37] <mandel> I really do not know why keep the deleted ones... anyway, is it not more efficient to do if deleted: update else put
[17:38] <mandel> waiting for the conflict adds more requests and more changes to get it wrong if there are more than one app looking at the db, also,  how will this be notified to other apps?
[17:41] <aquarius> mandel, originally we kept "deleted" records because there are sync problems if you don't; you can't tell the difference between "I used to have this and now it's gone" and "I've never had this", which matters in certain situations
[17:41] <aquarius> but...I'm not sure those situations apply
[17:41] <CardinalFang> I am trying to cope with aquarius' suggestion that we could actually delete.  I think we should give him some time to consider this.
[17:42] <mandel> aquarius, I think the best option right now is to perform the check and catch the conflict before it gets to the client app
[17:43] <rtgz_> verterok, is bug 487257 originates from the same problem i created when i resumed the system after suspend? http://paste.ubuntu.com/369612/ line 2376 clearly shows that twisted detected the disconnect.
[17:43]  * verterok looks
[17:44] <thisfred> aquarius: the mark as deleted is not because of replication, which AFAIK is able to handle deletions. It's just a hackish placeholder for versioning, that's never actually used
[17:44] <rtgz_> verterok, i don't want to create duplicate reports :)
[17:44] <thisfred> aquarius: I would be +100 on getting rid of the hack.
[17:44] <thisfred> let the people delete their data!
[17:45] <verterok> rtgz_: yeap, that's the issue
[17:45] <rtgz_> verterok, ok, will add my logs and STR there, thanks
[17:45] <verterok> rtgz_: that's the same bug triggered by a different condition :)
[17:45] <joshuahoover> rtgz_, rmcbride: bug #492100 passed
[17:45] <verterok> rtgz_: ok
[17:49] <mandel> CardinalFang, thisfred, I got an other bug for you ;) I'll send the patch in a second: https://bugs.launchpad.net/desktopcouch/+bug/517706
[17:49] <mandel> CardinalFang, thisfred, let me know if I'm right or I just reported a stupid bug...
[17:49] <thisfred> mandel: that's not a bug
[17:49] <thisfred> I think
[17:50] <mandel> thisfred, so how does an app add annotations then??
[17:50] <thisfred> mandel: you should not use record['application_annotations']
[17:50] <thisfred> it's not part of the normal record fields
[17:50] <thisfred> you use record.applications_annotations['my_app']
[17:50] <thisfred> which will be created for you if it isn't there
[17:51] <mandel> thisfred, .... so I'm stupid, cool
[17:51] <thisfred> mandel: no: our documentation is lacking
[17:54] <mandel> thisfred, I don't think so... I was just trying to be smart and access it directly since I know it is there :P
[17:54] <thisfred> mandel: you can do that by going through ._data if you absolutely want to ;)
[17:54] <thisfred> I explained on the bug as well, so that it's googleable
[17:56] <mandel> thisfred, I know the _data trick... specially when I make my own MergeableLists
[17:56] <mandel> CardinalFang, any luck with the remove method for those ^ want me to do that?
[17:56] <rtgz_> joshuahoover, ok, i have performed verification for bug #457147, should I adjust tags?
[17:57] <aquarius> I have mused on the idea of actually deleting records rather than marking them deleted, and I can't think of any incredibly good reasons why to not do it, these days. I'd like to hear comments from all of thisfred, teknico, urbanape, and CardinalFang, though...
[17:58] <thisfred> aquarius: I think the marking as deleted was a big fat YAGNI
[17:58] <CardinalFang> mandel, you do that.  I'm on something a getPort problem today.
[17:58] <CardinalFang> thisfred++
[17:59] <thisfred> aquarius: the *only* downside is that if people delete stuff, it gets deleted, and there's no getting it back. I think we can justify that
[17:59] <aquarius> thisfred, can you remember why we were insistent on not actually deleting? My memory has failed :(
[17:59] <thisfred> aquarius: although we have to be slightly cautious
[18:00] <thisfred> aquarius: phonesync removing all people's contacts, like mobileme did for statik :)
[18:00] <thisfred> aquarius: buggy apps happen, and if all apps talk to the same db, having an undo is nice
[18:00] <aquarius> thisfred, indeed. that wasn't the only reason we didn't delete things, though
[18:00] <thisfred> aquarius: but we don't actually, other than mucking about in futon
[18:00] <thisfred> aquarius: I think it was
[18:01] <aquarius> there was something around syncing and contacts that required keeping old ones around, I'm sure
[18:01] <aquarius> but I don't think it applies any more
[18:02] <thisfred> aquarius: I'm pretty sure _changes will solve such concerns if they were there
[18:03] <aquarius> thisfred, yeah, that was the conclusion I came to -- if you care about the difference between "never there" and "not there now", you should watch _changes
[18:20] <mandel> need to go, beer is calling, have a nice weekend!
[18:21] <rmcbride> joshuahoover: rtgz_ bug #457147 passes
[18:22] <rmcbride> ah rtgz_ beat me to it
[18:22] <rtgz_> rmcbride, verified one bug, found another...
[18:22] <rmcbride> rtgz_: yea I see that. Can't duplicate the STANDOFF state
[18:22] <rmcbride> but I've seen it in the past
[18:23] <rtgz_> rmcbride, i have reproduced it twice and one more system lock-up, but i don't think that's related :)
[18:25] <chewit> hi, i am having problems syncing my files?
[18:25] <chewit> ubuntu one thinks it is up-to-date, but its far from it
[18:32] <rtgz_> chewit, hello. Could you please run this script (in the terminal) to see what files are still not done: http://launchpadlibrarian.net/36063440/u1sdstatus.py
[18:33] <chewit> one minute, i decided to start it again, so i will run that script in a sec
[18:33] <rtgz_> chewit, and it would be nice if you could post the contents of ~/.cache/ubuntuone/log/syncdaemon.log to http://paste.ubuntu.com for us to see what actually happens.
[18:33] <chewit> ok
[18:34] <chewit> there is quite alot though
[18:35] <chewit> when is 1.0.3 of Ubuntu one coming out? that may solve my problems
[18:35] <rtgz_> chewit, additionally, you can check whether there is any download/upload going by excuting u1sdtool --current-transfers
[18:36] <chewit> brb
[18:36] <duanedesign> hello rtgz_ . HOw did the testing/documenting of ubuntuone-client and ubuntuone-storage-protocol in karmic-proposed go?
[18:37] <duanedesign> been busy last 24 hours so i am just now reading the scrollback :)
[18:38] <rtgz_> duanedesign, the battle was long and exhaustive
[18:39] <duanedesign> ha ha. I noticed there had been a lot of activity when i finally was able to sit down at my computer a couple  hours ago
[18:40] <rtgz_> great
[18:40] <rtgz_> The applet says "Updating 6 of 5 files..."
[18:40] <rtgz_> it looks like the applet lives its own life
[18:41] <duanedesign> :)
[18:44] <rtgz_> duanedesign, i am creating a note containing the fixes that were mentioned (ubuntuone-client, did not trace storage-protocol so much, sorry)
[18:44] <CardinalFang> It needs to un-update 1 file, and then it will be finished.
[18:46] <chewit> do you know when Ubuntu one client 1.0.
[18:47] <chewit> 3 will be out of ubuntu proposed
[18:49] <rtgz_> rmcbride, bug 455527 - was the result OK or not?
[18:50] <rtgz_> chewit, we are currently evaluating the fixes that were committed to 1.0.3, i mean at this very moment.
[18:50] <rtgz_> rmcbride, joshuahoover how about creating a Wiki Page for the results of the check so that it becomes a proper document?
[18:50] <chewit> great, cause ubuntu one has been fairly problematic for a few weeks now. I moved back to dropbox for a about week while some stuff on the servers were sorted
[18:52] <rmcbride> rtgz_: looking again. I think I did my last entry on that right before EOD for me
[18:53] <rmcbride> rtgz_: yea it's definitely fixed. I'll make a more clear entry
[18:54] <rmcbride> rtgz_: I'll leave the wiki decision up to joshuahoover
[18:56] <chewit> also, just out of interest. is work being do on the web interface (multi file uploading, fixes to the layout)
[18:56] <chewit> done*
[18:58] <dobey> chewit: if the proposed update works for you, please comment on the bugs you are experiencing, saying the update fixes the issue for you
[18:58] <dobey> chewit: this will help get it through the system faster :)
[19:02] <joshuahoover> rtgz_, rmcbride: catching up...was eating lunch w/ some friends
[19:02] <chewit> ok
[19:03] <chewit> btw, the sync seems to worked this time, thanks for you help
[19:03] <joshuahoover> rtgz_: so, you're thinking we should have a wiki page for capturing the results of the tests or something else?
[19:05] <rtgz_> joshuahoover, i have a tomboy gnote that lists all the bug reports
[19:07] <joshuahoover> rtgz_: right, so a list of all the bugs for this sru and a status on the testing (pass/fail plus notes if failed) sort of thing?
[19:07] <rtgz_> joshuahoover, yup
[19:07] <joshuahoover> rtgz_: sure, i'll throw that together right now so we can use it to track progress
[19:07] <rtgz_> joshuahoover, just taking the release note and add PASSED/ why not
[19:08] <duanedesign> is the SRU list for the package created from the changelog?
[19:08] <rtgz_> joshuahoover, or, we might use lp tags to assign e.g. 'release-1.0.1' + 'verification-done' + 'verification-failed' etc.
[19:09] <rtgz_> joshuahoover, just need to standardize on the tags :)
[19:09] <joshuahoover> rtgz_: we can do that but probably not as convenient to be able to look at all the bugs in one spot and see pass y/n and a reason if no
[19:10] <rtgz_> joshuahoover, but we could build a script that will turn into a wiki page in the end :)
[19:10] <joshuahoover> rtgz_: true
[19:10] <joshuahoover> rtgz_: maybe for now we do the wiki and then we come up with a better way after this round? i already know i need to write up a bit about how we handle some of this stuff...would like to work with you on it
[19:16] <rtgz_> joshuahoover, ok, it's just i haven't written anything useful with lp api yet :)
[19:16] <statik> hello pygi
[19:16] <statik> you were looking for me?
[19:16] <joshuahoover> rtgz_: you'll get your chance :)
[19:54] <chewit> this is strange, got my desktop to sync fine, cant get my laptop to sync, however tomboy sync works fine
[19:57]  * joshuahoover going back home as internet is back there
[19:57] <duanedesign> chewit: you can find instructions for updating to 1.0.3 here: https://answers.edge.launchpad.net/ubuntuone-client/+faq/930
[19:58] <duanedesign> joshuahoover: isnt the saying 'Home is where the internet is' :)
[19:58] <chewit> ah thanks :D
[19:59] <joshuahoover> duanedesign: heh
[19:59] <rtgz_> duanedesign, you are soooo right :)
[20:02] <duanedesign> rtgz_: if you do end up writing something with the Launchpad API i would like to look at it. I have been wanting to use that myself.
[20:04] <dobey> hmm
[20:04] <dobey> write what with the lp api?
[20:05] <rmcbride> rtgz_: Bug #459175  is also fixed. I verified a few things last night and had firefox die on me before I saved the LP pages apparently
[20:06] <qense> Are the GLib problems from the Jaunty PPA that caused Nautilus crashes solved now?
[20:07] <duanedesign> dobey: joshua_h and rtg_z were discussing the possobilities of using the API in the SRU process
[20:07] <rmcbride> rtgz_: likewise Bug #491573  (I'm going through my browser session and double checking a few  things and updating the bugs)
[20:10] <rtgz_> rmcbride, heh, we need a wiki page, both joshuahoover and you have tested bug 459175
[20:10] <dobey> qense: you need to downgrade
[20:10] <dobey> qense: there are instructions on the users list
[20:10] <rmcbride> rtgz_: yea, I had said something in channel yesterday about planning to hit those. a wiki would have been helpful.
[20:10] <qense> dobey: But is the PPA fixed already? What can I tell to the bug reporters, if they report anything new?
[20:11] <rmcbride> rtgz_: but this is our first SRU verification, so what we learn from this will make the next one better
[20:11] <dobey> qense: sudo aptitude install libglib2.0-0=2.20.1-0ubuntu2.1
[20:11] <dobey> libsoup2.4-1=2.26.0-0ubuntu3 libwebkit-1.0-1=1.0.1-4ubuntu0.1
[20:11] <dobey> libsoup-gnome2.4-1=2.26.0-0ubuntu3 libglib2.0-data=2.20.1-0ubuntu2.1
[20:11] <dobey> qense: the broken package was deleted from the ppa
[20:11] <dobey> qense: they need to downgrade the packages
[20:11] <qense> good
[20:11] <qense> dobey: thanks! I'll keep that in mind.
[20:11] <rtgz_> rmcbride, i vote for tags + external script to format it for wiki. I need more python experience :)
[20:11] <rmcbride> rtgz_: also Bug #451670  is definitely fixed (and not really an issue on karmic in the first place)
[20:12] <rmcbride> rtgz_: I like that idea
[20:12] <dobey> there are already some tags defined for SRU processing
[20:12] <dobey> i think they are "official" tags for Ubuntu (but not ubuntuone-client)
[20:15] <rtgz_> dobey, found verification-needed, verification-done, verification-failed (https://wiki.ubuntu.com/StableReleaseUpdates)
[20:16] <dobey> yeah
[20:16] <rtgz_> we might need to have additional tags so that we can signalize about the actual client version that is being verified, or if it can be done via other headers - then this is good.
[20:25] <dobey> not sure
[20:49] <rtgz_> duanedesign, http://paste.ubuntu.com/369747/
[20:49] <rtgz_> duanedesign, it will just fetch the bugs with verification-needed tag
[20:49] <rtgz_> duanedesign, erm.. /home/rtg is hardcoded there :)
[20:56] <duanedesign> rtgz_: nice.
[20:56] <joshuahoover> rtgz_, rmcbride: sorry, taken me waaaay too long to put a simple wiki page together...today is a day of distractions for me...i apologize...https://wiki.ubuntu.com/UbuntuOne/Testing ...maybe duanedesign and rtgz_ have a script for something better? :)
[20:57] <rtgz_> joshuahoover, you know... we need some tag to a) set who has verified the bug, b) what release it was verified/failed against.
[20:58] <joshuahoover> rtgz_: yeah...i think we need to give it a little thought or some proposed ideas because i could see it getting very messy, very quickly in terms of the number of tags and keeping it all straight
[20:59] <rtgz_> joshuahoover, yes, i don't even feel that tag is a good location for such info...
[21:00] <joshuahoover> rtgz_: yeah, doesn't "feel right" to me either but could be ok, just need to think about it a bit...i'm just trying to capture what we've done at this point right now and will likely give some thought as to what we do going forward on monday :)
[21:01] <rtgz_> joshuahoover, ok, i am filling the info i've got to Testing
[21:01] <joshuahoover> rtgz_: cool
[21:01] <rmcbride> rtgz_: was about to do the same. Either let me know when you're done or input what I've passed you, please :)
[21:03] <rtgz_> rmcbride, done with mine 2
[21:03] <rmcbride> rtgz_: thanks
[21:07] <rtgz_> done again, added the hibernation info as well
[21:08] <rtgz_> hm... wiki seems to be slow during writes...
[21:08] <joshuahoover> rtgz_, rmcbride: frustrating, bug #457564 wasn't passing this morning for me and now it is...i've changed nothing on the vm instance i'm testing on...hmmm...
[21:09] <rtgz_> joshuahoover, mine test was ok. Icon was working fine and bandwidth settings applied immediately - twisted breaks when 0 is set (bw settings can be saved, knowing the "Gread Sected") immediately
[21:11] <rtgz_> joshuahoover, hm, we can use a special format for bug post to store values. This way the history will be useful and we will be able to avoid such extra tags.
[21:13] <rtgz_> need to write that to wiki while i am in context...
[21:39] <rtgz_> joshuahoover, https://wiki.ubuntu.com/RomanYepishev/UbuntuOne/StableReleaseUpdateProcedure
[21:41] <joshuahoover> rtgz_: very good...good idea to capture this while it's fresh in your head :)
[21:44] <duanedesign> are there a set of tags used inside the U1 project?
[21:45] <rtgz_> joshuahoover, still, this may be not that efficient, but it is better to have SRU info near the original bug report...
[21:45] <duanedesign> ...for bug reports
[21:46] <duanedesign> should be the rest of that sentence. :)
[21:46] <joshuahoover> rtgz_: right, we need something...just not sure what that something should be at the moment :) my brain is fried today so any "something" i come up with right now will likely be garbage ;)
[21:46] <joshuahoover> duanedesign: yes, there are a set of tags we use
[21:48] <rtgz_> duanedesign, mmm.. yes, there is a 'standard' set that helps to tie bug reports together. I invented christmas-bug tag and (i guess) urbanape came with farfignugen-share-dialog tag for web-ui related stuff. It is now called simply web-ui :)
[21:48] <duanedesign> i noticed rtgz_  used verification-needed in his python code. Was curious if there were any, that would be relevant to me.
[21:48] <joshuahoover> duanedesign: the most prominent ones are detailed on this page (and also setup as of official tags on each lp project): https://wiki.ubuntu.com/UbuntuOne/Bugs/WorkFlow
[21:49] <duanedesign> ok thanks
[21:49] <joshuahoover> duanedesign: under the "assignment" section...desktop+, foundations+, ops+ ...and we're always open to making changes to help improve things so please don't hesitate to make suggestions!
[21:49] <duanedesign> lol, i was just on that page
[21:49]  * duanedesign slaps forehead
[21:50] <duanedesign> joshuahoover: ok, great
[21:51] <duanedesign> as part of my work with the launchpad Focus Group in the Beginners Team I show community members how to use Launchpad.
[21:51] <duanedesign> i used Ubuntu One the other day in my demo for triaging bugs
[21:52] <duanedesign> I noticed one of the attendes in here this morning helping to mark duplicates
[21:54] <duanedesign> that was a nice unintentional consequence. I was just using U1 because it was what I had been working on lately.
[22:01] <rtgz_> erm
[22:02] <rtgz_> guys, why did bug 455544 got launchpad bugtracker "This bug was fixed in the package ubuntuone-client - 1.0.3-0ubuntu1" - it is not fixed
[22:02] <dobey> rtgz_: huh?
[22:03] <rtgz_> dobey, bug 455544 - there is a message from LP bug tracker that "This bug is fixed" - was that performed automatically?
[22:03] <dobey> rtgz_: the package must have been uploaded to updates
[22:03] <dobey> rtgz_: but that bug was fixed. what you're seeing is a different bug, no?
[22:04] <rtgz_> dobey, erm
[22:04] <rtgz_> dobey, https://bugs.launchpad.net/ubuntuone-client/+bug/455544/comments/34
[22:04] <dobey> rtgz_: don't base your idea of whether or not a bug is fixed based on the description
[22:04] <rtgz_> 0 is the default, if 0 is set then it says Protocol version error
[22:05] <dobey> rtgz_: 0 is not the default
[22:06] <rtgz_> dobey, what is default then? -1 does not work and it will set the bw preferences to 0
[22:07] <dobey> the default is -1
[22:07] <dobey> it getting set to 0 is a different bug
[22:08] <dobey> well, in fact, i think there are 2 bugs
[22:08] <dobey> that -1 gets turned into 0
[22:09] <dobey> and that bw throttling also affects messages that aren't upload/download
[22:10] <dobey> throttling the auth commands and such is silly
[22:11] <rtgz_> dobey, yes, but. The bug was originally related to the fact that if a person enables throttling and does not change anything then syncdaemon is unusable. This condition remains.
[22:13] <rtgz_> dobey, so be it on=True read_limit=-1 or on=True read_limit=0 the fix cannot be said to be complete as we have a bug in applet as well
[22:15] <rtgz_> the client will open Preferences, enable throttling, read_limit is immediately reset to 0, syncdaemon divides by zero, Protocol error occurs and client is unhappy about the fact that he was told that syncdaemon will work with default values.
[22:17] <rtgz_> and, IMHO,  the cure for 462003 is worse that the disease
[22:17] <rtgz_> bug 462003
[22:17] <dobey> huh?
[22:18] <dobey> you're probably hitting a different bug i guess, that's visible now because of that fix
[22:18] <dobey> i don't know
[22:18]  * dobey wishes people would have tested this stuff 3 months ago when the branches to fix them actually landed
[22:19] <rtgz_> true
[22:20] <dobey> and reusing the same bug for that isn't a great idea
[22:24] <rtgz_> dobey, ok, first bug was that when -1 is stored as read_limit/write_limit caused syncdaemon to fail, this was patched. Now when 0 is stored as read_limit/write_limit it causes the same problem.
[22:24] <dobey> rtgz_: it's a different bug that apparently gives a similar result
[22:25] <dobey> different bugs can have the same symptoms unfortunately
[22:25] <rtgz_> dobey, yes, but the original reported problem is not fixed
[22:25] <dobey> well it is, becdause -1 is handled correctly now
[22:26] <dobey> another bug is just giving you the same symptom
[22:26] <rtgz_> dobey, i understand that, but given the users perspective, having Bandwidth throttling clicked immediately halts syncdaemon as it starts using 0 which is written to config file by the applet
[22:26] <dobey> rtgz_: presumably you might also get the same symptom by blocking that traffic with a firewall
[22:27] <dobey> rtgz_: if fthe original reporter was actively involved in testing the fix and saying the same thing, then maybe. but like a lot of bugs we get, it looks like it was a file and forget
[22:28] <rtgz_> dobey, yes, I could but the bug does not mention the firewall, and the problem arises from the _intended_ usage of application preferences. In this case it does not fail after next syncdaemon start, it fails immediately.
[22:28] <dobey> hah, it was filed by jdo
[22:29]  * dobey makes a note to smack him
[22:29] <rtgz_> :)
[22:29] <dobey> rtgz_: the intended usage of the preferences is that setting stsuff to 0 blocks file transfers, not authentication and such
[22:30] <dobey> different bug, same symptom
[22:31] <dobey> rtgz_: not to mention the several people saying "i upgraded, and my problem is fixed now"
[22:31] <rtgz_> dobey, i guess you will need to make a note to smack everybody to actually test the prepared SRU before it hits the shelves. And built in some kind of timer that stops working if no response is given about this version :)
[22:32] <dobey> rtgz_: it's been in proposed for > 6 weeks, and people have been poked multiple times to test this stuff, with no real response :(
[22:33] <rtgz_> dobey, ok, we will see what can be done about that. That's just sad that this all got such an attention 1 day before it is accepted to karmic-updates :(
[22:34] <dobey> rtgz_: we can do more SRUs if we need to
[22:34] <rtgz_> dobey, and I was running it for 2 weeks w/o touching these knobs so I was happy...
[22:34] <dobey> rtgz_: but we shouldn't block having it work for 50K users, because one or two were able to get a similar symptom even with the fix
[22:34] <rtgz_> dobey, true
[22:35] <dobey> rtgz_: and clearly we need to write more tests, that test exactly these conditions
[22:36] <rtgz_> dobey, okay, I guess the relevant info from that bug report should be copied to new one describing the final problem and probably give a link to that bug report from the original one so that it would be possible to find it
[22:38] <rtgz_> it looks like this: 1. bug with applet setting default to 0. 2. bug in syncdaemon that applies bw prefs to control messages as well as the content. 3. Syncdaemon should work with 0 values for read_limit and write_limit
[22:40] <dobey> speaking of bugs, Delta has plenty of them :(
[22:40] <dobey> rtgz_: i think multiple other bug reports need to be filed
[22:40] <rtgz_> dobey, Delta?
[22:41] <dobey> yeah, the airline
[22:41] <dobey> what use is on-line check-in, if you can't select any seats!
[22:42] <rtgz_> dobey, what use of online banking when the button to perform the payment failed to load? :)
[22:43] <duanedesign> dobey: that is annoying
[22:43] <rtgz_> bugs are everywhere... It is just 1) nobody cares for some 2) people get used to them 3) people switch elsewhere
[22:43] <duanedesign> i experienced that for the first time last month
[22:46] <dobey> nah, Delta is just made of fail
[22:46] <dobey> it's like how they say "Thank you for choosing Blah." when you land somewhere or take off
[22:47] <dobey> i didn't *choose*
[22:47] <rtgz_> dobey, ok, bug 465030 has also the same symptom for different code, i.e. preferences are saved, but only when focus is moved somewhere
[22:47] <dobey> you're one of the 3 airlines at my airport, and the others don't fly to where i'm going
[22:47] <dobey> not really choice
[22:52] <duanedesign> dobey: has anyone done any work on putting together a list of testcases for nightlies and releases
[22:52] <dobey> most of this stuff should be in our unit tests
[23:03] <joshuahoover> rmcbride: not sure how to test bug #476777 ...i know how i can get the same results...set read_limit and write_limit to None in syncdaemon.conf but not sure how that would happen or if that's how it happened originally
[23:09] <verterok> joshuahoover: hi :)
[23:09] <duanedesign> dobey: ahh, so you guys do automated unit-testing?
[23:10] <joshuahoover> verterok: hi
[23:10] <dobey> duanedesign: yes
[23:10] <verterok> joshuahoover: 476777 was caused by a error in the configglue parser
[23:10] <dobey> duanedesign: ideally everything will be automated, but there are some things we can't test like that right now
[23:10] <rtgz_> joshuahoover, the syncdaemon bug with default settings needs to be split into 3 bug reports, the package that we were testing is now in karmic-updates
[23:12] <duanedesign> dobey: i guessi was wondering if a checkllist of test cases like the ones on the QA site would be useful
[23:13] <dobey> duanedesign: rmcbride has a set of things he tests all the time, talk with him about that :)
[23:13] <verterok> joshuahoover: and I think it was triggered with the  -1 value
[23:13] <joshuahoover> rtgz_: 3 bug reports?
[23:13] <duanedesign> dobey: ok. Thank you
[23:13] <rtgz_> joshuahoover, : 1. bug with applet setting default to 0. 2. bug in syncdaemon that applies bw prefs to control messages as well as the content. 3. Syncdaemon should work with 0 values for read_limit and write_limit.
[23:15] <joshuahoover> rtgz_: ok, that makes sense, though i don't like allowing the setting to 0...i see the use for it maybe (i only want to upload, but not download files) but i think it causes more confusion than it's worth
[23:16] <statik> duanedesign, I would very much like to have a set of acceptance tests that get run (and perhaps later partially automated) against release candidates/nightlies
[23:16] <rtgz_> joshuahoover, hm, true, we might need a specific setting that says i want to download files only, no upload. I would think of 0 as infinity, though...
[23:16] <statik> there is a bunch that can be done inside unit tests, but a bunch more that is just more sane when run against an installed client talking to a server
[23:16] <duanedesign> statik: yes i thought coming up with a manual list would be beneficial as it could be used to later produce an automated system
[23:17] <dobey> well yes, integration tests shouldn't be in unit tests
[23:17] <joshuahoover> duanedesign: yes, we should discuss with rmcbride as he does have a set of automated acceptance tests already from what i recall...it would be good to start there and then see what else needs to be accounted for
[23:18] <duanedesign> joshuahoover: ok, ill make a note and touch base with him
[23:20] <rtgz_> joshuahoover, re: zero in preference: bug 509742
[23:21] <joshuahoover> rtgz_: right :)
[23:25] <joshuahoover> rtgz_: but i'm not sure all agree that bug is the right way to go...i'm arguing for not allowing users to turn off read and/or writes with the client, not just working around the issue that setting the limits to 0 doesn't work right now
[23:27] <rtgz_> joshuahoover, i will leave item 3 for Monday, since it requires some more thinking. It is 1:26 AM here and I am now operating in "Only report bugs that are definitely bugs" mode only :)
[23:29] <joshuahoover> rtgz_: get some sleep!
[23:30] <rtgz_> joshuahoover, hey, i am not fixing bugs, i am only reporting them :)
[23:30] <joshuahoover> rtgz_: heh
[23:38] <rtgz_> ok, my bug report consisted of 2 lines (which is too low), definitely need to go to bed :)
[23:39] <rtgz_> okay, see you all on Monday! Have a nice weekend :)