[16:49] <shadeslayer> has it started?
[16:49] <dholbach> not yet
[16:49] <dholbach> 11 more minutes
[16:49] <dholbach> https://wiki.ubuntu.com/UbuntuDeveloperWeek
[16:49] <shadeslayer> oh...
[16:50] <shadeslayer> Laney: can you point me to some basic packaging links.... im like a absolute 101 at this
[16:51] <dholbach> shadeslayer: https://wiki.ubuntu.com/PackagingGuide and more generally: https://wiki.ubuntu.com/MOTU/GettingStarted
[16:52] <shadeslayer> dholbach: thanks
[16:59] <dholbach> HELLO EVERYBODY! WELCOME TO THE LAST DAY OF UBUNTU DEVELOPER WEEK!
[16:59] <pitti> \o/
[17:00] <Kmos> hi :)
[17:00] <dholbach> First up we have three heroes, dpm, danilos and pitti, who are going to talk about "Translations for developers"!
[17:00] <c_korn> hey ho
[17:00] <dpm> hi! \o/
[17:00] <dholbach> as always please keep the chat in #ubuntu-classroom-chat and ask your questions there too
[17:00] <danilos> heya
[17:00] <dholbach> please make sure you prefix them with QUESTION:
[17:00] <dholbach> also... if you're not comfortable with English and need to ask questions in your language, try one of these channels:
[17:00] <dholbach>  * Catalan: #ubuntu-cat
[17:00] <dholbach>  * Danish: #ubuntu-nordic-dev
[17:00] <dholbach>  * Finnish: #ubuntu-fi-devel
[17:00] <dholbach>  * German: #ubuntu-classroom-chat-de
[17:00] <dholbach>  * Spanish: #ubuntu-classroom-chat-es
[17:00] <dholbach>  * French: #u-classroom
[17:00] <dholbach> (this fits quite well with the topic of Translations, hm? :))
[17:01] <dholbach> Enjoy the sessions and take the offer to get involved seriously! :-)
[17:01] <dholbach> dpm, danilos, pitti: the floor is yours!
[17:01] <pitti> Hello all!
[17:01] <danilos> dholbach: thanks
[17:01] <pitti> I'm Martin Pitt from the Ubuntu Desktop Team, and more or less the creator of the "language pack" system we have used in Ubuntu since 2005.
[17:01] <danilos> Hi all, I am Danilo and I lead the Launchpad Translations development team: Launchpad is an open source foundation for Ubuntu i18n and l10n
[17:01] <dpm> Hi everyone, my name is David Planella, I'm the Ubuntu Translations Coordinator and as such my job is to keep the Ubuntu translation community rocking
[17:02] <pitti> In Ubuntu we spend quite some effort on translation of software and move translations around a lot, so that we can clearly separate the actual packaged software from the translations which belong to it, for the following main reasons:
[17:02] <pitti>  * Make it as easy as possible for non-technical contributors to help translating software.
[17:02] <pitti>  * Deliver translation updates to stable Ubuntu releases without having to touch the actual software packages, and thus jeopardizing their stability.
[17:02] <pitti>  * Have a good control which translations land on the release CD images, to mitigate space constraints.
[17:02] <pitti> == What are language packs? ==
[17:02] <pitti> Langpacks are packages which contain translations for a particular language for software that we ship in Ubuntu main. Universe and multiverse are not currently handled by this system.
[17:03] <pitti> The basic idea is that the actual programs are packaged without any translations, and if you are using an e. g. Portugese desktop, you need to install the Portugese language pack to have Ubuntu talk Portugese instead of English to you.
[17:03] <pitti> As an user, you don't usually need to worry about this too much, since the installer takes care to install what you need, though. There's also the "Language selector" in the System menu which allows you to install more.
[17:03] <pitti> In order to avoid unnecessary downloads, wasted CD space, and wasted installation hard disk space, there is not just one langpack for a particular language, but they are split into "categories" (GNOME, KDE, and common), so that a pure Ubuntu installation does not need to carry GNOME translations.
[17:04] <pitti> E. g. the "language-pack-gnome-pt" package ships Portugese translations for all GNOME related packages in main.
[17:04] <pitti> To further complicate the issue, there is another split between "-base" and "update" packages. The idea is that the bulk of translations is usually ready and done by the final release of Ubuntu, and we want users to not have to download the same things over and over again. So the "-base" packages are big and contain the state of translations as it was at release time, while the "update" packages
[17:04] <pitti> are small and only contain the delta since the release.
[17:04] <pitti> That's why there is not just "language-pack-gnome-pt" (the update package), but also "language-pack-gnome-pt-base".
[17:05] <pitti> Thus for a single language you usually have a set of six related language-pack-* packages. This makes things a bit convoluted, but makes the system reasonably efficient.
[17:05] <pitti> Questions so far about this split?
[17:06] <pitti> seems not
[17:06] <pitti> == Translation formats ==
[17:06] <pitti> By far the most known and used method of translating Linux software is "gettext".
[17:06] <pitti> It
[17:06] <pitti>  * wraps the translatable strings in the software into a special function _("Hello")
[17:06] <pitti>  * extracts those strings into a template file which contains all the transatable strings (called the "PO template")
[17:07] <pitti>  * compiles human-readable and editable translation files (*.po) to binary "*.mo" files which provide super-fast access at runtime
[17:07] <pitti>  * uses the installed *.mo files at runtime to map an English string to a translated string.
[17:07] <pitti> A typical record in a gettext PO file looks like this:
[17:07] <pitti>    msgid "Good morning"
[17:07] <pitti>    msgstr "Доброй утро"
[17:07] <pitti> (this would be in the ru.po file for Russian)
[17:08] <pitti> Launchpad and the Ubuntu langpack system have fully supported gettext from day one.
[17:08] <pitti> Unfortunately there is not just gettext in the Linux world, but also other vendor specific systems, mainly due to the fact that these appliciations did not originate in the Unix world.
[17:08] <pitti> The prime examples here are Mozilla software (Firefox et al) which use "XPI", and OpenOffice.org which uses a system called "SDF".
[17:08] <pitti> Launchpad and langpacks grew support for XPI about a year ago, so that Launchpad can be used to translate Mozilla software now. SDF is not yet handled by Launchpad or langpacks.
[17:09] <pitti> For about a week now in karmic, we also started handle GNOME help file translations.
[17:09] <pitti> While they use gettext in principle, the translated files are assembled at build time, and packages ship the readily translated XML files and translated screenshots directly.
[17:09] <pitti> They take a lot of space, so we now strip them from the actual packages, temporarily park them in Launchpad, and put them into the language packs. But they are really just copied verbatim right now, there is no Launchpad support for updating the help translations yet.
[17:09] <pitti> questions about translation format?
[17:09] <pitti> please also yell in #chat if I'm too fast/slow
[17:10] <pitti> ok, so let's talk a little how translations make their way from the translator to the user's desktop
[17:10] <pitti> == Flow for gettext translations ==
[17:11] <pitti> Since gettext is pretty much the only system which you should need to know, I would like to concentrate on that from now on.
[17:11] <pitti> I like to explain how translations make their way through this system, to allow developers to be aware of the needs of translators, and how translations make it to the final desktop.
[17:11] <pitti> The 1000 m perspective looks like this:
[17:11] <pitti> (details will follow, don't worry)
[17:11] <pitti> translations in upstream tarball → extract at package build time → import into Launchpad
[17:11] <pitti> translation community → add/change strings on Launchpad
[17:12] <pitti> Launchpad translation export → sort them by language and category → generate language-pack-*, and upload them
[17:12] <pitti> Danilo and David will talk in detail about the Launchpad part later on, so I'll give some details on the packaging related bits.
[17:12] <pitti> == build time extraction ==
[17:12] <pitti> The majority of translations come from the already existing shipped PO files in the upstream tarballs. These need to be extracted and imported into Launchpad, and the compiled MO files be removed from built application packages.
[17:13] <pitti> This is done by a script "pkgstriptranslations" from the "pkgbinarymangler" package. That package is installed in the Ubuntu build servers, but of course you can also install it locally to see what it does.
[17:13] <pitti> For the import to actually succeed and work well, packages must ship or build an up-to-date PO template, i. e. the template must be an accurate list of all translatable strings in the application.
[17:13] <pitti> It is greatly preferred to have this generated at build time (usually with "intltool-update -p"), to ensure that it isn't stale, and also contains the added/changed strings we do in Ubuntu patches.
[17:14] <pitti> If the package uses cdbs and includes gnome.mk or translations.mk, this will be taken care of automatically. All other packages need to be fixed to build a PO template. (This should be the case for almost all packages in Ubuntu main nowadays.)
[17:14]  * pitti hands mike to danilos
[17:14] <danilos> thanks pitti; so, let me go on a bit with this
[17:14] <danilos> = Package structure =
[17:14] <danilos> As Martin mentioned, POT and PO files are produced as part of binary builds: for Launchpad to import translations correctly, make sure your builds do produce POT files, or translations will not be up to date (or not imported at all with a new package).  Also, note that this process happens only for source packages which are in Ubuntu 'main'.
[17:15] <danilos> Note that you can have multiple translation templates (POT) for different purposes.  Eg. a library POT and main UI POT: but make sure that you keep relevant translation PO files in the same subdirectory as their respective POTs.
[17:15] <danilos> Also, don't worry about merging PO files with latest POT files: Launchpad does that for you with a very smart algorithm not losing any contributions and worrying about conflicts.
[17:15] <danilos> KDE is special in the way POT files are built and where translations are pulled out of: https://wiki.ubuntu.com/Translations/Upstream/KDE/KubuntuTranslationsLifecycle
[17:16] <danilos> After all translation files are stripped off, they end up in Launchpad translations import queue: https://translations.launchpad.net/ubuntu/+imports
[17:16] <danilos> = Import queue =
[17:16] <danilos> Originally, when they enter the import queue, they are put into 'Needs review' state.  For templates, if a base path and filename matches a template Launchpad has previously imported in that source package, it is considered an update of that template, attached to it and marked as 'Approved'.
[17:16] <danilos> For translations, Launchpad tries to match them against existing templates and existing language codes.  Launchpad on purpose recognizes only  "the shortest possible" language codes: use "es.po" and "de.po" and not "es_ES.po" or "de_DE.po".
[17:17] <danilos> For anything that can't be automatically approved, it's stays in the queue for someone to look at.  If you wonder who that someone might be, I introduce you to...
[17:17] <danilos> dpm: tell us more about what UTC stands for :)
[17:17]  * dpm gets the mike
[17:18] <dpm> UTC stands for Ubuntu Translations Coordinator or the
[17:18] <dpm> Ubuntu Translations Coordinators Team
[17:18] <dpm> The Ubuntu Translations Coordinators team is a group of wonderful people who takes care of all the manual adjustments, reporting more technical issues and in short caring for Ubuntu translations.
[17:18] <dpm> Here you can see them sporting their good looks: https://launchpad.net/~ubuntu-translations-coordinators/+mugshots
[17:19] <dpm> The team was born from the intention of making the technical and all the behind-the-scenes work more open to the community. As such Launchpad Translations has been progressively getting more permissions on different levels and granting them to these trusted community members.
[17:19] <dpm> So they can participate in the process
[17:20] <dpm> One of the main tasks of the UTC team is to manage the imports queue and manually approve, tweak or block translation templates which the auto-approver script cannot automatically handle. They can also decide which translations must be included in language packs.
[17:21] <dpm> And here's a link to the Karmic imports queue, for those interested in horribly long links: https://translations.edge.launchpad.net/ubuntu/karmic/+imports?field.filter_extension=pot&field.filter_status=NEEDS_REVIEW&start=0&batch=150
[17:22] <danilos> dpm: thanks!
[17:22]  * danilos fights with dpm over mike
[17:22]  * pitti stumbles over the cable
[17:22] <danilos> Also, if you want your package (in 'main') translations exported into language packs, you can have UTC team set it up: if they don't, you'll have to manually download translation tarballs from Launchpad and use that export when building updated version of the package.
[17:22] <danilos> Does anyone have any quick questions so far about how stuff gets into the queue and how it gets approved?
[17:23] <dpm> QUESTION: What is the official way to contact UTC?
[17:24] <dpm> The UTC has got a mailing list, and you can also contact them by filing a request to the Answers system on the ubuntu-translations project
[17:24] <dpm> here are the relevant links:
[17:24] <dpm> https://wiki.ubuntu.com/Translations/Contact/#Ubuntu%20Translations%20Coordination
[17:25] <dpm> https://answers.edge.launchpad.net/ubuntu-translations/+addquestion
[17:25] <dpm> following the first link you'll also be able to consult the public mailing list archives
[17:25] <dpm> Any other questions on UTC? Or anything else so far?
[17:26] <danilos> I guess not :)
[17:26] <danilos> so, let's see what happens next
[17:26] <danilos> = Translation =
[17:26] <danilos> After files have been put into 'Approved' state (either automatically or manually), they are imported into Launchpad: usually very quickly, but some uploads can take longer than others (like KDE-l10n and OpenOffice.org with their 20k files each can take around a day).
[17:27] <danilos> After POT and PO files have been imported, it's possible to use Launchpad web UI to translate Ubuntu: it provides an easy to use interface but with some advanced features on top of it.  The easy way is: go to a web page, look at the English string, fill in a text box with your suggested translation, and save the page.
[17:27] <danilos> The more advanced way to do translation is to download a PO file, work on it offline, and then upload it back.  And Launchpad will worry about any conflicts, and will do it on per message basis: if you translated the same string someone translated online at the same time, it will make your translation a suggestion and let you know about it.
[17:27] <danilos> One of the cool new things is that translators can only work on one version of the project (i.e. trunk series for a project, or Jaunty for Ubuntu), and if relevant, their work will be reflected in all the other versions as well.
[17:28] <danilos> So, you do a translation of "Open file" in Jaunty.  You don't have to go to Karmic to do the translation there as well, it will be automatically propagated.  We call this feature "message sharing".
[17:28] <danilos> To control access to translation, Launchpad offers translation groups: they are a list of translation teams matched by language.  Only people who are part of those translation teams can 'approve' others' translation suggestions: without approval, their translations are never made active.
[17:28] <danilos> All this is made possible by the Launchpad development team consisting of henninge, jtv, Ursinha and me: find them in #launchpad and say hi!  Also, you can become part of the team as well, remember Launchpad is open source now!
[17:29] <danilos> Ubuntu has a vibrant translation community as part of 'Ubuntu translators' group.  But, I'll let David tell you more about it.
[17:29] <danilos> dpm: I heard to get an Ubuntu translation it doesn't just take a nice platform, you need some people as well? :)
[17:29] <dpm> sure, nice and exciting people
[17:30] <dpm> I'll take it from where Danilo was mentioning translation groups...
[17:30] <dpm> First of all, the permissions for translating projects (or distros in the case of Ubuntu) are organised around _translation groups_, to which project maintainers can assign the translations to.
[17:31] <dpm> The biggest translation group in Launchpad is the Ubuntu Translators group. There (https://translations.edge.launchpad.net/+groups/ubuntu-translators) you can see that there is a second level of permission translation communities use to organise themselves: _translation groups_ are containers for _translation teams_
[17:31] <dpm> Translation teams are where the exciting stuff takes place
[17:32] <dpm> Communities get organised in teams around Launchpad and use it to translate Ubuntu
[17:33] <dpm> At the risk of repeating what Danilo has already said, I'll reinstate that: although everyone with a Launchpad account can provide translation suggestions, only those translators in an Ubuntu translation team will be able to approve them or to submit them themselves.
[17:34] <dpm> this means that everyone can get introduced to the world of translations easily
[17:35] <dpm> but at the same time, only experienced translators will be able to accept suggestions, maintain a level of translation quality and guide newcomers into the process of becoming full-fledged translators
[17:36] <dpm> so let's talk about upstream/downstream relationships
[17:36] <dpm> danilo, do you want to take it from there?
[17:36] <danilos> sure, thanks dpm
[17:36] <danilos> = Upstream and downstream =
[17:36] <danilos> Ubuntu makes a lot of use of upstream software.  Some of it is Ubuntu's own, like upstart or jockey.  And yet others are completely independent like Evolution or Firefox.
[17:37] <danilos> With package builds for 'external upstream' applications, you usually get translations from upstream integrated by including the upstream PO files in the build. For 'internal upstream', they are usually hosted in Launchpad as separate projects, and they require some care to make sure people are not confused about where to do their translation.
[17:37] <danilos> Note that upstreams usually do not update translations for 'older' versions: that's what Launchpad allows Ubuntu to do.  You can still update Hardy translations and they will be reflected in the next language pack update.
[17:37] <danilos> now, how do we get to language packs?
[17:38] <danilos> = Exporting language pack tarballs =
[17:38] <danilos> After translations have been done in Launchpad, Launchpad aggregates all the translations for a single Ubuntu release and puts it in a tarball.  Launchpad calls that "language packs", but they are just the base tarballs used to construct final language packs you get installed on your system.
[17:38] <danilos> Launchpad does weekly exports of language pack tarballs, with the following schedule: http://dev.launchpad.net/Translations/LanguagePackSchedule
[17:38] <danilos> After they are produced, they are listed on distribution release language packs page, eg. https://translations.launchpad.net/ubuntu/jaunty/+language-packs
[17:39] <danilos> There are two slightly different types of tarballs Launchpad can produce: either a full tarball (for pitti called "base") containing all translations for templates marked as included in language packs (don't forget about that bit), or only those translations which have been updated since the last full language pack was released — called a "delta language pack" in Launchpad, and "update" package in Ubuntu.
[17:39] <danilos> QUESTION: how launchpad handles the translation's return to its original mantainer?
[17:40] <danilos> always an interesting matter: Launchpad provides two things: a translation platform for Ubuntu and for projects who use Launchpad as their base translation portal
[17:40] <danilos> in case of Ubuntu, translations done in Launchpad are mostly updates to existing upstream translations
[17:41] <danilos> With a wide variety of upstreams that Ubuntu uses, there is simply no way Launchpad can know all the ways to send updated translations back in good manners (i.e. not considered spam or aggressive)
[17:42] <danilos> so, Launchpad provides a facility to help translators submit their work upstream themselves: when they go to export a translation from Launchpad, they can choose to export only those translations which have been changed
[17:42] <danilos> by ticking the 'Export only changed translations' box on PO file export page
[17:42] <danilos> that file can then be sent to the original maintainer for inspection and merging with the upstream translation
[17:43] <danilos> I hope this answers this question
[17:44] <danilos> I want us to get back to language pack production: I'll let pitti tell you what happens next after Launchpad produces tarballs with translation files
[17:44] <pitti> == langpack-o-matic ==
[17:44] <pitti> The upstream and Ubuntu community translations get merged together in Launchpad, and then regularly get exported as a huge tarball which contains all translations for all applications.
[17:44] <pitti> The job of dissecting this 500 MB monster and producing installable debs is done by a set of scripts that I called "langpack-o-matic".
[17:45] <pitti> It has a set of package skeletons for language-pack-*-base and language-pack, and instantiates a group (base/update and gnome/kde/common) of them for each language that is present in the export.
[17:45] <pitti> For deciding what is a GNOMEish or a KDEish package it currently uses some heuristics, looking at the package description, dependencies, and so on.
[17:45] <pitti> Based on the categorization and language, it sorts the files into the generated language-pack-* source packages. It also adds some extra data, such as converting Mozilla related gettext translations into XPI files, or ship flag images for KDE.
[17:45] <pitti> == Testing ==
[17:46] <pitti> For the current Ubuntu development release, langpack-o-matic uploads the generated langpacks straight to the archive, i. e. Karmic at the moment.
[17:46] <pitti> That way, they get maximum testing, and we aren't concerned about small regressions within the development release
[17:46] <pitti> For stable releases we need to apply more care; since translations have the potential to break software, or just regress in quality, they need to get thorough testing before they get uploaded to e. g. jaunty-updates.
[17:46] <pitti> For this, we have a personal package archive where langpack-o-matic uploads updates for stable Ubuntu releases on a weekly basis. If you are translating Ubuntu software on Launchpad, or just would like to help testing, please enable this PPA to always get the latest translations, and report problems immediately.
[17:47] <pitti> Usually, the PPA packages are uploaded to -proposed once a month, then dpm sends out a call for testing on the translators mailing list, and once we can be reasonably sure to not have broken much, they get to -u
[17:47] <pitti> pdates for general consumption.
[17:47] <pitti> this answers: fran_dieguez_| QUESTION: How often the language-pack-update are updated at stable releases?
[17:47] <pitti> == Links ==
[17:47] <pitti> Details about all the involved processes: https://wiki.ubuntu.com/Translations/TranslationLifecycle
[17:48] <pitti> pkgbinarymangler package: https://launchpad.net/ubuntu/+source/pkgbinarymangler
[17:48] <pitti> langpack-o-matic project, bugs, code: https://launchpad.net/langpack-o-matic
[17:48] <pitti> Weekly langpack PPA: https://launchpad.net/~ubuntu-langpack/+archive
[17:48] <pitti> == Q & A ==
[17:48] <pitti> I propose we go in order of #-chat now
[17:48] <danilos> pitti: yeah, let's do that
[17:49] <pitti> qense| QUESTION: A bit late, but does the _() function also works in Python? What module do you need to import?
[17:49] <pitti> I think I'll take that
[17:49] <pitti> Python has a gettext module for that
[17:49] <pitti> it doesn't export _() by itself, but it's easy enough to do it with
[17:49] <pitti> from gettext import gettext as _
[17:50] <danilos> == More links ==
[17:50] <danilos> General i18n info for developers (packaging and coding): https://wiki.ubuntu.com/UbuntuDevelopment/Internationalisation (I'll try to have the page in a readable state by tomorrow)
[17:50] <danilos> Ubuntu import queue: https://translations.launchpad.net/ubuntu/+imports
[17:50] <danilos> Current language pack tarball schedule: http://dev.launchpad.net/Translations/LanguagePackSchedule
[17:50] <danilos> Language pack tarballs: https://translations.launchpad.net/ubuntu/karmic/+language-packs
[17:50] <danilos> Launchpad documentation: https://help.launchpad.net/Translations
[17:50] <danilos> and back to...
[17:50] <danilos> = Q & A =
[17:50] <pitti> fran_dieguez_| and related: QUESTION: if a newbie translator do work at launchpad and the original translator of that app makes work too outside of launchpad , how launchpad handles the collissions?
[17:50] <pitti> danilos: ^ ?
[17:51] <danilos> yeah, let me take that
[17:51] <danilos> so, Launchpad has a "smart" algorithm for deciding what takes precedence
[17:52] <danilos> if Launchpad imports an upstream translation, it updates it with every change coming from upstream
[17:52] <danilos> Launchpad basically "tracks" the upstream translation
[17:52] <danilos> However, if someone modified that translation in Launchpad *on purpose*, we keep it, and mark the newly imported upstream one as "needing review"
[17:53] <danilos> If someone did a translation in Launchpad which didn't exist upstream, but is later introduced there, we give preference to upstream translation
[17:54] <danilos> basically (re QUESTION), the rule is: only if it was modified on purpose in Launchpad, it takes preference; otherwise upstream translation takes precedence
[17:54] <danilos> I'd like to go back to other earlier question now:
[17:54] <danilos> QUESTION: What about projects that have their upstream on Launchpad?
[17:55] <danilos> Projects like these do not have to worry about any integration because everything happens in Launchpad; if they ship translations in Ubuntu, they might be for a different release so that might take manual merge effort for now
[17:56] <danilos> QUESTION: is there a page in launchpad where I can see all untranslated strings for a specific language so I can just start translating ? or do I have to choose a source package first ?
[17:57] <danilos> There is no such page in Launchpad, though you will get a list of recommendations of what could use some help in translating on your personal page with our 3.0 release coming in ~3 weeks
[17:57] <danilos> Note that Launchpad is not only about Ubuntu, though Ubuntu is the big part of it
[17:57] <danilos> There *is* such a page for Ubuntu, eg. look at
[17:58] <danilos> https://translations.launchpad.net/ubuntu/karmic/+lang/sr
[17:58] <danilos> it's a long list, though :)
[17:58] <pitti> ok, time for one more q
[17:59] <pitti> ah, no, sorry
[17:59] <pitti> thanks all for your attention!
[17:59] <pitti> more questions -> #chat, please
[17:59] <danilos> thanks all, sorry for taking longer than expected :)
[17:59] <dpm> thanks you all for cming along
[17:59] <dpm> (and sorry for the spelling)
[18:00] <pitti> *drumroll* liw!
[18:00] <liw> ka-ching! it's time!
[18:01] <liw> This is a tutorial on the "Getting Things Done" system.
[18:01] <liw> Impatient summary: externalize memory, review external memory regularly, pick the next possible thing to do and do just that.
[18:01] <liw> I will now spend the rest of this hour expanding on this.
[18:01] <liw> "Getting Things Done" is described in the book by the same name, written by David Allen.
[18:01] <liw> It is often shortened GTD, and that's what I'll be using.
[18:01] <liw> I've been using various parts of GTD since the summer of 2006.
[18:01] <liw> I am by no means an expert, but we can learn together.
[18:02] <liw> as usual, if you have questions, write them to #ubuntu-classroom-chat
[18:02] <liw> I will attempt to monitor that channel, too
[18:02] <liw> questions are OK at any time
[18:02] <liw> GTD is a system for personal productivity: for achieving things while avoiding stress.
[18:02] <liw> It's a system for keeping track of everything you need to do, so you can concentrate on the task at hand, without your subconscious distracting you with all the other things you might be doing at the same time.
[18:02] <liw> Alternatively, it lets you decide to not do anything, since you know there is nothing you need to do right now.
[18:02] <liw> (and that's important!)
[18:02] <liw> The goal of GTD is to get into a state where you know at any point all the things you could do next, and where you can easily deal with new inputs.
[18:02] <liw> GTD is divided into five phases: capture, process, organize, do, review.
[18:03] <liw> am I going too fast?
[18:03] <liw> During capture, you write down everything you need to remember.
[18:03] <liw> It is all about making notes for later processing, not at all about processing things immediately.
[18:03] <liw> If you are cooking and run out of milk, you write down that you need to buy more milk.
[18:03] <liw> If you're out walking and see an advertisment with a URL you want to check out later, you write down the URL.
[18:03] <liw> Or you take a photograph of the ad; any note-taking method is fine, except trying to keep it in your brain.
[18:03] <liw> If someone says something in a meeting that you need to deal with afterwards, you write it down (unless you're recording the meeting).
[18:03] <liw> It's important to write things down as you think of them, or encounter them.
[18:04] <liw> Since the brain remembers things mainly by association, it's hard for it to remember random things unless you're reminded of them again.
[18:04] <liw> Not impossible, just hard.
[18:04] <liw> Because of this, you should have note-taking equipment with you everywhere.
[18:04] <liw> A notebook and pen in your backpack, for example.
[18:04] <liw> And in your kitchen.
[18:04] <liw> Maybe in your bathroom.
[18:04] <liw> If you want to go extreme, there are notebooks for underwater you use in the shower.
[18:04] <liw> (I am not that extreme. Honestly.)
[18:04] <liw> I use a notebook in my backpack, plus my mobile phone, plus a text file on my laptop.
[18:04] <liw> When you've written something down, it should go in your inbox.
[18:05] <liw> An inbox might be physical or electronic, and you might have many of them.
[18:05] <liw> My notebook and mobile phones are considered inboxes.
[18:05] <liw> I have a single physical inbox for things like snail mail.
[18:05] <liw> I have lots of electronic inboxes: e-mail, RSS feeds, my home directories on various hosts, etc.
[18:05] <liw> the phone's sms messages are also an inbox, btw
[18:05] <liw> The point of the inboxes is that there is a limited number of places where to look in the process phase.
[18:05] <liw> that means it's easier to find all the things you write down
[18:06] <liw> any questions so far?
[18:07] <liw> then I'll continue with the processing phase
[18:07] <liw> In the process phase, you go through everything in the inboxes, and decide what to do about them.
[18:07] <liw> The algorithm is basically this: http://paste.ubuntu.com/263929/
[18:07] <liw> For each item in the inboxes, you decide whether you need to act on it at all, or whether it can be thrown away, or filed away where you'll find it when you do need it.
[18:07] <liw> If it does need action, can it be done immediately, in less than two minutes? Then do it at once.
[18:07] <liw> Otherwise, can you delegate it to someone else?
[18:07] <liw> When you've decided the fate of the item, you're either done, or you need to write down what needs to be done by your or someone else.
[18:07] <liw> This involves two lists: one for next actions for you, and one for things you're waiting for someone else to do.
[18:08] <liw> any questions? is anyone keeping up?
[18:09] <liw> ok, let's continue on the organize phase
 QUESTION: what do you think about such special apps and services, like tasque, tomboy, gtg, remeberthemilk? do you use it and do they help you?
[18:10] <liw> I have used a few a little bit, but for me, I find that simple tools are the most versatile and least in my way; however, everyeone needs to find the tools that fit them the best
[18:10] <liw> so, about organizing stuff...
[18:11] <liw> You need a place for everything, and you need to keep things more or less in their place.
[18:11] <liw> Otherwise you waste a lot of time finding things.
[18:11] <liw> The GTD system suggests several ways to organize things.
[18:11] <liw> At the core there are four lists: next actions for you to do, projects you are committed to, things you are waiting for someone else to do, and things you might do someday.
[18:11] <liw> (in short: next.txt, projects.txt, waiting.txt, and someday.txt for me)
[18:11] <liw> The difference between a next action and a project is that a project is anything that takes more than one step, but an action is just one.
[18:11] <liw> I keep these things in plain text files, other people prefer more sophisticated applications.
[18:11] <liw> I found sophisticated apps to be too limiting.
[18:11] <liw> You need a calendar for things that need to happen at specific times.
[18:11] <liw> You should only keep those things in there, and other notes and stuff elsewhere.
[18:12] <liw> I use Evolution's calendar.
[18:12] <liw> other people like google's calendar, or a paper calendar, or other solutions; again, whatever works for you is good
[18:12] <liw> You need a filing system. I have two: one for paper, one for bits.
[18:12] <liw> I use manilla folders for paper, and ~/Archive/ for bits.
[18:12] <liw> (actually, I have ~/Arkisto, which is Finnish for archive)
[18:13] <liw> Both have a folder for each topic. A new folder is very cheap, so I keep the highly specific and name them descriptively. This makes it easy to find things quickly.
[18:13] <liw> I also have a "Read and review" system, or several, for texts that don't require doing, but require reading.
[18:13] <liw> I have a shelf in my bookcase for unread books.
[18:13] <liw> I have a folder in my browser for bookmarks I haven't read yet.
[18:13] <liw> I have a ~/Read_and_review folder for downloaded files such as PDFs I need to read.
[18:13] <liw> that is a summary of my organizational system; any questions?
[18:15] <liw> no? in that case I'll continue on, to the "do" phase
[18:15] <liw> this is the best phase of all, this is where _useful_ stuff happens, all the rest exists only to make this phase be as good as possible
[18:15] <liw> Doing is simple. You look at your list of next actions, and pick whatever seems best to do next, and then you do it.
[18:15] <liw> GTD has no priorities: it trusts you to pick the best action at any one time.
[18:16] <liw> GTD does have contexts, but I'm going to skip those, in the interest of brevity. I can come back to them at the end if there's time (do ask).
[18:16] <liw> If your GTD system is kept up to date, you and your subconscious both trust it has everything important in it, and so you'll be able to concentrate on the chosen task and not have to worry about everything else.
[18:17] <liw> it's a bit contradictory, but since doing is so simple, there's really not much to say about it, even though it's the most important part of GTD
[18:17] <liw> so, unless there's questions, I'll continue to review
[18:17] <liw> A car needs an oil change and other attention from time to time.
[18:18] <liw> A GTD system needs regular review.
[18:18] <liw> During a review you make sure all your inboxes get emptied, that your lists are up to date, and that anything lingering in your brain gets dumped into the external system.
[18:18] <liw> you might also spend time during the review to empty all pockets in all your trousers, jackets, backpacks, and so on
[18:18] <liw> While you review the list of next actions, you remove anything that is already done, or that no longer needs doing, and make sure that everything that remains really is just a single, physical next action.
[18:18] <liw> Likewise, for the projects list. Make sure every project has at least one next action. Projects that don't have a next action can be removed from the list, although perhaps only temporarily.
[18:19] <liw> for most people, a weekly review seems to be a good idea; some people like monday mornings, to start off the work week with a clear picture of how things are
[18:19] <liw> others like Friday afternoon, to end it with a clear picture
[18:19] <liw> others like random times
[18:19] <liw> anything that works for you is good :)
[18:19] <liw> ok, that covers the very basics of the GTD system
[18:20] <liw> now does anyone have any questions?
[18:20] <liw> has anyone listening to this used GTD or some other productivity system?
[18:22] <liw> ok, a few people have :)
[18:22] <liw> the system I use is not a pure GTD system, but it's fairly closde
[18:23] <liw> one thing I've added is that in addition to a calendar I use a couple of other things to remind me of time-based things
[18:23] <liw> one is cron: I have my crontab e-mail me things that I need to do regularly
[18:24] <liw> the other is a nagger application that doesn't just remind me, it nags at me every morning until I tell it I've done it, and then it shuts up for a while until it's time to do the recurring thing again
[18:24] <liw> occasionally I also use at, but that's rare
[18:25] <liw> I wrote the nagger for myself, but 'bzr get http://code.liw.fi/nagger/bzr/trunk' should get you a copy, if you want to play with it; freshmeat is probably full of similar tools though
[18:26] <liw> those of you who have used productivity systems: what's your best tip? what's the worst thing you can warn people to avoid?
 liw, i guess it's the often wild inbox-todo-list-of-doom
[18:28] <liw> that is a very good point, and applies especially to e-mail handling
[18:28] <liw> I'll explain briefly how I manage e-mail
[18:28] <liw> all my e-mail comes to one inbox (in Evolution); I do not use per-mailing-list folders (even smart folders, since they broke for me)
[18:29] <liw> all incoming e-mail also gets automatically copied to an archive folder
[18:29] <liw> when I process e-mail in the inbox, if it does not require any action, I just delete it
[18:29] <liw> if I ever need to go back to it, to check something, I find it in the archive folder
[18:30] <liw> if I need to save an e-mail because it does need some action, I move it manually to a "pending & support" folder, and add the action to my next actions list
[18:30] <liw> thus, only e-mail that is unread or unprocessed stays in the inbox
[18:30] <liw> the goal is to empty the inbox completely every day (not necessary every time I read e-mail)
[18:31] <liw> I don't always reach the goal, but I rarely have more than a few e-mails that linger in the inbox; sometimes there are discussions that are just hard to read (difficult technical content, tough emotional content, or something)
 IMHO when you start keep your tools simple and you would probably have more succes commiting to the system
[18:32] <liw> that's also a good point, I feel similarly; however, some people get more energy from nifty technical toys, and more power to them
[18:33] <liw> A word about next actions and their list.
[18:33] <liw> An action should be a concrete physical action that can be done immediately, if you are in the right context.
[18:33] <liw> It should not require something else to be done first.
[18:33] <liw> It should be doable in one sitting, ideally in less than fifteen minutes, but that varies a lot, depending on the task and your familiarity with it.
[18:33] <liw> For example, "write weekly activity report and send it to boss" is a good next action.
[18:33] <liw> It is very concrete, does not depend on anything else, and doable quickly.
[18:33] <liw> On the other hand, "save the whales" is a bad next action.
[18:34] <liw> It is unclear what the actual action is.
[18:34] <liw> (if you really meant, "drag the whales from the beach back into the sea", you should write that instead)
[18:34] <liw> It might work as a project, but even then it should probably be expanded with some description of what it means for whales to have been saved: what the success criteria for the project are.
[18:35] <liw> "Make a new Ubuntu derivative for jugglers" is also a bad next action.
[18:35] <liw> It seems very concrete, but it's too long a task.
[18:35] <liw> It might be a project, and the first action for the project might be "write list of six reasons why jugglers need their own distro".
[18:35] <liw> also, a couple of links
[18:35] <liw> http://en.wikipedia.org/wiki/Getting_Things_Done is the wikipedia page on the GTD book
[18:36] <liw> if you're serious about trying out GTD, borrow or buy the book, it's a pretty quick read, and not too badly written
[18:36] <liw> http://www.43folders.com/ is a website/blog about productivity stuff; the early archives are full of all sorts of tips and tricks and ideas, which may be inspiring
[18:37] <liw> (though, after a few years the reader might get as tired of them as the author, but the archives are great)
[18:37] <liw> QUESTION: With all those pdf, articles, blogs... ¿do you know a centralized system to organize all that kind of information and be able to easily find where did you read what?
[18:38] <liw> I don't have a system for that. I save stuff I may want to get back to to my link list (http://liw.fi/links/), and for the rest, I use my memory and/or a search engine ending with ogle
[18:39] <liw> ok, that finished off all my prepared notes
[18:39] <liw> we have 20 minutes for further questions
[18:42] <liw> the silence is overwhelming :) no worries, I'll stick around until the end, in case anyone comes up with something
[18:42] <liw>  QUESTION: estimating how long an action will take, and then recording how long it actually took is advocated by some time management people.  Do you see any value in doing this?
[18:43] <liw> I don't do that, but if it's easy for you to do, it can be reasonable to do at least some of the time, so you know what the correction factor is between your estimates and reality
[18:43] <liw> (I have a correction factor of about 10 at times...)
[18:43] <liw> more sophisticated systems than plain text files would make this easier to do, I'm sure
[18:44] <liw> hm, I skipped an explanation of contexts earlier, I could do that now
[18:44] <liw> a "context" in the context of the next actions list, is some kind of constraint on the task, such as the availability of a phone
[18:45] <liw> or availability of the Internet, or some particular person, or being in a physical location, or whatever
[18:45] <liw> if a next actions list is shortish, say less than 20 or 30 items, it doesn't need to be divided into sections, but longer lists typically do, and GTD suggests contexts for them
[18:46] <liw> so the list might have a section for things you need to do over the phone: setting up a doctor's appointment, or something
[18:46] <liw> the exact list of suitable contexts depends on the life you lead
[18:47] <liw> the GTD books is from 2000, so it is a bit quaint and suggests things like "at computer", as if people didn't spend 16+ hours at their computers
[18:48] <liw> my contexts are: errand (stuff I need to leave my home for), phone, online banking (it is an effort to log in securely, so I try to do everything with one login), work time at computer, free time at computer, at home not using a computer, and availability of a car (I share a car with two friends)
[18:48] <liw> once again, any set of contexts that works for you is good
[18:52] <liw> QUESTION: Could you elaborate a bit more about why you prefer simple text lists over more specialized applications for GTD?  What didn't you like about applications?
 liw: It seems as though you use a huge variety of tools to manage everything.  Have you considered consolidating everything, or at least as much as possible, into one central application?
[18:53] <liw> these two questions are related, I think
[18:53] <liw> for a while I had everything in one system, then I wrote my own custom app and moved everything into that system
[18:53] <liw> the problem was, one app wasn't flexible enough for me
[18:54] <liw> but that's me, I'm not saying they aren't good for others
[18:54] <liw> for example, a centralized app might take a lot of work to change the list of contexts, or be resistant to adding a new category of list
[18:55] <liw> or the app might be a web app, which I just find awkward to use
[18:55] <liw> also, I am an old-fashioned luddite
[18:56] <liw> anything else? we are about to run out of time
[19:00] <liw> james_w, please have the podium
[19:00] <james_w> thanks liw
[19:00] <james_w> hi everyone
[19:00] <james_w> I'm going to be talking about fixing an Ubuntu bug using bzr
[19:01] <james_w> who's here to learn about that?
[19:01] <james_w> excellent
[19:02] <james_w> so, first things first, install the "bzr-builddeb" package if you don't have it installed yet
[19:02] <james_w> we'll need it in a little while
[19:03] <james_w> if you head on over to https://launchpad.net/ubuntu
[19:03] <james_w> you'll see you are now able to click on the "Code" tab at the top
[19:03] <james_w> which wasn't something you could do until recently
[19:04] <james_w> what does that tab show you?
[19:04] <james_w> it shows you bzr branches for every package in Ubuntu
[19:05] <james_w> you can now get the source of (nearly) every package as a bzr branch
[19:05] <james_w> this means you can more easily look at the history of the package
[19:05] <james_w> more easily version control your changes
[19:05] <james_w> and more easily merge changes from others
[19:05] <james_w> I think this is wicked cool
[19:06] <james_w> qense> QUESTION: How do the branches relate to the apt-get source command?
[19:07] <james_w> good question
[19:07] <james_w> branching lp:ubuntu/<packagename> will get you the same as "apt get source <packagename>" (if you have karmic deb-src lines in your sources.list)
[19:07] <james_w> it just gets it as a bzr branch rather than a tarball
[19:08] <james_w> we keep the branches up to date with changes in the archive
[19:08] <james_w> so that when there is a new upload it appears there very quickly
[19:09] <james_w> e.g. Chuck uploaded net-snmp 43 minutes ago, and the change was available in bzr 5 minutes later
 QUESTION: same as mruiz: are all ubuntu packages there ?
[19:09] <james_w> almost
[19:09] <james_w> the intent is to have them all there
[19:09] <james_w> we are still working on getting the last 10% there
[19:10] <james_w> so, you don't have to use that long list of branches to navigate
[19:11] <james_w> if you look at https://launchpad.net/ubuntu/+source/net-snmp
[19:11] <james_w> which is the package page for net-snmp that I just mentioned
[19:11] <james_w> you will see that the "Code" tab is again active
[19:11] <james_w> clicking on that gives this page:  https://code.launchpad.net/ubuntu/+source/net-snmp
[19:11] <james_w> which is an overview of the branches available
[19:12] <james_w> each branch is attached to a release of Ubuntu
[19:12] <james_w> so you can see the karmic branches separate from the jaunty ones
[19:12] <james_w> so at the top is lp:ubuntu/net-snmp
[19:12] <james_w> which can also be written lp:ubuntu/karmic/net-snmp
[19:12] <james_w> omitting the release gets you the current development one
[19:13] <james_w> under that is lp:ubuntu/jaunty/net-snmp which is obvious the jaunty branch
[19:13] <james_w> then there is intrepid, which is a bit more interesting
[19:14] <james_w> lp:ubuntu/intrepid/net-snmp
[19:14] <james_w> that's the source that was released with intrepid
[19:14] <james_w> then lp:ubuntu/intrepid-security/net-snmp
[19:14] <james_w> which is the source that is in intrepid-security
[19:14] <james_w> so it contains one or more security updates
 QUESTION: What if you want to get the latest version in e.g. jaunty, but don't know if it's been published in backports, security and/or updates?
[19:15] <james_w> good question
[19:15] <james_w> there's no good answer for that currently
[19:15] <james_w> it's partly that "latest" isn't exactly well defined
[19:15] <james_w> I'm keen to provide that somehow
[19:16] <james_w> but it may be implemented on top of what we have using the LP API or something
[19:16] <james_w> so, that's the branches that are available, what do they contain?
[19:16] <james_w> check out https://code.launchpad.net/~ubuntu-branches/ubuntu/karmic/net-snmp/karmic
[19:16] <james_w> which is the page that corresponds to the karmic branch
[19:17] <james_w> gives you some information on the branch, the latest revisions, and the bugs that have been fixed
[19:17] <james_w> it also allows you to subscribe to the branch
[19:17] <james_w> this would allow you to get an email every time there was an upload of a package
[19:18] <james_w> which I don't think you can currently do without some procmail/rss2email type solution
[19:19] <james_w> if you click on the "Source Code" link then you can see the contents of the branch
[19:19] <james_w> https://bazaar.launchpad.net/~ubuntu-branches/ubuntu/karmic/net-snmp/karmic/changes
[19:20] <james_w> and https://bazaar.launchpad.net/~ubuntu-branches/ubuntu/karmic/net-snmp/karmic/files
[19:20] <james_w> so you can see that all the source is there as you would expect
[19:20] <james_w> and you can also see the revision corresponding to each upload
 QUESTION: will it eventually be possible, with the correct upload rights, to push to one of these branches and have a package built & uploaded out of it? (or does this already happen? :) )
[19:20] <james_w> yes and yes
[19:20] <james_w> that is currently in planning
[19:21] <james_w> you will be able to push soon if you can upload
[19:21] <james_w> and then you will be able to request a build from the branch
[19:21] <james_w> (which will work from any packaging branch to PPAs as well)
 QUESTION: how are the default upload rights per branch ?
[19:21] <james_w> this is still being discussed
[19:22] <james_w> one rule will be that if you can upload the package then you will be able to push to these "official branches"
[19:22] <james_w> I should have mentioned that the lp:ubuntu/net-snmp etc. branches are special
[19:22] <james_w> they have been nominated to be "official" and correspond to what is in the archive
[19:22] <james_w> you can push any branch you like to ~LP-ID/ubuntu/karmic/net-snmp/some-name
[19:23] <james_w> if you want to work on this package
[19:23] <james_w> which will work well for PPAs at some point
[19:24] <james_w> in addition to all of this check out https://code.launchpad.net/debian
[19:24] <james_w> we have exactly the same thing there for Debian
[19:25] <james_w> so if you see an upload in Debian with a change you want in Ubuntu then you can merge the Debian branch from there to the Ubuntu one
 QUESTION: are they imported from git.debian.org ?
[19:25] <james_w> no
[19:25] <james_w> not every package is on there
[19:25] <james_w> we would like to do that when it makes sense
[19:26] <james_w> but we need some improvements in bzr first
[19:26] <james_w> so we are working on that
[19:26] <james_w> so, what else can you do with these branches?
[19:26] <james_w> well, I hope you can use them to fix bugs
[19:27] <james_w> otherwise I picked a bad title for this session
[19:27] <james_w> so, I recorded a screencast that shows some of this
[19:27] <james_w> unfortunately it has no audio, but it might help follow along or jog your memory
[19:27] <james_w> http://people.canonical.com/~jamesw/dd.ogv
 QUESTION: Is it possible to checkout the source of package, apply custom patches and publish to personal PPA for testing
[19:28] <james_w> yes, that will be possible one day
[19:28] <james_w> you can do the first part now
[19:28] <james_w> and you can upload the result to your PPA as normal with dput
[19:28] <james_w> the branch -> PPA step will be a future addition
 QUESTION: Who control Ubuntu Branches team?
[19:28] <james_w> <evil laugh>
[19:28] <james_w> I do
[19:29] <james_w> it's kind of an implementation detail
[19:29] <james_w> for all these branches there isn't really an owner, but we can't have no owner, so we just made a new team
 Not completely ontopic: I'd like to propose an update for the guake package with help of the branch system, but the diff comes from Git. How do I convert it to a usable patch I can add to the branch?
[19:29] <james_w> check out "bzr patch" from bzrtools
[19:29] <james_w> should be able to apply git diffs
 QUESTION: will bzr-builddeb be used on the launchpad side for building?
[19:31] <james_w> dunno
[19:31] <james_w> or a more precise answer:
[19:31] <james_w> yes, but no
[19:31] <james_w> there will be code reuse
[19:32] <james_w> but we might want to reduce the amount of trusted code
[19:32] <james_w> and it won't need all the features of bzr-builddeb
[19:35] <james_w> sorry, just checking the cricket score
[19:35] <james_w> right, so let's fix a "bug"
[19:35] <james_w> we can carry on working on this net-snmp package
[19:36] <james_w> oh, staging is down
[19:36] <james_w> that will make this tricky
[19:36] <james_w> we don't really want to create lots of useless merge proposals
[19:36] <james_w> how about I commentate on the video instead?
[19:36] <james_w> would that work?
[19:37] <james_w> not good for the logs though :-/
[19:38] <james_w> we can at least grab a branch and look around, so let's do that
[19:39] <james_w> bzr branch lp:ubuntu/net-snmp
[19:39] <james_w> that will create a "net-snmp" directory that contains the bzr branch
 error- > bzr: ERROR: exceptions.KeyError: 'Bazaar repository format 2a (needs bzr 1.16 or later)\n
[19:42] <james_w> so, what's going on here?
[19:42] <james_w> bzr is just about to release 2.0 with a new default format
[19:43] <james_w> this format is a lot better than it's previous ones in many ways
[19:43] <james_w> most notably here in disk space
[19:44] <james_w> as there are a *lot* of branches here it would have used loads of disk space in the old format
[19:45] <james_w> so we used the new one a little before it is available to most people so that we could fit all these branches on a sensible number of disk drives
[19:45] <james_w> this is unfortunate in that it makes it harder to use an old release to work on the branches
[19:45] <james_w> there is https://launchpad.net/~bzr/+archive/ppa
[19:45] <james_w> and we will go through the backport process once 2.0 is out
[19:46] <james_w> plus, it's not long until karmic is released :-)
[19:49] <james_w> so, we have the branch now
[19:49] <james_w> you can look around and see that it looks just like a normal package
[19:49] <james_w> how to build it?
[19:50] <james_w> "bzr builddeb -S"
[19:50] <james_w> that will build a source package
[19:50] <james_w> "bzr builddeb" to build a binary one
[19:50] <james_w> "bzr bd"
[19:50] <james_w> you can use that alias for less typing
 QUESTION: Is there a mechanism for proposing something based on lp:ubuntu/foo/bar to become lp:ubuntu/foo-backports/bar ?
[19:50] <james_w> that would be the normal backport process
[19:51] <james_w> but no, we don't have anything nominations or anything for that
[19:51] <james_w> so, feel free to fix any bugs you find in this package :-)
[19:52] <james_w> if you do fix something then you can "bzr commit", or use "debcommit" after adding a changelog entry with "dch"
[19:52] <james_w> then you should "bzr push" this to LP
[19:52] <james_w> to something like "bzr push lp:~LP-ID/ubuntu/karmic/net-snmp/fix-bug"
[19:54] <james_w> then you can open that branch in your web browser and "Propose for merging in to another branch"
[19:54] <james_w> and that would create a "merge proposal" that allows us to review and comment on the changes
[19:54] <james_w> you can see this in the screencast
[19:54] <james_w> we're out of time, any last questions?
[19:57] <james_w> ok, I'll make way for Laney
[19:57] <james_w> thanks everyone
[19:57] <james_w> I'm always up for discussing this, so grab me another time if you want to know more
[19:59] <Laney> Hi everyone
[19:59] <Laney> just getting sorted, let's start in a couple of minutes
[20:00] <Laney> Do we have a questions channel? I've been out of it for a week
[20:02] <Laney> you can paste them in here
[20:02] <AntoineLeclair> good
[20:02] <Laney> Alright everyone, let's get started!
[20:02] <Laney> Who's here? Say hi in #ubuntu-classroom-chat
[20:04] <Laney> Yay, looks healthy
[20:04] <Laney> (sorry if I go silent for a bit... connectivity problems)
[20:04] <Laney> So... we're here to learn how to package from scratch
[20:04] <Laney> take the upstream source tarball and end up with a .deb that users can install on their systems
[20:05] <Laney> ...and if you persevere enough, install using apt
[20:05] <Laney> Let's get started, as time is already ticking away
[20:06] <Laney> Earlier in the week I perused the needs-packaging bugs that have been filed on Launchpad looking for something fun for us to work on in this session
[20:06] <Laney> I've decided that we should package a little tool for working with PDF files called pdfchain.
[20:06] <Laney> You can read more about it here: http://pdfchain.sourceforge.net/
[20:07] <Laney> This is a nice and simple application to package, but one which has a couple of fun twists along the wya
[20:07] <Laney> So, without further ado, let's download the tarball
[20:07] <Laney> please run:
[20:07] <Laney>   wget
[20:07] <Laney> http://downloads.sourceforge.net/project/pdfchain/pdfchain-0.123/PDF%20Chain%20version%200.123/pdfchain-0.123.tar.gz
[20:07] <Laney> (on one line)
[20:07] <AntoineLeclair> shadeslayer: QUESTION:Who decides what to package?
[20:08] <Laney> shadeslayer: Good question
[20:08] <Laney> shadeslayer: Individuals do. We have a procedure for requesting packages on Launchpad, but nobody can force you to do the work
[20:08] <Laney> basically if you want to package an application that nobody else is working on, go ahead and do it :)
[20:09] <Laney> all got the tarball? We need to move it to the name that the packaging system expects
[20:09] <Laney> the format is UPSTREAMNAME_VERSION.orig.tar.gz
[20:10] <Laney> so please mv pdfchain-0.123.tar.gz pdfchain_0.123.orig.tar.gz
[20:10] <Laney> and then unpack it: tar xzvf pdfchain_0.123.orig.tar.gz
[20:10] <AntoineLeclair> shadeslayer: QUESTION:if you package an app and it gets uploaded to ubuntu repos,do you have to mange it or does MOTU take care of it?
[20:12] <Laney> shadeslayer: It will be team maintained in the usual case. Anyone can work on it but that can often mean nobody works on it, so it is expected that once you get a package uploaded you continue to care for it
[20:12] <Laney> that means managing bugs and keeping track of upstream
[20:12] <Laney> we don't want unmaintained packages in the archive
[20:12] <Laney> have we got the tarball unpacked?
[20:12] <Laney> please change into the directory
[20:12] <Laney> cd pdfchain-0.123
[20:13] <Laney> now we need to make a directory to hold all of our packaging data
[20:13] <Laney> mkdir debian
[20:13] <Laney> this is where all of the information used to build the package is going to go
[20:14] <Laney> Let's make empty copies of some of the files we are going to work with
[20:14] <Laney> I'll explain what these are as we go along
[20:14] <Laney> touch debian/copyright debian/compat debian/control debian/rules
[20:14] <Laney> (there is a tool called dh_make to make templates for these files but we won't use it here)
[20:15] <Laney> The first file we'll work with is the changelog
[20:15] <Laney> this is used by various pieces of archive software, and is the log of your package's history
[20:15] <Laney> dch --create --newversion 0.123-0ubuntu1 --package pdfchain --distribution karmic
[20:15] <Laney> so please run:
[20:16] <AntoineLeclair> mac_v: <Question> any reason , why the template is not used?
[20:16] <Laney> "dch" is a tool for managing debian changelog files
[20:16] <Laney> mac_v: Partly for educational purposes, partly because I don't think it's really necessary
[20:16] <Laney> dh_make creates a lot of files we don't need here
[20:17] <Laney> part of what I want to teach you is that packaging is quite easy
[20:17] <Laney> back to dch -- with this command we've told it to create a new debian changelog file, with the version/package/distribution given
[20:17] <AntoineLeclair> funkyHat: Question: why are we giving the version an 0ubuntu1 suffix?
[20:17] <Laney> funkyHat: coming to that )
[20:17] <Laney> :)
[20:18] <Laney> A note on the version convention - 0.123 is the upstream version number, which I hope is obvious
[20:18] <Laney> the - is a separator between the upstream and "debian" (/ubuntu) revision
[20:18] <Laney> 0 is the revision of the package in Debian itself
[20:19] <Laney> 0 as it hasn't been uploaded there (I hope)
[20:19] <Laney> if we were packaging for Debian we would use the version 0.123-1
[20:19] <Laney> "ubuntu1" means that this is the first revision of the package in Ubuntu
[20:19] <Laney> so dch should have opened a text editor for you
[20:19] <Laney> is that right?
[20:20] <Laney> So now we need to make one small change to the file
[20:21] <Laney> we need to ensure that when the package is uploaded, the bug that was filed to request the packaging is closed
[20:22] <Laney> you can see that bug here: https://bugs.launchpad.net/ubuntu/+bug/407982
[20:22] <Laney> So please change the Closes: #xxxx to LP: #xxxx
[20:22] <Laney> This instructs the launchpad archive management software to set this bug to "fix released" when the package is uploaded
[20:22] <Laney> please save and quit the file now
[20:22] <AntoineLeclair> elopio: Question: if there was no open bug for the package, should we open one before?
[20:23] <Laney> elopio: It's a good idea to prevent two people doing the same work
[20:23] <Laney> but it's not mandatory
[20:23] <Laney> OK the next file we're going to fill in is debian/compat
[20:23] <Laney> we will be working with Debhelper version 7 so please:
[20:23] <Laney> echo 7 > debian/compat
[20:23] <AntoineLeclair> ScottTesterman: QUESTION: If the "Closes" stays, but "LP" is still added before the bug number, will Launchpad still close the bug, or does the word Closes throw it off?
[20:23] <Laney> this instructs debhelper to use compatibilty level 7
[20:24] <Laney> see man debhelper for what the various choices are
[20:24] <Laney> and for more information on what this means
[20:24] <Laney> ScottTesterman: It will probably break the parser, use LP: #407982
[20:25] <Laney> OK that file was easy
[20:25] <Laney> so now we'll move on to the rules file
[20:25] <Laney> this is the file which describes how to build your package
[20:26] <Laney> for technical details see http://www.debian.org/doc/debian-policy/ch-source.html
[20:26] <Laney> for now, please:
[20:26] <Laney> cp /usr/share/doc/debhelper/examples/rules.tiny debian/rules
[20:26] <Laney> now open this up in your favourite editor
[20:27] <Laney> See how simple this is? :)
[20:27] <Laney> Not so long ago such a short rules file wouldn't have been possible
[20:27] <Laney> but the magical Joey Hess recently released Debhelper version 7 which allows us to use such short files
[20:28] <Laney> we now only need to express situations in which the packaging differs from the default behaviour
[20:28] <Laney> For now we don't know what's going to differ so please quit your editor
[20:29] <Laney> (we will return to rules later when the package doesn't quite build as expected)
[20:29] <Laney> now we'll move onto debian/control
[20:30] <Laney> This is a file which expresses a lot of important metadata about your package
[20:30] <Laney> Please visit http://pastebin.com/f59a78dd and copy the contents to debian/control
[20:30] <Laney> I'll quickly explain what the fields mean
[20:30] <Laney> (speeding up, time is ticking away)
[20:30] <Laney> Source: name of the source package
[20:31] <Laney> Section, name of the archive section - allows users to navigate packages by category
[20:31] <Laney> eg on http://packages.ubuntu.com
[20:31] <AntoineLeclair> mruiz: QUESTION: What is the default behaviour
[20:31] <Laney> mruiz: I don't understand, please clarify
[20:31] <AntoineLeclair> mruiz: QUESTION: What is the default behavior with debhelper 7 ?
[20:32] <Laney> Priority: how important it is that the user installs the package
[20:32] <Laney> mruiz: I don't have time to explain, but basically ./configure && make && install (or the appropriate based on the build system in use)
[20:32] <AntoineLeclair> mruiz: QUESTION: What is the default behavior with debhelper 7 (because debian/rules seems to be black magic)? ;-)
[20:33] <Laney> Maintainer: Who maintains the package, for us it's usually the development team
[20:33] <Laney> XSBC-... - for packages created in Ubuntu first, the initial packager
[20:33] <Laney> for packages which come from Debian, the Debian maintainer
[20:33] <Laney> Build-Depends: packages which must be installed for this one to build
[20:34] <Laney> Standards-Version: version of debian policy which this package conforms to
[20:34] <Laney> Homepage: upstream homepage for software
[20:34] <AntoineLeclair> shadeslayer: QUESTION : How does one determine dependencies?
[20:34] <Laney> shadeslayer: We'll come to this
[20:34] <Laney> after the blank line, the next lines refer to the *binary* package
[20:35] <Laney> we are working with a source package currently; the binary package is what we build at the end
[20:35] <Laney> (.deb)
[20:35] <Laney> there can be multiple binary stanzas
[20:35] <Laney> Package: name of the binary package
[20:35] <Laney> Architecture: CPU architectures for which this package works (can be - and usually is - 'all')
[20:36] <Laney> Depends: packages which must be installed for this one to work
[20:36] <Laney> Description: self explanatory - displayed in various pieces of software
[20:36] <Laney> there are other fields, but these are the ones we need here
[20:36] <Laney> See http://www.debian.org/doc/debian-policy/ch-controlfields.html for more
[20:37] <Laney> Usually I copy the control file from another package and edit it to suit
[20:37] <Laney> so now please open debian/control in your editor
[20:38] <Laney> to speed this up I've filled in most of the info
[20:38] <Laney> paste these contents in http://pastebin.com/m58074002
[20:39] <Laney> The most important thing to figure out are the build dependencies
[20:39] <Laney> the first thing we definitely need is debhelper
[20:39] <Laney> as this is what we are using to build the package
[20:39] <Laney> so please change line 6 to Build-Depends: debhelper (>= 7)
[20:39] <Laney> this says that we need debhelper of at least version 7 installed to build
[20:40] <AntoineLeclair> mruiz: QUESTION: Why Ubuntu Developers as Maintainers? What about MOTU Developers ?
[20:40] <Laney> mruiz: in anticipation of the archive reorganisation when MOTU will be going away
[20:40] <Laney> to figure out the rest of the build-deps, we would usually look in the README
[20:40] <Laney> or INSTALL file, but for this software they are not so useful
[20:41] <Laney> so we will open up configure.ac
[20:41] <AntoineLeclair> AlanBell: QUESION: is it important/desireable to use an @ubuntu.com email address?
[20:41] <Laney> AlanBell: For the maintainer or original maintainer?
[20:41] <Laney> configure.ac is used by the GNU autotools to build the configure script
[20:42] <Laney> part of running ./configure is checking that the system has the necessary prerequisites installed
[20:42] <Laney> it requres some skill to understand this file
[20:43] <Laney> but what we can understand from it is that we need gtkmm-2.4 greater than or equal to 2.4 installed and intltool greater than or equal to 0.35
[20:43] <Laney> this is one of the skills that you will develop when you maintain packages
[20:44] <Laney> so, please add ", libgtkmm-2.4 (>= 2.8), intltool (>= 0.35.0)" to your build-dep line
[20:44] <Laney> save and quit this file
[20:44] <Laney> er, wait, please don't do that ;)
[20:44] <Laney> reopen debian/control
[20:44] <Laney> we need to sort out the dependencies for the binary package
[20:45] <Laney> this is the Depends: line
[20:45] <Laney> this should be Depends: ${shlibs:Depends}, ${misc:Depends}, pdftk
[20:45] <Laney> what this says is to insert the dependencies for the shared libraries (resolved by dh_shlibdeps - see man page for more)
[20:45] <Laney> other parts of the build process can insert their own dependencies, which will be added to misc:Depends
[20:46] <Laney> these are called substvars, short for substitution variables
[20:46] <Laney> and pdftk is the application which pdfchain uses to transform pdfs, but won't be detected by shlibs because it is not a linked library
[20:46] <Laney> it is called as a system binary
[20:46] <Laney> *now* you can save and quit
[20:46] <Laney> invoke debuild -S -us -uc to build the source package
[20:47] <Laney> cd ..
[20:47] <Laney> ls *.dsc
[20:47] <Laney> you should see the source package we just made
[20:47] <Laney> which you can build with pbuilder
[20:47] <Laney> pbuilder-karmic build *.dsc
[20:49] <Laney> pbuilder-disc karmic build *.dsc if you did not build your pbuilder with symlinks
[20:50] <Laney> if you do not have pbuilder installed
[20:50] <Laney> cd pdfchain-0.123
[20:50] <Laney> sudo apt-get install libgtkmm-2.4-dev intltool
[20:50] <Laney> (and debhelper if you dont have this)
[20:50] <Laney> fakeroot debian/rules binary
[20:51] <Laney> We're rapidly running out of time, so I'm going to speed this up a lot
[20:51] <Laney> the build will fail in the documentation step
[20:51] <Laney> test step, sorry
[20:51] <Laney> this is because the upstream make check rule is broken
[20:52] <Laney> this should be reported as a bug to upstream - someone please feel free to do this
[20:52] <Laney> we are going to disable the test for now
[20:53] <Laney> Make your debian/rules look like: http://pastebin.com/f4d339347
[20:53] <Laney> this says to run the following commands instead of dh_auto_test, which is the debhelper command that runs the tests
[20:53] <Laney> the following commands are nothing ;)
[20:54] <Laney> overrides were a feature introduced in debhelper version 7.0.50, so we need to change ">= 7 to >= 7.0.50" in the build-deps line
[20:54] <Laney> now the package will build successfully :)
[20:55] <Laney> However, there is still an upstream bug where the documentation is placed in /usr/doc instead of /usr/share/doc
[20:55] <AntoineLeclair> randomaction: shouldn't we use libgtkmm-2.4-dev for build-depnds?
[20:55] <Laney> randomaction: yes
[20:55] <Laney> that's what I meant, sorry
[20:55] <Laney> See http://pastebin.com/f5a2622fe
[20:56] <Laney> dh_install is the debhelper command which deals with installing files into packages
[20:56] <Laney> we will patch this rule to move the documentation to the right place
[20:56] <Laney> http://pastebin.com/f547f4598
[20:56] <Laney> this says to move the files from debian/pdfchain/usr/doc to debian/pdfchain/usr/share/doc
[20:57] <Laney> which is the correct location for them
[20:57] <Laney> debian/pdfchain is where the files are built to before being placed into your .deb file
[20:57] <Laney> OK we're pretty much out of time so I'm going to give oyu links to read the stuff I didn't get to
[20:57] <Laney> sorry for having to go so fast, I hope you managed to get something
[20:58] <Laney> We still need to:
[20:58] <Laney>   - Fill out a package description. See http://www.debian.org/doc/debian-policy/ch-controlfields.html#s-f-Description for more on this
[20:59] <Laney>   - Fill out the debian/copyright file. See http://www.debian.org/doc/debian-policy/ch-docs.html#s-copyrightfile and http://dep.debian.net/deps/dep5/
[20:59] <Laney> For reference, http://pastebin.com/f75b86ac8 is the file I came up with
[21:00] <Laney> and you can get the final version of the pdfchain package from my PPA at https://edge.launchpad.net/~laney/+archive/ppa
[21:00] <Laney> Please ask any questions you have on this in #ubuntu-motu, and sorry again for having to rush
[21:00] <Laney> as you can see, there is a lot to know when packaging from scratch :)
[21:00] <Laney> Now I'll hand over to noodles775, cprov and wgrant who are going to talk to you about hacking soyuz
[21:01] <Laney> take it away lads
[21:01] <noodles775> Thanks Laney !
[21:01] <cprov> Laney: thanks.
[21:01] <noodles775> My name's Michael Nelson and I've been working on Launchpad and Soyuz for around 9 months now.
[21:01] <noodles775> Here's an overview of what's coming up over the next 40mins or so:
[21:02] <noodles775> 1. Grill a new soyuz hacker with questions.
[21:02] <noodles775> 2. A guided tour through the Soyuz code-base
[21:02] <noodles775> 3. Setting up a Soyuz test scenari
[21:02] <noodles775> o
[21:02] <noodles775> So - up first is our chance to grill the latest Soyuz hacker: wgrant! Since the open-sourcing of Soyuz with Launchpad, wgrant has - in his own time - pushed 20 (!) launchpad branches.
[21:03] <noodles775> 7 of these are soyuz-related branches (afaics).
[21:03] <noodles775> wgrant: wanna introduce yourself?
[21:03] <wgrant> I didn't think I'd done quite that many, but OK!
[21:03] <wgrant> So, I've been an Ubuntu developer for a few years now.
[21:03] <noodles775> (that's how many I'd counted that have been pushed - not necessarily merged :) ).
[21:04] <wgrant> And at some point became interested in the infrastructure behind it all.
[21:04] <noodles775> Cool!
[21:04] <noodles775> So this is our chance to find out how William got started working on Launchpad and Soyuz, what the issues were, and what he'd recommend to others who want to hack on Soyuz.
[21:04] <wgrant> So when it was sneakily open-sourced a month ago, I jumped straight in.
[21:05] <wgrant> There are lots of bits and pieces I'd like to see fixed or improved, so it was really great to see the source releaed.
[21:05] <wgrant> Hacking Soyuz (and Launchpad in general) is probably going to be a little intimidating at first.
[21:05] <wgrant> But Launchpad developers are very helpful to new contributors, so they can give you a lot of guidance if you get lost.
[21:06] <noodles775> BTW everyone: please have think up some good questions that will help you get started and post them in #ubuntu-classroom-chat
[21:06] <wgrant> To get oriented for development, I'd start off by setting up a local development environment (https://dev.launchpad.net/Getting).
[21:06] <wgrant> Then have a look at how the codebase is organised. Maybe poke around in the model a bit with 'make harness'.
[21:07] <wgrant> See how things work using the SoyuzTestPublisher, which I believe noodles775 will explain later.
[21:07] <wgrant> Once you've found your way around a bit, identify a little bug or feature on which you want to work.
[21:08] <wgrant> The next step is to ask a Launchpad developer (in #launchpad-dev) about it. They'll advise you whether you're attempting the impossible, or otherwise tell you where to start.
[21:08] <wgrant> That bit is quite important, as it can stop you from hitting dead-ends or attempting something that's just too difficult for a first-time hacker (as quite a few things are).
[21:11] <noodles775> cprov: have you (or anyone else) ever tried to explain what soyuz does by analogy?
[21:11] <wgrant> An analogy... a good question. It's a pretty complex creature, so I'm not sure where to start.
[21:12] <noodles775> I've sometimes tried to think of what soyuz does as a blogging engine... something familiar, with some similarities...
[21:13] <noodles775> But there are quite a few differences too.
[21:13] <noodles775> OK, any other questions for wgrant ?
[21:13] <cprov> well, Soyuz encompasses a lot of subcomponents that takes debian source packages as input and produces debian repositories, but there is a lot of details in the middle.
[21:13] <noodles775> Which is a great lead-in to 2. A guided tour through the Soyuz code-base - take it away cprov :)
[21:14] <cprov> Hi, my name is Celso Providelo and I've been working on Soyuz for the last 5 years (!)
[21:15] <cprov> so, I would like to point you to some piece of documentation I've created to guide users to the Soyuz code base.
[21:15] <cprov> https://wiki.ubuntu.com/CelsoProvidelo/SoyuzInfrastructureOverview
[21:16] <cprov> it has a descent (but not pretty) diagram, it illustrate what I mean by 'lots of details in the middle' before.
[21:17] <cprov> Soyuz is in reality a set of integrated tools/components for 'controlling' software packages.
[21:19] <cprov> It starts with the 'upload server', an FTP daemon that receives source packages uploaded by users using `dput/dupload`.
[21:20] <cprov> Sources are them passed to the 'upload processor' which verifies their consistency (packaging metadata) and stores its information in the Launchpad database.
[21:21] <cprov> the publication of the source automatically creates a build request, which is dealt by the 'build dispatching' component.
[21:22] <cprov> it pass the source to a 'builder', an isolated environment for running `debuild`.
[21:23] <cprov> Binaries resulted from the build process come back to the upload processor and are checked before getting stored in Launchpad.
[21:23] <cprov> QUESTION: c_korn: cprov: those are the builders ? https://launchpad.net/builders
[21:24] <cprov> c_korn: exactly, those are the current build machines.
[21:24] <cprov> In the wiki page I've mentioned above, there are pointers to the corresponding modules for each part of the systems
[21:26] <cprov> knowing the topology described in that diagram and where to look in the codebase will help you to find out what need has to be changed.
[21:28] <cprov> That document is very short on details, but I expect them to be added as long as we get more community contributors. Don't hesitate in adding questions or suggestion directly to it.
[21:28] <cprov> noodles775: the stage is yours, I guess.
[21:28] <noodles775> cprov: I had a few questions myself...
 QUESTION: What's an example of a inconsistency that the upload processor will find and reject?
[21:29] <cprov> noodles775: basically anything that doesn't pass a `lintian` check.
[21:30] <noodles775> ah ok, so it's really just to check for things that people should do themselves before uploading.
[21:30] <cprov> exactly
[21:30] <noodles775> OK, on to: 3. Setting up a Soyuz test scenario
[21:30] <cprov> the upload processor also checks consistency against the packages previously uploaded
[21:31] <noodles775> When hacking on Launchpad’s soyuz application – and creating tests to verify that your new functionality works, you’ll often need sources or binaries published in very specific scenarios.
[21:31] <noodles775> We're going to use a special test feature - the SoyuzTestPublisher - to publish sources and binaries to a PPA in our development environment - and watch the status' update live in the browser.
[21:32] <noodles775> The SoyuzTestPublisher – as the name suggests – was created for this exact reason (by cprov) :)
[21:32] <noodles775> So for this hands-on - you don't need any previous LP development experience... but you do need a Launchpad development setup.
[21:33] <noodles775> If you've set up the Launchpad development environment properly according to http://dev.launchpad.net/Getting, you should be able to run the following command:
[21:33] <noodles775> $ rocketfuel-branch soyuz-test-scenario
[21:34] <noodles775> While that's going - can I get an idea of how many (if any) people are following along?
[21:35] <noodles775> Great! As long as there's at least one person, it's worth doing :)
[21:35] <noodles775> When that's finished, change into the soyuz-test-scenario directory.
[21:35] <noodles775> We will be watching the new publications at:
[21:35] <noodles775> https://launchpad.dev/~cprov/+archive/ppa
[21:36] <noodles775> This page updates the build status column every 60 seconds by default, so instead of tapping your fingers while you wait I'd recommend specifying an update interval of 5 seconds for the dynamic updates
[21:36] <noodles775> as shown in the patch: http://pastebin.ubuntu.com/264831/
[21:36] <noodles775> You can apply that patch to your current branch with:
[21:36] <noodles775> $ wget http://pastebin.ubuntu.com/264831/plain/ -O update_every_five.diff
[21:36] <noodles775> $ bzr patch update_every_five.diff
[21:37] <noodles775> Just let me know if I go too fast...
[21:37] <noodles775> With that patch applied, run 'make run' in your branch directory in one terminal and 'make harness' to get a python console in another.
[21:38] <noodles775> Now, using the python console, we'll first just grab a sample-data user who has a PPA.
[21:38] <noodles775> >>> cprov = getUtility(IPersonSet).getByName('cprov')
[21:38] <noodles775> A few imports that we need
[21:39] <noodles775> >>> from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
[21:39] <noodles775> >>> from lp.soyuz.interfaces.publishing import PackagePublishingStatus
[21:40] <noodles775> and then we create our Soyuz test publisher instance:
[21:41] <noodles775> >>> publisher = SoyuzTestPublisher()
[21:41] <noodles775> Next, we just to ensure that the publisher has default distroseries etc. setup:
[21:42] <noodles775> >>> publisher.prepareBreezyAutotest()
[21:42] <noodles775> And now for the fun, we'll create a new published source package:
[21:42] <noodles775> >>> testsrc = publisher.getPubSource(sourcename='testsrc', archive=cprov.archive, status=PackagePublishingStatus.PUBLISHED)
[21:43] <noodles775> Finally, we'll create the missing builds for this new source package, and commit it all to the db:
[21:43] <noodles775> >>> builds = testsrc.createMissingBuilds()
[21:43] <noodles775> >>> import transaction;transaction.commit()
[21:43] <noodles775> Now open a browser at https://launchpad.dev/~cprov/+archive/ppa (or re-load) and you'll see the new 'testsrc' package with its pending builds.
[21:44] <noodles775> We'll now update the build manually watching the status update itself in the browser window.
[21:44] <noodles775> >>> from lp.soyuz.interfaces.build import BuildStatus
[21:44] <noodles775> >>> build = builds[0]
[21:44] <noodles775> >>> build.buildstate = BuildStatus.BUILDING
[21:44] <noodles775> Just watch your browser window without refreshing... after you commit the transaction, you'll see the build status for your package update within 5 seconds:
[21:44] <noodles775> >>> transaction.commit()
[21:45] <noodles775> Did it work?
[21:46] <noodles775> Now we update it to fully-built:
[21:46] <noodles775> >>> build.buildstate = BuildStatus.FULLYBUILT
[21:47] <noodles775> >>> transaction.commit()
[21:48] <noodles775> Now we've got a successful build, but its binary has not been published,
[21:48] <noodles775> Mouse-over the build icon to see a description of the current state.
[21:49] <noodles775> So we'll fake the successful publication of the binary with the SoyuzTestPublisher...
[21:49] <noodles775> >>> binary_pkg_release = publisher.uploadBinaryForBuild(build, 'testbin')
[21:49] <noodles775> >>> binary_pub = publisher.publishBinaryInArchive(binary_pkg_release, cprov.archive, status=PackagePublishingStatus.PUBLISHED)
[21:49] <noodles775> Again, be ready to watch it update:
[21:49] <noodles775> >>> transaction.commit()
[21:50] <noodles775> There you go! A brief intro to the SoyuzTestPublisher for testing soyuz publications.
[21:50] <noodles775> I've created a screencast and paste of the script at:
[21:50] <noodles775> http://micknelson.wordpress.com/2009/09/04/testing-launchpad-soyuz-features/
[21:51] <noodles775> So, that's all we had... does anyone have any questions?
[21:53] <noodles775> I guess not :) Well, hope it was useful! Remember, if you've got any questions later, you can always ask them on #launchpad-dev.
[21:56] <noodles775> jcastro: ?
 some final words to end the Ubuntu Developer Week ?
[21:59] <jcastro> not really
[21:59] <jcastro> see you guys next cycle? :)
[21:59] <jcastro> \o/
[21:59] <noodles775> :)
[22:00] <jcastro> we'll have an open week coming up soon so there will be more tutorials, etc.
[22:02] <jcastro> please feel free to send your feedback to myself or daniel holbach