[00:17] <mateus> hjhk
[07:54] <monkeyland> hola
[07:54] <zkriesse> Hallo monkeyland
[07:55] <monkeyland> hola zkriesse
[16:50] <ashams_> quit
[16:51] <akgraner> Welcome to Ubuntu Developer Week Day 2
[16:51] <akgraner> shadeslayer, if you are ready the floor is yours!
[16:51] <shadeslayer> yayyyy
[16:52] <shadeslayer> Hi! Everyone and welcome to Packaging with the Ninjas
[16:52] <dholbach> shadeslayer: can I interrupt for an organisational note?
[16:52] <shadeslayer> yes sure
[16:53] <dholbach> Welcome everybody to Ubuntu Developer Week - for those of you who join in the first time today, please also make sure you join #ubuntu-classroom-chat (yes, lernid does that automatically for you)
[16:53] <dholbach> if you have questions, please ask in #ubuntu-classroom-chat and prefix them with "QUESTION: "
[16:54] <dholbach> for those of you who did not review https://wiki.ubuntu.com/UbuntuDeveloperWeek/Sessions yet, some of the sessions that are happening today and in the next days, require a bit of preparation
[16:54] <dholbach> for example Riddell's session requires to have qt4-qmlviewer installed
[16:54] <dholbach> Rohan Garg (shadeslayer) will introduce you to Ninja packaging skills now, so I hope you'll enjoy the session!
[16:54] <dholbach> shadeslayer: the floor is yours :)
[16:55] <shadeslayer> :D
[16:55] <shadeslayer> OK.. so first up,the stuff that takes some time
[16:55] <shadeslayer> Build Env : https://wiki.kubuntu.org/Kubuntu/Ninjas/BuildEnvironment
[16:55] <shadeslayer> Read that once and have the build environment ready :D
[16:55] <shadeslayer> Every few weeks KDE make a new release of their software compilation
[16:56] <shadeslayer> and our crack team of packaging ninjas jumps into action to package this
[16:57] <shadeslayer> Please branch bzr branch lp:kubuntu-dev-tools
[16:57] <shadeslayer> those are the latest tools
[16:58] <shadeslayer> mhall119: yes
[16:59] <shadeslayer> First we build all the packages for maverick and these are backported to lucid
[17:00] <shadeslayer> tech2077: currently the kubuntu-dev-tools are broken
[17:00] <shadeslayer> the branch i just gave you doesnt seem to be right .... lemme search for another one
[17:01] <shadeslayer> bzr branch lp:~kubuntu-members/kubuntu-dev-tools/trunk
[17:01] <shadeslayer> the right dev tools
[17:02] <shadeslayer> General Dep Graph : https://wiki.kubuntu.org/Kubuntu/Ninjas/DependencyGraph : is the genral dep graph ninjas follow in their quest to get the latest KDE releases building
[17:02] <shadeslayer> so first up we have to upload kdelibs and then go upwards onto kdebase and other such stuff
[17:03] <shadeslayer> KDE released 4.4.92 sources a few days back,so i will be teaching you how to package kdetoys 4.4.92
[17:04] <shadeslayer> We usually get the tarballs a few days prior to the release so that we get the time to package them and test out any build failiures
[17:04] <shadeslayer> *failiures
[17:04] <shadeslayer> http://people.ubuntu.com/~rohangarg/kdetoys-4.4.92.tar.bz2 << Not so secret tar
[17:04] <shadeslayer> ;)
[17:05] <shadeslayer> download that tar in a seprate folder
[17:05] <shadeslayer> lets name it tmp/
[17:05] <shadeslayer> so , mkdir tmp ; cd tmp ; wget http://people.ubuntu.com/~rohangarg/kdetoys-4.4.92.tar.bz2
[17:06] <shadeslayer> everyone done?
[17:06] <shadeslayer> tar -xjvf kdetoys-4.4.92.tar.bz2 comes next
 QUESTION: Assuming there are no major build related changes couldn't the entire build-on-new-release process be automated ? Do you really need to manually package it for an inicial upstream build ?
[17:08] <shadeslayer> joaopinto: yes,but there are huge issues with Beta 1 releases of KDE,loads of missing stuff in install files and deps that need to be added
[17:08] <shadeslayer> these have to be checked manually and cannot be automated
[17:09] <shadeslayer> ok now most of our packaging is hosted on bzr
[17:09] <shadeslayer> so : bzr branch lp:~kubuntu-members/kdetoys/ubuntu -r 60
[17:09] <shadeslayer> that checks out revision 60
[17:10] <shadeslayer> done?
[17:10] <shadeslayer> Also,if your not a k/ubuntu member,like i was when i joined the ninjas,you can push the packaging to your own bzr branch and ask for a merge with that branch
[17:11] <ClassBot> Mindflyer91 asked: We have to checkout in the temp folder?
[17:11] <shadeslayer> Mindflyer91: yes
[17:11] <shadeslayer> that will create 2 folders,one containing the kdetoys sources and the other has the ubuntu/debian dir
[17:11] <shadeslayer> which has most of our packging
[17:11] <shadeslayer> *packaging
[17:12] <shadeslayer> The good thing is,most of the dirty work ends in the .90 releases
[17:12] <shadeslayer> thats when no more install files need to be edited to make the package work
[17:12] <shadeslayer> only small alterations are required
[17:13] <shadeslayer> so,after you have the packaging branched, just copy the debian/ folder over to the extracted kdetoys sources
[17:13] <shadeslayer> done?
[17:14] <shadeslayer> abhi_nav: just cp -r ubuntu/debian kdetoys-4.4.92/
[17:14] <shadeslayer> copy the debian folder from the ubuntu folder into the kdetoys-4.4.92 folder
[17:15] <shadeslayer> everyone done?
[17:15] <shadeslayer> ok so far so good :D
[17:15] <shadeslayer> now, cd kdetoys-4.4.92
[17:15] <shadeslayer> dch -i
[17:16] <shadeslayer> at the top,edit the version to 4:4.4.92-0ubuntu1~ppa1
[17:16] <shadeslayer> if your on maverick and 4:4.4.92-0ubuntu1~lucid1~ppa1 if you are on lucid
[17:17] <shadeslayer> abhi_nav: debchange <<
[17:18] <shadeslayer> then after the * add : New upstream release
[17:18] <shadeslayer> this marks the changelog
[17:18] <shadeslayer> that a new upstream KDE version was released
[17:19] <shadeslayer> done?
[17:19] <shadeslayer> should i move forward? :D
[17:20] <shadeslayer> ok,so moving forward
[17:20] <shadeslayer> now open : nano debian/control
[17:21] <shadeslayer> this is the most important file apart from our rules file that helps in packaging
[17:21] <shadeslayer> as you can see it describes each package,what it does,and what needs to be pulled in to make it build :D
[17:22] <shadeslayer> in line 7 you will see something like kde-sc-dev-latest (>= 4:4.4.90)
[17:22] <shadeslayer> kde-sc-dev-latest is a meta package that pulls in other dependencies to make the package work
[17:23] <shadeslayer> now change that 4:4.4.90 to 4:4.4.92
[17:23] <shadeslayer> since kde released a new version and new deps are needed ;)
[17:23] <shadeslayer> done?
[17:23]  * shadeslayer hands orange ninja belt to tech2077 chilicuil and abhi_nav_
[17:24] <shadeslayer> congrats new ninjas :D
[17:24] <shadeslayer> ok now thats all we need to edit in that file :D
[17:24] <shadeslayer> close with Ctrl+X and y
[17:25] <shadeslayer> ok,now thats all that needs to be done in that package
[17:25] <shadeslayer> :D
[17:25] <shadeslayer> just edit debian/changelog and add - Bump kde-sc-dev-latest to 4.4.92 below the *
[17:25] <shadeslayer> something like http://bazaar.launchpad.net/~kubuntu-members/kdetoys/ubuntu/annotate/head:/debian/changelog
[17:26] <shadeslayer> everything done?
[17:27] <shadeslayer> good!
[17:27] <shadeslayer> now,lets start building!!!!!!!
[17:27] <shadeslayer> in the kdetoys dir,just run : pdebuild
[17:27] <shadeslayer> thats it!
[17:27] <shadeslayer> ( supposing you have pbuilder and some ninja hooks ;) )
[17:28] <shadeslayer> my favourite hooks are B10list-missing  C10shell
[17:28] <shadeslayer> the B10list-missing hook prints out a list of missing files at the end of the build
[17:29] <shadeslayer> pretty useful when your packaging the first beta release of KDE
[17:29] <shadeslayer> the C10shell is another one which drops to a shell and install vim for me to inspect the problem
[17:30] <ClassBot> simar asked: How this different from MOTO in ubuntu ?? I look for analogous names of ninja as i'm more familiar with ubutnu ??
[17:30] <shadeslayer> MOTU == Masters of the Universe
[17:30] <shadeslayer> these people are responsible for the universe section of the archives
[17:30] <shadeslayer> !motu
[17:30]  * shadeslayer looks for ubottu
[17:31] <shadeslayer> KDE packages are in the main section of the archives ( most of them that is ;) )
[17:31] <shadeslayer> so packaging KDE and MOTU are 2 different things
[17:32] <ClassBot> joaopinto asked: regarding kde-sc-dev-latest, how do you know that the new version is required to build this particular package ?
[17:32] <shadeslayer> joaopinto: good question
[17:32] <shadeslayer> kde-sc-dev-latest is a meta package
[17:32] <shadeslayer> it depends on other packages,but does not install anything by itself
[17:33] <shadeslayer> also,in order to build kdetoys 4.4.92,you will need kdebase 4.4.92 , this is hard coded in the cmake files
[17:33] <ClassBot> simar asked: I have learnt some packaging skills yesterday from danial also but after then I don't know what to do and where and frm where to start ?? How to get in team .. Could you please tell me a bit abt dat here also ..
[17:34] <shadeslayer> simar: we idle in #kubuntu-devel,poke us there and we will hand you work :D
[17:34] <ClassBot> tech2077 asked: I can't build this myself, i seem to not have a lot of the dependencies
[17:34] <shadeslayer> tech2077: can you pastebin this error?
[17:34] <shadeslayer> also which version are you on?
[17:35]  * shadeslayer checks time
[17:35] <shadeslayer> oh.. 30 mins more :D
[17:36] <shadeslayer> tech2077: ok exit pbuilder if it hasnt already
[17:37] <shadeslayer> tech2077: now run : sudo pbuilder --login  --save-after-login
[17:37] <shadeslayer> everyone welcome Quintasan
[17:37] <Quintasan> \o
[17:37] <shadeslayer> another ninja
[17:37] <shadeslayer> dpkg-checkbuilddeps: Unmet build dependencies: kde-sc-dev-latest (>= 4:4.4.92) cmake pkg-kde-tools (>= 0.6.4) kdebase-workspace-dev (>= 4:4.4) libphonon-dev (>> 4:4.7.0really) libstreamanalyzer-dev (>= 0.6.3) libqimageblitz-dev
[17:38] <shadeslayer> that says that you do not have the proper build deps
[17:38] <shadeslayer> Quintasan: this will be over in 10 mins,we can take on neon then :D
[17:38] <shadeslayer> so everyone run sudo pbuilder --login  --save-after-login
[17:39] <shadeslayer> then : add-apt-repository ppa:kubuntu-ppa/beta
[17:39] <shadeslayer> done?
[17:40]  * shadeslayer pokes all orange belt ninjas will pointy sword
[17:40] <shadeslayer> ok then : apt-get install nano
[17:41] <shadeslayer> pbuilder doesnt have a editor by default
[17:41] <shadeslayer> then : nano /etc/apt/sources.list
[17:41] <shadeslayer> and add : deb http://ppa.launchpad.net/kubuntu-ppa/beta/ubuntu lucid main
[17:41] <shadeslayer> at the very end
[17:42] <shadeslayer> or you can do as tech2077 did.. install python-software-properties
[17:42] <shadeslayer> done?
[17:43] <shadeslayer> the basic thing here is to add the kubuntu beta ppa
[17:43] <shadeslayer> well.. after that just : apt-get update
[17:43] <shadeslayer> then log out using ctrl+D
[17:43] <shadeslayer> then pdebuild again
[17:44] <shadeslayer> and this time , it should work :D
[17:45] <shadeslayer> Quintasan: ready to fire away?
[17:45] <Quintasan> hmm I think yes
[17:45] <ClassBot> penguin42 asked: So once you've built all of this stuff as Ninjas do you have a set of tests?
[17:45] <shadeslayer> penguin42: yes!
[17:46] <shadeslayer> penguin42: lintian takes care of most of the errors
[17:46] <shadeslayer> it checks the packaging for defects that the ninjas might have ignored
[17:47] <Quintasan> So, fist of all, Project Neon was/is a very Kool thing that provides users with nightly (unstable) builds of KDE SC. This new users that are up for testing can work with bleeding edge changes in KDE without compiling the whole source by themselves.
[17:47] <shadeslayer> ok after the deps are downloaded and unpacked,pbuilder builds the package and you get .debs in you pbuilder result dir
[17:47] <shadeslayer> which is basically all about ninja packaging :D
[17:48] <shadeslayer> just as a example,kdelibs takes 2-3 hours to build
[17:48] <Quintasan> shadeslayer: you forgot about uploading to hyper secret ppa
[17:48] <shadeslayer> ^^
[17:48] <shadeslayer> ah yes.. the hyper secret ppa
[17:49] <shadeslayer> if you want to get your name spoken in the elite circles of ninjas,grab one of the ninjas and ask him to review your work
[17:49] <shadeslayer> after a review,you get access to super secret ppa
[17:49] <shadeslayer> where you can test your builds in the ppa!
[17:49] <shadeslayer> s/hyper/ultimate
[17:49] <shadeslayer> as apachelogger pointed out
[17:50] <ClassBot> There are are 10 minutes remaining in the current session.
[17:51] <shadeslayer> Quintasan: ^^
[17:51] <Quintasan> okay
[17:51] <shadeslayer> quickly neon :D
[17:52] <Quintasan> as I worte earlier, Project Neon will provide you with latest builds of KDE SC. Some time ago apachelogger wrote some magic code in Ruby but it won't work now and we have decided to port it to Launchpad Recipes.
[17:52] <shadeslayer> Which is in a beta stage as well
[17:54] <Quintasan> This is where packaging comes in handy. I do belive that latest builds should be provided with ever possible feature.
[17:55] <Quintasan> That's why we (me, shadeslayer, apachelogger) need to check for every additional dependecies, add them, rewrite rules and install files.
[17:55] <ClassBot> There are are 5 minutes remaining in the current session.
[17:57] <Quintasan> Well, without boring stuff. Come to #project-neon and help us with out uber 1337 mission of providing the best nightly builds of KDE SC
[17:58] <Quintasan> simulacrum: Well, there is a page in Kubuntu Wiki but it is currently empty, I have been thinking about specs for the past few days
[17:59] <Quintasan> simulacrum: feel free to ask us anything in #project-neon, we currently need hands to sort out additional dependencies for our packages.
[17:59] <Quintasan> Thank shadeslayer for teaching you those all things, you can help us right away
[18:00] <Quintasan> :)
[18:01] <dpm> hi everyone
[18:02] <dpm> and thanks shadeslayer for a great session
[18:02] <dpm> so, welcome everyone to this session on translations
[18:03] <dpm> In the next hour we'll be learning about some basic concepts concerning natural language support, how translations work in Ubuntu at the technical level and how they work for other projects hosted in Launchpad.
[18:03] <dpm> This is a very broad subject and there are lots of resources to learn from on the net. My intention on this session is just to give you an overview of the basic concepts and concentrate on the main technologies and tools for making Ubuntu translatable.
[18:04] <dpm> I'll leave some additional time for questions, but feel free to ask your questions in between as well.
[18:04] <dpm> So without further ado, let's get the ball rolling.
[18:04] <dpm> == Why Translations ==
[18:04] <dpm> While this might seem obvious to some people, I'd like to start highlighting once more the importance of translations -or natural language support to be more precise.
[18:05] <dpm> One of the principles that unite the Ubuntu community is providing an Operating System for human beings.
[18:05] <dpm> Some of these human beings might understand and speak English, which is the original language in which the OS is developed.
[18:05] <dpm> However, there is still a large number of users who need Ubuntu to be available in their own language to be able to use it at all.
[18:06] <dpm> If you are an English speaker, you can think about it the other way round to get an idea:
[18:06] <dpm> imagine your operating system would be developed in a language you don't know - let's take Japanese.
[18:06] <dpm> Would you be able to choose the right menus in a foreign language, or even understand the messages the OS is showing you?
[18:07] <dpm> If you provide internationalization support to your applications, more people will be able to translate them and to actually use them
[18:07] <dpm> Setting up an application for internationalization is easier than you might think.
[18:07] <dpm> It is generally a one-off process and it's best done from the moment you start creating your application.
[18:08] <dpm> The rest is simply maintenance - exposing translatable strings to translators and fetching translations.
[18:08] <dpm> What prompted me to run such a session was precisely the many times I've heard the session's title from developers:
[18:08] <dpm>     «I don't know anything about translations»
[18:08] <dpm> So let's try to cast some light on that and hopefully change the statement so that next time someone brings the subject we hear something more along the lines of
[18:08] <dpm>     «Translations are awesome»
[18:09] <dpm> Yeah, that'll do :)
[18:09] <dpm> == Basic concepts ==
[18:09] <dpm> Let's continue with some basic concepts
[18:09] <dpm> I'll quickly run through them, so I won't go into details, but please, feel free to interrupt if you've got any questions.
[18:10] <dpm> * Internationalization (i18n): is basically the process of making your application multilingual. This is something you as a developer will be doing while hacking at your app. It is mostly a one-off process, and in most cases it simply involves initializing the technologies used for this purpose.
[18:10] <dpm> * Localization (l10n): that's what translators will be doing, which is adapting internationalized software to a specific region and language. Most of the work here goes into actually translating the applications
[18:11] <dpm> * Gettext: that's the underlying framework to make your applications translatable. It provides the foundations and it is the most widely used technology to enable translations of Open Source projects. In addition, it defines a standard file format for translators to do their work and for the application to load translations, as well as providing tools to work with these.
[18:12] <dpm> Related to gettext, we've also got:
[18:12] <dpm> * PO files: these are text files with a defined format basically consisting of message pairs - the first one the original string in English and the next one the translation. E.g:
[18:12] <dpm> msgid "Holy cow, is that a truck coming towards me?"
[18:12] <dpm> msgstr "Blimey, is that a lorry coming towards me?"
[18:13] <dpm> they are often simply referred to as "translations", and are what translators work with, either with a text editor, a dedicated PO file editor, or with an online interface such as Launchpad Translations. They are named after language codes (e.g. en_GB.po, ca.po, hu.po) and are kept in the code as the source files to generate MO files.
[18:14] <dpm> * MO files: binary files created at build time from PO files and installed in a particular system location (e.g. /usr/share/locale). These are where the applications will actually load translations from.
[18:14] <dpm> * POT files: also called templates, have got the same format as PO files, but the messages containing the translations are empty. Developers provide the templates with the latest messages from the applications, on which the PO files will be based on. There is generally one template (POT file) and many translations (PO files), and it usually carries the name of the application (mycoolapp.pot)
[18:15] <dpm> so assuming you've got all your translations-related files under a 'po' directory, it would look like:
[18:15] <dpm> po/mycoolapp.pot
[18:16] <dpm> po/ca.po
[18:16] <dpm> po/es.po
[18:16] <dpm> po/it.po
[18:16] <dpm> ...
[18:16] <dpm> so you can see how from a single POT file translators (or Launchpad) create the PO files for their particular language
[18:17] <dpm> oh, and in a POT file the message pairs will look like this:
[18:17] <dpm> msgid "Holy cow, is that a truck coming towards me?"
[18:17] <dpm> msgstr ""
[18:18] <dpm> You can see a real one here to get an idea: http://l10n.gnome.org/POT/evolution.master/evolution.master.pot
[18:19] <dpm> And a particular translation: http://l10n.gnome.org/POT/evolution.master/evolution.master.ca.po
[18:19] <dpm> * Translation domain: this is a name which will be used to build a unique URI where to fetch the translations from. E.g. /usr/share/locale/<langcode>/LC_MESSAGES/<domain>. It will be set in the code or as a build sysem variable and generally be the name of the application in lowercase. The POT template will also be generally named after the domain.
[18:20] <ClassBot> umang asked: gettext is a gnu software. Can I use it in a PyQt application, say?
[18:20] <dpm> yes, you'll be able to use it
[18:20] <dpm> but it might be tricky to set up
[18:21] <dpm> since all the makefile rules related to gettext are geared towards autotools
[18:21] <dpm> but it is definitely possible
[18:22] <ClassBot> devcando85 asked: What .PO and .MO stands for?
[18:22] <dpm> PO stands for Portable Object
[18:22] <dpm> MO ... err...
[18:22] <dpm> I'd have to look it up :)
[18:22] <dpm> Message Object perhaps
[18:23] <ClassBot> csigusz asked: If I didn't translate one message, than what will be appear in the application?
[18:23] <dpm> the application will show the original English message if there is no translation
[18:23] <dpm> that will always be the fallback
[18:24] <ClassBot> zyga asked: will there be a section specific to working with web applications? such as django-based web applications? Often distributing and installing such applications is done differently and gettext with its strict rules as to where to find translations is annoying to work with
[18:24] <dpm> I haven't planned this for this session, but that'd be a great idea for another (full one)
[18:24] <dpm> some web apps use either gettext or their own implementation (full or partial) of the gettext api
[18:25] <dpm> so many of the concepts (po files, mo files, domain, etc) still apply
[18:26] <dpm> ok, let's continue
[18:26] <dpm> I'll try to answer the rest of the questions later on
[18:27] <dpm> ah, Rhonda tells me that MO stands for Machine Object. There you go, thanks :)
[18:27] <dpm> Let's go on with a final couple of basic concepts/tools:
[18:28] <dpm> * Intltool: it's a tool that provides a higher level interface to gettext and allows it handling file formats otherwise not supported (.desktop files, .policy files, etc.)
[18:28] <dpm> * Launchpad Translations: collaborative online translation tool for Open Source projects, part of Launchpad and available at https://translations.launchpad.net. It allows translating Operating Systems such as Ubuntu, as well as single projects. For translators, it hides the technical complexity associated with file formats and tools, and allows them easily translationg applications online without prior technical knowledge. For developers, it provide
[18:28] <dpm> s code hosting integration, which greatly facilitates the development workflow
[18:28] <dpm> There are more technologies associated with other i18n aspects - font rendering, to mention an important one - but we'll not be looking at them today.
[18:29] <dpm> From those concepts, technologies and tools, the main ones to retain for this session are gettext and Launchpad Translations
[18:29] <dpm> Another important concept is the translation workflow. Traditionally, this has been as follows:
[18:30] <dpm> 1. Some time before release (e.g. 2 weeks), the developer announces a string freeze and, release date and produces a POT template with all translatable messages. This allows translators to start doing their work with stable messages
[18:30] <dpm> 2. Translators do the actual translations and sent them back to the project (either committing them, sending them per e-mail or simply saving them in Launchpad)
[18:30] <dpm> 3. Before release, the developer makes sure the latest translations (the PO files) are in the source tree and releases the tarball
[18:30] <dpm> Launchpad make some of those steps less rigid and easier both for translators and developers - the online interface and automatic translation commits ensures that translations get to the project automatically and nearly immediately. Automatic template generation allows the templates to be always up to date. More on that later on.
[18:30] <dpm> == Ubuntu Translations ==
[18:31] <dpm> Ubuntu is translated in Launchpad at https://translations.launchpad.net/ubuntu by the Ubuntu translators, which work in per-language translation teams that constitute the heart of the Ubuntu Translations community.
[18:31] <dpm> You can see all teams here:
[18:31] <dpm>   https://translations.launchpad.net/+groups/ubuntu-translators
[18:31] <dpm> Each team has their own communication method, and they coordinate globally through the ubuntu-translators mailing list.
[18:31] <dpm> So if as a developer you need to announce anything to translators, or ask a question, the mailing list at https://lists.ubuntu.com/mailman/listinfo/ubuntu-translators is the place to go to.
[18:32] <dpm> All Ubuntu applications -and Ubuntu-specific documentation- can thus be translated from a central location and with an easy to use online interface that greatly owers the barrier to contribution.
[18:32] <dpm> Let's get a bit more technical and talk about the workflow of Ubuntu translations
[18:32] <dpm> A couple of important point first:
[18:33] <dpm> * Ubuntu is translated in Launchpad at https://translations.launchpad.net/ubuntu
[18:33] <dpm> * This only applies to Ubuntu packages in the main and restricted repositories
[18:33] <dpm> * Translations are shipped independently from the applications in dedicate packages called language packs. There is a set of language packs for each language.
[18:33] <dpm> * Language packs allow separation between application and translations and shipping separate updates without the need to release new package versions.
[18:34] <dpm> Ok, let's have a look at the Ubuntu translations lifecycle:
[18:34] <dpm> It all starts with an upstream project being packaged and uploaded to the archive
[18:34] <dpm> If that package is either in main or restricted, it will be translatable in Ubuntu and will go through this whole process
[18:34] <dpm> Upon upload, the package will be built and its translations (the PO files from the source package plus the POT template) will be extracted and put into a tarball
[18:35] <dpm> The pkgbinarymangler package takes care of doing this
[18:35] <dpm> This tarball will then be imported into Launchpad, entering the translations import queue for some sanity checking before approval. It is important at this point that the tarball contains a POT template, otherwise it will not be imported.
[18:35] <dpm> here's what the imports queue looks like
[18:36] <dpm>   https://translations.launchpad.net/ubuntu/lucid/+imports?field.filter_status=NEEDS_REVIEW&field.filter_extension=pot&batch=90
[18:36] <dpm> (for Lucid)
[18:36] <dpm> After approval, both the template and the translations will be imported and exposed in Launchpad, making them available from an URL such as:
[18:36] <dpm> https://translations.launchpad.net/ubuntu/<distrocodename>/+source/<sourcepackage>/+pots/<templatename>
[18:36] <dpm> Here is for example how it looks like for the evolution source package:
[18:37] <dpm>   https://translations.launchpad.net/ubuntu/lucid/+source/evolution/+pots/evolution
[18:37] <dpm> From this point onwards, after translations have been exposed, translators can do their work.
[18:37] <dpm> While they are doing this, and on a periodical basis, translations are exported from Launchpad in a big tarball containing all languages and fed to a program called langpack-o-matic
[18:38] <dpm> Langpack-o-matic takes the translations exported as sources and creates the language packs, one set for each language. These are the packages which contain the translations in binary form and will ultimately be shipped to users, finally closing the translation loop.
[18:38] <dpm> So that was it. Basically, for an application to be translatable in Ubuntu:
[18:38] <dpm> * It must have internationalization support
[18:38] <dpm> * It must be either in main or restricted
[18:38] <dpm> * Its package must create a POT template during build (here's how: https://wiki.ubuntu.com/UbuntuDevelopment/Internationalisation/Packaging#TranslationTemplates)
[18:39] <dpm> If you want to learn more about this, you'll find more info here as well:
[18:39] <dpm>   https://wiki.ubuntu.com/Translations/TranslationLifecycle
[18:39] <dpm>   https://wiki.ubuntu.com/Translations/Upstream
[18:39] <dpm>   https://wiki.ubuntu.com/UbuntuDevelopment/Internationalisation/Packaging
[18:39] <dpm>   https://wiki.ubuntu.com/MaverickReleaseSchedule
[18:40] <dpm> == Translation of Projects ==
[18:40] <dpm> So we've seen how an Operating System such as Ubuntu can be translated in Launchpad
[18:40] <dpm> But what about individual projects? How can they be internationalized and localized?
[18:40] <dpm> There are many programming languages, build systems and possible configurations, so let's try to see a general overview on the steps for adding i18n support to an app and getting it translated.
[18:41] <dpm> * Gettext initialization - the code will have to add a call to the gettext initialization function and set the translation domain. This generally means adding a few lines of code to the main function of the program. Here's a simple example in Python:
[18:41] <dpm>   import gettext
[18:41] <dpm>   _ = gettext.gettext
[18:41] <dpm>   gettext.install('myappdomain', '/usr/share/locale')
[18:41] <dpm> This is a very basic setup. Depending on your build system -if you are using one-, you might have to modify some other files as well
[18:42] <dpm> * Marking translatable strings - you'll then need to mark strings to be translated. This will be as simple as enclosing the strings with _(), which is simply a wrapper for the gettext function
[18:42] <dpm> * Create a 'po' folder to contain translations (po files) and a template (pot file)
[18:42] <dpm> (remember the layout I was mentioning earlier on)
[18:43] <dpm> Roughly, up to here the package will have internationalization support. Let's now see how we can make it translatable for translators to do their work
[18:43] <dpm> * Updating the .pot template - the translatable strings will need to be extracted from the code and put into the POT template to be given to translators. There are several ways to do this:
[18:43] <dpm> a) you can use the gettext tools directly (calling the xgettext program)
[18:44] <dpm> b) you can invoke intltool directly -if you are using it- with 'intltool-update -p -g mycoolapp'
[18:44] <dpm> c) using a make rule to do this for you: with autotools you can use 'make $(DOMAIN).pot-update' or 'make dist'; with python-distutils-extra you can use ./setup.py -n build_i18n
[18:44] <dpm> I'd recommend the latest, as having a build system will greatly simplify maintenance
[18:44] <dpm> If you are using intltool in a standard layout, you can even let Launchpad do the work for you and build the templates automatically
[18:45] <dpm> check out this awesome feature here: http://blog.launchpad.net/translations/automatic-template-generation
[18:45] <dpm> The best integration and workflow is achieved when your project's code is hosted in Launchpad and using bzr, as either committing a new template or letting Launchpad generate it for you will automatically expose it to translators
[18:46] <dpm> in a location such as https://translations.launchpad.net/ubuntu/<distrocodename>/+source/<sourcepackage>/+pots/<templatename>
[18:46] <dpm> see the Getting Things GNOME translations for a real example:
[18:46] <dpm> https://translations.launchpad.net/gtg/trunk/+pots/gtg
[18:48] <dpm> Setting up a project for translations in Launchpad involves enabling translations, activating the template you (or Launchpad) have created and optionally enabling the bzr integration features
[18:48] <dpm> These are fairly easy steps
[18:49] <dpm> so I'll just direct you to https://help.launchpad.net/Translations/YourProject/BestPractices and leave the last few mins for questions
[18:49] <ClassBot> arjunaraoc asked: what is the minimum translation required to include the language in boot options for Ubuntu?
[18:50] <dpm> I believe there is not a minimum for the bootloader package. The minimum is the translation coverage of the debian-installer package
[18:50] <ClassBot> There are are 10 minutes remaining in the current session.
[18:50] <dpm> I'd recommend you check https://wiki.ubuntu.com/Translations/KnowledgeBase/DebianInstaller or ask on the ubuntu-translators mailing list
[18:51] <dpm> Moomoc: Is documentation in Ubuntu always translated with gettext? Isn't this a bit arduous?
[18:51] <dpm> actually, translating using with gettext isn't arduous, but rather more comfortable for translators. The tricky part of translating documentation
[18:52] <dpm> is converting from the documentation format to the gettext format, which is the one translators are used to
[18:52] <dpm> fortunately, there are several tools to make this easier:
[18:52] <dpm> xml2po or po4all are two good ones
[18:52] <ClassBot> inquata asked: Are there plans to support http://open-tran.eu/ by providing Ubuntu strings?
[18:53] <dpm> There aren't right now, but if you've got an idea on how this could be implemented, a blueprint would be most welcome
[18:53] <dpm> Remember that Launchpad is Open Source: https://dev.launchpad.net/
[18:54] <dpm> and any contributions are really welcome
[18:54] <ClassBot> umang asked: I seem to have missed something about gettext. Are the .po /.mo files accessed at runtime depending on the user's language or are they integrated into separate builds of the same program? If I've understood correctly it's the former.
[18:54] <dpm> .po files are source files, so they aren't used at run time
[18:55] <dpm> the .mo files are generated at build time from the .po files
[18:55] <dpm> then installed in the system (generally at /usr/share/locale ...)
[18:55] <ClassBot> There are are 5 minutes remaining in the current session.
[18:55] <dpm> and applications using gettext pick them up at runtime to load the translations from them
[18:56] <dpm> Rhonda also tells me: One important thing to note about MO: Even though it's byte encoded and can be big-endian and little-endian, gettext is sane enough to be able to use _both_ no matter what system it runs on. So the MO format actually is still architecture independent even though the data isn't really.
[18:57] <dpm> We've got time for one or two questions still, anyone?
[18:58] <ClassBot> arjunaraoc asked: debian-installer has not been setup in Launchpad so far for Telugu. Is it better to do translation outside?
[18:59] <dpm> it is in Launchpad, but yes, I'd recommend doing translations upstream in Debian for that particular one
[18:59] <dpm> it is a complex package and does not use a conventional layout
[19:00] <dpm> A final note Rhonda mentioned to me as well: The package python-polib helps with respect to using gettext catalogues in python
[19:00] <dpm> ok, so that was it!
[19:00] <dpm> Thanks a lot for listening and for the interesting questions
[19:01] <Riddell> afternoon all, I'll be starting in one minute
[19:02] <Riddell> Hi, anyone here for a session on Qt Quick?
[19:02] <Riddell> chat in #ubuntu-classroom-chat I believe
[19:02] <Riddell> for this session please install qt4-qmlviewer
[19:03] <Riddell> lucid users need to   sudo apt-add-repository ppa:kubuntu-ppa/beta; sudo apt-get update; sudo apt-get install qt4-qmlviewer
[19:03] <Riddell> I'm Jonathan Riddell, a Kubuntu developer
[19:04] <Riddell> as you know Qt is used in KDE.  it's also spreading out a lot now that Nokia have invested in it
[19:04] <Riddell> it's going to be on pretty much all Nokia phones soon, which will make it the most used UI toolkit on the planet
[19:05] <Riddell> up until now Qt has normally been programmed in the traditional way of making widgets and putting them places
[19:05] <Riddell> which is useful for coders but isn't how designers tend to think
[19:06] <Riddell> designers tend to start with items in places which move around and out the way depending on what's happening
[19:06] <Riddell> so Qt has come up with Qt Quick, a new way to make UIs
[19:06] <Riddell> it's declarative, so you say what you want it to look like and it'll sort out the bits inbetween
[19:07] <Riddell> this is very new stuff
[19:07] <Riddell> it hasn't been released yet
[19:07] <Riddell> it will be with Qt 4.7 which is due in August
[19:07] <Riddell> it's still in flux, the language has been changing and more changes might happen
[19:07] <Riddell> but already it's being used in interesting places like KDE PIM mobile http://dot.kde.org/2010/06/10/kde-pim-goes-mobile
[19:08] <Riddell> I'm not too familiar with other toolkits but I think flash and Mac already have declarative UI coding, I don't know of any free toolkits which support it
[19:08] <Riddell> so if you want to create bling interfaces then this will be the way to go
[19:09] <Riddell> Qt Quick is made up of a few thing
[19:09] <Riddell> QtDeclarative library, Qt Creator, qmlviewer app, the QML language and some plugins
[19:09] <Riddell> much of this talk was given by a Qt Quick developer last week at Kubuntu Tutorials day, you can see the logs here if I don't explain things too well (it's new to me too) https://wiki.kubuntu.org/KubuntuTutorialsDay
[19:10] <Riddell> you can use Qt Creator for this but the version in the archive doesn't support it
[19:10] <Riddell> you'd need to download the daily build ftp://ftp.qt.nokia.com/qtcreator/snapshots/latest
[19:10] <Riddell> but for now you can just use qmlviewer
[19:10] <Riddell> the QML language integrates well with c++ and signal/slots
[19:10] <Riddell> so if you like your old style of programming, it's not going anywhere
[19:11] <Riddell> right, let's see some code
[19:11] <Riddell> http://people.canonical.com/~jriddell/qml-tutorial/tutorial1.qml
[19:11] <Riddell> that's a hello world example
[19:11] <Riddell> it shows some text in a rectangle
[19:11] <Riddell> you can run it with   >qmlviewer tutorial1.qml
[19:12] <Riddell> 19:12 < maco> looks like CSS to me
[19:12] <Riddell> yes it's a similar syntax but there's plenty differences
[19:12] <Riddell> here you declaire the objects, not just add styles to them
[19:12] <Riddell> also it's not cascading
[19:13] <Riddell> the first line, import Qt 4.7, imports all the types in Qt 4.7
[19:13] <Riddell> so when we start using types later, like 'Rectangle', you now know where they are from
[19:13] <Riddell> http://doc.qt.nokia.com/4.7-snapshot/qdeclarativeelements.html  lists all the current types
[19:13] <Riddell> the Rectangle { line actually creates a Rectangle element
[19:13] <Riddell> between {} you can set the properties and children of the element
[19:14] <Riddell> next line sets an id so we can refer to this rectangle by that id "page" elsewhere
[19:14] <Riddell> some properties are set
[19:14] <Riddell> the next line, Text{, creates another element
[19:15] <Riddell> this element, as it's inside the Rectangle{}, will be a child of the Rectangle element
[19:15] <Riddell> and we set its properties over the next few lines
[19:15] <Riddell> the anchor properties are a way to position elements
[19:15] <Riddell> and that line binds the horizontal centre anchor of the Text to the horizontal center of the element called 'page'
[19:15] <Riddell> anyone got it running?
[19:16] <Riddell> 19:15 < ean5533> QUESTION: Do QML elements have required attributes?
[19:16] <Riddell> ean5533: not as far as I know, they will default to sensible defaults
[19:16] <Riddell> although for text that will be a blank string so it's not much use unless you want to use the item later
[19:16] <Riddell> 19:16 < maco> QUESTION: how do we execute it?
[19:17] <Riddell> with  >qmlviewer tutorial1.qml
[19:17] <Riddell> you need to install qt4-qmlviewer
[19:17] <Riddell> 19:15 < simar> QUESTION: Is this work on GNOME also ie on ubuntu?
[19:17] <Riddell> of course, it's all X so it'll run on any desktop environment
[19:18] <Riddell> just because you use Gnome doesn't mean you can't use non-Gnome apps
[19:18] <Riddell> 19:18 < Neo---> no problems with running it
[19:18] <Riddell> success!
[19:19] <Riddell> resizing the window will move the text so it stays centred, that's the anchor in use
[19:19] <Riddell> so onto tutorial2.qml, which uses multiple files
[19:19] <Riddell> http://people.canonical.com/~jriddell/qml-tutorial/tutorial2.qml
[19:19] <Riddell> The change here, is a grid containing a lot of cells that are all very similar
[19:19] <Riddell> so we want to write the code for the Cell once, and reuse it
[19:19] <Riddell> it will load the file Cell.qml to create the Cell type
[19:19] <Riddell> http://people.canonical.com/~jriddell/qml-tutorial/Cell.qml
[19:20] <Riddell> so looking at Cell.qml
[19:20] <Riddell> Item is just a simple type in QML, which is pretty much nothing but a bounding box.
[19:20] <Riddell> the id is set, we'll call it container
[19:20] <Riddell> next some new stuff
[19:20] <Riddell> the line 'property alias cellColor: rectangle.color' creates a new property on this item, and calls it cellColor
[19:21] <Riddell> 'property' starts the property declaration, 'alias' is the type of property, and 'cellColor' is the name
[19:21] <Riddell> because it is an alias type, it's value is another property. And it just forwards everything to that property
[19:22] <Riddell> 19:22 < maco> QUESTION: so cellColor is like a variable name then?
[19:23] <Riddell> a property can be a variable, in this case it's an alias so it's just the same as another variable
[19:23] <Riddell> in this case rectangle.color
[19:24] <Riddell> back in tutorial2.qml we only have a 'Cell'. And the interface for that is whatever is declared in the root item of Cell.qml (we can't access the inner Rectangle item)
[19:24] <Riddell> "rectangle" is the item declaired next in Cell.qml, to expose rectangle.color, we add an alias property
[19:25] <Riddell> the 'signal clicked(color cellColor)' line is similar. We add a signal to the item so that it can be used in the tutorial2.qml file
[19:25] <Riddell> Another new element in this file is 'MouseArea'. This is a user input primitive
[19:25] <Riddell> despite the name, it works equally well for touch
[19:25] <Riddell> QML can be the entire UI layer, including user interaction
[19:26] <Riddell> And MouseArea is a separate element so that you can place it whereever you want. You can make it bigger than the buttons for finger touch interfaces, for example
[19:26] <Riddell> to make it the exact size of the Item, we use 'anchors.fill: parent'
[19:26] <Riddell> which anchors it to fill its parent
[19:26] <Riddell> less obvious is the 'onClicked' line after that
[19:27] <Riddell> MouseArea has a signal called 'clicked', and that gives us a signal handler called 'onClicked'
[19:27] <Riddell> you can put a script (QtScript) snippet in 'onClicked', like in Cell.qml, and that snippet is executed when the signal is emitted
[19:27] <Riddell> so when you click on the MouseArea, the clicked signal is emitted, and the script snippet is emitted
[19:27] <Riddell> and the script snippet says to emit the clicked signal of the parent item, with container.cellColor as the argument
[19:27] <Riddell> 19:24 < sladen> can these QML files have (semi-)executable code
[19:28] <Riddell> so that answers Paul's question, you can include QtScript (which is Javascript inside Qt)
[19:28] <Riddell> to create UIs that do something
[19:29] <Riddell> so Cell is a rectangle which lets us set the colour and emits a signal with that colour when it's clicked on
[19:29] <Riddell> Back to tutorial2.qml, we can see this interface in use
[19:29] <Riddell> In each Cell instance, we set the cellColor property
[19:29] <Riddell> and use the onClicked handler
[19:29] <Riddell> The Grid element positions the Cell elements in a grid
[19:29] <Riddell> who's got it working?
[19:32] <Riddell> 19:30 < sladen> Riddell: how does tutorial2.qml know that 'Cell{}' refers to the Cell.qml file?
[19:32] <Riddell> 19:31 <+Riddell> sladen: any .qml files in the same directory as the one being run are loaded automatically
[19:32] <Riddell> 19:31 <+Riddell> and the item name comes from the file name
[19:32] <Riddell> if you rename the Cell.qml file then it stops working
[19:33] <Riddell> ok, who wants some animation?
[19:33] <Riddell> tutorial3.qml does animations using states and transitions
[19:33] <Riddell> http://people.canonical.com/~jriddell/qml-tutorial/tutorial3.qml
[19:34] <Riddell> a State is just a set of property changes from the base state (called "")
[19:34] <Riddell> and a Transition is just telling it how to animate those property changes
[19:34] <Riddell> in this file, in the Text element, we add a MouseArea, states, and transitions
[19:34] <Riddell> compared to tutorial2.qml that is
[19:34] <Riddell> We have a State, which we name "down", and the way we are entering it is through the when property.
[19:35] <Riddell> When 'mouseArea.pressed' changes, that property binding gets revaluated
[19:35] <Riddell> when mouseArea.pressed changes to true, it makes 'when' true. And so the state activates itself
[19:35] <Riddell> and this applies the property changes in the PropertyChanges element
[19:36] <Riddell> PropertyChanges has a similar syntax to the rest of QML. Once you set the target, it is just like you are in that item
[19:36] <Riddell> So the 'y: 160' and 'rotation: 180' will be applied as if they were written inside the Text item
[19:37] <Riddell> the transition adds the animation, without it the text just jumps between the two states
[19:37] <Riddell> the from and to properties on the element say which state you are going from and to
[19:37] <Riddell> The ParallelAnimation element just groups animations
[19:37] <Riddell> and when it runs, the animations in it are run in Parallel
[19:37] <Riddell> The first animation in it is a NumberAnimation, which animates numbers
[19:38] <Riddell> 'properties: "y, rotation"' means that it will animate the y and rotation properties
[19:38] <Riddell> so if these properties changed in this state, on any items, they will be animated in this way
[19:38] <Riddell> the rest of the properites in the NumberAnimation will define this exact way
[19:38] <Riddell> duration: 500 means the animation will take 500ms
[19:38] <Riddell> easing.type: Easing.InOutQuad means that it will use an interpolation function that has quadratics on both the in and out parts
[19:38] <Riddell> or something like that. The documentation has pretty pictures
[19:39] <Riddell> who's got it running?
[19:39] <Riddell> if you click on the text the transition will start until it reaches the new state
[19:40] <Riddell> the position, rotation and colour all change, unless you let go of your mouse button in which case they change back
[19:40] <Riddell> if you comment out the    transitions: Transition { ... }   block then the animation doesn't happen
[19:40] <Riddell> it just jumps between the two states
[19:40] <Riddell>  /*  */  and // comments work
[19:41] <Riddell> and that is how you get pretty animations without having to code them
[19:41] <Riddell> since animations are going to be important in applications in future, this is an important new tool to make sure free software remains at the lead and the world isn't dominated by iPhone software
[19:42] <Riddell> 19:41 < mgamal> QUESTION: Can you integrate qml code within normal Qt C++ code?
[19:42] <Riddell> yes, and actually you will have to with qt 4.7
[19:42] <Riddell> qmlviewer won't be installed by distros by default in general
[19:43] <Riddell> you need to load it with QDeclarativeView in your c++ (or python or whatever) code)
[19:43] <Riddell> as I said before, this is so new it hasn't been released yet, so consider this a heads up for the future
[19:44] <Riddell> also this code still isn't much use for most designers, so Qt Creator integration is planned
[19:44] <Riddell> then you can just lay your elements out like in Qt Designer and enter the properties through the UI and pick your preferred transition style
[19:44] <Riddell> with any luck it'll get rid of the need for anyone to use Flash
[19:45] <Riddell> I think there's even been experimental browser plugins with it
[19:46] <Riddell> 19:44 < Neo---> QUESTION: are interfaces supposed to be written by coding or is there a graphical editor (possibly in-the-making)?
[19:46] <Riddell> the graphical editor is actually Qt Designer, the IDE you can use is Qt Creator and it integrates Designer very well
[19:47] <Riddell> any questions?  comments?  heckles?
[19:48] <Riddell> missed your chance sladen :)
[19:48] <Riddell> thanks for coming all
[19:49] <Riddell> as ever #kubuntu-devel is open to anyone wanting to help with Kubuntu
[19:49] <Riddell> #kde-devel is a good place to get into KDE development, #qt for Qt development and #qt-qml for Qt Quick
[19:50] <Riddell> 19:48 < sladen> Riddell: what's the latest terminiology re: "signals" "slots" and the like (eg. you've used "signal handler")
[19:50] <ClassBot> There are are 10 minutes remaining in the current session.
[19:51] <Riddell> a signal handler is a property in a QML item which can handle a signal
[19:51] <Riddell> unlike a slot it's in the same item, a slot can be in any object
[19:51] <Riddell> and you don't need to connect it, it's connected by just having something in the property
[19:53] <Riddell> 19:53 < sladen> Riddell: "something" ?
[19:53] <Riddell> yes, where the something is probably a QtScript snippit
[19:53] <Riddell> e.g.          Cell { cellColor: "red"; onClicked: helloText.color = cellColor }
[19:54] <Riddell> signal is "clicked" signal handler is the property "onClicked" which we set to the QtScript snippit "helloText.color = cellColor"
[19:55] <Riddell> 19:54 < James147> Riddell: how does scope work with QML? Can child elements see their parents properties?
[19:55] <Riddell> I believe there's a "parent" keyword yes
[19:55] <ClassBot> There are are 5 minutes remaining in the current session.
[19:55] <Riddell> generally you just use the id for each object within that file
[19:56] <Riddell> and files are used for items with interfaces so you can't (easily) access items within the top level item in a file
[19:57] <Riddell> but you make sure that top level item has the interface you need, like we did with Cell
[19:57] <Riddell> 19:55 < sladen> Riddell: so the relationship between 'clicked' and 'onClick' is implicit
[19:57] <Riddell> correct
[19:58] <Riddell> thanks for coming all, hope it was interesting
[19:59] <Riddell> I believe Laney and Rhonda will be taking us through "How to work with Debian" in a couple of minutes
[19:59]  * Laney waves
[20:00] <Laney> We'll get started in a couple of minutes
[20:00]  * Rhonda greets, too.
[20:01] <Laney> Right, let's do this
[20:01] <Laney> do we have people in #-chat? :)
[20:02] <Laney> Ah yes, good good!
[20:02]  * Laney spots a troublemaker in sebner already :)
[20:02] <Laney> So I'll quickly introduce myself... I'm Iain and I'm going to talk to you today about why you want to work with Debian
[20:03] <Laney> I'm a MOTU developer and contribute to Debian packaging too
[20:03] <Laney> Half or so of the session will also be given by Rhonda who is a long-time DD and recent MOTU, but she can introduce herself later :)
[20:04] <Laney> So what is Debian? Simply put, it is a hugely important Linux distribution which maintains some tens of thousands of pieces of free software
[20:04] <Laney> And almost uniquely in our Linuxy world, it is completely volunteer driven and not controlled by any commercial entity
[20:05] <Laney> The vast vast *vast* majority of software packages that are present in Ubuntu come from Debian packages in one way or another
[20:05] <Laney> Here's a graph that I found which shows how the packages in Universe are formed: https://merges.ubuntu.com/universe-now.png
[20:06] <Laney> Everything apart from the cyan wedge are packages which came from Debian — the dark blue wedge are those which haevn't been modified in Ubuntu at all
[20:06] <Laney> so you can see how important this project is to us :)
 QUESTION: is part of getting ubuntu-specific packages in debian part of the talk?
[20:07] <Laney> Not specifically, but I will try and get to this at the end
 QUESTION: Yesterday I attended Danial's class. It was great but after learning packaging I don't know what to do, where to get started and how to help ubuntu. I hope you  will take care of this :)
[20:08] <Laney> This talk is more about the why rather than the how: It is intended to encourage people to contribute to Debian directly
[20:08] <Rhonda> It is also meant to cover how, but for a very specific area. :)
[20:09] <Laney> ...ok then, so I'm here to convince you to do your Ubuntu work in Debian directly rather than trying to get fixes into Ubuntu. Having the string "ubuntu" in a package version is what we're intending to minimise
[20:09] <Laney> Rhonda: Right, my half :P
[20:10] <Laney> The Ubuntu MOTU team has approximately an order of magnitude fewer developers than there are DDs (people who can upload to Debian), so we have really got no hope of keeping on top of that many packages without help
[20:10] <Laney> By working together with our biggest partner, we can make the effort of keeping Ubuntu in good shape go as smoothly as possible
[20:11] <Laney> Also there is another perspective to this partnership: getting fixes pushed upstream (to the source of the software) will help more people benefit from your work, which can't be a bad thing
[20:11] <Laney> I'm not going to spend time in this session talking about the technical process of how to send your work to Debian, but here are some pointers for that:
[20:12] <Laney>   - look at the script "submittodebian" in the ubuntu-dev-tools package
[20:12] <Laney>   - look at the program "reportbug" in the reportbug package
[20:12] <Laney> https://wiki.ubuntu.com/Debian/Bugs#Using%20submittodebian%20to%20forward%20patches%20to%20Debian this is also a good resource on how technically to go about forwarding bugs
[20:12] <Laney> and as always #ubuntu-motu on freenode is an excellent place to ask any question
[20:14] <ClassBot> tech2077 asked: When we contribute to debian, and we want a package we made for debian in ubuntu, what should we do
[20:14] <Rhonda> Packages that are in Debian will get imported automatically into Ubuntu - there usually is nothing needed at all, as long as the next release isn't pending like currently (Debian Import Freeze)
[20:15] <ClassBot> ari-tczew asked: do I need to be a DD to doing NMU?
[20:15] <Rhonda> No, you don't need to - though you'll need a DD to sponsor you to do the actual upload. The NMU itself can be get prepared by anyone.
[20:16] <Laney> contributing RC bug fixes by NMU is a great way to get involved in Debian btw: see various RCBW posts on Planet Debian :)
[20:16] <Laney> thanks Rhonda
[20:16] <Laney> Right, so imagine you've found a bug and have managed to identify the fix
[20:16] <Rhonda> Sidenote: NMU stands for Non Maintainer Upload - there is in many packages within Debian a pretty tight maintainer concept of who is responsible for the package.
[20:17] <Laney> It can be difficult for you to identify whether the fix is appropriate to Debian or to Ubuntu, and additionally whether the fix is serious enough to require Ubuntu uploading right away or whether you can wait for the Debian maintainer to upload and then do a sync from there
[20:18] <Laney> Part of your fixing process should be investigating how far upstream to push the fix:
[20:18] <Laney> If Debian is also seeing the bug, or if the fix is not specific to Ubuntu in any way — forward to the Debian bug tracking system (using submittodebian or reportbug)
[20:19] <Laney> If the fix is in the upstream code (not part of the Debian/Ubuntu packaging or any patch therein), then it might be nice to try and reproduce the problem on the vanilla upstream version and then send the patch direclty there
[20:19] <Laney> If it is somehow Ubuntu specific then just submit to Launchpad for sponsoring
[20:20] <Laney> if you are unsure then ask in #ubuntu-motu as always
[20:20] <Laney> I've dug up a couple of bugs to use as case studies so that we can decide what to do with some real patches
[20:21] <Laney> The first one is bug #604565 — https://bugs.launchpad.net/ubuntu/+source/motion/+bug/604565
[20:21] <Laney> have a look at the proposed diff — http://launchpadlibrarian.net/51774081/debian-ubuntu.debdiff — do you guys think this should be forwarded?
 QUESTION: what is vanilla upstream version. Please relate it to bazaar
[20:22] <Laney> It's useful to download the upstream software from their website and compile it yourself, that way you can tell whether the bug you are seeing is a problem with their software or the way Debian/Ubuntu have set it up
[20:23] <Laney> that's what I mean by the vanilla upstream version
[20:24] <Laney> in bzr terms, perhaps a tagged commit? i.e. one which is an official upstream release
[20:24] <Laney> right, let's move on with the example
[20:25] <Laney> This is a good case of something which requires a little bit of investigation: the real change is a build-depends change from libmysqlclient15-dev → libmysqlclient-dev
[20:25] <Laney> What you'd want to do here is to investigate whether this fix applies to Debian too, to see whether we can forward the patch there
[20:25] <Laney> so we hop over to the PTS page of mysql-5.1 to have a look
[20:26] <Laney> that's this page: http://packages.qa.debian.org/m/mysql-5.1.html
[20:26] <Laney> we can see a package by the name libmysqlclient-dev in the "binaries" list, so that's a good start
[20:26] <Laney> we'll double check the changelog to be sure
[20:26] <Laney> that's here: http://packages.debian.org/changelogs/pool/main/m/mysql-5.1/current/changelog
[20:27] <Laney> In the entry for 5.1.37-1 we see the message "Drop empty transitional package libmysqlclient15-dev, and provide/replace it with libmysqlclient-dev
[20:27] <Laney> "
[20:27] <Laney> that is a very strong hint that we can indeed apply this fix to Debian too
[20:27] <Laney> so at this point I'd check the bugs list of the original package tos ee if the change is there: http://bugs.debian.org/cgi-bin/pkgreport.cgi?repeatmerged=no&src=motion
[20:28] <Laney> it's not, so now I'd do a test build in a debian environment (e.g. from pbuilder-dist sid create with ubuntu-dev-tools instaled)
[20:28] <Laney> and if this all works, then forward the patch
[20:28] <Laney> and hopefully at the next upload the Debian maintainer picks it up and we can just sync the package again :)
[20:29] <Laney> I got this question in PM:
[20:29] <Laney> 13/07 20:16:39 <Moomoc> On the Debian site: Am I a Debian Developer when I'm just maintaining one or more packages in Debian, or am I just a Debian maintainer then?
[20:30] <Rhonda> The terms Debian Developer and Debian Maintainer come with specific permissions.
[20:30] <Laney> Both of those terms have technical meanings in Debian: a Debian Developer is someone who has passed through the New Maintainer process and has an @debian.org address. They have almost unrestricted upload access to the Deiban archive
[20:30] <Laney> Debian Maintainer is a more limited set of permissions that is correspondingly not as stringent to achieve. These people can upload to packages they are maintainer or co-maintainer of and have a specific control file field set
[20:31] <Laney> however you need have neither of these statuses to comment perfectly well to Debian; you will just have to have your uploads spnosored by someone with access
[20:31] <Laney> Rhonda: did you have some questions you wanted to answer now?
[20:31] <ClassBot> ari-tczew asked: mentors debian page is for new packages?
[20:32] <Rhonda> The mentors.debian.net effort is a site for conveniently uploading packages for seeking for sponsors.
[20:32] <Rhonda> There are some people monitoring uploads to there, but it is usually adviced to also look on the debian-mentors@lists.debian.org mailinglist or in #debian-mentors on OFTC (irc.debian.org)
[20:33] <ClassBot> porthose asked: If you are unable to make it to UDS or DebConf, how is one to get there key signed to become a DM, what alternatives are there?
[20:34] <Rhonda> There is a special page in the Debian wiki where you can look for people in your area. Debian Developers are usually found at most bigger conferences and events beside from UDS and DebConf, too.
[20:35] <Rhonda> There are alternatives too, but those became extremely discouraged in recent times and only apply to people living really off the track.
[20:36] <Rhonda> Thanks to nthykier, he digged up the keysigning URL: http://wiki.debian.org/Keysigning/Offers
[20:36] <Rhonda> You can also ask any DD to look up location information in the debian ldap to find additional people that might live in your area.
[20:37] <Laney> OK, I had another case study which was Ubuntu specific but I'll leave that for now :)
[20:37] <Laney> it was bug 582253 if you're interested
[20:38] <Laney> To wrap it up, I just want to quickly talk about actualyl maintaining packages in Debian
[20:38] <Laney> your contributions need not be limited to forwarding individual patches from Ubuntu
[20:39] <Laney> If you get more involved in Ubuntu development, you will probably find at some point that there is a particular package or group of packages that hold your interest more than the rest
[20:39] <Laney> when this happens to you, it's a good idea to have a look at how they are maintained in Debian and try and get involved directly there
[20:40] <Laney> for example I am a member of the Haskell packaging team and CLI (~ mono) teams
[20:41] <Laney> all of the advantages of patch forwarding apply here too — reducing deltas, getting your fixes out to more people etc. with the additional bonus that *you* get to decide (or at least help to decide) what gets into the Debian packages
[20:41] <Laney> even if your fave package isn't maintained in a team, it might be a nice idea to approach the maintainer and offer your time — often times people are pressed for time themselves and would appreciate your help
[20:42] <Laney> and by talking to Debian maintainers directly you will be speaking to people who have direct knowledge of the software they are dealing with — this is not guaranteed to be the case in the MOTU team :)
[20:42] <Laney> right, that's all I had to say for now
[20:42] <Laney> I believe Rhonda has a few words for you, if I didn't take up too much time… :)
[20:43] <Rhonda> Hi. I've sneaked in a few lines already, let me introduce myself properly. I'm a Debian Developer for a very long time who piled up a fair amount of packages to look after.
[20:44] <Rhonda> Over the time I found out that some of the packages I maintain carried diffs within Ubuntu that make sense for Debian, too.
[20:45] <Rhonda> So I wrote to the last person who I found in the changelog doing a change to one of the packages to ask why the changes weren't forwarded to me.
[20:45] <Rhonda> Response was mostly "I didn't do the changes, just the last sync."
[20:46] <Rhonda> Over the time though I was able to convince some of the people to understand that it is good for them to forward patches.
[20:46] <Rhonda> There are many good reasons, including that it's covered even in the Code of Conduct ;)
[20:47] <Rhonda> The most appealing reason though should be: It means less work! No more merges needed, no checking wether the patch still is required, wether the patch might even produce a conflict, and others.
[20:48] <Rhonda> When a package can easily get synced without requiring a merge, it is a win-win situation for everyone: The fix is available to a broader audience and there is no work anymore for you!
[20:48] <ClassBot> NMinker asked: Is there an IRC channel that we can talk to Debian Maintainers, much like we can with the Ubuntu MOTDs?
[20:49] <Rhonda> There is a very old channel that was pretty dead over the time but got reactivated recently for this purpose: #debian-ubuntu which lives on OFTC (irc.debian.org)
[20:50] <ClassBot> There are are 10 minutes remaining in the current session.
[20:51] <Rhonda> I'd like to mention two pages in the Ubuntu wiki that are closely related to this topic:
[20:52] <Rhonda> https://wiki.ubuntu.com/Debian/Bugs mentiones how the Debian Bug Tracking System (BTS) works and how one can interface with it. The approach is a fair bit different to launchpad, mostly in the way that it's email centric and doesn't have a web interface for changing bug status and comments.
[20:54] <Rhonda> Secondly, https://wiki.ubuntu.com/Debian/Usertagging is also extremely interesting. It helps keeping track of bugs that were submitted to Debian to get an overview of where one might need a little more prodding to get it applied, or find candidates of packages that might require NMUs to get them fixed.
[20:54] <ClassBot> Ram_Yash asked: Is there any code review tools used?
[20:55] <Rhonda> Usual people are more happy to receive ready patches to apply to the packages. When they are sent to the BTS they can easily be reviewed by the package maintainer or anyone else interested in the packages.
[20:55] <ClassBot> There are are 5 minutes remaining in the current session.
[20:55] <Rhonda> A very important question that I still want to address:
 and what abut DD or DM that are hostile? Is there a way to work around that hostility?
[20:56] <Rhonda> Unfortunately this is part of the reason why some Ubuntu people rather refrain from forwarding patches or bugs to Debian.
[20:56] <Rhonda> I'd like to stress that those might be very vocal but on the other hand are a rather small group.
[20:58] <Rhonda> The best advice is to try to keep your temper and stay calm. Feel invited to come over to #debian-ubuntu (OFTC) and seek help on advices for how to move on in those areas. Usually there are DDs around who have had their own issues with those people already and found a way to work around already, in the one or the other way.
[20:59] <Rhonda> I think our time is almost up - next session should start within a few minutes time.
[20:59] <Laney> Thanks for coming!
[20:59] <Laney> The take home message is: always think about Debian when fixing stuff in Ubuntu
[20:59]  * Laney waves
[20:59] <Rhonda> Thanks for listening, and also for the good questions. :)
[21:03] <zyga> test
[21:03] <zyga> great
[21:03] <zyga> first of all thanks for joining, I don't know how many people are with me today
[21:04] <zyga> I prepared some rough notes and a bzr branch for those of you who will find this topic interesting
[21:04] <zyga> also I'm not sure how classbot and questions work so if anyone could ask me a QUESTION in the #ubuntu-classroom-chat channel I would appreciate that
[21:04] <zyga> if not I'll just start talking...
[21:05] <zyga> == About ==
[21:05] <zyga> Validation dashboard is a tool for both application developers and system integrators.
[21:05] <zyga> At system scope (which is probably not that interesting to most people here) it
[21:05] <zyga> aids development of a distribution or an distribution spin-off. More specifically it aids in seeing how low-level changes affect the whole system.
[21:05] <zyga> At application developer scope is aids in visualizing performance across time
[21:05] <zyga> (different source revisions), operating system versions and target hardware.
[21:06] <zyga> it all sounds nice but it's worth pointing out that dashboard is still under development and very little exists today
[21:07] <zyga> still it's on schedule to be usable and useful for maverick
[21:07] <zyga> I have a branch with some code that is worth using today, I will talk about it later during this session
[21:07] <zyga> == How it works ==
[21:08] <zyga> Validation dashboard is based on tapping into _existing_ tests and benchmarks
[21:08] <zyga> that provide data interesting to the target audience (you, developers). Most
[21:08] <zyga> projects have some sort of tests and already use them for CI (continuous
[21:08] <zyga> integration), some have test suites but no CI system as some are difficult to
[21:08] <zyga> set up and require effort to maintain.
[21:08] <zyga> Dashboard takes CI a step further. First by allowing you to extend a CI system
[21:08] <zyga> into a user-driven test system. When users can submit test results you get much
[21:08] <zyga> more data from a wide variety of hardware. Second you can test your unchanging
[21:08] <zyga> software (a stable release branch) against a changing environment. This will
[21:08] <zyga> allow you to catch regressions caused by third party software updates that you
[21:08] <zyga> depend on or that affect your runtime behaviour by being active in the system.
[21:08] <zyga> Finally the biggest part of launch control is the user interface. While I'm
[21:08] <zyga> just giving promises here the biggest effort will go into making the data easy
[21:08] <zyga> to understand and easy to work with. Depending on your project you will have a
[21:08] <zyga> different requirements. The dashboard will allow you to
[21:09] <zyga> The dashboard will allow you to show several kinds of pre-defined visualizations, depending on the kind of data your tests generates
[21:10] <ClassBot> porthose asked: is it working
[21:10] <zyga> it works, thanks
[21:11] <zyga> and will also allow you to make custom queries (a variation of the pre-defined really) that will show some specific aspect of the data such as comparing one software version to another or comparing results from different hardware, etc
[21:12] <zyga> so that's the good part, next I'll talk about how the dashboard operates internally and what is required to set one up (once it's ready to be used)
[21:12] <zyga> so the bard part is you need to put some effort to use the dashboard, an initial investment of sorts
[21:13] <zyga> you have to do some work to translate your test results into a format the dashboard will understand
[21:13] <zyga> Dashboard understands a custom data format that encapsulates software profile, hardware profile and test results (both tests and benchmarks). You get half of the information for free but you have to invest in writing a translator or other glue code from whatever your test suite generates into dashboard-friendly format.
[21:14] <zyga> A python library (that already exists) has been created to support this. Anyone can get it from my bzr branch by executing this command: bzr get lp:~zkrynicki/launch-control/udw launch-control.udw
[21:15] <zyga> you can get the branch now, I'll use it for one example later on
[21:16] <zyga> so a little back story now, dashboard is a project created for the arm world, arm hardware is really cool because it is so diverse and can scale from tiny low power microcontrollers all the way up to the multicore systems that have lots of performance
[21:17] <zyga> one of the thing that is not good about such diversity is validating your software stack on new hardware configuration, you really need some tools to make it efficient and worth your effort in the first place
[21:17] <zyga> so a couple of people in the linaro group are working on a set of tools that will make it easier, dashboard (aka launch-control) is one of them
[21:18] <zyga> so enough with the back story
[21:18] <ClassBot> ktenney asked: url to look at while waiting?
[21:20] <zyga> ktenney, I made a presentation about the initial assumptions of what the dashboard is about, I'm not sure if that is what you asked for. The presentation is here: http://ubuntuone.com/p/6fE/
[21:21] <zyga> okay so at the really low level the dashboard is about putting lots of lots of samples (measurements of something) into context
[21:21] <zyga> you can think of samples as simple test results
[21:22] <zyga> samples come in two forms one for plain tests and other for benchmark-like tests
[21:22] <zyga> most of the work you have to do to start using this is to translate your test results into this sample thing, fortunately it's quite easy
[21:23] <zyga> if you check out the branch I posted earlier (bzr get lp:~zkrynicki/launch-control/udw launch-control.udw
[21:24] <zyga> if you look around you'll see the examples directory
[21:24] <zyga> inside I wrote a simple script that takes the default output of python's unittest module output
[21:24] <zyga> and converts that into a dashboard samples
[21:25] <zyga> and packages all the samples into something I call bundle that you will be able to upload to a dashboard instance later on
[21:25] <zyga> so let's have a look at that code now
[21:25] <zyga> note: this is developed on maverick so if you have lucid and hit a bug, just let me know and I'll try to help you out
[21:26] <zyga> I'm sorry for saying this but I'm on holiday and I'm away from my workstation where I have a much better infrastructure
[21:26] <zyga> the client side code will run on a wide variety of linux distributions and will require little more than python2.5
[21:27] <zyga> this branch might fail but it's just a snapshot of work in progress developed on maverick
[21:27] <zyga> okay
[21:27] <zyga> so first thing is to get some test output
[21:28] <zyga> if you just run the test case (test.py) it will hopefully pass and print a summary
[21:29] <zyga> if you run it with -v (.test -v) it will produce a much more verbose format
[21:29] <zyga> that's the format we'll be using, redirect it to a file and store it somewhere
[21:30] <zyga> (by default unittest prints on sys.stderr so to capture that using a bash-like shell you must redirect the second file descriptor: ./test.py -v 2>test-result.txt )
[21:30] <ClassBot> penguin42 asked: What's the flow? Is the idea the dashboard runs somewhere central (like launchpad) or that each developer might have his own copy for his own test runs?
[21:30] <zyga> penguin42, great question thanks
[21:30] <zyga> penguin42, so the flow is kind of special
[21:31] <zyga> penguin42, we have decided NOT to run the centralized dashboard instance ourselves as it would defeat the linaro-specific requirements
[21:31] <zyga> penguin42, so to cut to the chase, you host your own dashboard,
[21:31] <zyga> it's going to be trivial to set one up
[21:31] <zyga> on a workstation
[21:31] <zyga> or a virtual machine
[21:31] <zyga> or some server you have
[21:32] <zyga> we'll make the deployment story as easy and good as possible as we expect (we == linaro) to get this inside corporations that develop software for the arm world and we want them to have a good experience
[21:33] <zyga> that said it's still possible that in the future launchpad or other supporting service will grow a dashboard derived system, there is a lot of interest for having some sort of tool like this for regular ubuntu QA tasks
[21:33] <zyga> but the answer is: currently you run your own
[21:33] <ClassBot> penguin42 asked: Is it possible to aggregate them - i.e. if there are a bunch of guys each doing this, or if a bunch of organisations are each doing it?
[21:34] <zyga> sorry for loosing context, could you specify what to aggregate
[21:36] <zyga> currently I see this being used (during the M+1 cycle) by linaro and some early adopters that will want to evaluate it for inclusion into their tool set, so I expect project-centric deployments
[21:36] <ClassBot> penguin42 asked: n-developers each working on an overlaping set of packages, each running their set of tests; is there a way multiple dashboards can aggregate test results to form a n overview of all of their tests?
[21:37] <zyga> penguin42, yes multiple developers can use a single instance to host unrelated projects and share some data (possibly)
[21:37] <zyga> penguin42, so to extend on your example, you can have a couple of developers working on some packages in some distribution (one for simplicity but this is not required)
[21:38] <zyga> penguin42, and while each developer sets up something that will upload test results (daily tests are our primary target)
[21:38] <zyga> penguin42, you can go to the dashboard and see a project wide overview of how your system is doing
[21:38] <zyga> penguin42, if there are any performance regressions
[21:39] <zyga> penguin42, new test failures
[21:39] <zyga> penguin42, overall test failures grouped by test collection (my term for "bunch of tests")
[21:39] <zyga> penguin42, our current targets are big existing test projects such as LTP or phoronix
[21:39] <zyga> they have lots of tests that look at the whole distribution
[21:40] <zyga> so many people can upload results of running those tests on their software/hardware combination
[21:40] <zyga> and you can look at that on one single page
[21:41] <zyga> on the opposite spectrum you can have multiple projects (such as "my app 1" and "my app 2")
[21:41] <zyga> that for some reason share a dashboard instance
[21:41] <zyga> and have totally unrelated data inside the system
[21:42] <ClassBot> tech2077 asked: Will this be available stable before maverick, i heard it would be stable at the time, but what is the stable release time frame
[21:42] <zyga> sorry I'm on GSM internet here and I have some lags
[21:42] <zyga> we have to speed up a little
[21:43] <zyga> tech2077, it will be available by the time maverick ships in a PPA
[21:43] <zyga> tech2077, our target is inclusion in N
[21:43] <zyga> I'll get back to the session now
[21:43] <zyga> so we ran the test suite I have written for the client side code
[21:44] <zyga> and placed the results in a test-result.txt file
[21:44] <zyga> the results themselves are a simple line-oriented (more less) format
[21:44] <zyga> the interesting bits are lines that end with " ... ok" and " ... FAIL"
[21:44] <zyga> parsing that should be easy
[21:45] <zyga> If you run ./examples/parse-pyunit -i test-result.txt -o test-result.json
[21:45] <zyga> you'll get a test-result.json file, go ahead and inspec it
[21:46] <zyga> there is some support structure but the majority of the data is inside the "samples" collection
[21:46] <zyga> so this is the easiest way of translating test results
[21:46] <zyga> tests have no individual identity
[21:47] <zyga> and all you get is a simple pass/fail status
[21:47] <zyga> everything else is just optional data, like message we harvested in this case
[21:47] <zyga> this is very weak as we cannot, for example, see a history of a particular test case
[21:47] <zyga> but it was very easy to set up
[21:48] <zyga> the next thing we'll make is to turn on a feature I commented away
[21:48] <zyga> in examples/parse-pyunit find the line that says bundle.inspect_system() and remove the comment # sign
[21:49] <zyga> if you run the parser again you'll get lots of extra information
[21:49] <zyga> this is the actual data you'd submit to a dashboard instance
[21:49] <zyga> your test results (samples)
[21:50] <zyga> software profile (mostly all the software the user had installed)
[21:50] <zyga> hardware profile (basic hardware information, cpu, memory and some miscellaneous bits like usb)
[21:50] <ClassBot> There are are 10 minutes remaining in the current session.
[21:51] <zyga> the profiles will make it possible to construct specialized queries and to filter data inside the system
[21:51] <zyga> okay so I have 10 minutes
[21:51] <zyga> I'd like to talk a tiny bit about samples again to let you know what is supported during this cycle
[21:51] <zyga> and spend the rest on questions
[21:51] <zyga> so there are qualitative samples (pass/fail) tests
[21:52] <zyga> they have test_result (mainly pass and several types of fail)
[21:52] <zyga> and test_id - the identity
[21:53] <zyga> if you start tracking test identity you need to make sure your tests have an unique identity that will not change as you develop your software
[21:53] <zyga> the primary use case for this is specialized test cases and benchmarks
[21:53] <zyga> a test that checks if the system boots is pretty important
[21:53] <zyga> a benchmarks that measures rendering performance needs identity to compare one run to all the previous runs you already stored in the system
[21:54] <zyga> identity is anything you like but it's advised to keep it to a reverse domain name scheme
[21:54] <zyga> the only thing the system enforces is a limited (domain name like) characters available
[21:55] <zyga> if you look at the pydoc documentation for launch_control.sample.QualitativeSample you can learn about additional properties
[21:55] <zyga> the next important thing is QuantitativeSample - this is for all kinds of benchmarks
[21:55] <ClassBot> There are are 5 minutes remaining in the current session.
[21:55] <zyga> and differs by having a measurement property that you can use to store a number
[21:56] <zyga> if you have benchmarks or want to experiment with adapting your test results so that they can be pushed to the dashboard please contact me, I'd love to hear your comments
[21:57] <ClassBot> tech2077 asked: is python coverage available for lucid, it seems to depend on it
[21:57] <zyga> tech2077, python-coverage is not strictly required, it's just for test coverage of the library
[21:58] <ClassBot> dupondje asked: when running it I get ImportError: No module named launch_control.json_utils, whats the right package I need to install ?
[21:58] <zyga> dupondje, none, just make sure to run this from the top-level package directory, or set PYTHONPATH accordingly
[21:59] <zyga> in general if you want to make sure you have all the dependencies see debian/control
[21:59] <ClassBot> penguin42 asked: Has it got any relation to autotest (autotest.kernel.org)
[21:59] <zyga> penguin42, no
[22:00] <zyga> penguin42, actual test frameworks are not really related to the dashboard
[22:00] <zyga> penguin42, dashboard is just for visualizing the data and for having a common upload "document" (here a .json file)
[22:00] <zyga> time's up
[22:01] <dupondje> you have some visual example ?
[22:01] <zyga> dupondje, just mockups, I'm working on the visual parts really
[22:02] <zyga> dupondje, (actually I will next week, I'm still on holiday)
[22:02] <dupondje> héhé ok :)
[22:02]  * zyga keeps getting bitten by mosquitoes just to get internet here
[22:03] <zyga> dupondje, what I can tell you today is that I'll probably use open flash charts for the on-screen rendering
[22:03] <penguin42> a whole new meaning to bugs with your net connection
[22:03] <zyga> hehe
[22:03] <zyga> the hard part with visualizing is to make it really easy to make custom graphs that you want to show (as in asking for the right data)
[22:04] <zyga> some of this is really skewed to linaro and arm world but much I hope will apply to general PCs and upstreams that develop software
[22:05] <zyga> if upstreams start producing more tests and start to look at feedback from various runtime environments (during their daily development process) then the dashboard project will succeed
[22:05] <zyga> that's all from me, if you want please contact me at zygmunt.krynicki@linaro.org