=== someone is now known as Guest80499 [01:56] is happing something now? [01:57] happening* === jonas42 is now known as Jonas42 [01:57] no, classes start again in about 14 hours [01:59] ok, thanks [01:59] what are you doing here? [02:00] i`m sorry, i`m new here [02:06] hi everybody [03:19] hello [03:47] hi === wraith is now known as Guest94790 === ikonia_ is now known as ikonia === aksyahba is now known as acklee === blaze is now known as Guest51455 === __2hack3r is now known as sayan [15:51] Alright my friends - welcome everybody to the second day of Ubuntu Developer Week! [15:51] I'll just do a very quick introduction, before I vanish off the stage [15:51] Most of you know the organisational bits by now: [15:52] - please make sure you join #ubuntu-classroom-chat as well, so you can ask questions and chat in there [15:52] - if you want to ask questions, please prefix them with QUESTION: [15:52] ie: QUESTION: Which instrument does Barry play? [15:52] - also: if you can't make it to a session or missed something: there'll be logs up later on at https://wiki.ubuntu.com/UbuntuDeveloperWeek (logs of yesterday are linked already) [15:54] dholbach, there is still time remaining I guess before I start [15:54] Today we're going to kick off with "Getting started with merging packages from debian" - Bhavani "coolbhavi" Shankar will lead the session and you all have 6 minutes left to relax, grab another coffee/tea/water [15:54] exit [15:54] There is :) [15:54] and please let your friends know about the event in case they're interested :-) [15:54] Enjoy another great day off Ubuntu Developer Week! [15:58] hey all before I start the session I am bhavani shankar a ubuntu contributor and MOTU and ll be showing a simple example using merge o matic [16:00] Hey all [16:00] so lets get kicking on this [16:00] We are going to learn how to merge packages [16:00] But, first of all we need to understand what merging is [16:00] Today I'm going to show you how to do a merge using merge - o - matic (MoM). Ubuntu's semi automatic merging system === ChanServ changed the topic of #ubuntu-classroom to: Welcome to the Ubuntu Classroom - https://wiki.ubuntu.com/Classroom || Support in #ubuntu || Upcoming Schedule: http://is.gd/8rtIi || Questions in #ubuntu-classroom-chat || Event: Ubuntu Developer Week - Current Session: Getting started with merging packages from debian - Instructors: coolbhavi [16:01] Logs for this session will be available at http://irclogs.ubuntu.com/2011/07/12/%23ubuntu-classroom.html following the conclusion of the session. [16:01] efore we start off just a basic roundup of the concept(s) involved [16:01] On our first stage of development cycle we import packages from debian unstable (sid). (In case of a LTS we import packages from debian testing) [16:02] There are two ways to import a package from debian (one is a sync and other is a merge) [16:02] Sync is importing the debian package "as is" without any further changes [16:02] whereas merging is importing the debian package and intoducing/including all the ubuntu changes. [16:02] But we need to cross verify whether the ubuntu changes are applicable for the present version and the ubuntu changes have been superceeded by debian in which case its a sync [16:03] Now getting to know a short background lets move on [16:04] Please enable universe and main repositories and pull in all the packages essential for ubuntu development environment (as mentioned in the ubuntu packaging guide which m pasting here for your kind reference [16:05] sudo apt-get install --no-install-recommends bzr-builddeb ubuntu-dev-tools fakeroot build-essential gnupg pbuilder debhelper [16:05] (PS: btw, this page provides a overview of merging workflow https://wiki.ubuntu.com/UbuntuDevelopment/Merging and we are going to see a simple example of it now) [16:06] So assuming you have understood till now lets move on [16:06] First of all we need to check what packages have need to be merged. Thats where MoM comes into picture [16:06] MoM is available here: http://merges.ubuntu.com [16:06] First of all we need to create a work directory, I use ~/development, you could use a directory of your own [16:07] From now on we are calling this $WORK_DIR, so please create a working directory for your own. [16:07] now store the path on the $WORK_DIR variable giving "export $WORK_DIR=/path/to/work/directory" [16:07] for convenience [16:08] which in my case will be "export WORK_DIR=~/development" [16:08] By MoM way we use a script called grab-merge.sh which is available in ubuntu-dev-tools but for convineince I'll download the script here [16:08] Execute this in a terminal [16:08] cd $WORK_DIR ; wget -c http://merges.ubuntu.com/grab-merge.sh; chmod +x grab-merge.sh [16:09] I'll go a bit slow from now on for others to catch up if m going fast [16:10] So once its done we can start merging process [16:10] Since we are new contributors I ll take a simple example of a universe package merge which is ldaptor a pure python based LDAP client in short [16:11] See: https://launchpad.net/ubuntu/+source/ldaptor [16:11] btw the complete list of universe package merges is here: [16:11] https://merges.ubuntu.com/universe.html [16:12] Here we find a large list of packages, with their ubuntu, debian and base version also we see the last uploader, which is the last person who work on the package, and in some cases the [16:12] uploader of the package (sponsor of the package) [16:13] So if all is fine we are going to work on ldaptor package now [16:14] so we are going to create an empty directory to work on it and get into it: [16:14] mkdir $WORK_DIR/ldaptor ; cd $WORK_DIR/ldaptor [16:14] now we need to download the debian and ubuntu packages to work on them, that's easily done with the script we download earlier: [16:16] Now I assume everyone has created a directory named ldaptor we ll execute grab-merge.sh script which on my system will be india@ubuntu11:~/development/ldaptor$ ../grab-merge.sh ldaptor [16:17] Now I'll leave some time for the packages to download [16:19] most of the work have been already done by MoM, we only need to work on some fine tuning and the tasks [16:19] which need human intervention [16:20] ok, if everything is already downloaded we can see a file called REPORT, this is the first thing we [16:20] need to look at === Shock is now known as Guest98355 [16:20] In this case there are no conflicts so indicating that its a pretty simple merge but quite interesting :) [16:21] now we just need to look at the debian changelog and determine whether the ubuntu changes are still applicable or not [16:22] for that do cd ldaptor-0.0.43+debian1-5ubuntu1/debian in my system [16:23] india@ubuntu11:~/development/ldaptor$ cd ldaptor-0.0.43+debian1-5ubuntu1/debian/ === vivek is now known as Guest37421 [16:23] now once you are in the /debian directory [16:24] type in this command dch -e to edit the debian/changelog in your favourite editor [16:24] i use good old nano :) [16:26] you ll find a lines like this at the start ldaptor (0.0.43+debian1-5ubuntu1) oneiric; urgency=low [16:26] * Merge from debian unstable. Remaining changes: [16:26] - SUMMARISE HERE [16:27] with the debian package changelog and previous ubuntu package chanbgelog merged together :) [16:28] Now if you take a look at the previous ubuntu specific changelog you ll find this: [16:28] ldaptor (0.0.43+debian1-4ubuntu1) oneiric; urgency=low [16:28] * Merge from debian unstable. Remaining change: [16:28] - Remove empty POT files. Fixes FTBFS caused by pkgstriptranslations. [16:31] which is pretty interesting as the package fails to build on the official buildds of ubuntu due to a package which strips translations pertaining to the package and if the po or pot files are empty it causes a build failure [16:32] without this change the package would have been imported as is giving raise to a sync :) [16:33] so now we need to update the changelog for the latest ubuntu version of the package we are working on which on my system now will be [16:33] ldaptor (0.0.43+debian1-5ubuntu1) oneiric; urgency=low [16:33] * Merge from debian unstable. Remaining changes: [16:33] - Remove empty POT files. Fixes FTBFS caused by pkgstriptranslations. [16:35] Please take note of the spacings and the format of the changelog as it ll be machine parseable :) [16:35] Now save the changes in your favourite editor ] [16:36] and run the following command debuild -S [16:36] So that builds the .dsc file and generates the .changes file now you should test build the package .dsc file in a pbuilder or sbuild to check whether the package builds correctly not and generates the deb file [16:37] (Note: this step is very important for ensuring quality work and quick sponsoring :) ) [16:39] so after building the .dsc please test it in a pbuilder so issue the following command: sudo pbuilder build ldaptor_0.0.43+debian1-5ubuntu1.dsc [16:40] QUESTION: make: dh: Command not found make: *** [clean] Error 127 [16:41] saimanoj79, make sure you have installed all the packages correctlly required for ubuntu development as said above [16:41] debsign: gpg error occurred! Aborting....debuild: fatal error at line 1256:running debsign failed is it work? [16:43] vanderson, please create a GPG key too because its required to sign the .dsc built and .changes with your gpg key https://wiki.ubuntu.com/GnuPrivacyGuardHowto should help you [16:43] Once the package builds correctly generate the debdiff between the current debian_version.dsc and current ubuntu_version.dsc and attach it to a bug opened as defined in the merging workflow in this link: https://wiki.ubuntu.com/UbuntuDevelopment/Merging [16:45] so in my system I would do: india@ubuntu11:~/development/ldaptor$ debdiff ldaptor_0.0.43+debian1-5.dsc ldaptor_0.0.43+debian1-5ubuntu1.dsc > ldaptor.diff [16:45] and attach ldaptor.diff as a patch to the bug I created as per the merge workflow as a patch [16:47] and last but not the least Subscribe the ubuntu-sponsors team on your merge request bug for feedback and uploading of your change :) [16:50] so this was a session on how to get started merging packages from debian. This I believe can get you started into merging packages The above example was a simple one and various packages have various sort of conflicts which are need to be handled diligently :) [16:51] There are 10 minutes remaining in the current session. [16:51] and we need to look at the importance of debian package change too while merging a package from debian [16:51] coolbhavi, QUESTION: as a side note, would you mind explaining what a fake sync is and why do we need that? [16:53] and`, fake sync arises due to mismatching of orig tarballs in debian and ubuntu so it cant be synced directly from debian in short terms [16:55] and if you get stuck anywhere in the ubuntu development sphere please feel free to tap on us in #ubuntu-motu or #ubuntu-devel [16:55] we are always there to help you :) [16:55] There are 5 minutes remaining in the current session. [16:56] QUESTION: in which directory did you issue the pbuilder command? I'm getting ... is not a valid .dsc file name [16:56] mjaga, its in the ldaptor directory that we created [16:57] QUESTION: how to apply that patch (.diff) to ubuntu package [16:58] we use the generated diff to apply to the debian package in case of a merge takdir [16:59] so if anyone is on facebook you can catch me up on facebook.com/bshankar :) [17:00] thats all from my side now :) [17:00] thanks all for turning up for this session === ChanServ changed the topic of #ubuntu-classroom to: Welcome to the Ubuntu Classroom - https://wiki.ubuntu.com/Classroom || Support in #ubuntu || Upcoming Schedule: http://is.gd/8rtIi || Questions in #ubuntu-classroom-chat || Event: Ubuntu Developer Week - Current Session: Porting from pygtk to gobject introspection - Instructors: pitti [17:01] Logs for this session will be available at http://irclogs.ubuntu.com/2011/07/12/%23ubuntu-classroom.html following the conclusion of the session. [17:01] thanks coolbhavi! [17:01] Hello everyone! I am Martin Pitt from the Canonical Ubuntu Desktop Team. [17:02] just a note, if you were in this talk at the last UDW half a year ago, it'll be pretty much the same [17:02] just to get an impression of how many folks are listening, can you raise hands (o/) or say hello in #chat? [17:03] nice :) === Guest24161 is now known as vistaero [17:03] so, let's python! [17:03] Python is a very important and popular language in Ubuntu, we have a lot of applications written in Python for GTK and Qt/KDE. Most prominent examples are our installer Ubiquity, Software Center, our driver installer "Jockey", and our bug/crash reporting system "Apport" (shameless plug!). [17:04] By way of Quickly we also encourage application developers to use Python and GTK, as these allow you to write GUI applications both conveniently, fast, and still rather robust. [17:04] Until recently, the package of choice for that has been PyGTK, a manually maintained Python binding for GTK, ATK, Pango, Glade, and a few other things. However, a few months ago, with the advent of GTK3, PyGTK was declared dead, so it's time to bring the banner of the great new world of its successor -- gobject-introspection -- to the world! [17:04] I'll concentrate on the app developer side, i. e. how to use GI typelibs in Python, but will nevertheless give a quick overview of gobject-introspection. [17:05] Porting existing PyGTK2 code is a topic that has kept, and will still keep many of us busy for some months, so I'll explain the process and common pitfalls with that. [17:05] Finally I'll give some pointers to documentation, and will be available for some Q&A. [17:05] Everyone ready to dive in? Please let me know (here or in #-chat) when I become too fast. If I am being totally unclear, please yell and I'll handle that immediately. If you just have a followup question, let's handle these at the end. [17:06] == Quick recap: What is GI? == [17:06] So far a lot of energy was spent to write so-called "bindings", i. e. glue code which exposes an existing API such as GTK for a target language: PyGTK, libnotify-cil, or Vala's .vapi files. [17:06] This both leads to a combinatorial explosion (libraries times languages), as well as many bindings which don't exist at all, or being of low quality. In addition it is also an almost insurmountable barrier for introducing new languages, as they would need a lot of bindings before they become useful. [17:07] GI is a set of tools to generate a machine parseable and complete API/ABI description of a library, and a library (libgirepository) which can then be used by a language compiler/interpreter to automatically provide a binding for this library. [17:07] With GI you can then use every library which ships a typelib in every language which has GI support. [17:07] GI ABI/API descriptions come in two forms: [17:07] * An XML file, called the "GIR" (GI repository). These are mainly interesting for developers if they need to look up a particular detail of e. g. a method argument or an enum value. These are not actually used at runtime (as XML would be too costly to interpret every time), and thus they are shipped in the library's -dev package in Ubuntu and Debian. For example, libgtk2.0-dev ships [17:07] /usr/share/gir-1.0/Gdk-2.0.gir. [17:07] * A compiled binary form for efficient access, called the "typelib". These are the files that language bindings actually use. Ubuntu/Debian ship them in separate packages named gir--, for example, gir1.2-gtk-2.0 ships /usr/lib/girepository-1.0/Gdk-2.0.typelib. [17:08] (Yes, it's confusing that the gir1.2-* package does _not_ actually ship the .gir file; don't ask me why they were named "gir-", not "typelib-"). [17:08] == How does it work in Python? == [17:08] pygobject is the piece of software which provides Python access to GI (amongst other things, like providing the glib and GObject bindings). The package name in Ubuntu/Debian is "python-gobject", and it should already be installed on all but the most manually trimmed down installations. [17:09] Initial GI support was added to pygobject in version 2.19.0 (August 2009), but the entire GI/pygobject/annotations stack really only stabilized in the last couple of months, so that in practice you will need at least pygobject 2.28 and the corresponding latest upstream releases of GTK and other libraries you want to use. [17:09] This means that you can only really use this with the latest release of distros, i. e. Ubuntu 11.04 (Natty) or Debian testing. [17:10] (some time to catch up, will slow down a bit as per #chat) [17:11] pygobject provides a "gi.repository" module namespace which generates virtual Python modules from installed typelibs on the fly. [17:11] For example, if you install gir1.2-gtk-2.0 (it's already installed by default in Ubuntu 11.04), you can do: [17:12] $ python -c 'from gi.repository import Gtk; print Gtk' [17:12] [17:12] and use it just like any other Python module. [17:12] I bet that this first example comes as an absolutely unexpected surprise to you: [17:12] $ python -c 'from gi.repository import Gtk; Gtk.MessageDialog(None, 0, Gtk.MessageType.INFO, Gtk.ButtonsType.CLOSE, "Hello World").run()' [17:12] * pitti gives everyone a couple of seconds to copy&paste&run that and be shocked in awe [17:14] working? [17:14] Let's look at the corresponding C code: [17:14] GtkWidget* gtk_message_dialog_new (GtkWindow *parent, GtkDialogFlags flags, GtkMessageType type, GtkButtonsType buttons, const gchar *message_format, ...); === Shock is now known as Guest57437 [17:15] and the C call: [17:15] GtkMessageDialog* msg = gtk_message_dialog_new (NULL, 0, GTK_MESSAGE_INFO, GTK_BUTTONS_CLOSE, "Hello World"); [17:15] msg.run() [17:15] So what do we see here? [17:15] (1) The C API by and large remains valid in Python (and other languages using the GI bindings), in particular the structure, order, and data types of arguments. There are a few exceptions which are mostly due to the different way Python works, and in some cases to make it easier to write code in Python. [17:15] I'll speak about details below. But this means that you can (and should) use the normal API documentation for the C API of the library. devhelp is your friend! [17:16] (2) As Python is a proper object oriented language, pygobject (and in fact the GI typelib already) expose a GObject API as proper classes, objects, methods, and attributes. I. e. in Python you write [17:16] b = Gtk.Button(...) [17:16] b.set_label("foo") [17:16] instead of the C gobject syntax [17:16] GtkWidget* b = gtk_button_new(...); [17:16] gtk_button_set_label(b, "foo"); [17:17] The class names in the typelib (and thus in Python) are derived from the actual class names stated in the C library (like "GtkButton"), except that the common namespace prefix ("Gtk" here) is stripped, as it becomes the name of the module. [17:17] (3) Global constants would be a heavy namespace clutter in Python, and thus pygobject exposes them in a namespaced fashion as well. [17:17] I. e. if the MessageDialog constructor expects a constant of type "GtkMessageType", then by above namespace split this becomes a Python class "Gtk.MessageType" with the individual constants as attributes, e. g. Gtk.MessageType.INFO. [17:18] (4) Data types are converted in a rather obvious fashion. E. g. when the C API expects an int* array pointer, you can supply a normal Python array [0, 1, 2]. A Python string "foo" will match a gchar*, Pythons None matches NULL, etc. [17:18] So the GObject API actually translates quite naturally into a real OO language like Python, and after some time of getting used to above transformation rules, you should have no trouble translating the C API documentation into their Python equivalents. [17:18] When in doubt, you can always look for the precise names, data types, etc. in the .gir instead, which shows the API broken by class, method, enum, etc, with the exact names and namespaces as they are exposed in Python. [17:19] There is also some effort of turning .girs into actual HTML documentation/devhelp, which will make development a lot nicer [17:19] but I'm afraid it's not there yet, so for now you need to use the C API documentation and the .gir files [17:20] As I mentioned above, this is in no way restricted to GTK, GNOME, or UI. For example, if you handle any kind of hardware and hotplugging, you almost certainly want to query udev, which provides a nice glib integration (with signals) through the gudev library. [17:20] This example lists all block devices (i. e. hard drives, USB sticks, etc.): [17:21] (You need to install the gir1.2-gudev-1.0 package for this) [17:21] $ python [17:21] >>> from gi.repository import GUdev [17:21] >>> c = GUdev.Client() [17:21] >>> for dev in c.query_by_subsystem("block"): [17:21] ... print dev.get_device_file() [17:21] ... [17:21] /dev/sda [17:21] /dev/sda1 [17:21] /dev/sda2 [17:21] [...] [17:21] See http://www.kernel.org/pub/linux/utils/kernel/hotplug/gudev/GUdevClient.html#g-udev-client-query-by-subsystem for the corresponding C API. [17:21] or /usr/share/gir-1.0/GUdev-1.0.gir for the proper class/method OO API [17:22] GI is not even restricted to GObject, you can annotate any non-OO function based API with it. E. g. there is already a /usr/share/gir-1.0/xlib-2.0.gir (although it's horribly incomplete). These will behave as normal functions in Python (or other languages) as well. [17:22] == Other API differences == [17:22] I said above in (1) that the structure of method arguments is by and large the same in C and in GI/Python. There are some notable exceptions which you must be aware of. [17:23] === Constructors === [17:23] The biggest one is constructors. There is actually two ways of calling one: [17:23] * Use the real constructor implementation from the library. Unlike in normal Python you need to explicitly specify the constructor name: [17:23] Gtk.Button.new() [17:23] Gtk.Button.new_with_label("foo") [17:23] * Use the standard GObject constructor and pass in the initial property values as named arguments: [17:23] Gtk.Button(label="foo", use_underline=True) [17:23] The second is actually the recommended one, as it makes the meaning of the arguments more explicit, and also underlines the GObject best practice that a constructor should do nothing more than to initialize properties. But otherwise it's pretty much a matter of taste which one you use. [17:24] === Passing arrays === [17:24] Unlike C, higher level languages know how long an array is, while in the C API you need to specify that explicitly, either by terminating them with NULL or explicitly giving the length of the array in a separate argument. [17:25] Which one is used is already specified in the annotations and thus in the typelib, so Python can automatically provide the right format without the developer needing to append an extra "None" or a separate len(my_array) argument. [17:25] For example, in C you have [17:25] gtk_icon_theme_set_search_path (GtkIconTheme *icon_theme, const gchar *path[], gint n_elements) [17:25] (where you pass an array and an explicit length) [17:25] In Python you can just call this as [17:25] my_icon_theme.set_search_path(['/foo', '/bar']) [17:26] and don't need to worry about the array size. [17:26] === Output arguments === [17:26] C functions can't return more than one argument, so they often use pointers which the function then fills out. [17:26] Conversely, Python doesn't know about pointers, but can easily return more than one value as a tuple. [17:27] The annotations already describe which arguments are "out" arguments, so in Python they become part of the return tuple: [17:27] first one is the "real" return value, and then all out arguments in the same order as they appear in the declaration. [17:27] For example: [17:27] GdkWindow* gdk_window_get_pointer (GdkWindow *window, gint *x, gint *y, GdkModifierType *mask) [17:27] In C you declare variables for x, y, mask, and pass pointers to them as arguments [17:27] In Python you would call this like [17:28] (ptr_window, x, y, mask) = mywindow.get_pointer() [17:28] === Non-introspectable functions/methods === [17:28] When you work with PyGI for a longer time, you'll inevitably stumble over a method that simply doesn't exist in the bindings. [17:28] These usually are marked with introspectable="0" in the GIR. [17:29] In the best case this is because there are some missing annotations in the library which don't have a safe default, so GI disables these to prevent crashes. They usually come along with a corresponding warning message from g-ir-scanner, and it's usually quite easy to fix these. [17:29] in popular libraries like GTK 3, pretty much all of them are fixed now, but in less common libraries there's probably still a ton of them [17:30] Another common case are functions which take a variable number of arguments, such as gtk_cell_area_add_with_properties(). [17:30] Varargs cannot be handled safely by libgirepository. [17:31] In these cases there are often alternatives available (such as gtk_cell_area_cell_set_property()). For other cases libraries now often have a ..._v() counterpart which takes a list instead of variable arguments. [17:31] == Migrating pygtk2 code == [17:32] (there are two more common differences: overrides and GDestroyNotify, but they are documented on a wiki page, no need to bore you with them right now) [17:32] A big task that we in Ubuntu already started in the Natty cycle, and which will continue to keep us and all other PyGTK app developers busy for a while is to port PyGTK2 applications to GTK3 and PyGI. [17:33] Note that this is really two migrations in one step, but is recommended as GTK2 still has a lot of breakage with PyGI, although I did a fair amount of work to backport fixes from GTK3 (the six applications that we ported in Natty run with PyGI and GTK2, after all). [17:33] The GTK2 → GTK3 specifics are documented at http://developer.gnome.org/gtk3/stable/gtk-migrating-2-to-3.html and I don't want to cover them here. [17:33] === Step 1: The Great Renaming === [17:34] The biggest part in terms of volume of code changed is basically just a renaming exercise. [17:34] E. g. "gtk.*" now becomes "Gtk.*", and "gtk.MESSAGE_INFO" becomes "Gtk.MessageType.INFO". [17:34] Likewise, the imports need to be updated: "import gtk" becomes "from gi.repository import Gtk". [17:34] Fortunately this is is a mechanical task which can be automated. [17:34] The pygobject git tree has a script "pygi-conver.sh" which is a long list of perl -pe 's/old/new/' string replacements. You can get it from http://git.gnome.org/browse/pygobject/tree/pygi-convert.sh. [17:35] It's really blunt, but surprisingly effective, and for small applications chances are that it will already produce something which actually runs. [17:35] Note that this script is in no way finished, and should be considered a collaborative effort amongst porters. So if you have something which should be added there, please don't hesitate to open a bug or ping me or someone else on IRC (see below). We pygobject devs will be happy to improve the script. === Ursinha is now known as Ursinha-lunch [17:35] When you just run pygi-convert.sh in your project tree, it will work on all *.py files. If you have other Python code there which is named differently (such as bin/myprogram), you should run it once more with all these file names as argument. [17:36] === Step 2: Wash, rinse, repeat === [17:36] Once the mechanical renamings are out of the way, the tedious and laborious part starts. [17:36] As Python does not have a concept of "compile-time check" and can't even check that called methods exist or that you pass the right number of parameters, you now have to enter a loop of "start your program", "click around until it breaks", "fix it", "goto 1". [17:36] he necessary changes here are really hard to generalize, as they highly depend on what your program actually does, and this will also involve the GTK 2 → 3 parts. [17:37] (just imagine a 'T' at the start of the last sentence) [17:37] One thing that comes up a lot are pack_start()/pack_end() calls. In PyGTK they have default values for "expand", "start", and "padding", but as GTK does not have them, you won't have them in PyGI either. [17:37] There even was a patch once for providing an override for them, but it was rejected as it would cement the API incompatibility. [17:38] and upstream decided (righfully IMHO) that staying close to the original API is better than staying compatible with pygtk's quirks [17:38] One thing you need to be aware of is that you can't do a migration halfway: [17:38] If you try to import both "gtk" and "gi.repository.GTK", hell will break lose and you'll get nothing but program hangs and crashes, as you are trying to work with the same library in two different ways. [17:38] you have to be especially careful if you import other libraries which import gtk by themselves, so it might not actually be immediately obvious that this happens [17:39] You can mix static and GI bindings of _different_ libraries, such as using dbus-python and GTI-GI. [17:39] sorry, GTK-GI [17:40] === Step 3: Packaging changes === [17:40] After you have your code running with PyGI and committed it to your branch and released it, you need to update the dependencies of your distro package for PyGI. [17:40] You should grep your code for "gi.repository" and collect a list of all imported typelibs, and then translate them into the appropriate package name. [17:40] For example, if you import "Gtk, Notify, Gudev" you need to add package dependencies to gir1.2-gtk-3.0, gir1.2-notify-0.7, and gir1.2-gudev-1.0 on Debian/Ubuntu [17:40] I have no idea about other distros, so the package names will differ, but the concept is the same [17:41] At the same time you should drop dependencies to the old static bindings, like python-gtk2, python-notify, etc. [17:41] Finally you should also bump the version of the python-gobject dependency to (>= 2.28) to ensure that you run with a reasonably bug free PyGI. [17:41] == RTFM & Links == [17:41] I'd like to give a list of useful links for this topic here. [17:41] This has a good general overview about GI's architecture, annotations, etc: [17:41] https://live.gnome.org/GObjectIntrospection [17:42] By and large the contents of this talk from previous UDW, massaged to be a proper wiki page: [17:42] https://live.gnome.org/PyGObject/IntrospectionPorting [17:42] The interview with Jon Palmieri and Tomeu Vizoso is also an interesting read about its state: [17:42] http://www.gnomejournal.org/article/118/pygtk-gobject-and-gnome-3 [17:42] The GI/PyGI developers hang out on IRC here: [17:42] #introspection / #python on irc.gnome.org [17:42] pygobject's git tree has a very comprehensive demo showing off pretty much all available GTK widgets in PyGI: [17:42] http://git.gnome.org/browse/pygobject/tree/demos/gtk-demo [17:42] Description of the Python overrides for much easier GVariant and GDBus support [17:42] http://www.piware.de/2011/01/na-zdravi-pygi/ [17:42] Examples of previously done pygtk → pyGI ports: [17:42] Apport: http://bazaar.launchpad.net/~apport-hackers/apport/trunk/revision/1801 [17:42] Jockey: http://bazaar.launchpad.net/~jockey-hackers/jockey/trunk/revision/679 [17:42] gtimelog: http://bazaar.launchpad.net/~pitti/gtimelog/pygi/revision/181 [17:42] system-config-printer (work in progress): http://git.fedorahosted.org/git/?p=system-config-printer.git;a=shortlog;h=refs/heads/pygi [17:43] The gtimelog one is interesting because it makes the code work with *both* PyGTK and PyGI, whichever is available. [17:43] == Q & A == [17:43] Thanks everyone for your attention! I'm happy to answer questions now. [17:44] num asked: Im sorry if I missed something but what are those gir files? [17:44] num: so, the .gir file is an XML text format which describes the API of a library [17:45] it contains everything which a C header (*.h) file contains, but goes way beyond that [17:45] for example, it also documents the lifetime, parameter direction, the position of array length parameters, or who owns the object that a method returns [17:45] this (well, in its binary typelib incarnation) is what the language bindings use to use the library [17:46] just open usr/share/gir-1.0/Gtk-2.0.gir and have a look [17:46] john_g asked: Can you say more about the window sizing changes? [17:47] this is actually on the side of gtk 2 -> 3, which indeed changed this [17:47] there is no difference at all if you move from pygtk2 to PyGI with GTK2 [17:48] most prominent change here is the different expand/fill default, which often makes GTK3 apps look very huge until they get fixed [17:48] http://developer.gnome.org/gtk3/stable/gtk-migrating-2-to-3.html has more details about this [17:48] bj0 asked: is there an example or howto for adding GI/PyGI support to a relatively simple library? Is writing a .gir all that is needed? [17:48] ah, I didn't cover that part, only from the POV of the "user" (python developer) [17:48] it's actually easier [17:49] you don't write the .gir, it's generated from the GI tools [17:49] it scans the header and .C files and gets all the classes, methods, docstrings, parameter names etc. from it [17:49] what you need to do in addition is to add extra magic docstring comments to do the "annotations" [17:49] i. e. if you have a method [17:50] GtkButton* foo(GtkWindow *window) [17:50] you need to say who will own the returned button -- the caller (you) or the foo method [17:50] this will tell Python whether it needs to free the object, etc. [17:50] https://live.gnome.org/GObjectIntrospection/Annotations explains that [17:50] let me dig out gudev, as this is much smaller than gir [17:51] There are 10 minutes remaining in the current session. [17:51] than GTK, I mean [17:51] but the nice thing is that most of these are already defined in gtk-doc, too [17:52] i. e. the things that python needs to know are also things you as a programmer need to know :) [17:53] http://git.kernel.org/?p=linux/hotplug/udev.git;a=blob;f=extras/gudev/gudevclient.c;h=97b951adcd421e559c4a2d7b3b822eb95dd01f1d;hb=HEAD#l336 [17:53] check this out [17:53] /** [17:53] * g_udev_client_query_by_subsystem: [17:53] standard docstring [17:53] * @subsystem: (allow-none): The subsystem to get devices for or %NULL to get all devices. [17:53] the "(allow-none)" is an annotation [17:54] and tells python (or you) that you can pass "NULL" for this [17:54] * Returns: (element-type GUdevDevice) (transfer full): A list of #GUdevDevice objects. The caller should free the result by using g_object_unref() on each element in the list and then g_list_free() on the list. [17:54] the element-type tells the bindings about the type of the elements in teh returned GList* [17:54] and the (transfer full) says that the object will be owned by the caller [17:54] and so on [17:55] so in summary, all you need to do is to annotate parameters properly, then the GI tools will produce a working gir/typelib [17:55] time for one more question [17:55] seems not; then thanks again everyone! [17:55] There are 5 minutes remaining in the current session. === ChanServ changed the topic of #ubuntu-classroom to: Welcome to the Ubuntu Classroom - https://wiki.ubuntu.com/Classroom || Support in #ubuntu || Upcoming Schedule: http://is.gd/8rtIi || Questions in #ubuntu-classroom-chat || Event: Ubuntu Developer Week - Current Session: Working with bugs reported by apport - Instructors: bdmurray [18:01] Logs for this session will be available at http://irclogs.ubuntu.com/2011/07/12/%23ubuntu-classroom.html following the conclusion of the session. === someone is now known as Guest30183 [18:02] Hi, I'm Brian Murray and I work for Canonical as a defect analyst. Part of being a defect analyst is working with bug reports in Launchpad. [18:02] One of the ways, and actually the preferred way, that bugs are reported to Launchpad is via apport. [18:03] Apport is an automated system for reporting problems regarding Ubuntu packages. [18:03] These problems include crashes and package installation failures. [18:04] As these reports are regularly formatted it is also possible to work with them using automated tools. This is what I am going to talk about today. [18:04] Let's look at bug reported by apport together - http://launchpad.net/bugs/807715. [18:05] We can tell this bug was one reported by apport because of the keys and values in the description for example ProblemType: Crash and the tag apport-crash. [18:05] Both of these indicate that we are looking at a crash reported by apport. The actual crash appears in Traceback.txt. [18:06] The Traceback.txt indicates it is an error importing a python module and since update-manager is installed on a lot of systems this bug is likely to receive a lot of duplicates [18:06] We can see the bug already has 10 of them. [18:06] Because we already know the cause of the bug and how to fix it but a fix isn't available yet - it'd be a good idea to prevent this bug from being reported any more. [18:07] Apport provides us with a system for doing just that - bug patterns. [18:08] Are there any questions so far? [18:09] So the branch containing bug patterns can be found at https://code.launchpad.net/~ubuntu-bugcontrol/apport/ubuntu-bugpatterns/. [18:09] When apport prepares to report a bug about a package it first checks to see if the bug it will report matches a bug pattern. [18:10] question from andyrock: how does LP detect bug duplicates? [18:11] Duplicate detection is actually done by the apport retracer and it builds a crash signature based of the traceback or stacktrace and then it marks bugs as a duplicate of another. For an example look at one of the duplicates of bug 807715. [18:12] If the bug matches a pattern the reporter will be directed to an existing bug report or a web page instead of going through the bug reporting process. [18:12] The pattern is an xml file that contains pattern matches, using regular expressions, for apport bug keys. [18:12] To write one for bug 807715 I used (http://bazaar.launchpad.net/~ubuntu-bugcontrol/apport/ubuntu-bugpatterns/revision/244) the following: [18:12] [18:13] This is the url future bug reporters will be directed to. [18:13] You could even send them to a wiki page like https://wiki.ubuntu.com/Bugs/InitramfsLiveMedia. [18:13] ~At this page I've documented the issue people have encountered and provided steps for resolving the issue rather than directing them to a bug report as that may be harder to parse. [18:14] Carrying on with the example for bug 807715. We want the pattern to match a specific package: [18:14] ^update-manager [18:15] Then we have the unique error that we've encountered. [18:15] ImportError: cannot import name GConf [18:15] Patterns can even be used to search attachments to the bug report like Traceback.txt as I've done or log files like DpkgTerminalLog.txt. [18:15] Anybody from the Ubuntu Bug Control team can commit a bug pattern and I'll happily review any merge proposals. [18:17] The bugpatterns.xml file (http://bazaar.launchpad.net/~ubuntu-bugcontrol/apport/ubuntu-bugpatterns/view/head:/bugpatterns.xml) contains lots of examples to help you get started. [18:18] Are there any questions regarding the format of bugpatterns or how they work? [18:18] < andyrock> Question: in `re` means `regular expression`? [18:18] yes, that is correct [18:19] for example here http://bazaar.launchpad.net/~ubuntu-bugcontrol/apport/ubuntu-bugpatterns/view/head:/bugpatterns.xml#L67 we can see we are matching either of two specific version of apport [18:19] 1.20.1-0ubuntu[4|5] [18:19] Any more questions? [18:20] Included in the bugpatterns bzr branch are some tools for testing and working with bugpatterns [18:21] Once you've written a bug pattern you can test it with the test-local script e.g. ./test-local 807715. [18:21] It will download the details from the bug report and reconstruct an apport crash file. [18:21] Then the pattern matching function will be run against that crash file. I usually do this with one or two bug reports to make sure they match my pattern before I commit it. [18:22] The bug patterns are auto synced to http://people.canonical.com/~ubuntu-archive/bugpatterns/ which is where apport looks for them. [18:23] In addition to the test-local script there is a utility called search-bugs which takes a package name and tags as arguments e.g. (search-bugs --package update-manager --tags apport-crash). [18:23] This is a great way to make sure your pattern isn't catching the wrong bug reports. [18:23] This is really important as a poorly written bug pattern could end blocking all apport bug reports about package or even all of Ubuntu! [18:24] Additionally, while the apport retracer will automatically mark crash reports as duplicates it does not currently do this with package installation failures. [18:24] So if you write a bug pattern regarding a package installation failure you can use search-bugs with the '-C' switch to consolidate all the existing bug reports into the one you've identified as the master bug report. [18:25] By the way package installation failures are identifiable by "ProblemType: Package" and the tag apport-package. [18:25] search-bugs will add a comment to each bug matching the pattern and mark it as a duplicate of the bug identified in pattern url. [18:26] An overview of writing bug patterns can be found at https://wiki.ubuntu.com/Apport/DeveloperHowTo#Bug_patterns. Are there any questions regarding how to write bug patterns or use the tools in the bug patterns branch? [18:28] Another thing worth noting is that the apport retracer also helps identify bug reports that would benefit from a bug pattern by tagging them 'bugpattern-needed'. [18:28] It does this for crash reports with more than 10 duplicates. [18:30] So we've talked about how we can deal with bugs reported by apport once they've arrived in Launchpad, but what can we do to add information to bugs reported via apport? [18:31] pport supports package hooks - meaning that it looks in the directory /usr/share/apport/package-hooks/ for a file matching the name of the binary or the source package and adds the information in the package hook to the bug report. [18:32] For example if we look at /usr/share/apport/package-hooks/source_update-manager.py which is also viewable at http://bazaar.launchpad.net/~ubuntu-core-dev/update-manager/main/view/head:/debian/source_update-manager.py. [18:32] We can see that we add the file "/var/log/apt/history.log" to the bug report and name the attachment "DpkgHistoryLog.txt". [18:33] So now if update-manager crashes or a user reports a bug via 'ubuntu-bug update-manager' we will have an attachment named DpkgHistoryLog.txt added to the bug report. [18:34] This is helpful in debugging the issue with update-manager as it provides us with the context for the operation being performed. [18:35] The Stable Release Updates team is also happy to approve SRUs that add apport package hooks to a package. [18:36] Back to the update-manager package hook - it is utilizing some convenience functions provided by apport. [18:36] For example attach_gconf adds the non-default gconf values for update-manager. [18:37] We can review the convenience functions at http://bazaar.launchpad.net/~apport-hackers/apport/trunk/view/head:/apport/hookutils.py. [18:39] Useful ones include attach_hardware, command_output, recent_syslog and attach_network. [18:39] It is also possible to communicate with the bug reporter in a source package hook by asking questions or displaying messages. [18:40] It is even possible to prevent the reporting of bugs - before a bug pattern is ever searched for. [18:40] I've done both of these things, asking the reporter a question and preventing certain bugs from being reported, in the ubiquity package hook. [18:41] You can see it at http://bazaar.launchpad.net/~ubuntu-core-dev/ubuntu/oneiric/apport/ubuntu/view/head:/data/package-hooks/source_ubiquity.py [18:42] http://bazaar.launchpad.net/~ubuntu-core-dev/ubuntu/oneiric/apport/ubuntu/view/head:/data/package-hooks/source_ubiquity.py#L26 [18:42] Here we examine the syslog file from when the installation was run for SQUASHFS errors [18:43] In the event that any are found an information dialog is presented to the reporter explaining the issue and informing them about things they can do to resolve the issue. [18:45] http://bazaar.launchpad.net/~ubuntu-core-dev/ubuntu/oneiric/apport/ubuntu/view/head:/data/package-hooks/source_ubiquity.py#L31 [18:45] Here a Yes / No dialog box is presented to the bug reporter asking them if they want to include their debug log file in the report. [18:47] Most ubuntu systems will have lots of package hooks installed on them in /usr/share/apport/package-hooks and there are some great examples in there of things you can do. [18:47] The xorg package hook is pretty complicated but gathers lots of information. [18:49] The grub2 package hook actually reviews a configuration file for errors which is quite handy. [18:50] Are there any questions regarding the material covered so far? [18:51] There are 10 minutes remaining in the current session. [18:52] Okay well that's all I have [18:53] I hope you understand how you can better work with bugs reported by apport about your package by using bug patterns and how to make those bugs reported by apport even more informative. [18:53] Thanks for your time! [18:55] There are 5 minutes remaining in the current session. === ChanServ changed the topic of #ubuntu-classroom to: Welcome to the Ubuntu Classroom - https://wiki.ubuntu.com/Classroom || Support in #ubuntu || Upcoming Schedule: http://is.gd/8rtIi || Questions in #ubuntu-classroom-chat || Event: Ubuntu Developer Week - Current Session: Fixing obvious bugs in Launchpad - Instructors: deryck [19:02] Logs for this session will be available at http://irclogs.ubuntu.com/2011/07/12/%23ubuntu-classroom.html following the conclusion of the session. [19:02] Ok, so I guess that's me. Hi, all. :) [19:02] My name is Deryck Hodge. I'm a team lead for one of the dev teams working on Launchpad. [19:02] This session will be about fixing obvious bugs in Launchpad. [19:02] Feel free to ask questions as we go. [19:02] Launchpad is a large code base with a complex deployment story, so one way into hacking on Launchpad is to start by fixing obvious or easy bugs. [19:02] From there you can decide if you want to go deeper or spend more time working on Launchpad. [19:03] Working on lp is great by itself, but also a great way to support Ubuntu development. [19:03] The goals for this session then are to: [19:03] * show you how to setup a Launchpad dev environment [19:03] * give you a tour of the Launchpad codebase [19:03] * teach you how to find easy bugs [19:03] * demonstrate how to approach fixing a bug [19:04] You can try to follow along if you like, but I'm not assuming that, since certain steps like branching can take time.... [19:04] I'll just outline and demo here. [19:04] So, let's start with getting a Launchpad dev environment setup. [19:05] Launchpad uses a set of scripts we refer to as "rocketfuel" to manage our dev environment. These live in the Launchpad devel tree and all begin with "rocketfuel-" names. [19:05] So rocketfuel-setup would be the script you would run to build a dev environment. [19:05] A word of warning.... [19:06] This script appends to your /etc/hosts file, adds ppa sources for launchpad development, and adds local apache configs. [19:06] If that scares you, don't run the script. If it doesn't, then you can get a working environment by downloading the script and running it, like: [19:06] * bzr --no-plugins cat http://bazaar.launchpad.net/~launchpad-pqm/launchpad/devel/utilities/rocketfuel-setup > rocketfuel-setup [19:06] * ./rocketfuel-setup [19:07] (I'm pasting from notes, so the "*" has no meaning, it's just my bullet points I copied by accident.) [19:08] The above will install into $HOME/launchpad by default. But it does look for some env variables if you want to install in non-standard places. [19:08] I change things myself, so my $HOME/.rocketfule-env.sh looks like: [19:08] http://pastebin.ubuntu.com/642768/ [19:09] If you create this file before running rocketfuel-setup, you'll get things in the places you want them. Or just go with the defaults. :) [19:10] If you want to do each step of rocketfuel-setup by hand rather than run the script, see the full instructions at: [19:10] https://dev.launchpad.net/Getting [19:10] Under the "Do it yourself" section [19:11] If you want to run rocketfuel-setup for convenience but want it isolated a bit, you can run launchpad in a vm, or use an LXC container or a chroot. [19:11] For more on that, see: [19:11] https://dev.launchpad.net/Running/LXC or https://dev.launchpad.net/Running/Schroot [19:12] The point of all of the above is to get you familiar with getting setup and point you to sources of more info. [19:12] You don't have to do this now to continue following along... but feel free, of course. [19:12] Okay, so after you do all that you should be able to browse the local tree. If you're using the default location you can see it at $HOME/launchpad/lp-branches/devel. [19:13] Now, you need a working database setup. [19:13] This is as simple as changing into the lp tree and running: [19:14] ./utilities/launchpad-database-setup $USER && make schema [19:14] where $USER is whatever local user you want to access the db with. [19:14] A warning again..... [19:14] If you already have a working Postgres setup, this will destroy any existing DBs. So run isolated per suggestions above if you need to. [19:15] Also "make schema" is your friend if you get your DB in a weird state. It always resets the DB for development. [19:15] Now we can run the local Launchpad with: make run [19:16] This takes a minute to start up but you should be able to go to https://launchpad.dev/ in your browser and connect. [19:16] You can also run tests here. By killing the running lp and trying something like: [19:16] ./bin/test -cvvt test_bugheat [19:16] Testing becomes important as we do changes later. We'll come back to that. [19:17] Any questions so far on getting setup and running lp locally? [19:18] Now that we're setup (or at least know how to get setup), let's look at the code itself! [19:18] You can ls the top of the tree to see what's there: ls -l ~/launchpad/lp-branches/devel [19:18] But the interesting bits are in lib, especially lib/lp and lib/canonical. [19:19] lib/canonical is a part of the tree that we can't seem to get rid of. Eventually everything should end up moved to lib/lp... [19:19] ...but if you can't find something in lib/lp, then look in lib/canonical. [19:19] Everything under the lib directory are the Python packages we'll deal with. So lib/lp/bugs/model/bug.py can be referenced in Python by lp.bugs.model.bug. [19:20] Here's a paste to make this clear: [19:20] http://pastebin.ubuntu.com/642785/ [19:20] Notice in the paste that there is a bin/py in the tree that allows you to use Python with the correct path set for lp development. [19:21] You can use `make harness` at the top of the lp tree to get a Python shell with several objects available already. [19:21] This is nice for interpretative debugging or figuring things out. [19:22] So to understand the tree structure, let's focus on lib/lp since that's where most everything lives these days. [19:22] Each component of launchpad.net has it's own directory in the lp tree. "bugs" and "code" and "translations" and so on. [19:22] Each of these has roughly the same structure.... an interfaces, model, browser, and templates directory.... among other directories that are common. [19:23] so lp.bugs.model and lp.code.model and so on. [19:23] This is roughly our MVC division in launchpad, for those who know MVC-style development from other web app frameworks like Django. [19:24] Interfaces are declared in the "interfaces" dir, "model" then contains the classes that implement those interfaces. [19:24] "browser" holds the view stuff and templates are the html portion. Well, TAL versions of the html. [19:24] Generally, if you want to figure out what's happening with the Python objects or the database layer, look at stuff in the interfaces and model directories.... [19:25] (Read up on Zope Component Architecture to make better sense of those files.) [19:25] ...but we said we want to focus on easy or obvious bugs, which are likely something to do with the web page itself. [19:25] So let's focus on the stuff in templates or browser code. [19:26] Again, this is the stuff that has to do with display on launchpad.net. (Or launchpad.dev if you working locally.) [19:26] That concludes the tour of the code. Any questions on that part? [19:27] Now let's try to understand how we organize bugs on the launchpad project to find something to fix. [19:28] Tagging can help us here. We use "trivial" or "easy" to mark bugs that are pretty shallow. [19:28] Here is a list of 162 Triaged launchpad bugs tagged "trivial" -- [19:28] http://tinyurl.com/6l572gg [19:29] But if you look at those bugs, you can see we often use "trivial" to mean trivial to an *experienced* Launchpad dev. [19:29] So that might be useful, but I like to narrow further, when looking for truly easy stuff. [19:30] Let's search for Triaged bugs with the tag "trivial" and "ui" since I know "ui" is used to many anything in the web page itself. [19:30] http://tinyurl.com/5snjlva [19:30] Now we're down to 69 bugs. :-) [19:31] FWIW, "ui" and "css" and "javascript" are all tags we use for front end work that combined with "easy" or "trivial" tags can help you find easier bugs to fix. [19:32] trivial means (for lp devs) something that can be fixed in an hour. "easy" is a bit longer but still short work. maybe 2-3 hour fix all told. [19:32] You can look through these bugs above if you like, but I've spent some time with them already this morning.... [19:32] ....so I've found a bug that will be a nice one to demo how to approach fixing bugs. [19:33] Let's look at bug 470430 and start working on how to fix Launchpad bugs now. [19:33] https://bugs.launchpad.net/launchpad/+bug/470430 [19:34] This is an older bug that outlines that the icon for the link "Copy packages" is bad. [19:34] See the bug report for a link to a page that has the bad link on it. [19:35] We currently use the edit icon that is used too much on Launchpad, and mpt recommends a new icon or an expander icon. [19:35] But we can also just remove the icon to fix the issue. [19:36] The first thing I would do is simply search the soyuz templates to find the one that has the link for "Copy packages." [19:36] (I know to look in soyuz because I know that's the part of lp that deals with packaging on Launchpad.) [19:37] (If you want to work on an easy bug like this but don't even know where to start, ask in #launchpad-dev here on Freenode.) [19:37] feel free to ping me if no one responds :) [19:37] So back to the bug in question.... [19:37] To find this link, I would change to the devel tree and run: [19:37] grep -rI "Copy packages" lib/lp/soyuz/templates/ [19:38] If you do that, you'll find that it returns nothing. This is a clue that the link is created in Python code rather than a template. === yofel_ is now known as yofel [19:38] So I need to look in the browser code I told you about earlier: [19:39] grep -rI "Copy packages" lib/lp/soyuz/browser/ [19:39] This gives me: http://pastebin.ubuntu.com/642798/ [19:40] The "text" and "label" bits there look promising. [19:40] So I now want to open lib/lp/soyuz/browser/archive.py in an editor and see what's happening there. [19:40] We need to search the file for the phrase "Copy packages". [19:41] We can find a function that is called "copy" which creates a link from a class called "Link". This looks like it! [19:41] See the code pasted here: http://pastebin.ubuntu.com/642799/ [19:42] That line also sets the icon to "edit." And this is the cause of our bug. [19:42] So the easy fix is to just remove the icon line and make it like: http://pastebin.ubuntu.com/642801/ [19:42] The fix is at line 5 in the paste. [19:43] And now we've fixed a trivial bug! :-) [19:44] The next steps would be to branch from devel, create your own branch with this fix in it, and push it up to Launchpad. [19:44] Then propose it for merging into lp:launchpad. [19:44] A Launchpad dev should then step in and help you get your changes landed. [19:44] Any questions about all that? [19:46] It really is just that easy to fix easy bugs. :) [19:47] Today we've been through getting setup with lp dev, finding our way around the code base, finding bugs, and learning how to approach fixing easy bugs. [19:47] Please ask around on #launchpad-dev if you'd like to get more involved with launchpad development and try your hand at fixing these kinds of bugs. [19:48] Thanks for attending the session everyone! That's all I have. [19:48] I'll hang around until the next session if any lingering questions arise. [19:51] There are 10 minutes remaining in the current session. [19:55] There are 5 minutes remaining in the current session. === ChanServ changed the topic of #ubuntu-classroom to: Welcome to the Ubuntu Classroom - https://wiki.ubuntu.com/Classroom || Support in #ubuntu || Upcoming Schedule: http://is.gd/8rtIi || Questions in #ubuntu-classroom-chat || Event: Ubuntu Developer Week - Current Session: DEX - how cross-community collaboration works - Instructors: nhandler [20:01] Hello everyone. My name is Nathan Handler. I am an Ubuntu Developer and a member of the DEX team. [20:01] Logs for this session will be available at http://irclogs.ubuntu.com/2011/07/12/%23ubuntu-classroom.html following the conclusion of the session. [20:02] I am also spending the summer participating in Google's Summer of Code with Debian, where I am working with Matt Zimmerman and Stefano Zacchiroli on creating some tools for DEX. [20:03] This session will probably be on the shorter side. So please feel free to ask questions at any time in #ubuntu-classroom-chat. Please be sure to prefix them with QUESTION: [20:03] The first thing I am sure some of you are wondering is, "What is DEX?". [20:04] DEX is the Debian dErivatives eXchange. Normally, Debian-based derivatives pull packages from Debian and then merge in changes that they have made. [20:04] The goal of DEX is to get these changes applied in Debian to make things easier for the derivatives and to allow all of the derivatives to benefit from the changes. [20:04] The DEX homepage is available at http://dex.alioth.debian.org/ [20:04] Currently, the only derivative that is actively participating in DEX is Ubuntu. [20:05] We organized our first project, ancient-patches, several months ago. [20:06] Details about the project are available here: http://dex.alioth.debian.org/ubuntu/ancient-patches/ [20:07] One issue that we had with that project was that it took too much time to create a new project, and all changes had to be committed to the VCS on alioth (which required membership in the alioth team). [20:07] That is why I am spending the summer creating a new dashboard and some other tools to make DEX easier to use. You can see what the dashboard currently looks like here: http://dex.alioth.debian.org/gsoc2011/projects/dex.html [20:08] Keep in mind, there are still many bugs and other issues that need to be fixed before the dashboard can be deemed stable. [20:09] One thing you will notice is that there are now two projects showing up on the dashboard. There is the old ancient-patches project, but there is also a python2.7 project. This python2.7 project is being organized by Allison Randal. [20:10] There is a brief FAQ available for using the dashboard: http://dex.alioth.debian.org/gsoc2011/docs/FAQ [20:11] It explains how projects can either be created by applying special usertags to a set of bugs in the Debian BTS, or they can be specified in a plain text file by connecting via ssh to wagner.debian.org [20:13] The table (while currently not functioning) will eventually support adding and modifying tasks via the web [20:14] This means that there will be no need for every participant in a DEX project to have membership in the alioth team like there was for ancient-patches [20:14] rww asked: Those dashboard graphs look pretty. What did you use to generate them? [20:15] rww is referring to the graph that is displayed at the bottom of each project to track the number of open tasks versus time. These graphs are currently updated once each day via cron using matplotlib. They still have some issues that need to be sorted out, but they should be functional. [20:16] Eventually, there will be a second graph on each project page. This graph will be a bar graph that shows the most active people in a DEX project. The idea is to allow new contributors to get instant visual recognition for their contributions [20:16] pleia2 asked: So everything is handled through Debian BTS? So other derivatives can participate without DEX specifically having to take into consideration their BTS (launchpad, bugzilla, etc)? [20:17] Eventually, I might add some support to the dashboard for downstream bug trackers. But for now, the goal of DEX is to get those downstream changes applied in Debian, which will involve a bug getting filed in the BTS. [20:18] If a derivative has a project whose tasks are downstream bugs, DEX would allow them to do this, but it would not pull in any additional information about those downstream bugs (i.e. status, owner, package, etc) [20:18] pleia2 asked: Even though Ubuntu is the only one actively participating, have many other derivatives shown serious interest? [20:20] I have not seen that many posts from other derivatives on the mailing list. However, the debian derivatives front desk put together a census shortly before DEX started up. They got a fairly nice response: http://wiki.debian.org/Derivatives/Census . I have a feeling some of the larger derivatives will get involved in DEX once it gets more stable and organized [20:21] At this point, some of you are hopefully wondering how you might go about getting involved with DEX. [20:22] If you are interested in helping out with a project, you could help out with the python2.7 project. We will also soon be starting a large merges project that you might be interested in. [20:23] Most of these projects will be discussed on the debian-derivatives mailing list (http://lists.debian.org/debian-derivatives/). [20:23] You do not need to be an Ubuntu or Debian developer to help out. For the ancient-patches project, most of the people involved were not Debian Developers. [20:24] A lot of the work tends to be triage-related. We need to figure out whether the change is needed in Debian, whether it has already been applied, search for and report bugs on the BTS, and talk to the package maintainers to decide on the best approach. While packaging knowledge might help, being an official developer is not needed to perform those tasks. [20:25] We also would appreciate help testing the dashboard, and any suggestions for tools and other improvements that we could make to make DEX easier to get involved with. [20:26] Finally, if you are interested in starting a DEX project of your own, either for Ubuntu or another Debian-based derivative, simply send an email to the debian-derivatives mailing list or stop by #debian-derivatives on oftc, and we would be more than glad to help you get started. [20:26] Any questions at this point? [20:28] In that case, I'll take a few minutes to go back and talk about why DEX is doing what it does. [20:28] In the first part of the Ubuntu release cycle, we spend time pulling updates packages from Debian. If we have not modified them in Ubuntu before, we can use tools to automatically do this (sync). [20:29] If we have made changes, a developer needs to manually update the package (merge). [20:29] Similer tasks occur in other Debian-based erivatives. [20:30] If we take some time to get the changes made in Ubuntu back into Debian, it means less work for us, as we get to sync the package in the future. [20:31] It also means that Debian, and all other Debian-based derivatives get to benefit from our changes. If other derivatives do the same thing, we get to benefit from their changes as well. [20:31] So everybody benefits from this work, not just Ubuntu or Debian. [20:31] Any questions on that? [20:33] Finally, I'll talk a bit about what will be happening with DEX in the future. [20:34] First, the dashboard, as a GSoC project, should be done in about a month. This means that it will be available for all DEX projects to use. [20:34] We are also working on ensuring that we have plenty of documentation about how to get involved with DEX and the individual DEX projects. This should make it trivial for people of any skill level to get involved. [20:35] As you can see on http://dex.alioth.debian.org/gsoc2011/projects/dex-ubuntu-python2.7/graph.svg , the python2.7 project is slowly but steadily progressing. That project should finish up soon. [20:36] Once it is done, we will be starting a large merges project. [20:37] This project will find the Ubuntu packages that differ the most from their Debian counterparts and attempt to send as many of our changes upstream to Debian as possible. [20:37] That will probably be the first project to rely entirely on the DEX dashboard. [20:39] Once that project is underway, I hope to talk to some of the people involved with the census that I linked to earlier about getting some other derivatives involved with DEX. It will be great being able to see a long list of projects that are being worked on. [20:40] Any questions about any of the future plans? [20:40] rww asked: (sorry, I went afk so this is about something from earlier) For people looking to get into DEX, what sort of skillset are you looking for? Programming? Packaging? etc. [20:41] The specific skills will depend on the project. For the large merges project, packaging knowledge will definitely prove useful. For the ancient-patches project, it was mainly triage work. So anyone able to navigate LP, the BTS, and changelogs was able to help out. [20:42] However, due to the nature of DEX, most of the tasks are fairly similar and repetitive. That is why we are going to spend a lot of time ensuring that projects are properly documented. [20:43] This means that you should be able to work on a task, follow the documentation, ask a few questions, and get it sorted out. After doing that a few times, you will probably be able to handle most of the non-special tasks in a project. [20:44] Finally, before I conclude here, I want to make sure everyone is aware of some links and resources that might prove useful if you choose to get more involved. [20:45] First, there is #debian-ubuntu and #debian-derivatives on oftc (oftc is the irc network that most of the Debian channels live on). #debian-ubuntu is for the Ubuntu specific DEX stuff, and #debian-derivatives is more about DEX in general. [20:46] You should be able to ask most of your questions there and get pointed in the right direction. [20:47] http://dex.alioth.debian.org/ is the main DEX website. http://dex.alioth.debian.org/ubuntu/ is the Ubuntu DEX Team website. They are slightly outdated right now, but still have some useful information. [20:48] http://dex.alioth.debian.org/gsoc2011/projects/dex.html is the current location of the DEX Dashboard. It is still a work in progress, and the URL will probably change once it is stable. [20:49] http://lists.debian.org/debian-derivatives/ is the debian-derivatives mailing list (this is @lists.debian.org, not @lists.ubuntu.com). That is where the projects will be discussed and announced. It is relatively low-volume, so I would suggest subscribing. [20:50] Finally, you can always email me or PM me on IRC (the same is true of most members of the DEX team) with any questions/comments you might have. [20:50] That is all that I have. Does anyone have any last questions? [20:50] There are 10 minutes remaining in the current session. [20:52] In that case, thank you everyone who attended the session. This concludes my DEX session and the second day of Ubuntu Developer Week. I will stick around until the end of the hour in case anyone thinks of any more questions. [20:55] There are 5 minutes remaining in the current session. [21:00] Logs for this session will be available at http://irclogs.ubuntu.com/2011/07/12/%23ubuntu-classroom.html === ChanServ changed the topic of #ubuntu-classroom to: Welcome to the Ubuntu Classroom - https://wiki.ubuntu.com/Classroom || Support in #ubuntu || Upcoming Schedule: http://is.gd/8rtIi || Questions in #ubuntu-classroom-chat || === skaet is now known as skaet_afk === skaet_afk is now known as skaet === datastream is now known as datastream_ === Ursinha-lunch is now known as Ursinha