[12:56] MOTU Q&A session in 4 minutes [13:00] hello everybody - how are you doing? welcome everybody to another MOTU Q&A session [13:00] Let's start with our usual round of introductions - who do we have here today? [13:00] * Hobbsee dies, so is not here. [13:00] very well thanks [13:00] * sourcercito waves [13:00] o/ [13:01] * maiatoday learning new things daily [13:01] hi [13:01] * dholbach is Daniel Holbach, working in the MOTU team for quite a while and trying to help out where he can [13:01] * Hobbsee is the late Dent, Arthur Dent. [13:01] haha... Hobbsee is not :) [13:02] * ian_brasil high fives [13:02] who's completely new here and hasn't start their road to MOTU yet? [13:02] o/ [13:02] \o [13:02] hey MLP - how are you doing? [13:02] * maiatoday is very close to being completely new [13:02] dholbach: great, thanks :) and you? [13:02] I'm fine - thanks :) [13:03] do you have any questions already? [13:03] not really, I haven`t looked into enough to have any questions already [13:03] ok [13:03] what I can recommend is http://wiki.ubuntu.com/MOTU/GettingStarted [13:04] cool [13:04] because it links to all the documents and howtos necessary to get you started [13:04] if anything in the session is unclear - just ask [13:04] yup :) [13:04] did somebody bring questions today? [13:05] dholbach: can i have a pony? [13:05] we were bound to hear that question today [13:05] lol [13:05] Hobbsee wants to have the "MOTU Q&A clown" tag :) [13:05] i want to ask about how to make a package for a specific architeture (lpia) [13:05] hahaha [13:06] ian_brasil: good question [13:06] so a key thing about packages in ubuntu and debian is that we all deal with the source package, what users install are binary packages [13:07] we only do source changes and only do source uploads to the build daemons, which (hopefully, if we do everything alright) turn our source packages to binary packages [13:07] let's all grab a random source package to look at [13:07] please all run apt-get source xicc [13:10] ok [13:12] excusez-moi [13:12] in the case of python package (which just contains python code that is interpreted), you'd change 'any' to 'all' [13:13] which means it's only built on one build daemon, but will be installable on all architectures, because it's the same binary package anywhere [13:13] ian_brasil: to answer your question, you could enter just 'lpia' there [13:13] or a list of architectures [13:13] sorry..i dropped out...enter lpia where? [13:14] Architecture: i386, amd64 would be perfectly valid too [13:14] in the architecture field [13:14] ah ok [13:15] ok great [13:15] any questions about that? [13:15] we had a timely net-split [13:15] ok hang on [13:16] check out http://daniel.holba.ch/temp/log [13:17] ok, thanks. [13:17] great [13:17] any other questions? [13:17] dholbach, about pbuilder... [13:17] for everybody who doesn't know a lot about pbuilder yet, be sure to check out http://wiki.ubuntu.com [13:18] for everybody who doesn't know a lot about pbuilder yet, be sure to check out http://wiki.ubuntu.com/PbuilderHowto [13:19] effie_jayx: what is your question? [13:19] I have recently had my experiece for testing packages... what is a good chroot enviroment for testing packages for other releases ... packages that contain a gui [13:20] good question, to be honest, I don't know - I know that some people use vmware, some people use separate partitions [13:20] virtualbox, being free, would make virtual machines alot more appealable for that purpose imho [13:20] what are these variables -> Depends: ${shlibs:Depends}, ${misc:Depends} [13:20] and what others exist that we could use [13:20] dholbach, great... [13:21] a simple chroot won't do in most cases, because if you want to test it properly you will also want to have hal, X, the kernel and other stuff of the very same release [13:21] * dholbach is not that experienced when it comes to virtualisation [13:21] dholbach, for a manpage bug I first tested it was cool [13:21] ok super [13:21] I thought virtualization gives the tester an estimate [13:22] hum, what do you mean? [13:22] but since sometimes hardware is not using its drivers ... [13:22] depnds what kind of software you are testing [13:23] in the most cases it's OK to test it just on the release you're uploading it too [13:23] only if you do uploads to -updates or -backports you want to test it on that very release [13:23] coming to ian_brasil's question, it's a good one [13:24] ${shlibs:Depends} is a variable that will be substituted with all the library packages the binary files in the binary package link to [13:24] if you look at xicc again [13:24] (debian/control) [13:24] it just lists the two that ian_brasil mentioned [13:24] if you check out apt-cache show xicc [13:25] it lists the following: libc6 (>= 2.3.4-1), libglib2.0-0 (>= 2.8.0), libice6, libsm6, libx11-6 [13:25] because that's the libaries the binary in the package links to [13:26] {misc:Depends} will be expanded by packaging tools that run during the build process [13:26] for example dh_gconf (in the debhelper package) will notice if you have gconf schema files in the package [13:27] and automatically add a postinst script that will rebuild the gconf database after the installation on a machine [13:27] also, it will add gconf as a dependency to ${misc:Depends} [13:27] was that clear enough? [13:27] there are not very many dependencies added through misc:Depends, but it's good to have in the packaging - it does no harm [13:27] so it is good to have this ${misc;Depends} in every package?...what do you mean 'the libraries the binary in the package links to' ? [13:28] ok [13:28] let's download the xicc binary package and check it out [13:28] aptitude download xicc [13:28] now we'll unpack the package: dpkg -x xicc_0.2-2_i386.deb test [13:29] check out the output of ldd test/usr/bin/xicc [13:29] that's all libraries that the built xicc binary links to [13:29] so ${python:Depends} will do the same for a package containing python [13:29] and ${perl:Depends} on same note [13:29] maiatoda1: unfortunately not, it will just add the python dependency for the python versions you build that package [13:30] ok good thing I asked then [13:30] kelmo: I know that perl:Depends will add a basic perl dependency, but I do not know if it adds all perl modules you might have to install [13:30] and that answers another question ian_brasil had: it doesn't make sense to have shlibs:Depends everywhere [13:30] if I just package a few python scripts of mine, shlibs:Depends will be empty [13:30] dholbach: i believe it only adds the perl major version (actually paclaged a perl module today, and iirc no extra modules were added) [13:30] as there's no built binary files to inspect [13:31] kelmo: ok, thanks for checking [13:31] so how do i know that linux-gate.so.1 from ldd for example is part of say libc6 (>= 2.3.4-1) ?? [13:31] there were a lot of discussions about adding that feature, but as far as I know it never was very reliable, that's why we have to specify them on our own [13:31] ian_brasil: that will be automatically done for you :) [13:32] well i guess interpreted languages are harder to parse for dependencies like that? [13:32] ah...ok...i understand now [13:32] ian_brasil: during the build process the ${shlibs:Depends} will get magically substituted with all the depends [13:32] kelmo: exactly [13:33] we would have to dive into library packaging to understand more about shlibs and dh_makeshlibs and dh_shlibdeps [13:33] but there might be some other questions first? [13:34] dholbach: what other variable magic exists then [13:34] packaging libraries can be a bit more complicated and I'd suggest to leave them until you're a bit more comfortable with normal packages [13:34] ian_brasil: for dependencies I just know of perl and python for sure [13:34] there are some for mono too and I believe for php too, but I can't remember their names [13:35] how about the ${source:Version} and friends that are used in certain cases? [13:35] it's always useful to just apt-get source and see how they solved problems [13:35] kelmo: good question [13:35] sometimes you decide to split a package into several binary packages [13:36] dholbach, what bugs are usually recomended for us starting out, desktop bugs? and whn is it safe for us to start merges and syncs? [13:36] check out the output of apt-cache showsrc pidgin | head -n 3 [13:36] pidgin is split up into 10 binary packages [13:37] so if somebody installs pidgin, you want to make sure that the installed version of pidgin-data is the same [13:37] else you can't rely on all the images and sound files being at the right place and so on [13:37] that's why in pidgin's Depends you write something like [13:37] Depends: pidgin-data (= ${Source:Version}), ... [13:38] which will be automatically replaced with the current source version of that upload [13:38] does that make sense? [13:38] yep. it does [13:38] there's a whole section in the debian policy just about that [13:38] ok great [13:38] coming to effie_jayx's question [13:39] I recommend bitesize bugs (linked from MOTU/TODO) and upgrade bugs [13:39] the bitesize are almost gone :( [13:39] if you look at http://wiki.ubuntu.com/PackagingGuide/Recipes you will find two recipes which will help you with that [13:39] one describes how to upgrade a package [13:39] the other one describes how to generate a debdiff [13:40] effie_jayx: I added it to my todo list to do another round of bug triage and tag a few bugs as bitesize [13:40] in the meantime you could join #ubuntu-desktop and look at http://wiki.ubuntu.com/DesktopTeam/TODO [13:40] dholbach, great [13:40] there's always a bunch of desktop updates to work on in GNOME land === MLP_ is now known as MLP [13:41] in my latest Ask MOTU blog entry I added a URL to all TODO pages of various ubuntu teams [13:41] when you mention debdiffs and package upgrades you relate that to merges and syncs? [13:41] no, not specifically [13:41] ohh ok [13:41] you can submit debdiffs for simple typos too [13:41] library transitions, etc etc [13:41] coming back to your other question: it's always safe to do merges and syncs [13:42] we prefer to get the bulk of them done early in the release cycle [13:42] http://wiki.ubuntu.com/UbuntuDevelopment/Merging might be interesting to you [13:42] any other questions? [13:43] dholbach, thanks [13:43] I applied for a MOTU blog i think it was...i would like to do a mobile MOTU series...i heard nothing yet, maybe there is a delay? [13:43] ian_brasil: who did you ask? [13:43] i filled the online form in [13:44] reason i asked about the ${source:Version} was the recent changes with respect to binNMU's. I just found a good reference that explains what has changed in that regard on the debian Wiki [13:44] ian_brasil: I don't know how long it takes to get on ubuntuweblogs [13:44] if there's further delay let me know and I'll ping the guys who do it [13:44] which is confusing, because it was ${Source-Version}, but now ${source:Version} seems to be preffered [13:45] you can only get on planet ubuntu if you're an ubuntu member [13:45] and they look so similar... http://wiki.debian.org/binNMU [13:45] dholbach: thx [13:45] kelmo: yes, they changed the syntax so have source:Version and binary:Version [13:45] that's not really relevant, because we don't have the concept of binNMUs [13:45] in Ubuntu [13:45] we only do source uploads [13:46] so if you want to rebuild a package you will have to upload a no-change (just a new changelog entry) to the buildds [13:46] ah, k. understood [13:46] ok great [13:46] any other questions? [13:47] i have cut a dpatch to enable hildon on liferea...in configure.ac i check for the hildon libraries with PKG_CHECK_MODULES([HILDON], hildon-1 >= 1.0.5,enable_hildon=yes,enable_hildon=no) so if its there hildon is compiled in ...i can i just add lpia into the architecture right and check it in a lpia chroot [13:48] ian_brasil: liferea will get built for any architecture, so there's no need to add 'lpia' [13:48] when i tested in the chroot i got an lpia error [13:48] I know of other packages that have a check in debian/rules on which architecture is built [13:48] and in the case of lpia, add --enable-hildon [13:49] cool...you remember which? [13:49] ian_brasil: best to ask adilson, I can't remember which packages he patched to do that [13:49] ok, will do [13:49] great [13:50] any more questions? :) [13:50] all the new people in here: how do you like the MOTU Q&A session? does it help understanding things or does it all still sound very complicated? [13:52] seems it was quite overwhelming :) [13:52] i think its a great service. quite helpful to get some hints [13:52] dholbach, it is needed [13:52] * dholbach hopes that these sessions prove to be useful :) [13:52] ok good :) [13:52] are there topics you'd like to see a dedicated session about? [13:53] one of the things that I find important is getting confortable with the tools [13:53] maybe policy specific package sessions, like python, library, perl etc. where resources are for these ? [13:53] kelmo: sounds good, making notes [13:53] * ian_brasil thinks dholbach rocks [13:54] ian_brasil: gracias :) [13:54] ian_brasil, obrigado? [13:54] * kelmo has not much idea about the python packaging best practises [13:54] a mobile packaging session [13:54] kelmo, sounds interesting [13:54] ian_brasil: ok, I'll talk to folks about it and see what we can do [13:55] there was already a "patch system" workshop wasn't there? [13:55] is that woth discussing again? [13:55] yeah [13:55] kelmo: http://wiki.ubuntu.com/PackagingGuide/PatchSystems [13:55] but I agree, it'd be nice to have such a session soon again [13:56] if you have any other ideas: things we should improve, things we should talk about, etc etc - please do mail me [13:56] i find patch systems conceptually hard to grasp at first, but nice and easy once a couple of dry runs are done. maybe others feel the same [13:56] I'll put effort and time to make your life and that of the people who join after you easier :) [13:56] dholbach, thanks ... [13:57] :D [13:57] kelmo: I've worked for a while in the Desktop Team, which does a lot of package updates every two weeks - that's where I learned to like patch systems [13:57] it's so much easier to just drop a patch by removing a file, easy to update to a new upstream version, etc [13:57] another idea maybe, managing different upstream version control systems for various upstreams, debian package teams etc [13:57] so yeah, we should have a session about that again [13:58] good ideas [13:58] taking notes [13:58] effie_jayx: you said it's important to learn tools: anything we should document better? [13:58] dholbach, pbuilder... for testing packages [13:59] PbuilderHowto is not good enough yet? [13:59] dholbach: the difference between depends and build-depends and how to work them out [13:59] devscripts tools, $vcs-buildpackage (or when _not_ to use $vsc-buildpackage) [13:59] I really didn't know how to test it... and the pbuilderhowto didn't cover that [13:59] ian_brasil: did you check out http://wiki.ubuntu.com/PackagingGuide/Basic ? [13:59] effie_jayx: ok, taking notes [14:00] dholbach, it was no biggie either [14:00] no , i used the packaging guide...cool link [14:00] thanks kelmo, thanks effie_jayx, thanks ian_brasil [14:00] rock and roll [14:00] ok, turning into pumpkin [14:00] thanks all for joining the session, I had a great time [14:00] gn8, thanks for time [14:00] me too [14:00] so did I [14:01] having fun definetelly [14:01] be sure to blog about your MOTU experience and add it to ubuntuweblogs.org :) [14:01] dholbach: thx for doing these sessions [14:01] see you guys around! and let me know if you have trouble [14:01] :-) [14:01] dholbach, mos cerntaily [14:02] *wave* [14:44] Thanks for helping once again, LjL [14:45] soundray, gamma is something simple and yet complicated. basically, your scanner gives values for each pixel, from 0 to 255 (for each color, and maybe from 0 to 65535 if it's 16-bit, but anyway -- values from a minimum to a maximum) [14:45] soundray: however, nothing says that those values actually scale "correctly" (that is, linearly) with the actual brightness on the paper [14:45] soundray: same with the monitor - nobody says that a 128 is half as bright as a 256. it depends on the monitor's characteristics [14:46] soundray: so, a "gamma curve" is a function that maps those numbers into other numbers that are hopefully more representative of the actual colors you're looking for [14:46] That makes sense [14:47] Now I get to set gamma twice: once in the main xsane window, and once in a "Standard options" dialog under Enhancement [14:47] soundray: you could define the whole curve point by point (try in the gimp or something, there is a "curve" tool perhaps not called that), but what is usually done is to use a standardized curve (i don't know its analytic expression, but it's just called a "gamma curve"), with a single parameter that changes the curve shape. that's what's usually called the "gamma value" [14:48] soundray, it [14:48] soundray, it's possible that one is an option that goes directly to the scanner hardware (some of them, if not most, are capable of being told a gamma curve to use), while the other would be post-processing done in software by xsane [14:49] LjL: I see. I guess the Standard options dialog sets the scanner internal one and influences the "raw" values that come in over the USB. [14:50] Is there some sort of "standard" gamma that defines a linear relationship? [14:50] soundray: but here's the awkward part. in a scanner, gamma correction is usually *unnecessary*, because CCDs are pretty good at scaling brightness values linearly (not perfect, but almost) [14:50] soundray, yes, gamma 1.0 means "don't change anything". however, the standard for 8-bit pictures is usually *not* gamma 1.0 [14:51] soundray: that's because *monitors* aren't linear at all, and traditionally, instead of changing it just before displaying, a different gamma is set in the picture itself [14:51] soundray: pretty standard values are 1.8 and 2.something, 1.8, i think, being the standard on a Mac [14:52] soundray: so if you set gamma 1.0 in the picture, your screen will show it as *very* dark, unless you've also adjusted your screen to work correctly for gamma 1.0 [14:52] but most people don't do that, and most images on the internet are not made for gamma 1.0 screens, but rather for screens between gamma 1.8 and gamma 2.5 [14:53] soundray: i suggest you start by trying gamma 1.8 in the "raw" settings, and leaving 1.0 in the "post-processing" settings [14:54] soundray: are you on GNOME? i suppose so... on KDE i have a Gamma tool in the system settings, which is nice to experiment with and kind of helps actually understand how gamma works [14:55] LjL: I'm getting the hang of it now. When I set both to 1, I get black blacks for the first time. [14:55] LjL: also, the histogram is making sense now. [14:55] soundray, however *black* itself shouldn't be changed by gamma. if the input byte was 0, it will remain 0 no matter what the gamma value, because the standard curve always maps 0 to 0 [14:56] same for white, i.e. 1 or 255 or whatever the value is [14:56] though if "black" is really a pixel with value, say, 16, that *will* be affected by gamma [14:56] but then it's a gray ;) [14:57] LjL: what I want most importantly is good results from scanning black on white text. I guess the originals aren't really black in the 0 sense. [14:59] soundray, that shouldn't be a problem with most scanners though... you're scanning in grayscale? [15:01] LjL: yes. With greyscale I get a bit of an antialiasing-like effect. Lineart doesn't look so good. [15:01] soundray: do something, screenshot the whole thing, settings windows and all, and post it on imageshack or somewhere [15:02] LjL: not necessary. You've helped me a great deal already. I'm getting much better results now. [15:03] I guess the secret when using the Lineart setting is to increase the resolution sufficiently. [15:04] soundray: consider scanning in black&white mode rather than grayscale. gamma will not be involved there, you'll just have to find an acceptable white/black threshold. if you scan at a high resolution (which is made memory-feasible by the image being b/w rather than 8-bit), you can then rescale it and apply your own antialiasing [15:04] b/w is called "binary" in xsane by the way [15:05] Lineart, actually. [15:05] binary on my xsane... [15:06] Anyway, you've made me a happier man :) [15:06] I have to do the school run now. [15:06] soundray: see you later [15:06] Thank you! [15:09] soundray: (anyway yes, antialiasing is really a way to "approximate" high resolution by spreading information into multiple pixels in a lower resolution. if you start with a high resolution to begin with, you have the whole information, and can then *later* decide to apply *some* form of antialiasing, which is almost certainly better than what's applied by the scanner itself, if you need to move to a lower resolution)