[12:13] <astraljava> balloons: Hi, sadly I cannot participate to the roundtable today. I asked knome if could instead, but he wasn't sure either.
[16:01] <balloons> astraljava,  I missed your message earlier :-( Well it's happening now
[16:05] <balloons> phillw, jriddell, hggdh, jibel, Effenberg0x0, superrm1 you guys about?
[16:05] <Effenberg0x0> o/ here
[16:05] <hggdh> ~ô~
[16:05] <balloons> I don't know some of the IRC names of folks
[16:05]  * balloons goes off to look
[16:06] <superm1> howdy balloons
[16:06] <balloons> howdy!
[16:06] <balloons> ok, so we have Effenberg0x0 hggdh and superm1
[16:06] <balloons> let's see who else we can get trickling in here
[16:07] <superm1> tgm4883 should be here too from mythbuntu
[16:07] <tgm4883> I am now
[16:07] <tgm4883> I forgot and walked away to the server room :/
[16:07] <balloons> hello Thomas ;-)
[16:07] <tgm4883> o/
[16:08] <balloons> ok, looks like astraljava and knome can't make it so, I don't think we'll get anyone from xubuntu
[16:08] <balloons> any lubuntu folks here?
[16:09] <balloons> or kubuntu folks?
[16:10] <balloons> ok, well I will record this so I'll launch a meeting
[16:10] <balloons> #startmeeting QA Community Roundtable
[16:10] <meetingology> Meeting started Thu Jun 21 16:10:27 2012 UTC.  The chair is balloons. Information about MeetBot at http://wiki.ubuntu.com/meetingology.
[16:10] <meetingology> Available commands: #accept #accepted #action #agree #agreed #chair #commands #endmeeting #endvote #halp #help #idea #info #link #lurk #meetingname #meetingtopic #nick #progress #rejected #replay #restrictlogs #save #startmeeting #subtopic #topic #unchair #undo #unlurk #vote #voters #votesrequired
[16:11] <patdk-wk> hm?
[16:11] <balloons> Julien and Phil from ubuntu both don't seem to be around ;-(
[16:11] <balloons> checking on kubuntu folks then we'll get started
[16:12] <Effenberg0x0> balloons: you had scheduled 2 hours. Maybe they will still make it
[16:13] <balloons> ok, pm's sent to everyone ;-)
[16:14] <balloons> so, let's start
[16:14] <balloons> [TOPIC] UTAH Demo
[16:14] <balloons> huh, bot doesn't like topics
[16:15] <balloons> well, anyways, first item on the agenda was to demo UTAH
[16:15] <tgm4883> balloons, perhaps #topic
[16:15] <balloons> don't think we can get a demo together, but hggdh can explain a little bit about what it is
[16:15] <balloons> #topic UTAH Demo
[16:15] <superm1> i thought there was going to be a google hangout or somethign to show it off?
[16:15] <tgm4883> hmm, meetingology apparently is a liar when it comes to available commands
[16:15] <hggdh> no, no demo right now, unfortunately
[16:16] <balloons> superm1, hggdh should be able to demonstrate how vm's are used in automated testing. but sadly, not a UTAH demo atm
[16:16] <superm1> so for now everyone stare at http://www.enchantedlearning.com/usa/states/utah/map.GIF and imagine as hggdh describes :)
[16:16] <balloons> :-)
[16:17] <Effenberg0x0> I like the green. Very vivid.
[16:17] <hggdh> UTAH (Ubuntu Testing Automation Harness) is the new baseset we are moving to for automated testing on Ubuntu
[16:18] <hggdh> it will be (when fully implemented) flavour-agnostic, so it can be used for all *ubuntu
[16:18] <hggdh> right now, beginning of development, it only supports VMs (via libvirt)
[16:18] <superm1> is the input it takes an ISO image?
[16:19] <hggdh> yes, it uses ISO images to build the target test environment
[16:19] <superm1> specifically, "desktop" ISO images, not alternate
[16:19] <hggdh> the installation is preseeded, so there is no input required.
[16:19] <hggdh> not only desktop, but also alternate and server
[16:19] <superm1> okay i see
[16:20] <superm1> sorry, i can hold off questions until the end if you would like, i just realized this might be a bit rude to interject
[16:20] <hggdh> (which is to say, ubiquity and debian-installer based installs)
[16:20] <hggdh> superm1: no, please shoot the questions as we go
[16:20] <hggdh> I do not mind :-)
[16:21] <superm1> Ok. will you have a place to put preseeds in this tool for the different flavours?
[16:21] <hggdh> this, on the other hand, means that you might have to adjust the preseeds to your specific needs
[16:21] <hggdh> yes, we will
[16:21] <superm1> i know at least mythbuntu and ubuntu studio do have custom preseeds
[16:21] <superm1> Ok
[16:22] <hggdh> being able to adjust preseeds is pretty much a requirement if you want to automate tests
[16:22] <superm1> so the unfortunate flaw in doing it this way that comes to mind is that sometimes you will have bugs that are exposed only when the installation is ran in an interactive mode or only in an automatic mode etc
[16:23] <superm1> so it can't be a complete replacement to lots of user testing, but instead a valuable supplement
[16:23] <hggdh> UTAH was originally called 'uath'. But we found that (1) nobody could pronouce it, and (2) it means 'fear, horror´ in ancient Gaelic
[16:23] <hggdh> generically speaking, automated testing *cannot* replace actual hands-on
[16:24] <hggdh> it is just a way of getting the bits that do not directly depend on user input tested, and out of the way
[16:24] <hggdh> but we still need to have manual installs, and testing
[16:25] <Effenberg0x0> hggdh: Just to be clear, can you mention some cases in which UTAH will raise a flag?
[16:26] <hggdh> yes
[16:27] <hggdh> let's say you are testing upgrades from desktop -- it is easily automated: having an existing (say) Precise install, you run 'sudo do-release-upgrade -d', and use pexpect to drive the answers
[16:28] <hggdh> in this case, we are looking for failures to upgrade -- missing pre-reqs, new package version fails to install, etc
[16:28] <hggdh> of course, we cannot check for positioning of windows, and text visibility in windows, etc. But we will get the "backoffice" errors
[16:29] <hggdh> or you are testing a specific package with the testset provided by it (say, mysql, or even coreutils). So you install the image at the version of Ubuntu you want, and run these tests
[16:30] <hggdh> (for coreutils, you need to _build_ the package again, coreutils tests are intermixed with the build process)
[16:30] <hggdh> but you will get errors because of a change on libraries
[16:31] <superm1> what kind of failures get raised in the upgrade testing?  will bugs get filed?
[16:31] <hggdh> or you want to check on ecryptfs -- we have a set of tests for it, and they are fully automated
[16:32] <hggdh> no bugs are opened automagically. We still need to look at the failures and identify the cause
[16:32] <Effenberg0x0> hggdh: got it. Some debugging skills are needed by a human tester.
[16:33] <hggdh> the point here is to weed out false positives -- errors caused by test code, not by what is actually being tested
[16:33] <hggdh> Effenberg0x0: always
[16:33] <hggdh> when you are testing you have to look for real and false positives and negatives
[16:34] <hggdh> usually, we assume the negatives (which is to say, the expected results) are correct
[16:35] <hggdh> but, every so often, you should look at your "correct" results, and verify they are indeed correct
[16:35] <superm1> there will be some sort of notification mechanism for those flavors interested when things fail and need some further intervention?
[16:36] <hggdh> UTAH has a provision to alert users. YOu can set it and use it. In our case, most of the UTAH tests will be run via Jenkins (http://jenkins-ci.org/), so we use Jenkins to do the alerting
[16:37] <hggdh> another point we all should be careful on is on destructive/non-destructive tests
[16:37] <hggdh> I personally define a destructive test as a test that unilaterally changes your system configuration
[16:38] <hggdh> look at 'change your system configuration' as 'can destroy your system'
[16:38] <hggdh> it does not matter if it actually completely borks, but it might -- for example -- change the DNS resolution
[16:39] <superm1> will flavours be able to run utah in the canonical jenkins instance, or need to set up their own?
[16:39] <hggdh> on the other hand, this Monday we had a ecryptfs test that actually forces a reboot of the system: a piece of the kernel goes haywire, and I/O to ecryptfs cannot be guaranteed to work until the reboot
[16:40] <hggdh> superm1: I cannot really answer that, sorry. But... we -- right now -- do not have the resources to guarantee test space for the flavours
[16:41] <superm1> OK
[16:41] <superm1> Daviey warned that jenkins is a PIA to get setup
[16:41] <hggdh> so, at least right now, please do not expect we will be able to run tests for other than Ubuntu
[16:42] <Effenberg0x0> hggdh: The typical ISO-Test (Install/Partitioning, writing MBR/Grub, installing/removing drivers/kernel modules like VGA) is handled by UTAH?
[16:42] <hggdh> Effenberg0x0: yes indeed
[16:42] <hggdh> superm1: not really a pita, but it does have a learning curve
[16:43] <hggdh> the easiest way of using UTAH would be to deploy VMs
[16:43] <superm1> can UTAH be ran without jenkins then on its own via VM deployments?
[16:43] <hggdh> deploying bare-metal will, of course, require bare-metal and additional packages/networks (so that MAAS, for example, can be deployed)
[16:44] <hggdh> superm1: yes, it can, and this is how I was starting to test it
[16:44] <superm1> ah great
[16:44] <balloons> if I can interject, it would also be good to throw some links at you for this : https://launchpad.net/utah
[16:44] <hggdh> all you need is libvirt and friends
[16:44] <hggdh> balloons: thank you, did not really have time to prepare
[16:44] <superm1> so really need to just check it out and start playing to see where questions crop up
[16:44] <balloons> there's a wiki off that page with more info and a picture
[16:45] <hggdh> (and I am, right now, testing a new set of kernel SRU tests, and the KVM I am using is driving me nuts, popping up on my monitor every time the machine thinks about doing something
[16:45] <balloons> additionally, if you have further specific questions once your playing with it, there's a mailing list setup for it now ubuntu-utah-dev@lists.ubuntu.com
[16:46] <hggdh> superm1: yes. Not only to learn, but to tell us where we, ah, did something wrong
[16:46] <superm1> great, thanks!  just need to find some time to actually use it and experiment now :)
[16:47] <hggdh> the code resides at...
[16:47] <balloons> it's linked above hggdh https://launchpad.net/utah, lp:utah
[16:48] <hggdh> heh
[16:48] <Effenberg0x0> hggdh: How do we prioritize/filter UTAH-Testing bug reports, amidst the constant flow of new bug reports on LP (correct/incorrect ones) everyday?
[16:48] <hggdh> so please brz branch, and play -- or install the daily utah package from the PPA
[16:49] <hggdh> Effenberg0x0: what we are doing internally, is to tag all bugs we find (via whatever test process, including manual testing) as a qa finding
[16:49] <balloons> fyi, there's a daily and stable ppa.. you can find them here: https://launchpad.net/~utah
[16:50] <balloons> link to stable: https://launchpad.net/~utah/+archive/stable
[16:50] <Effenberg0x0> hggdh, OK
[16:51] <balloons> ok anymore questions on utah for hggdh ?
[16:53] <balloons> alright if not, next up is testcase management
[16:53] <balloons> which you get to listen to me for ;-)
[16:53] <balloons> I'll include visuals..
[16:53] <balloons> I'm wondering if it makes sense for me to type it or speak it
[16:54] <balloons> I'm concerned if I only have the video it won't make the log
[16:54] <balloons> I can screenshare and type to you all, or you can view a live feed of me speaking with my desktop :-)
[16:55] <balloons> let's try the type and view
[16:55] <balloons> http://www.screenleap.com/ubuntuqa
[16:56] <balloons> hopefully everyone can see my screen now?
[16:56] <Effenberg0x0> OK here balloons
[16:56] <superm1> yup
[16:56] <cariboo907> OK here too
[16:56] <balloons> As you know the qatracker just went through an update to bring testcase management
[16:57] <balloons> I'm going to show you how it works from a behind the scenes admin perspective
[16:57] <balloons> we'll use the staging site on dev.stgraber.org
[16:58] <balloons> so I have a couple products laid out here, mimicking the iso.qa.ubuntu.com tracker
[16:58] <balloons> testcases and pages look pretty similar, except for the addition of the extra links
[16:59] <balloons> clicking on a test (http://packages.qa.dev.stgraber.org/qatracker/milestones/224/builds/16281/testcases/1305/results) you can see there are some new links, and the testcase is now included inline
[16:59] <balloons> hi highvoltage!
[16:59] <highvoltage> hi balloons :)
[16:59] <balloons> http://www.screenleap.com/ubuntuqa
[16:59] <balloons> you can watch and follow along
[16:59] <balloons> ok, so that's nice having the testcase in there
[17:00] <balloons> you'll also notice the boilerplate text on the bottom for submitting your result and filing a bug
[17:00] <balloons> the bug reporting instructions are currently set at a product level
[17:01] <balloons> the testcases are defined and then grouped into testsuites which can then be used by any product
[17:01] <superm1> product meaning "ubuntu" "mythbuntu" etc?
[17:01] <balloons> superm1, yes
[17:01] <balloons> or, a package
[17:01] <balloons> like the calls for testing of the kernel
[17:01] <balloons> I'll show that quickly
[17:02] <balloons> you can see we've had a call for testing for the kernel, and it's had 2 versions
[17:02] <balloons> silly me called the 3.4 version precise1, because I was still learning the tool
[17:02] <balloons> so, there's one testcase in here, smoke test
[17:03] <balloons> similar boilerplate text
[17:03] <balloons> we're looking at (http://packages.qa.dev.stgraber.org/qatracker/milestones/223/builds/16283/testcases/1301/results)
[17:03] <balloons> filing a bug on this package has some further instructions than normal, and includes a link
[17:03] <balloons> that link is tagging the bug as well
[17:04] <balloons> if we look at installation instructions for the package, we get a howto install and howto uninstall
[17:05] <balloons> finally the detailed infromation on the testcase has the history of what the testcase looked like
[17:05] <balloons> this is useful for when we update a case, but have old results
[17:05] <balloons> the results and case will match up
[17:05] <balloons> So you can see some of my earlier attempts and playing with formatting, etc
[17:06] <balloons> ok, so let's take a look at the admin side of things
[17:07] <balloons> As you can see, we allow you to define a template for new testcases
[17:07] <balloons> heh, I changed it a bit
[17:07] <superm1> does that mean we need to work through an admin like you to make our own test cases?
[17:07] <balloons> err, actually.. there
[17:08] <balloons> the template has gone thru a couple revisions before going live
[17:08] <balloons> superm1, no it doesn't mean you'll need me to help you make testcases
[17:08] <balloons> we'll get to that in a min ;-)
[17:09] <balloons> ok, so anyways, that's the example template for a testcase as decided upon last dec by the qa community
[17:09] <balloons> it could change of course if we decided to change it.. and if so, we could update the template at that time
[17:09] <balloons> :-0
[17:09] <balloons> ok, so let's look at the testcases quickly
[17:10] <balloons> you can see my smoke test mostly follows the template -- barring it has no expected results.
[17:10] <balloons> but the admin side, you simply title it, and add the testcase
[17:10] <balloons> notice we do use some html / drupal markup
[17:10] <balloons> some is allowed and we use it to help display as well as for machine parsing
[17:11] <balloons> here's an example of an isotest testcase
[17:12] <balloons> this wiki page http://testcases.qa.ubuntu.com/Install/DesktopFree has been converted into the testcase you see
[17:12] <balloons> http://packages.qa.dev.stgraber.org/qatracker/milestones/224/builds/16281/testcases/1306/results
[17:12] <balloons> ok, so that's testcases more or less
[17:12] <balloons> now, we can organize testcases into testsuites
[17:13] <balloons> you see the 'free software only' testcase is part of the ubuntu desktop suite
[17:13] <balloons> well.. 'ubuntu desktop extras' actually :-)
[17:13] <balloons> there are 3 tests defined in here you can see
[17:13] <balloons> I can set the order, weight and status as expected
[17:14] <balloons> now, let's go assign our testcases/testsuites to a product
[17:15] <balloons> you can see I've linked both the 'ubuntu desktop' and 'ubuntu desktop extras' testsuites to the ubuntu desktop amd64 product
[17:15] <balloons> for precise
[17:15] <balloons> now, some of this may be rather foreign to all of you, but there is a guide to this interface which I'll link to at the end
[17:16] <balloons> the new stuff I am showing you is not yet in that guide. since, it's new (as in this week new :-) )
[17:16] <balloons> now to answer superm1's question a little bit, there is a new role to the admin interface
[17:17] <balloons> we will have a testcase admin role which will allow you to define and manage the testcases
[17:17] <balloons> I've setup a team on lp to do this
[17:17] <balloons> anyone who is on the team will have access to manage testcases
[17:17] <balloons> https://launchpad.net/~ubuntu-testcase is the team
[17:18] <balloons> Phil graciously agreed to help trial out this new interface
[17:18] <balloons> you'll notice the team is restricted
[17:18] <balloons> my goal is not to prevent anyone from contributing, but some control is needed
[17:19] <balloons> for anyone running the tests, I would like them to be able to suggest new tests or improvements by filing a bug against ubuntu-qa
[17:19] <balloons> like any other community, make a few contributions and it will make sense to make you an admin should you wish to be
[17:20] <balloons> this is all VERY new, so I'd love input from all of you on how to shape this
[17:20] <balloons> suffice to say, the flavors should all have at least one person who has this access
[17:20] <superm1> that's what i was just going to say
[17:20] <superm1> i'm glad tgm4883 volunteered for mythbuntu
[17:20] <balloons> :-)
[17:20] <mrand> hahaha
[17:21] <balloons> I'd like everyone on the team to also make sure the tests stay maintained and not fall into dis-use or out of date
[17:22] <balloons> so it's a bit of responsibility, but not too much I don't think
[17:22] <balloons> more or less if your active in testing, you would be doing / have done this anyway
[17:22] <balloons> so questions?
[17:23] <Effenberg0x0> Balloons: Doesn't it sort of overlap with UTAH?
[17:23] <balloons> besides the testcase management piece, the new qatracker should be able to support all of our testing needs (insomuch as we want to have a test and record results)
[17:23] <balloons> it's my hope we can consolidate what we're doing by using it
[17:23] <balloons> Effenberg0x0, how so?
[17:23] <balloons> This is intended for manual testing, UTAH is intended for automated testing
[17:24] <balloons> of course, over time we will continue to close the gap on that.. how/where the results get recorded, etc
[17:24] <Effenberg0x0> I know, I mean some automated tests might kill the need for some manual test-cases
[17:24] <Effenberg0x0> How to keep things paired up
[17:25] <balloons> Effenberg0x0, my longer term view is to automate away everything that makes sense
[17:25] <balloons> the pairing up if you will of what is automated vs manual happens via our communication
[17:26] <Effenberg0x0> Ok, so the community looks at test-cases and define what's still valid or not, got it
[17:26] <balloons> this cycle once we have migrated all of our pre-existing tests over I would like to see us take a review of them
[17:26] <balloons> I took a work item personally to review the testcases for iso testing to ensure they make sense, are needed, etc
[17:26] <balloons> but yes, as a community, on an on-going basis, we should help frame what is needed for manual testing and where it makes sense for us to help
[17:27] <balloons> example of this is the kernel tests. you notice the "smoke tests" are rather simple and light
[17:27] <balloons> the intense tests are being automated by hggdh and the canonical and kernel teams
[17:27] <balloons> the manual testing piece of that is getting it out to many more different workloads and bits of hardware
[17:28] <Effenberg0x0> Ok
[17:28] <balloons> anything else?
[17:29] <balloons> If not we'll migrate onto the next piece and i'll shut down the screenshare for the time being
[17:30] <balloons> ok, so we talked about the new tracker and testcases
[17:30] <balloons> briefly I wanted to mention milestones.. Kate asked me to remind the flavors that you can skip milestones
[17:30] <balloons> but if you do commit to doing a milestone she asks you do it with full force.. aka, you see it through to the end ;-)
[17:31] <balloons> feel free to interrupt me at any time btw
[17:31] <balloons> next up we wanted to discuss collaboration
[17:31] <superm1> i haven't kept up with the thread about abolishing milestones, but what if that happens?
[17:32] <superm1> have you thought about how the tracker would scale for that?
[17:32] <balloons> superm1, yes, that thread has grown and been quite a discussion
[17:32] <balloons> from a tools point of view, our "milestones" can remain the same.. We can nothing but dailies for iso testing if needed, and calls for testing are already of variable length
[17:33] <balloons> as far as what's going to happen, I am not completely sure as it's still be discussed
[17:34] <balloons> however, from a ubuntu perspective we're being asked (again, we were asked at UDS / before UDS) to test more regularly; meaning not just at milestones
[17:35] <balloons> from that perspective nothing in theory has changed.. but the cadence of exactly when we do this "regular" testing is being suggested to be 2 weeks
[17:35] <superm1> i see, okay
[17:35] <balloons> the schedule we adopted after UDS was pretty much the same.. about every 2 weeks
[17:35] <superm1> and i understand that flavours can follow the cadence they would like in this regime too
[17:36] <balloons> yes, of course flavors can choose to follow the cadence or not
[17:36] <balloons> I recommend adopting a cadence that works for the flavor
[17:36] <balloons> considering the devs, testers, etc
[17:36] <balloons> hence, adopting every ubuntu milestone doesn't always make sense..
[17:36] <GridCube> (i could speak very very unoficially for xubuntu)
[17:36] <balloons> I liked the LTS only approach some flavors have thought about as well
[17:37] <balloons> GridCube, hello :-)
[17:37] <GridCube> :)
[17:38] <balloons> on colloborating, part of us all getting together is to talk about needs we might have and how we can work together to benefit each other
[17:38] <balloons> maintaining testcases in mutual fashion and sharing them across flavors as it makes sense, is one such example
[17:39] <balloons> but I also think we can do things like collaborated calls for testing (like we have done with the kernel testing), or on specific packages that mutliple flavors use like firefox
[17:39] <balloons> whatever it is.. floor is open to whomever has a need or idea for colloberation
[17:40] <GridCube> o/
[17:40] <balloons> yes GridCube
[17:40] <balloons> brb, type away :-)
[17:41] <GridCube> hello, i present myself, im a bug tester and support collaborator for xubuntu, i've been so for over a year now, never participated here tho.
[17:42] <GridCube> i have proposed a small change on the qa tracker a while ago, it was that the tests cases should have some sort of area, where all the reported bugs for that particular case on previous days could be seen
[17:42] <tgm4883> wait, what
[17:43]  * tgm4883 scowls at superm1 
[17:43] <superm1> ;)
[17:43] <balloons> that delay!
[17:43] <balloons> epic
[17:43] <balloons> GridCube, this exists: http://iso.qa.ubuntu.com/qatracker/reports/defects
[17:44] <balloons> additionally when you look at a testcase, it has a list of previously reported bugs
[17:44] <balloons> but only for that build I believe
[17:45] <GridCube> only for that build
[17:45] <balloons> regardless, it's worth filing a bug to discuss how it might look
[17:45] <balloons> I like the idea
[17:45] <GridCube> the wishlist bug is already there
[17:45] <GridCube> im trying to find it
[17:45] <balloons> GridCube, ahh, ok, point it out to me.. I'll subscribe
[17:46] <GridCube> ok
[17:47] <GridCube> give me a sec
[17:47] <balloons> ok, so anything else.. This last piece is general Q & A time. hggdh is still around (though dealing with his flailing computer) feel free to ask any more questions
[17:47] <GridCube> https://bugs.launchpad.net/ubuntu-qa-website/+bug/994816
[17:48] <GridCube> :)
[17:48] <balloons> it appears like that was implemented?
[17:48] <balloons> ahh I think you mean https://bugs.launchpad.net/ubuntu-qa-website/+bug/994812
[17:49] <GridCube> ahm, it was like "granted" but no one actually implemented it
[17:49] <balloons> I subbed.. I'll look into it.
[17:49] <GridCube> balloons, yes those two go thogether :)
[17:49] <balloons> thanks for the suggestion
[17:50] <GridCube> no problem, i just though it would make reporting and testing easier, because people would know what to look at
[17:50] <hggdh> I am here
[17:51] <balloons> ok, so if there's no more questions we can discuss the testcase management stuff quickly.
[17:51] <balloons> The old wiki needs converted, of which I have done a couple.. In addition to converting I placed them into the new templated format
[17:52] <balloons> you can see everything in powerpc and amd64+mac has the new testcase format
[17:52] <balloons> http://iso.qa.ubuntu.com/qatracker/milestones/219/builds/17586/testcases
[17:52] <balloons> I take that back, heh, 'Live Session' doesn't
[17:53] <balloons> So I will be going through and doing the same with the other testcases used by the ubuntu iso's
[17:53] <balloons> however, many of those testcases are used by the flavors as well
[17:53] <balloons> for instance, the xubuntu desktop i386 testcases
[17:54] <balloons> they use those same 4 tests
[17:54] <balloons> the only one not converted is the wubi (windows installer) testcase
[17:55] <balloons> convert that and you can convert that iso over
[17:55] <balloons> of course, the flavors can then pick and choose which tests to keep in common and which to write for themselves
[17:55] <balloons> you can pull any testcase you wish and include it with any other combination of testcases to go into a testsuite
[17:56] <balloons> the examples I gave that are live actually are 2 testsuites, so that even the testsuites can be shared across multiple isos
[17:57] <balloons> so for example, the xubuntu desktop i386 iso can use the 'ubuntu desktop' testsuite which contains all 4 testcases it's using, barring the wubi testcase
[17:57] <balloons> write the wubi testcase and then add it to a testsuite and include it on the iso
[17:57] <balloons> I trust that all makes sense :-)
[17:58] <balloons> So, please ping me to get access to help out in this area. Your iso's and testcases will remain usable as-is, (you can see the legacy mode on the tracker), but can be converted now at any time
[17:59] <balloons> let me know who is interested in joining this new team and I'll help get them going on how to use the tool, etc
[18:00] <balloons> and with that I think we're done. ;-) Thanks to you all for coming out. I appreciate your time. And thanks for suggesting we meet during the cycle. I think it was helpful
[18:00] <balloons> #endmeeting
[18:00] <meetingology> Meeting ended Thu Jun 21 18:00:10 2012 UTC.
[18:00] <meetingology> Minutes (wiki):        http://ubottu.com/meetingology/logs/ubuntu-testing/2012/ubuntu-testing.2012-06-21-16.10.moin.txt
[18:00] <meetingology> Minutes (html):        http://ubottu.com/meetingology/logs/ubuntu-testing/2012/ubuntu-testing.2012-06-21-16.10.html
[18:00] <Effenberg0x0> Thanks balloons
[18:00] <balloons> I'll post the log on the qa mailing list for reference.. Thanks everyone
[18:03] <hggdh> thank you balloons