[14:55] <jasoncwarner> hi everyone. I just updated the hangout instructions for those joining via hangout. if you have the link, let me know if things are working. I'll start the broadcast at the top of the hour (6 minutes)
[15:03] <tedg> https://blueprints.launchpad.net/hud/+spec/hud-20-client-base-libs
[15:09] <Satoris> Let's just use troff markup. It is the standard way, after all.
[15:13] <sladen> the "magic" noise-cancellation is all done with dual microphones
[15:13] <sladen> and differientals
[15:14] <sladen> put what in Multiverse?
[15:15] <tsdgeos> julius i think
[15:17] <tedg> sladen, Yes, julius is in multiverse, and we'd have to put hud-listen-julius
[15:18] <pete-woods> https://launchpad.net/~pete-woods/+archive/sphinx/+packages
[15:21] <pete-woods> http://www.voxforge.org/home/read
[15:21] <tedg> http://www.voxforge.org/home/read
[15:21] <tedg> https://blueprints.launchpad.net/hud/+spec/hud-20-voice
[15:23] <tedg> https://blueprints.launchpad.net/hud/+spec/hud-20-service
[15:35] <tedg> https://blueprints.launchpad.net/ubuntu/+spec/client-1303-unity-hud-2-ui
[15:36] <tsdgeos> my interwebs is running crazy
[15:36] <tsdgeos> 1sec lag
[15:39] <pete-woods> that's better than my normal ping :p
[15:39] <udsbotu> uds-client-2: 5 minutes left in this session!
[15:40] <udsbotu> uds-client-2: 4 minutes left in this session!
[15:41] <udsbotu> uds-client-2: 3 minutes left in this session!
[15:42] <udsbotu> uds-client-2: 2 minutes left in this session!
[15:45] <udsbotu> uds-client-2: This session has ended.
[15:45] <jasoncwarner> 10 minutes left
[15:46] <jasoncwarner> actually, done :/
[15:57] <jasoncwarner> Hi everyone. Session starts in 2 minutes.
[16:01] <jasoncwarner> session is live...
[16:02] <jasoncwarner> have about 45 minutes
[16:02] <jasoncwarner> feel free to ask questions
[16:02] <lool> may I get the hangout URL?
[16:03]  * bryce waves
[16:03] <tsimpson_> the link for the video/eitherpad is in the topic
[16:03] <Lockyz> Waves
[16:07] <lool> http://summit.ubuntu.com/uds-1303/meeting/21689/client-1303-hw-video-decode-rendering-support/
[16:07] <lool> (topic has a truncated URL it seems)
[16:07] <slomo> jhodapp: considering your constraints it is definitely a good idea to continue to use stagefright for accessing the device codecs (for the reasons you mentioned)... however writing a gstreamer plugin around stagefright is absolutely no problem, we're doing that (very similar) in the gstreamer sdk on android (just using a layer on top of stagefright, the java mediacodec api)
[16:09] <lool> is the GStreamer Android SDK what you get from building as described on http://docs.gstreamer.com/display/GstSDK/Installing+for+Android+development ?
[16:09] <slomo> lool: yes
[16:11] <lool> jhodapp: slomo works on GStreamer upstream ^ in case you want to shoot questions
[16:11] <lool> (hey slomo!)
[16:11]  * lool hugs slomo 
[16:12] <slomo> lool: hi :) i already talked to jim earlier :)
[16:12] <lool> perfect  :)
[16:12] <slomo> tvoss: gstreamer is not like openmax il, it's more like a combination of al and il
[16:13] <tvoss> slomo, sure, I just wanted to point out that there is openmax and roughly map it to gstreamer
[16:15] <ptl> How is the flip between software and hardware decoding? Is it automatic within the API, or through it?
[16:15] <ptl> does gstreamer also deals with 3D/OpenGL?
[16:16] <kgunn> ptl in mobile anyway...hwa is default, if its not present it falls back to sw
[16:16] <slomo> ptl: short answer would be yes
[16:16] <slomo> ptl: however you also can have full control over all that if you want to
[16:17] <Ford_Prefect> So dalvik is definitely not going to be available? One concern was that libstagrfright can change any time Google feels like
[16:17] <ptl> thanks
[16:17] <jhodapp> slomo, can you describe for everyone a bit more about how Gstreamer SDK on Android works?
[16:17] <Ford_Prefect> That might be a maintenance concern in the long term
[16:17] <kgunn> ptl i know of at least one impl of gstreamer that TI did to get texture streamed video frames
[16:17] <Ford_Prefect> (my question might make more sense in light of slomo's response)
[16:17] <slomo> jhodapp: basically it's a gstreamer "distribution" for android, including plugins to access the device's codecs, including sinks for audio and video for android
[16:18] <slomo> jhodapp: it just provides the normal gstreamer api to applications, and includes everything you need for using it on android
[16:19] <Ford_Prefect> video: egl, audio: opensles, codecs: wraps the java MediaCodec API
[16:19] <slomo> jhodapp: all the relevant code of it is also in gstreamer upstream. and you should be able to build upon many of these things (e.g. you could re-use the opensles audio sink, the egl/glesv2 video sink)
[16:19] <slomo> for accessing the device's codecs you would need to write something yourself that wraps stagefright or something else
[16:20] <slomo> because what we do is going throught he java layer (because that's the only public API on android that actually allows to access the codecs)
[16:21] <slomo> tvoss: alternative: let gstreamer talk to your service, and let applications use gstreamer?
[16:21] <tvoss> slomo, yup, that's what I'm trying to come down to
[16:21] <slomo> tvoss: gstreamer is not like openmax il, it's not necessarily the lower layer directly on top of the hardware
[16:22] <kgunn> bfiller: right Qt is just command
[16:22] <tvoss> slomo, yup, so we would leverage gstreamer on the service side and on the app side
[16:22] <slomo> tvoss: ack
[16:23] <slomo> bfiller: note that qtmultimedia also has a gstreamer backend, which could make things easier for you
[16:26] <willcooke> do we expect that, when speaking with content owners, the availability of a "known good" DRM'd video playback service (i.e. from Android) will make those conversations easier?
[16:27] <tvoss> slomo, bfiller what api do we want to expose to html5/js? surely not qt?
[16:27] <Ford_Prefect> For whoever's editing the whiteboard: you typically will not see audio from GSM on the CPU
[16:27] <slomo> tvoss: the normal html5 media api?
[16:27] <slomo> tvoss: are you planning to use webkit btw?
[16:28] <Ford_Prefect> But the use-case is valid if you s/GSM/VoIP
[16:28] <ptl> I see that stagefright is going to be supported for now, but how is going to be the long-term plan for not depending on it anymore? I mean, what are the migration paths and strategies?
[16:28] <tvoss> slomo, @api, sure, but someone needs to implement it
[16:29] <slomo> tvoss: well, use webkit and you're done ;) webkit has a very good gstreamer backend already that works fine on different platforms
[16:29] <Ford_Prefect> I don't believe that audio codecs are typically h/w accelerated
[16:30] <pmcgowan> agree
[16:30] <rsalveti> not at android anymore
[16:30] <slomo> yeah, they're definitely not on most socs
[16:30] <rsalveti> afaik they are all using software decode now
[16:30] <pmcgowan> tvoss: we should ensure whats working now with qtwebkit
[16:30] <rsalveti> at least latest android versions
[16:30] <tvoss> pmcgowan, sure, just curious
[16:32] <slomo> tvoss, pmcgowan: yeah, qtwebkit is one of the webkit ports that works fine with gstreamer (the only other possible backend currently is qtmultimedia, which itself could also use gstreamer)
[16:33] <lool> rsalveti: should I hop in the client-1 one?
[16:34] <rsalveti> lool: client-1 is done
[16:34] <lool> wow, good
[16:35] <kgunn> lool: to be clear stagefright still there as well?
[16:35] <slomo> lool: why wouldn't you expose gstreamer to apps though? i understand the reason for an abstraction (make it very simple for apps to do a few use cases), but what if someone wants to do something more complex you didn't consider?
[16:35] <kgunn> lool: otherwise you loose known/good working drm plugins availalble on android
[16:36] <slomo> kgunn: not having stagefright on android would be silly imho
[16:36] <lool> kgunn: apparently stagefright is up for discussion on how we reuse it (listed as research)
[16:36] <kgunn> slomo: agreeing w u
[16:36] <lool> slomo: maybe there's a case to be made to have GStreamer as a high level API *and* a low level implementation
[16:36] <pmcgowan> I assume we dont have access to source for the drm pieces, so could we use it?
[16:37] <kgunn> http://developer.android.com/reference/android/drm/package-summary.html
[16:37] <lool> slomo: the main reason I framed it that we didn't want to expose it is a result from a prior session where we wanted to hide the implementation
[16:37] <kgunn> pmcgowan: ^
[16:37] <lool> but then GStreamer is also pretty good at hiding the implementation
[16:37] <slomo> lool: makes sense
[16:38] <pmcgowan> kgunn: make sense to use the framework if we can
[16:38] <slomo> tvoss: short answer would be yes, it could wrap around drm frameworks (and people do that)
[16:39] <tvoss> slomo, interesting, that's good news. How do you account for restrictions where content is not allowed in ram?
[16:40] <kgunn> pmcgowan: drm typically hw accel too...probably going to need to reuse (to be practical)
[16:40] <slomo> tvoss: you could for example only pass some kind of "drm handle" through gstreamer... and code could then access that via drm framework specific API
[16:40] <pmcgowan> do specific implementations use the framework, like a netflix client? assume so
[16:40] <tvoss> slomo, ack, do you have some public example for that available?
[16:40] <kgunn> pmcgowan: i believe so
[16:41] <slomo> tvoss: we're talking about drm, do you expect people to make that open source ;)  however i could point you to some code that does something very similar (memory not directly accessible, only through another API)
[16:41] <slomo> tvoss: you need the latter very often for hw accel video playback too
[16:42] <tvoss> slomo, just curious :)
[16:43] <slomo> tvoss: (detail: in gstreamer 0.10 you'll need something hackish for this, in 1.0 we added things to make this very clean to implement)
[16:44] <ogra_> 0.10 is in universe (or on its way there at least)
[16:44] <tvoss> ogra_, what about 1.0?
[16:45] <ogra_> thats in since a while, ask Laney
[16:45] <slomo> 1.0 is there too (fwiw, i'm doing the debian packages for gstreamer)
[16:45]  * ogra_ is actually following the rolling release session, just jumping IRC channels on the side 
[16:48] <slomo> jhodapp: don't focus too much on the gstreamer *SDK* part here, that's not very useful for you because it's only a gstreamer distribution and you'll have your own distribution
[16:48] <jhodapp> slomo, good point
[16:49] <slomo> jhodapp: for exposed API... imho there's a middle point between the two you have right now: plug what is available together in any possible way (and by that implementing new use cases, but not implementing new sources/sinks/filters)
[16:51] <pmcgowan> oh heck
[16:51] <rsalveti> pmcgowan: got some work items for you :-)
[16:51] <pmcgowan> sure drm, great
[16:51] <rsalveti> haha :-)
[16:52] <lool> :-)
[16:52] <Saviq|UDS> pmcgowan: it's not like it's the first time ;)
[16:52] <pmcgowan> ha nope
[16:53] <lool> tvoss: I think you'd want to lead the API question with me participating, ok with you?
[16:53] <tvoss> lool, ack, sorry for being distracted
[16:55] <Saviq|UDS> ubuntu-platform@?
[16:55] <lool> jhodapp: have requested creation of ubuntu-platform@l.u.c already
[16:55] <lool> Saviq|UDS: correct
[16:55] <lool> for API discussions
[16:55] <lool> (took that action in platform API session earlier)
[16:55] <lool> jhodapp: are you copying back the pad notes into the bp?
[16:56] <jhodapp> lool, yeah, I'll be doing that
[16:56] <lool> thanks
[16:56] <jhodapp> lool, and thanks for that creation request
[16:56] <lool> np; thanks for leading an interesting session!
[16:56] <rsalveti> thanks all
[16:56] <jhodapp> lool, cool, glad you enjoyed it
[16:57] <lool> it seems we have a ton of followup work and chats on this one  :)
[16:57] <jhodapp> yes, it's an important and large topic
[16:57] <jhodapp> can get quite complex too
[16:59] <slomo> lool, jhodapp: so if you guys have any questions, feel free to talk to me... also including all parts of your work items list, that list looks very familiar ;)
[16:59] <jhodapp> slomo, hehe, yes :)
[17:03] <slomo> jhodapp, lool: do you have an irc channel for this too?
[17:04] <jhodapp> slomo, we have the general ubuntu-touch one right now, but it may be a good idea to make a more specific one for media
[17:04] <jhodapp> slomo, I'll let you know
[17:06] <slomo> jhodapp: ok, thanks
[17:06] <jhodapp> slomo, thanks for your participation in the session
[17:06] <tvoss> slomo, yeah, thanks for being in the session
[17:11] <lool> slomo: Might make sense to have one or not; not sure
[17:12] <lool> would like to avoid fragmentation if we can
[17:12] <lool> but then #ubuntu-touch is a bit busy
[17:12] <lool> I'll defer to jhodapp to decide on this
[17:18] <slomo> lool: ok, i'll just stay there for the time being then :)
[17:18] <lool> that channel definitely going away after UDS  :-)
[17:47] <jasoncwarner> Hi everyone. session starts in 1/2 hour.
[18:14] <jasoncwarner> hi everyone...going to start here very soon!
[18:14] <rsalveti> Ford_Prefect: hey!
[18:14] <Ford_Prefect> o/ rsalveti
[18:14] <jasoncwarner> if you have qustions, make sure you ping me on IRC to bring to my attention
[18:14] <TheMuso> Urm, how do I unmute myself?
[18:14] <TheMuso> i.e what am I looking for visually?
[18:14] <TheMuso> SInce this is mostly inacccessible...
[18:15] <diwic2> TheMuso, red icon upper right
[18:15] <rsalveti> TheMuso: there's mic icon at the top, right after the hangout title
[18:16] <zyga-uds> is the video rolling yet?
[18:16] <nuclearbob> nope
[18:18] <Saviq|UDS> rsalveti: can't hear you at all
[18:18] <Saviq|UDS> rsalveti: audio is completely robot-like
[18:18] <Saviq|UDS> anyone else?
[18:18] <ptl> "This live event will begin in a few moments." - no image/audio
[18:18] <ptl> Let me reload.
[18:18] <diwic2> sounds like a lack of bandwidth problem
[18:19] <nuclearbob> yeah
[18:19] <Saviq|UDS> \o/
[18:19] <ptl> I can hear ricardo now
[18:19] <Ford_Prefect> ptl: should be rolling now
[18:19] <lool> issues with audio in an audio stack session...
[18:19] <Ford_Prefect> LD
[18:19] <Ford_Prefect> :D
[18:25] <ChickenCutlass> awe_: yes the data goes over binder
[18:25] <lool> will we have an API for (re-)routing audio?
[18:26] <lool> I guess we'll abstract away things like sound input/output
[18:26] <lool> but there might be need to change e.g. how devices are routed
[18:26] <zyga-uds> QUESTION: is that the direction we want to work towards for the long term? If so, does that affect any existing desktop apps as far as expsting the audio stack to work as it does on non-android kernel?
[18:26] <ChickenCutlass> the problem with getting rid of audio flinger is the kernel drivers do not support them
[18:27] <zyga-uds> s/expsting/expecting/
[18:27] <ChickenCutlass> I mean the audio drivers do not support everything needed for alsa
[18:27] <ptl> so isn't it a case of just improving the features of the given audio drivers?
[18:28] <lool> other question (from hw decode session): how do we handle sync between audio and video?
[18:29] <tanuk_> Is the hardware configuration written for HAL or AudioFlinger (afaik AudioFlinger works on top of HAL, but I don't know if AudioFlinger needs any hw-specific configuration)
[18:31] <lool> I was thinking of e.g. you press the speaker output button, audio gets to the speaker
[18:31] <lool> one needs to be able to do this from apps
[18:31] <lool> or you are implementing a conferencing app and want to send audio either to the speaker, or to the headset
[18:33] <lool> diwic: Do we have tests that we can use to validate whether drivers are good enough for pulseaudio?
[18:35] <diwic2> lool: test suites are never complete, unfortunately
[18:36] <lool> awe: rsalveti: Could one of you two relay the concerns from hw decoding support and sync issues that we need to support?
[18:36] <lool> also, routing for voice (modem) stack
[18:36] <lool> (I am jumping between video streams, so not necessarily following everything which is being said unfortunately)
[18:40] <Saviq|UDS> QUESTION: what about bluetooth? is HFP not going through CPU either?
[18:42] <rsalveti> lool: modem usually goes via modem directly
[18:42] <rsalveti> and don't know if we'd have any issue with hw decode support, at least Ford_Prefect didn't have any issue when replacing audioflinger with pulseaudio
[18:42] <rsalveti> as the media service will just end up using the audioflinger api, which will go via libpulse in this case
[18:43] <ptl> replacing? I thought pulseaudio was running on top of audioflinger..
[18:44] <rsalveti> in this case yes, not what Ford_Prefect did a while ago
[18:44] <rsalveti> ptl: http://arunraghavan.net/2012/04/pulseaudio-on-android-part-2/
[18:44] <lool> rsalveti: thanks
[18:44] <ptl> ah, thanks
[18:44] <ptl> brb, video stopped, will reload
[18:46] <lool> rsalveti: so libubuntu-app-api -> libpulse -> pulseaudio -> libasound -> kernel drivers, and mediaservice -> libpulse -> pulseaudio -> libasound -> kernel drivers?
[18:46] <Saviq|UDS> lool: or potentially s/libasound/HAL/
[18:47] <lool> Saviq|UDS: quite a big difference!
[18:47] <rsalveti> lool: do we want libubuntu-app-api for it as well?
[18:47] <lool> rsalveti: I guess
[18:47] <rsalveti> not something we discussed
[18:47] <lool> rsalveti: another question related to libubuntu-app-api for it as well?
[18:48] <lool> 19:47 < lool> rsalveti: I guess
[18:48] <lool> ups
[18:48] <lool> rsalveti: another question related to libubuntu-app-api that I've asked earlier is whether we want audio routing API for this
[18:48] <lool> rsalveti: e.g. I want to switch audio output from headset to speaker or vice-versa
[18:48] <lool> or I want to mute this or that
[18:49] <lool> I should have joined the session early rather than trying to follow multiple sessions
[18:49] <Saviq|UDS> lool: probably a topic for ubuntu-platform@
[18:49] <lool> definitely
[18:49] <lool> Saviq|UDS: good point though, might want to mention that we need to followup on platform APIs
[18:49] <lool> I'm a bit worried that we will end up requiring a media service, and then pulseaudio looks weird in the picture
[18:49] <Ford_Prefect> rsalveti: audio's broken up again :/
[18:50] <Saviq|UDS> gone
[18:51] <rsalveti> sorry
[18:51] <Saviq|UDS> rsalveti: try disabling your cam
[18:51] <rsalveti> Saviq|UDS: indeed
[18:52] <Saviq|UDS> Ford_Prefect: is echo cancellation a special case?
[18:52] <rsalveti> lool: it's fine to have both pulse and media service
[18:52] <rsalveti> at android we have media service + audioflinger
[18:53] <rsalveti> so we'd just be replacing the audioflinger part
[18:53] <rsalveti> lool: it scares me that we also want an ubuntu platform api abstraction for audio :-)
[18:53] <lool> right; I was kind of making a blob out of media service + audioflinger
[18:53] <lool> shoudl be cleaer
[18:53] <lool> rsalveti: me too, but I kind of understand where tvoss comes from to suggest this
[18:53] <rsalveti> still :-)
[18:54] <lool> rsalveti: On the other, I would be less confortable to commit to using libpulse as stable API instead of e.g. libasound2
[18:54] <lool> at least we can do libasound2 -> libpulse -> whatever
[18:58] <rsalveti> diwic: Ford_Prefect: thanks for joining in
[18:58] <rsalveti> awe_: thanks for taking notes :-)
[18:58] <Ford_Prefect> Happy to pitch in. :)
[18:58] <awe_> rsalveti, anytime!
[18:59] <rsalveti> sorry if I'm a bit slow today, not feeling so good
[18:59] <lool> rsalveti: you got my ubuflu over the air
[18:59] <Ford_Prefect> I certainly couldn't tell
[18:59] <diwic> Ford_Prefect, yup thanks for joining. I'll need to investigate the UCM stuff, still feels a bit immature compared to just doing pulseaudio profiles
[18:59] <rsalveti> lool: lol, yeah
[18:59] <Ford_Prefect> In general, I'm usually around on IRC, so feel free to ping me if you hit any problems with Android integration or anything else
[18:59] <rsalveti> cool
[18:59] <Ford_Prefect> diwic: one advantage with doing UCM is testing with straight-up alsa becomes easier
[18:59] <rsalveti> Ford_Prefect: we usually hang around at #ubuntu-touch
[19:00] <diwic> Ford_Prefect, true
[19:00] <Ford_Prefect> rsalveti: cool, I'm there now
[19:00] <diwic> Ford_Prefect, do we even support hw mixer control through ucm? I mean, setting the mic gain e g.
[19:01] <Ford_Prefect> diwic: there is supposed to be API to say "use this mixer", but I don't think there's one to express a mixer hierarchy
[19:01] <diwic> Ford_Prefect, moving to #pulseaudio
[19:01] <Ford_Prefect> ack
[19:02] <jasoncwarner__> hi everyone. session will start in 3 minutes
[19:07] <lool> coming live
[19:07] <lool> (soon)
[19:07] <tedg> Faster!
[19:07] <tedg> :-)
[19:08] <jasoncwarner__> hi all...if you have quesitons, ping me here
[19:08] <mdeslaur> \o
[19:09] <lool> stgraber can't be there unfortunately (conflict)
[19:10] <tedg> jasoncwarner__, Why does Android need to be in a container?
[19:10] <tedg> Could we run the services without Android init?
[19:11] <tedg> Benefit: Being able to apt-get upgrade a kernel :-)
[19:12] <tedg> Seems like setting up different paths would be clearer.
[19:13] <tedg> Won't we need to remove those features anyway?  Or patch them?
[19:13] <tedg> We don't really want the system services writing somewhere odd.
[19:13] <rsalveti> we can't patch binaries
[19:14] <rsalveti> we'd need to abstract bionic calls
[19:14] <tedg> We could use apparmor to rewrite the paths.
[19:14] <rsalveti> right, but that's not specific to android, that changes depending on the device
[19:14] <rsalveti> so we'd need device specific rules, which is a pain
[19:14] <tedg> Sure, but it'd let us use the android services without a container.
[19:14] <tedg> We could use something like /etc/$(servicename)
[19:16] <tedg> awe_, I think you can upgrade the Android components as a single package.
[19:17] <tedg> awe_, So then you could make the hybris versions match, in a single upgrade.
[19:17] <rsalveti> tedg: sure, but we also want to avoid making more work for the porters
[19:18] <tedg> rsalveti, Seems like it'd be easier as they'd just make one package, no?
[19:18] <rsalveti> not necessarily, there might be device specific services as well
[19:18] <tedg> rsalveti, Instead of figuring how to start up the Ubuntu side, etc.
[19:19] <tedg> ChickenCutlass, I think if you're running in a container, you have to think of it as two OSes.
[19:19] <tedg> ChickenCutlass, Especially with a PID namespace.
[19:22] <tedg> A lot of command line tools people use for debugging.
[19:22] <tedg> Also, with out /proc doesn't apport have problems?
[19:27] <mfisch> They both get the event I think
[19:27] <rsalveti> I'd expect so
[19:29] <awe_> mfisch, that's my guess too.  Haven't had a chance to talk to anyone on the kernel team.  I plan on bringing this up during our kernel session tomorrow...
[19:29] <tedg> ChickenCutlass, Don't we already have that with /opt/extras* ?
[19:29] <ChickenCutlass> tedg: for apps you mean
[19:29] <tedg> Yes
[19:30] <ChickenCutlass> tedg: yes, but would need to force apps there
[19:30] <tedg> ChickenCutlass, We already do, no?
[19:30] <ChickenCutlass> we do
[19:30] <mfisch> can you use bind mounts so that ubutnu apps don't have to change?
[19:30] <mfisch> or change as much
[19:31] <tedg> mfisch, We looked at that, but were told that the mount command is too slow.  Would actually effect startup time.
[19:32] <mfisch> tedg: I'd love to have any data you have on that
[19:32] <tedg> I think "One System" is easier -- Two is super confusing.
[19:32] <tedg> mfisch, I believe that I was told that by mdeslaur
[19:35] <tedg> I think the most sane solution to target short term is put Android in a container.
[19:35] <mdeslaur> +
[19:35] <tedg> Then we can pull it out if it makes sense.
[19:35] <mdeslaur> +1
[19:36] <tedg> But then it can be one package easily.  With a reasonable update.
[19:36] <tedg> jasoncwarner__, It seems like we've now listed the options again, can you force a decision?  :-)
[19:38] <tedg> Do we care about any of those devices?  I mean, having a port is not our goals.
[19:38] <tedg> Our goals is to have a device of our own.
[19:38] <rsalveti> tedg: can't be one package necessarily, it'd be one package per device
[19:39] <tedg> Using someone else's is the same problem we've got on the desktop today.
[19:39] <rsalveti> or one package for android + another package for device specifics
[19:39] <tedg> rsalveti, Yes, I was thinking a "provides: android stuff" and then multiple packages could provide that.
[19:39] <rsalveti> tedg: porting is kind of our goal
[19:39] <rsalveti> we want people to use and work with it
[19:39] <awe_> tedg, the problem is that it may not be technically possible
[19:39] <rsalveti> that's why we created the porting effort
[19:39] <rsalveti> let's not drop that
[19:39] <tedg> awe_, We still need a decision for our plan from this session.
[19:40] <awe_> tedg, we don't have enough information to make a decision today
[19:40] <tedg> rsalveti, I think we should drop it if it at all puts at risk the possibility of having our own device that is great.
[19:40] <awe_> If I was a betting man, I would bet against this being technically possible w/out alot of custom tweak to Android
[19:40] <tedg> awe_, I find it hard to believe you're not a betting man ;-)
[19:41] <tedg> I think we should say where we want to go and try to get there.  Then we can reevaluate.
[19:41] <tedg> The problem is other people need to figure out if they have to work around all the BS in the Android container stuff.
[19:42] <rsalveti> tedg: we're not putting anything at risk :-)
[19:42] <rsalveti> we're trying to find a better solution, that's all
[19:42] <mfisch> tedg: I (or a delegate) will probably investigate the mount time stuff and get back to you
[19:42] <rsalveti> so let's just not break other devices or any porting effort
[19:42] <ali1234> i think there's a misunderstanding here
[19:42] <tedg> mfisch, Great, thanks!
[19:42] <ali1234> lool is talking about eg extract_files.sh
[19:42] <ali1234> the existing system doesn't use any files from the previous full android install
[19:43] <ali1234> except during build
[19:52] <tedg> That was me.
[19:52] <lool> any question from here?
[19:52] <tedg> Sure, but if we're looking at one service.
[19:52] <tedg> It might easier to do a single path rewrite than a whole container.
[19:52] <tedg> No questions.  No decisions to question.
[19:56] <ogra_>  /dev as overlayfs !
[19:56] <ogra_> :)
[19:56] <xnox> 8) wtf?!
[19:56]  * xnox is clearly missing context
[19:56] <ogra_> xnox, i was commanting to the hahngout
[19:57] <mfisch> where will the results of these investigations go?
[19:57] <mfisch> a link to a doc from the blueprint would be ideal
[19:59] <ogra_> enjoy the bar !!!
[20:00] <lool> awe_: updated blueprint with notes and wis
[20:00] <awe_> thanks lool!