=== udsbotu changed the topic of #ubuntu-uds-client-2 to: Track: Client | Completing HUD 2.0 | Url: http://summit.ubuntu.com/uds-1303/meeting/21624/client-1303-hud-20/ [14:55] hi everyone. I just updated the hangout instructions for those joining via hangout. if you have the link, let me know if things are working. I'll start the broadcast at the top of the hour (6 minutes) [15:03] https://blueprints.launchpad.net/hud/+spec/hud-20-client-base-libs [15:09] Let's just use troff markup. It is the standard way, after all. [15:13] the "magic" noise-cancellation is all done with dual microphones [15:13] and differientals [15:14] put what in Multiverse? [15:15] julius i think [15:17] sladen, Yes, julius is in multiverse, and we'd have to put hud-listen-julius [15:18] https://launchpad.net/~pete-woods/+archive/sphinx/+packages [15:21] http://www.voxforge.org/home/read [15:21] http://www.voxforge.org/home/read [15:21] https://blueprints.launchpad.net/hud/+spec/hud-20-voice [15:23] https://blueprints.launchpad.net/hud/+spec/hud-20-service [15:35] https://blueprints.launchpad.net/ubuntu/+spec/client-1303-unity-hud-2-ui [15:36] my interwebs is running crazy [15:36] 1sec lag [15:39] that's better than my normal ping :p [15:39] uds-client-2: 5 minutes left in this session! [15:40] uds-client-2: 4 minutes left in this session! [15:41] uds-client-2: 3 minutes left in this session! [15:42] uds-client-2: 2 minutes left in this session! === alex_abreu is now known as alex-abreu [15:45] uds-client-2: This session has ended. [15:45] 10 minutes left === udsbotu changed the topic of #ubuntu-uds-client-2 to: Currently no events are active in this room - http://summit.ubuntu.com/uds-1303/client-2/ - http://ubottu.com/uds-logs/%23ubuntu-uds-client-2.log [15:46] actually, done :/ === udsbotu changed the topic of #ubuntu-uds-client-2 to: Track: Client | HW Accelerated Video Decode and Rendering support | Url: http://summit.ubuntu.com/uds-1303/meeting/21689/client-1303-hw-video-d [15:57] Hi everyone. Session starts in 2 minutes. [16:01] session is live... [16:02] have about 45 minutes [16:02] feel free to ask questions [16:02] may I get the hangout URL? [16:03] * bryce waves [16:03] the link for the video/eitherpad is in the topic [16:03] Waves [16:07] http://summit.ubuntu.com/uds-1303/meeting/21689/client-1303-hw-video-decode-rendering-support/ [16:07] (topic has a truncated URL it seems) [16:07] jhodapp: considering your constraints it is definitely a good idea to continue to use stagefright for accessing the device codecs (for the reasons you mentioned)... however writing a gstreamer plugin around stagefright is absolutely no problem, we're doing that (very similar) in the gstreamer sdk on android (just using a layer on top of stagefright, the java mediacodec api) [16:09] is the GStreamer Android SDK what you get from building as described on http://docs.gstreamer.com/display/GstSDK/Installing+for+Android+development ? [16:09] lool: yes [16:11] jhodapp: slomo works on GStreamer upstream ^ in case you want to shoot questions [16:11] (hey slomo!) [16:11] * lool hugs slomo [16:12] lool: hi :) i already talked to jim earlier :) [16:12] perfect :) [16:12] tvoss: gstreamer is not like openmax il, it's more like a combination of al and il [16:13] slomo, sure, I just wanted to point out that there is openmax and roughly map it to gstreamer === udsbotu changed the topic of #ubuntu-uds-client-2 to: Track: Client | HW Accelerated Video Decode and Rendering support | Url: http://summit.ubuntu.com/uds-1303/meeting/21689/client-1303-hw-video-decode-rendering-support/ [16:15] How is the flip between software and hardware decoding? Is it automatic within the API, or through it? [16:15] does gstreamer also deals with 3D/OpenGL? [16:16] ptl in mobile anyway...hwa is default, if its not present it falls back to sw [16:16] ptl: short answer would be yes [16:16] ptl: however you also can have full control over all that if you want to [16:17] So dalvik is definitely not going to be available? One concern was that libstagrfright can change any time Google feels like [16:17] thanks [16:17] slomo, can you describe for everyone a bit more about how Gstreamer SDK on Android works? [16:17] That might be a maintenance concern in the long term [16:17] ptl i know of at least one impl of gstreamer that TI did to get texture streamed video frames [16:17] (my question might make more sense in light of slomo's response) [16:17] jhodapp: basically it's a gstreamer "distribution" for android, including plugins to access the device's codecs, including sinks for audio and video for android [16:18] jhodapp: it just provides the normal gstreamer api to applications, and includes everything you need for using it on android [16:19] video: egl, audio: opensles, codecs: wraps the java MediaCodec API [16:19] jhodapp: all the relevant code of it is also in gstreamer upstream. and you should be able to build upon many of these things (e.g. you could re-use the opensles audio sink, the egl/glesv2 video sink) [16:19] for accessing the device's codecs you would need to write something yourself that wraps stagefright or something else [16:20] because what we do is going throught he java layer (because that's the only public API on android that actually allows to access the codecs) [16:21] tvoss: alternative: let gstreamer talk to your service, and let applications use gstreamer? [16:21] slomo, yup, that's what I'm trying to come down to [16:21] tvoss: gstreamer is not like openmax il, it's not necessarily the lower layer directly on top of the hardware [16:22] bfiller: right Qt is just command [16:22] slomo, yup, so we would leverage gstreamer on the service side and on the app side [16:22] tvoss: ack [16:23] bfiller: note that qtmultimedia also has a gstreamer backend, which could make things easier for you [16:26] do we expect that, when speaking with content owners, the availability of a "known good" DRM'd video playback service (i.e. from Android) will make those conversations easier? [16:27] slomo, bfiller what api do we want to expose to html5/js? surely not qt? [16:27] For whoever's editing the whiteboard: you typically will not see audio from GSM on the CPU [16:27] tvoss: the normal html5 media api? [16:27] tvoss: are you planning to use webkit btw? [16:28] But the use-case is valid if you s/GSM/VoIP [16:28] I see that stagefright is going to be supported for now, but how is going to be the long-term plan for not depending on it anymore? I mean, what are the migration paths and strategies? [16:28] slomo, @api, sure, but someone needs to implement it [16:29] tvoss: well, use webkit and you're done ;) webkit has a very good gstreamer backend already that works fine on different platforms [16:29] I don't believe that audio codecs are typically h/w accelerated [16:30] agree [16:30] not at android anymore [16:30] yeah, they're definitely not on most socs [16:30] afaik they are all using software decode now [16:30] tvoss: we should ensure whats working now with qtwebkit [16:30] at least latest android versions [16:30] pmcgowan, sure, just curious [16:32] tvoss, pmcgowan: yeah, qtwebkit is one of the webkit ports that works fine with gstreamer (the only other possible backend currently is qtmultimedia, which itself could also use gstreamer) [16:33] rsalveti: should I hop in the client-1 one? [16:34] lool: client-1 is done [16:34] wow, good [16:35] lool: to be clear stagefright still there as well? [16:35] lool: why wouldn't you expose gstreamer to apps though? i understand the reason for an abstraction (make it very simple for apps to do a few use cases), but what if someone wants to do something more complex you didn't consider? [16:35] lool: otherwise you loose known/good working drm plugins availalble on android [16:36] kgunn: not having stagefright on android would be silly imho [16:36] kgunn: apparently stagefright is up for discussion on how we reuse it (listed as research) [16:36] slomo: agreeing w u [16:36] slomo: maybe there's a case to be made to have GStreamer as a high level API *and* a low level implementation [16:36] I assume we dont have access to source for the drm pieces, so could we use it? [16:37] http://developer.android.com/reference/android/drm/package-summary.html [16:37] slomo: the main reason I framed it that we didn't want to expose it is a result from a prior session where we wanted to hide the implementation [16:37] pmcgowan: ^ [16:37] but then GStreamer is also pretty good at hiding the implementation [16:37] lool: makes sense [16:38] kgunn: make sense to use the framework if we can [16:38] tvoss: short answer would be yes, it could wrap around drm frameworks (and people do that) [16:39] slomo, interesting, that's good news. How do you account for restrictions where content is not allowed in ram? [16:40] pmcgowan: drm typically hw accel too...probably going to need to reuse (to be practical) [16:40] tvoss: you could for example only pass some kind of "drm handle" through gstreamer... and code could then access that via drm framework specific API [16:40] do specific implementations use the framework, like a netflix client? assume so [16:40] slomo, ack, do you have some public example for that available? [16:40] pmcgowan: i believe so [16:41] tvoss: we're talking about drm, do you expect people to make that open source ;) however i could point you to some code that does something very similar (memory not directly accessible, only through another API) [16:41] tvoss: you need the latter very often for hw accel video playback too [16:42] slomo, just curious :) [16:43] tvoss: (detail: in gstreamer 0.10 you'll need something hackish for this, in 1.0 we added things to make this very clean to implement) [16:44] 0.10 is in universe (or on its way there at least) [16:44] ogra_, what about 1.0? [16:45] thats in since a while, ask Laney [16:45] 1.0 is there too (fwiw, i'm doing the debian packages for gstreamer) [16:45] * ogra_ is actually following the rolling release session, just jumping IRC channels on the side [16:48] jhodapp: don't focus too much on the gstreamer *SDK* part here, that's not very useful for you because it's only a gstreamer distribution and you'll have your own distribution [16:48] slomo, good point [16:49] jhodapp: for exposed API... imho there's a middle point between the two you have right now: plug what is available together in any possible way (and by that implementing new use cases, but not implementing new sources/sinks/filters) [16:51] oh heck [16:51] pmcgowan: got some work items for you :-) [16:51] sure drm, great [16:51] haha :-) [16:52] :-) [16:52] pmcgowan: it's not like it's the first time ;) [16:52] ha nope [16:53] tvoss: I think you'd want to lead the API question with me participating, ok with you? [16:53] lool, ack, sorry for being distracted [16:55] ubuntu-platform@? [16:55] jhodapp: have requested creation of ubuntu-platform@l.u.c already [16:55] Saviq|UDS: correct [16:55] for API discussions [16:55] (took that action in platform API session earlier) === udsbotu changed the topic of #ubuntu-uds-client-2 to: Currently no events are active in this room - http://summit.ubuntu.com/uds-1303/client-2/ - http://ubottu.com/uds-logs/%23ubuntu-uds-client-2.log [16:55] jhodapp: are you copying back the pad notes into the bp? [16:56] lool, yeah, I'll be doing that [16:56] thanks [16:56] lool, and thanks for that creation request [16:56] np; thanks for leading an interesting session! [16:56] thanks all [16:56] lool, cool, glad you enjoyed it [16:57] it seems we have a ton of followup work and chats on this one :) [16:57] yes, it's an important and large topic [16:57] can get quite complex too [16:59] lool, jhodapp: so if you guys have any questions, feel free to talk to me... also including all parts of your work items list, that list looks very familiar ;) [16:59] slomo, hehe, yes :) [17:03] jhodapp, lool: do you have an irc channel for this too? [17:04] slomo, we have the general ubuntu-touch one right now, but it may be a good idea to make a more specific one for media [17:04] slomo, I'll let you know [17:06] jhodapp: ok, thanks [17:06] slomo, thanks for your participation in the session [17:06] slomo, yeah, thanks for being in the session [17:11] slomo: Might make sense to have one or not; not sure [17:12] would like to avoid fragmentation if we can [17:12] but then #ubuntu-touch is a bit busy [17:12] I'll defer to jhodapp to decide on this [17:18] lool: ok, i'll just stay there for the time being then :) [17:18] that channel definitely going away after UDS :-) === FunnyLookinHat_ is now known as FunnyLookinHat [17:47] Hi everyone. session starts in 1/2 hour. === udsbotu changed the topic of #ubuntu-uds-client-2 to: Track: Client | Audio support with PulseAudio and AudioFlinger | Url: http://summit.ubuntu.com/uds-1303/meeting/21627/client-1303-sound-support-pulse-audioflinger/ [18:14] hi everyone...going to start here very soon! [18:14] Ford_Prefect: hey! [18:14] o/ rsalveti [18:14] if you have qustions, make sure you ping me on IRC to bring to my attention [18:14] Urm, how do I unmute myself? [18:14] i.e what am I looking for visually? [18:14] SInce this is mostly inacccessible... [18:15] TheMuso, red icon upper right [18:15] TheMuso: there's mic icon at the top, right after the hangout title [18:16] is the video rolling yet? [18:16] nope [18:18] rsalveti: can't hear you at all [18:18] rsalveti: audio is completely robot-like [18:18] anyone else? [18:18] "This live event will begin in a few moments." - no image/audio [18:18] Let me reload. [18:18] sounds like a lack of bandwidth problem [18:19] yeah [18:19] \o/ [18:19] I can hear ricardo now [18:19] ptl: should be rolling now [18:19] issues with audio in an audio stack session... [18:19] LD [18:19] :D [18:25] awe_: yes the data goes over binder [18:25] will we have an API for (re-)routing audio? [18:26] I guess we'll abstract away things like sound input/output [18:26] but there might be need to change e.g. how devices are routed [18:26] QUESTION: is that the direction we want to work towards for the long term? If so, does that affect any existing desktop apps as far as expsting the audio stack to work as it does on non-android kernel? [18:26] the problem with getting rid of audio flinger is the kernel drivers do not support them [18:27] s/expsting/expecting/ [18:27] I mean the audio drivers do not support everything needed for alsa [18:27] so isn't it a case of just improving the features of the given audio drivers? [18:28] other question (from hw decode session): how do we handle sync between audio and video? [18:29] Is the hardware configuration written for HAL or AudioFlinger (afaik AudioFlinger works on top of HAL, but I don't know if AudioFlinger needs any hw-specific configuration) [18:31] I was thinking of e.g. you press the speaker output button, audio gets to the speaker [18:31] one needs to be able to do this from apps [18:31] or you are implementing a conferencing app and want to send audio either to the speaker, or to the headset [18:33] diwic: Do we have tests that we can use to validate whether drivers are good enough for pulseaudio? [18:35] lool: test suites are never complete, unfortunately [18:36] awe: rsalveti: Could one of you two relay the concerns from hw decoding support and sync issues that we need to support? [18:36] also, routing for voice (modem) stack [18:36] (I am jumping between video streams, so not necessarily following everything which is being said unfortunately) [18:40] QUESTION: what about bluetooth? is HFP not going through CPU either? [18:42] lool: modem usually goes via modem directly [18:42] and don't know if we'd have any issue with hw decode support, at least Ford_Prefect didn't have any issue when replacing audioflinger with pulseaudio [18:42] as the media service will just end up using the audioflinger api, which will go via libpulse in this case [18:43] replacing? I thought pulseaudio was running on top of audioflinger.. [18:44] in this case yes, not what Ford_Prefect did a while ago [18:44] ptl: http://arunraghavan.net/2012/04/pulseaudio-on-android-part-2/ [18:44] rsalveti: thanks [18:44] ah, thanks [18:44] brb, video stopped, will reload [18:46] rsalveti: so libubuntu-app-api -> libpulse -> pulseaudio -> libasound -> kernel drivers, and mediaservice -> libpulse -> pulseaudio -> libasound -> kernel drivers? [18:46] lool: or potentially s/libasound/HAL/ [18:47] Saviq|UDS: quite a big difference! [18:47] lool: do we want libubuntu-app-api for it as well? [18:47] rsalveti: I guess [18:47] not something we discussed [18:47] rsalveti: another question related to libubuntu-app-api for it as well? [18:48] 19:47 < lool> rsalveti: I guess [18:48] ups [18:48] rsalveti: another question related to libubuntu-app-api that I've asked earlier is whether we want audio routing API for this [18:48] rsalveti: e.g. I want to switch audio output from headset to speaker or vice-versa [18:48] or I want to mute this or that [18:49] I should have joined the session early rather than trying to follow multiple sessions [18:49] lool: probably a topic for ubuntu-platform@ [18:49] definitely [18:49] Saviq|UDS: good point though, might want to mention that we need to followup on platform APIs [18:49] I'm a bit worried that we will end up requiring a media service, and then pulseaudio looks weird in the picture [18:49] rsalveti: audio's broken up again :/ [18:50] gone [18:51] sorry [18:51] rsalveti: try disabling your cam [18:51] Saviq|UDS: indeed [18:52] Ford_Prefect: is echo cancellation a special case? [18:52] lool: it's fine to have both pulse and media service [18:52] at android we have media service + audioflinger [18:53] so we'd just be replacing the audioflinger part [18:53] lool: it scares me that we also want an ubuntu platform api abstraction for audio :-) [18:53] right; I was kind of making a blob out of media service + audioflinger [18:53] shoudl be cleaer [18:53] rsalveti: me too, but I kind of understand where tvoss comes from to suggest this [18:53] still :-) [18:54] rsalveti: On the other, I would be less confortable to commit to using libpulse as stable API instead of e.g. libasound2 [18:54] at least we can do libasound2 -> libpulse -> whatever [18:58] diwic: Ford_Prefect: thanks for joining in [18:58] awe_: thanks for taking notes :-) [18:58] Happy to pitch in. :) [18:58] rsalveti, anytime! [18:59] sorry if I'm a bit slow today, not feeling so good [18:59] rsalveti: you got my ubuflu over the air [18:59] I certainly couldn't tell [18:59] Ford_Prefect, yup thanks for joining. I'll need to investigate the UCM stuff, still feels a bit immature compared to just doing pulseaudio profiles [18:59] lool: lol, yeah [18:59] In general, I'm usually around on IRC, so feel free to ping me if you hit any problems with Android integration or anything else [18:59] cool [18:59] diwic: one advantage with doing UCM is testing with straight-up alsa becomes easier [18:59] Ford_Prefect: we usually hang around at #ubuntu-touch [19:00] Ford_Prefect, true [19:00] rsalveti: cool, I'm there now [19:00] Ford_Prefect, do we even support hw mixer control through ucm? I mean, setting the mic gain e g. === udsbotu changed the topic of #ubuntu-uds-client-2 to: Track: Client | Containers and host/client model for Android + Ubuntu | Url: http://summit.ubuntu.com/uds-1303/meeting/21626/client-1303-containers-host-client-ubuntu-android/ [19:01] diwic: there is supposed to be API to say "use this mixer", but I don't think there's one to express a mixer hierarchy [19:01] Ford_Prefect, moving to #pulseaudio [19:01] ack [19:02] hi everyone. session will start in 3 minutes === netcurli_ is now known as netcurli [19:07] coming live [19:07] (soon) [19:07] Faster! [19:07] :-) [19:08] hi all...if you have quesitons, ping me here [19:08] \o [19:09] stgraber can't be there unfortunately (conflict) === rsalveti_ is now known as rsalveti [19:10] jasoncwarner__, Why does Android need to be in a container? [19:10] Could we run the services without Android init? [19:11] Benefit: Being able to apt-get upgrade a kernel :-) [19:12] Seems like setting up different paths would be clearer. [19:13] Won't we need to remove those features anyway? Or patch them? [19:13] We don't really want the system services writing somewhere odd. [19:13] we can't patch binaries [19:14] we'd need to abstract bionic calls [19:14] We could use apparmor to rewrite the paths. [19:14] right, but that's not specific to android, that changes depending on the device [19:14] so we'd need device specific rules, which is a pain [19:14] Sure, but it'd let us use the android services without a container. [19:14] We could use something like /etc/$(servicename) [19:16] awe_, I think you can upgrade the Android components as a single package. [19:17] awe_, So then you could make the hybris versions match, in a single upgrade. [19:17] tedg: sure, but we also want to avoid making more work for the porters [19:18] rsalveti, Seems like it'd be easier as they'd just make one package, no? [19:18] not necessarily, there might be device specific services as well [19:18] rsalveti, Instead of figuring how to start up the Ubuntu side, etc. [19:19] ChickenCutlass, I think if you're running in a container, you have to think of it as two OSes. [19:19] ChickenCutlass, Especially with a PID namespace. [19:22] A lot of command line tools people use for debugging. [19:22] Also, with out /proc doesn't apport have problems? [19:27] They both get the event I think [19:27] I'd expect so [19:29] mfisch, that's my guess too. Haven't had a chance to talk to anyone on the kernel team. I plan on bringing this up during our kernel session tomorrow... [19:29] ChickenCutlass, Don't we already have that with /opt/extras* ? [19:29] tedg: for apps you mean [19:29] Yes [19:30] tedg: yes, but would need to force apps there [19:30] ChickenCutlass, We already do, no? [19:30] we do [19:30] can you use bind mounts so that ubutnu apps don't have to change? [19:30] or change as much [19:31] mfisch, We looked at that, but were told that the mount command is too slow. Would actually effect startup time. [19:32] tedg: I'd love to have any data you have on that [19:32] I think "One System" is easier -- Two is super confusing. [19:32] mfisch, I believe that I was told that by mdeslaur [19:35] I think the most sane solution to target short term is put Android in a container. [19:35] + [19:35] Then we can pull it out if it makes sense. [19:35] +1 [19:36] But then it can be one package easily. With a reasonable update. [19:36] jasoncwarner__, It seems like we've now listed the options again, can you force a decision? :-) [19:38] Do we care about any of those devices? I mean, having a port is not our goals. [19:38] Our goals is to have a device of our own. [19:38] tedg: can't be one package necessarily, it'd be one package per device [19:39] Using someone else's is the same problem we've got on the desktop today. [19:39] or one package for android + another package for device specifics [19:39] rsalveti, Yes, I was thinking a "provides: android stuff" and then multiple packages could provide that. [19:39] tedg: porting is kind of our goal [19:39] we want people to use and work with it [19:39] tedg, the problem is that it may not be technically possible [19:39] that's why we created the porting effort [19:39] let's not drop that [19:39] awe_, We still need a decision for our plan from this session. [19:40] tedg, we don't have enough information to make a decision today [19:40] rsalveti, I think we should drop it if it at all puts at risk the possibility of having our own device that is great. [19:40] If I was a betting man, I would bet against this being technically possible w/out alot of custom tweak to Android [19:40] awe_, I find it hard to believe you're not a betting man ;-) [19:41] I think we should say where we want to go and try to get there. Then we can reevaluate. [19:41] The problem is other people need to figure out if they have to work around all the BS in the Android container stuff. [19:42] tedg: we're not putting anything at risk :-) [19:42] we're trying to find a better solution, that's all [19:42] tedg: I (or a delegate) will probably investigate the mount time stuff and get back to you [19:42] so let's just not break other devices or any porting effort [19:42] i think there's a misunderstanding here [19:42] mfisch, Great, thanks! [19:42] lool is talking about eg extract_files.sh [19:42] the existing system doesn't use any files from the previous full android install [19:43] except during build [19:52] That was me. [19:52] any question from here? [19:52] Sure, but if we're looking at one service. [19:52] It might easier to do a single path rewrite than a whole container. [19:52] No questions. No decisions to question. [19:56] /dev as overlayfs ! [19:56] :) [19:56] 8) wtf?! [19:56] * xnox is clearly missing context [19:56] xnox, i was commanting to the hahngout [19:57] where will the results of these investigations go? [19:57] a link to a doc from the blueprint would be ideal [19:59] enjoy the bar !!! === udsbotu changed the topic of #ubuntu-uds-client-2 to: Currently no events are active in this room - http://summit.ubuntu.com/uds-1303/client-2/ - http://ubottu.com/uds-logs/%23ubuntu-uds-client-2.log [20:00] awe_: updated blueprint with notes and wis [20:00] thanks lool! === tvoss is now known as tvoss|eod