[00:17] ok, ubuntu store started to work after reboot but when I enter 'terminal' in search input, the results do not show it. === chihchun_afk is now known as chihchun [03:35] If I root my phone and then break it...how easily can I restore it to its former glory? [03:36] amazoniantoad: using the recovery built-in? [03:36] lotuspsychje: I guess? It's an aquaris e5. [03:37] amazoniantoad: system settings==> back to defaults [03:37] lotuspsychje: so pretty easily? My next question is...how can I root it? Google isn't being my friend. I need to emulate whatsapp [03:38] amazoniantoad: there is a whatsapp webversion for touch, why do you wanna emulate? [03:39] amazoniantoad: not a good idea to root the phone for security neither [06:47] can ubuntu touch be used on touch screen lapyops to support the screen rotation features? [06:48] SFC: ubuntu touch is not meant for laptops [06:48] !devices | SFC [06:48] SFC: You can find the full list of devices, official images, community images, and works in progress at https://wiki.ubuntu.com/Touch/Devices [06:48] SFC: use ubuntu desktop for tablets [06:49] SFC: but devs are working on convergence, so who knows what will be possible in the near future [06:49] how would i get the screen rotation for a touch screen lapto to be aa usable feature in ubuntu [06:50] SFC: for now, not yet [06:50] roger [06:50] SFC: unless you port ubuntu touch specificly for your device [06:50] would it work? [06:51] SFC: but ubuntu desktop has touch support, so your device can run already but wihtout rotation [06:51] SFC: if the port is succesfull, sure [06:51] SFC: but its a hard job [06:52] i see [06:52] SFC: or check the XDA forums if someone already has an existing project [06:53] i was just thinking it would be more user friendly when using touch screens to have the screen rotation [06:53] SFC: for now its not possible yet on your device [06:54] SFC: works like a charm on nexus7 though... [06:54] i have a feeeling it will be sooner or later [06:54] SFC: yeah me too [06:54] SFC: lets hope one day touch can be installed on every device :p [06:56] i like what they did with vlc where you can use the vlc --video-wallpaper command [06:56] and agrred to your last statement [06:56] :p [06:57] i run ubuntu already on all my devices [06:57] bq, nexus7, netbook,desktop [06:57] so there is a chance though if i port it say the tablet version [06:57] of ubuntu touch [06:58] SFC: if your handy enough to do, sure look at the porting guide in topic [06:58] between me and my brother we could do it [06:58] SFC: you would make a great deal for the community if succeed [06:59] we both picked up computers back when windows 95 came out [06:59] SFC: but keep in mind, wont be simple [06:59] and now that im out of the U.S. Army were both going to college for computers [07:00] bbl mate breakfast here === shuduo_ is now known as shuduo === shuduo is now known as shuduo_afk [19:07] ajalkane: how are you? [19:10] good [19:11] oops :D [19:45] hello [19:46] anyone has information on China Mobile N1 Max Maruko [20:05] Clubuntu: oh thats the china mobile device? [20:06] but it will be released officially as a device for convergence? [20:09] Clubuntu, you can judge by the SoC whether it can do MHL or Slimport. [20:10] and so who took Bq or meizu? [20:14] the bq or meizu dont support convergence since the chip doesnt support it in that devices [20:17] so I should sell my meizu just released the new device [20:23] Clubuntu: hint: http://www.phoronix.com/scan.php?page=news_item&px=Aethercast-Ubuntu-WiDi [20:25] not sure how latency will be.. [20:25] but in the end it means proper bluetooth support and flash size and ram size are the key features [20:27] Thank you, we hope you will be able to do something [20:29] mx4 display is actually that large that you only need a cardbox to head mount the mx4.. [20:33] then look and see what happens [20:37] hello, anyone knows an example on how to read the data of a GL handle/texture in a QML ubuntu-touch app? I cant get a valid GL context where I have access to the texture id ... I am trying some camera augmented reality [20:40] BlackJohnny, it's quite difficult to do low-level gl operations from within qml, unless you can express what you want to do in a shader [20:40] BlackJohnny, people usually do the heavy-lifting in C++, and provide qml bindings to their code [20:40] tvoss, I am using QML with C++ backend [20:41] BlackJohnny, ah okay [20:41] tvoss, on ubuntu alone i get a buffer that I can easily access however in ubuntu-touch i get a texture id ... [20:42] BlackJohnny, yup, camera data streams never hit main memory unless explicitly requested [20:42] BlackJohnny, on ubuntu touch, that is [20:43] tvoss, so there is a way to ask/configure for this? I am basically handling the preview/viewfinder [20:43] BlackJohnny, I think you want to take a look at: http://bazaar.launchpad.net/~phablet-team/qtubuntu-camera/stable/files/head:/src/ [20:43] tvoss, thanks ... an example is "goldlike" [20:43] :) [20:44] tvoss, hmm I've been there before :) [20:45] BlackJohnny, http://bazaar.launchpad.net/~phablet-team/qtubuntu-camera/stable/view/head:/src/aalvideorenderercontrol.h and http://bazaar.launchpad.net/~phablet-team/qtubuntu-camera/stable/view/head:/src/aalvideorenderercontrol.cpp [20:46] BlackJohnny, I'm assuming you are familiar with http://doc.qt.io/qt-5/cameraoverview.html [20:46] ? [20:47] tvoss, the texture handling is done by some QT class ... dont remember exactly which one [20:49] BlackJohnny, okay, so what problem are you trying to solve then :) if you have the texture id, a glReadPixels would give you the raw pixel values in main memory (not that glReadPixels has a huge performance penalty, though) [20:51] tvoss, I dont get there because QOpenGLContext::currentContext() returns 0 [20:51] tvoss, maybe it is something i do wrong [20:51] tvoss, thanks [20:51] BlackJohnny, okay, so you would have to make sure that you only invoke any calls to gl if you are on the render thread. ensuring that depends on how your code is structured [20:52] tvoss, i am in a QAbstractVideoSurface::present implementation [20:52] BlackJohnny, ah, how do you do the actual rendering? [20:53] tvoss, I am handling the communication between Camera and VideoOut [20:54] tvoss, I basically want to alter the frames adding some AR on top of what camera sends to the Videout renderer [20:55] tvoss, I am wandering how tagger app works [20:55] tvoss, I will check that [20:56] wondering :) [20:56] BlackJohnny, I was about to say: tagger does exactly that :) [20:56] tvoss, :) [20:56] BlackJohnny, in case you are still searching: https://launchpad.net/tagger [20:56] tvoss, I am there already thanks :) [20:59] BlackJohnny, ack. so I did hud-like functionality before by implementating a QAbstractVideoSurface in a QQuickItem [21:00] BlackJohnny, more specifically: I used an http://doc.qt.io/qt-5/qquickframebufferobject.html as I already had quite some gl code available for the rendering [21:03] tvoss, I will check that, thanks [21:13] tvoss, m_mainWindow->grabWindow() :)) [21:13] tvoss, that is what tagger does [21:13] tvoss, to get the camera information ... I am 99% sure :) [21:14] tvoss, I will check that QQuickFramebufferObject lead [21:15] tvoss, have a nice day [21:15] BlackJohnny, that's a glReadPixels, though :) I think the more elegant approach is to implement a QAbstractVideoSurface in terms of QQuickTime. You would just store the frame handed to you in present() and render it (with stuff on top of it) in http://doc.qt.io/qt-5/qquickframebufferobject-renderer.html#render [21:15] BlackJohnny, yup, you too [21:15] tvoss, actually I do also edge detection and want to feed that to an neural network :) [21:16] tvoss, anyway thanks. I will continue later [21:16] BlackJohnny, well, you could do edge detection in a shader (which is a lot faster) and grab the resulting features, less bandwidth needed usually :) [21:17] tvoss, I implemented already the Sobel alg [21:17] tvoss, and it is working on Desktop but not on mx4 [21:18] BlackJohnny, ah okay, so you are calculating the sobel image on the cpu, correct? [21:18] tvoss, yup ... I do online learning with my net and that is sufficient in terms of performance [21:20] BlackJohnny, ack, so yeah, you still could do a glReadPixels in render, and overlay detected edges plus anything else you calculate in there. render is called with a valid gl context