[03:53] <techgesture> I'm probably going to ask the dumbest question that everyone asks - but I have to
[03:54] <techgesture> I have a stand alone physical ubuntu machine that I setup
[03:54] <techgesture> I followed the get started guide at https://jujucharms.com/get-started
[03:54] <techgesture> and I can't for the life of me figure out what the URL should be to open the juju gui from my other desktop on the local lan
[03:55] <techgesture> I try http://192.168.1.101... but its not there - any thoughts?
[03:55] <hatch> techgesture: so did you deploy the GUI to an lxc on that machine?
[03:56] <techgesture> I have no idea - like I said I just followed the steps on the get-started url
[03:56] <hatch> ok let me take a look at those
[03:56] <hatch> ok yeah looks like it
[03:56] <techgesture> when I do a "watch juju status"
[03:57] <techgesture> I see juju-gui as a service
[03:58] <techgesture> cs:trusty/juju-gui-48
[03:58] <techgesture> charm: cs:trusty/juju-gui-48
[03:58] <techgesture>  charm: cs:trusty/juju-gui-48
[03:58] <techgesture>  charm: cs:trusty/juju-gui-48
[03:58] <techgesture> the status has -    charm: cs: trusty/juju-gui-48
[03:58] <hatch> so you'll need to mount the port the gui is listening on on that lxc to the host machine
[03:58] <hatch> then you'll be able to access it from your other machine
[03:58] <hatch> right now you're only going to be able to access your test machine
[03:58] <techgesture> is there a juju command to do that?
[03:59] <hatch> no this would be independant of your own machine
[03:59] <hatch> does your other machine have a desktop?
[03:59] <hatch> your standalone machine
[03:59] <techgesture> how do I know what port that lxc is listening on?
[04:00] <techgesture> the standalone - no, its running ubuntu server
[04:00] <hatch> when you run `juju status` you will be able to see the ports that are opened for the GUI
[04:00] <hatch> but it's 80 and 443
[04:01] <techgesture> it has public address of 10.0.3.250... is that on some kind of SDN on the machine for the containers? thats not the vlan for my network
[04:01] <techgesture> so I assume I need to map that 10.0.3.250 to the localhost port 80...?
[04:01] <hatch> the 10.0.3 space is what will be assigned for lxc instances
[04:02] <techgesture> so how to I get the lxc port 80 mapped to be exposed out?
[04:02] <hatch> so it's going to be something like...
[04:04] <hatch> sudo iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 1234 -j DNAT --to 10.0.3.250:443
[04:04] <hatch> I think 
[04:04] <hatch> :)
[04:04] <techgesture> shouldn't this be in some documentation somwhere? I can't be the first person to ask this
[04:05] <hatch> honestly first I've seen
[04:05] <hatch> you'll have better luck running it on your desktop or a cloud provider like aws for now 
[04:05] <hatch> I'm just not too up on my iptables
[04:06] <techgesture> I work for a cloud provider - and we are working on bringing in juju, so I'm sure this is something we will have to figure out - we have smarter engineers than I
[04:07] <techgesture> thanks hatch - it gives me a direction to go in
[04:07] <hatch> did you try my command above?
[04:07] <techgesture> should I copy and paste it just like that?
[04:08] <hatch> well I did just wing it so you might want to verify it :)
[04:08] <hatch> yup quick google search matches https://www.computersnyou.com/3047/forward-port-lxc-container-quick-tip/
[04:09] <techgesture> so it should be at http://192.168.1.101:1234/ ?
[04:09] <hatch> from your desktop yes
[04:09] <hatch> if that 192 is your ubuntu machine
[04:10] <techgesture> right, ok
[04:11] <techgesture> yes - yours did it... had to do https
[04:11] <techgesture> got it
[04:11] <hatch> basically it's mapping all requests to port 1234 on your host machine to the 443 port on your lxc
[04:11] <hatch> great :)
[04:11] <techgesture> thanks a million
[04:12] <hatch> glad to help
[15:34] <arosales> rick_h__:  https://github.com/juju/juju-gui/issues/1455 was the juju-gui install hook failure I was referring to. I was able to easily reproduce on our test zVM
[15:35] <rick_h__> arosales: ty, looking
[15:35] <hatch> arosales: thanks - we don't have a stable xenial version of the charm yet but this is good to know
[15:35] <arosales> rick_h__: thanks
[15:36] <arosales> hatch: I don't think I have seen this on my x86 xenial, but noted you don't have a xenial stable charm release
[15:37] <hatch> forgive me, but what's a zVM? :)
[15:37] <rick_h__> hatch: yea, this is going to be a problem with the python stack on z vs amd64
[15:39] <hatch> so this bug is actually in the wrong repo then :) arosales do you mind if I move it to https://github.com/juju/juju-gui-charm ?
[15:40] <rick_h__> hatch: can you copy/paste it and then @arosales in the new bug please?
[15:40] <hatch> yop that ws the plan!
[15:40] <rick_h__> oh sorry, read that as asking him to copy it 
[15:42] <hatch> moved https://github.com/juju/juju-gui-charm/issues/43
[18:35] <arosales> anyone else getting a service not available at jujucharms.com/store?
[18:35] <arosales> "Sorry for the inconvenience, please pop back soon."
[18:36]  * arosales hard refreshing
[18:36] <arosales> still getting error message
[18:42] <rick_h__> uiteam ^
[18:44] <fabrice> arosales: we have a know issue were charmstore timeout from time to time on the store page only
[18:44] <fabrice> we are investigating the issue
[18:47] <arosales> fabrice: thanks 
[18:47] <urulama> arosales: seems openstack charms were being published at mass again https://api.jujucharms.com/charmstore/v4/changes/published?start=2016-03-08
[18:48] <arosales> fabrice: for debug info I also hit it when entering "jujucharms.com/q/neutron" into my browser.  I also hit it when clicking on the charm, https://jujucharms.com/neutron-gateway/trusty/10, from a charm search
[18:48] <urulama> arosales: hm, or not ... as fabrice said, we're trying to figure out what's going on
[18:49] <arosales> ok, I'll leave it in your capable hands. Just thought I would do a quick ping in here. Let me know if you would like me to open a bug or provide any other details.
[18:49] <fabrice> I have the log from prod so no need
[18:50] <fabrice> but thanks