[00:01] wagonboi: I think that goes into your main nginx config though. there's no concept of AllowOverride in nginx, iirc [00:01] dunno, I'm sure it's not difficult at any rate :) [00:01] Turning off password as well as root login, and only allowing public-key login with SSH is a good idea. [00:02] yup [00:02] worst comes to worse you can always get in via your provider's serial/vnc/whatever console [00:03] if your provider doesn't have that you should find a better provider :) [00:03] They offer it. It is Ramnode, and strangely they haven't been metering my bandwith. I'm supposed to have 1000gb and it's stuck at 18k [00:04] Not that I'm complaining :) [00:04] they may only query X amount of times per month. [00:05] ah, that makes sense. I use to use WeLoveServers and they refreshed the meter as soon as I refreshed the page [00:05] ah [00:06] things like that can start to get really expensive in terms of page load [00:09] HI does anyone know what type of interface ax0 is, i.e. ifconfig ax0 ? thx [00:09] rostam: dmesg | grep -C 5 ax0 [00:10] rostam: quick google search though looks like it's a packet radio interface? [00:14] jkitchen, thanks, It is funny that interface exist on my system and have no idea why... [00:15] dmesg will tell you why [00:17] where are you seeing this interface anyways? in "ip a" output? or /etc/network/interfaces, or? [00:18] is nfs robust over an internet connection? [00:19] nfs is sensitive to latency [00:19] very sensitive to latency. [00:21] so, while it may function, I wouldn't expect any sort of performance out of it. [00:21] I need a remote filesystem with local caching that culls when it grows to a certain number that is happy to live on an internet connection. [00:22] culls when it grows to a certain number? [00:22] could you elaborate a bit on that requirement? [00:23] you could use something like s3 or webdav with a local varnish server for caching, perhaps. but that's not going to give you posix semantics, of course. [00:24] jkitchen: VPS has limited storage. I have a server at my house with a lot of storage. I'd like the VPS to have a directory mounted on the home server, but with local caching of common and recent files so performance is usually good and my internet connection is hit less hard. It's just a personal web app so it doesn't need to be amazing performance all the time. [00:25] there is unlikely anything which will meet your requirements [00:25] but presumably the files you're talking about are being served up via http? [00:25] you could run varnish on your vps and have it talking back to your server at home [00:25] it'll keep a local cache [00:25] nfs has cachefilesd [00:26] davfs2 has a problematic caching component. [00:26] ok [00:27] I'm unfamiliar with varnish. That sounds like the files would all still need to exist on the VPS. The //main// problem is that the VPS has limited storage. [00:27] nope [00:27] varnish is exactly an http cache [00:27] so if these files only need to be cached and accessible to http clients, it's pretty much exactly what you're looking for. [00:27] but if they also need to be accessible to local vfs clients, it's not [00:28] its a drupal install, and I'm looking to remote store *with local cache* its "files" directory. Wouldn't such an implementation confuse drupal? [00:28] probably [00:28] but there may be plugins for drupal for storing files in, say, ec2 [00:29] I know there are for other web frameworks [00:29] s/ec2/s3/ [00:29] there are. I already have the home server though and don't want to pay for ec2. [00:29] I know we use one for our rails app [00:29] you miss my point [00:29] my point is that if there's a plugin for s3, it can probably be adapted to use, say, a webdav backend [00:29] and you have a webdav server at home that drupal can write to [00:29] via the plugin [00:30] varnish serves it up to clients [00:30] and locally caches [00:30] you might be able to get away with it with nfs, but I wouldn't bank on it [00:30] varnish will also handle the situation of the backend being inaccessible in a predictable fashion [00:31] files it has cached which are still valid will be served, no problem [00:31] files which aren't cached will simply error [00:31] probably with a 503 until it can confirm that the file does or does not exist on the backend [00:33] whereas nfs may just hang [00:33] you know, I was looking at implementing nginx anyway. Maybe I could use Varnish instead and see if I can't do what you're suggesting. It sounds elegant. [00:33] well, varnish is just a cache [00:33] it's not a webserver itself [00:34] you can use it in front of a webserver, but you still need a webserver [00:38] you would more than likely in this case want to use it *behind* your nginx [00:38] and have nginx proxy requests to /files to it [00:38] I could do that too [00:39] so it would be like nginx -> varnish -> http-at-home [00:39] plugin for drupal takes uploads and shoves them to home server [00:39] maybe even is smart and preloads varnish [00:40] hmm... okay, trying to wrap my head around how this is organized [00:40] so... [00:40] also, varnish is by no means tho only thing that does this [00:40] it's just the one I've seen used the most lately [00:41] so the means to get files from drupal to the home server is completely separate from varnish, correct? [00:41] correct [00:41] you can use the same endpoint on the home server (assuming webdav) [00:41] well, neat! I am already doing that with sshfs. [00:42] cool [00:42] I would not use sshfs though [00:42] So I could then expose the same directory in webdav [00:42] yea [00:42] why not? [00:42] because sshfs is a posix layer [00:42] meaning transient connectivity issues can cause weird problems [00:42] whereas webdav is just an http request [00:43] so you'd use webdav to write the file from drupal [00:43] you're right. [00:43] hopefully I can figure out how to do that. I'm not savey enough to modify an s3 module I think. [00:43] or is s3 just webdav? [00:43] s3 is not webdav [00:43] one sec [00:44] I'm seeing if there's already some stuff available [00:44] Thanks. I'm looking too. [00:44] webdav may also not be the way to go [00:44] it's just what I would probably use if I had to do it right this second [00:44] webdav is easy to set up and there are plenty of clients for it [00:45] I take it this is what you're using for s3? https://drupal.org/project/amazons3_cors [00:45] I'm not using s3. [00:45] (note, I have nearly 0 experience with drupal, so I'm just flailing) [00:45] ahh ok [00:45] you might *start* by using your already-in-place sshfs thing [00:46] to get you going, proof of concept the varnish setup and such [00:46] baby steps sort of thing y'know :) [00:46] I might end up doing that but I suspect drupal using a remote file store instead of a local one is not an uncommon config. [00:47] https://drupal.org/project/amazons3 [00:47] that actually looks more like the thing to look at [00:47] for an example [00:47] just replace http calls to s3 with wedbav calls to your home server [00:47] yes was looking at that [00:47] I'm not comfortable editing the module. [00:47] also: you'llwant to make sure you properly secure the webdav server at home [00:48] use authentication and probably ssl [00:48] yes. I've set up webdav with certs and authentication before. [00:48] even if it's just self-signed ssl [00:48] ahh ok [00:48] I don't mean to patronize :) [00:48] Nah, you're fine. You don't know what I know already. Better to be safe. [00:49] you might pop into a drupal channel or mailing list and see if someone has done a webdav version of that s3 plugin [00:49] hell, maybe there's even an ftp version [00:50] whoknows?! [00:50] webdav just felt like the right choice :D [00:50] I hate ftp. cleartext passwords? [00:50] can use ssl on control connection! [00:50] ;) [00:51] if the module supports it [00:51] anywho, hopefully that will get you going [00:51] I think it will. thanks! [00:51] like I said, you might ust start by layering in the varnish part [00:51] I need to read up on varnish. I don't know anything about it really. [00:51] since you already have a transport mechanism for writes [00:51] via sshfs [00:51] hmm. [00:52] and that may end up being an acceptable solution for you, in which case boom. [00:52] you're golden [00:52] what if I upload the file manually and post a link? https://drupal.org/project/remote_file_source [00:52] I'm a fan of keeping file uploads in-app [00:53] * Lownin nods [00:53] and letting the app deal with putting the file in the right place and generating the proper url to reference it [00:53] otherwise yea, you could do that and not even need a plugin [00:53] and have varnish cache it [00:54] you'll also probably want to monitor the cache [00:54] make sure your miss rate isn't too high, you're not expiring things too quick, you're not evicting to oearly, etc [00:54] like ify ou get a surge in traffic to some large files and it starts kicking out commonly-accessed assets, that's not good :) [00:55] maybe at that point you need to invest in a bit larger disk for the cache [00:55] yeah. [00:55] because you're trading the expense of disk on your vps for the limited bandwidth of your home internet connection [00:56] so you want to minimize what has to be pulled out of that home connection [00:56] exactly. [00:56] there's a break even point, and that may move [00:56] and monitoring things will let you know when you need to change ti [00:57] also, and this may not even be a consideration, but most VPS providers I've seen measure bandwidth usage by adding both inbound and outbound [00:58] so if you have a lot of cache misses you may be effectively doubling the bandwidth consumption for that file [00:58] so keep *that* in mind, too :) [00:58] hmm [00:58] reading a bit about how varnish works... [00:59] I'd like some of these files to only be accessible to users who are authenticated in drupal and have had the files shared with them. [00:59] making the file somewhat non-static. [01:00] you might be able to get away with that via X-Sendfile [01:00] it appears varnish only works in the context of anonymous page views [01:00] googling. [01:00] I don't know if the X-Sendfile header generally allows for an http source [01:00] I know apache's X-Sendfile is specifying a file on disk [01:01] looks like lighty can do it: http://blog.lighttpd.net/articles/2006/07/22/mod_proxy_core-got-x-sendfile-support/ === Tm_T_ is now known as Guest28208 [01:06] ahh, maybe not, bummer. [01:08] feature request! === Ursinha_ is now known as Ursinha === NightmareMoon is now known as Luna [01:08] allow X-Sendfile headers to provide a url which will be requested via the various proxy mechanisms [01:09] I found this- https://drupal.org/project/storage_api === daker_ is now known as daker === xerxas_ is now known as xerxas [01:10] Prevents files being served to users who are not authorised. [01:10] Depending on the service, this can have significant performance implications. [01:10] Even without this enabled, URLs will only be generated for files that the user has permission to access. [01:10] I've just installed it and am playing with it. it appears to allow the generation of static links but with url generation only avalible to users who have permission to the view file. it can also do "access control" but states " [01:10] whoops [01:10] sorry weird irc client [01:10] anyway [01:11] FTP - files are uploaded to a directory via FTP. A URL prefix can be defined for serving. [01:11] I assume they're talking about things like varnish ;) [01:11] so I may persue this route [01:11] security through obscurity may be good enough for this application. [01:11] that seems to be where you would want to shim in webdav support [01:11] replace the ftp uploading bit with webdav upload [01:12] anywho [01:12] good luck :) [01:12] thanks so much for taking time to help me. [01:12] I'm a bit jealous, actually, this sounds like it could be a fun project. [01:13] * Lownin smiles [01:13] Yeah I'm having a lot of fun with it so far. [01:13] a little more complicated than I'd generally have done, I'd probably just throw it on S3 and call it good. [01:14] but that's the lazy route :) [01:14] I'd love to use s3 [01:14] I'm broke [01:14] yea [01:14] I hear you [01:21] wait... [01:21] okay [01:21] still there jkitchen? [01:22] yup [01:22] FTP, uploads to homeserver [01:22] cool. [01:22] but the url now has to be homeserver/stuff [01:22] how does varnish intercept that? It can't [01:22] it can't [01:23] does it actually have to be homeserver? [01:23] or can it be yourserver/otherstuff? [01:23] no but I don't understand how varnish knows to go to my homeserver for stuff. [01:23] oh [01:23] you tell varnish to use your homeserver for its origin [01:23] and then point clients at the varnish [01:26] is the "origin" any of the config options referenced here? https://www.digitalocean.com/community/articles/how-to-install-and-configure-varnish-with-apache-on-ubuntu-12-04--3 [01:27] https://www.varnish-cache.org/docs/3.0/tutorial/backend_servers.html [01:28] oh perfect. thank you! [01:28] disclaimer: I haven't personally ever actually even looked at a varnish config. [01:28] heh [01:28] another team at $OLDJOB used it extensively [01:29] but it's a pretty simple program, really [01:34] gah [01:34] hmm [01:34] thought [01:35] cached pages ala the ones server by varnish will mess with piwik javascript, won't it? [01:37] Lownin: I would put varnish in front of your home server for now and proxy back to it for certain urls via your primary webserver [01:37] so like, in apache you might to ProxyPass /uploads/ http://localhost:8002/ [01:37] and varnish listening on port 8002 caching things from your home server [01:37] rather than putting it in front of the entire app [01:38] eventually maybe having varnish live in front of the entire app, but that's once you're more familiar with varnish and such. [01:39] clients would then pull data on 8002 for things in /uploads? [01:39] clientside? [01:40] they could, and it would seem that ftp plugin would support that and that would be valid [01:40] or you could have it transparently done through nginx [01:40] which is what I would do, personally [01:40] through nginx/whatever web server you're using [01:41] the ProxyPass example I mentioned above would be for apache, but nginx can do the same thing [01:41] syntax may be different but it's the same concept [01:41] I'm on apache at the moment. was only looking at nginx [01:41] I don't want clients to connect on anything other than 80/443 anyway [01:42] then yea, look at apache's mod_proxy and the ProxyPass directive [01:42] it's pretty straightforward === justizin_ is now known as justizin [07:10] hey guys i am setting up apache2-mpm-event + php-fpm and mod fastcgi on kubuntu 13.10 for website development and testing on my localhost yet i am getting errors forbidden You don't have permission to access /php5-fcgi/index.php on this server. [07:10] any help would be greatly appreciated === frojnd_ is now known as frojnd [09:20] evening all === Maple__ is now known as Guest8737 === zz_Gurkenmaster is now known as Gurkenmaster === Guest8737 is now known as Mapley === ikonia_ is now known as ikonia [14:25] Would anyone be willing to help with a network adapter question? Running 13.10 Desktop, 3.11.0-12-generic on a new HP/AMD64 desktop. 13.04 used to work, but now ifconfig does not show any usage metrics on my eth0 device. Ifconfig: http://paste.ubuntu.com/6312474/ Adapters: http://paste.ubuntu.com/6312520/ Nothing in dmesg relating to the adapter === freeflying is now known as freeflying_away === freeflying_away is now known as freeflying [14:28] I'm configuring apache, but the vhosts give me a headache [14:30] For some reason, it just doesn't pick the right vhost. /var/www/vhosts/domain.com/httpdocs is my document root, but if I go to domain.com, there is no index.html [14:30] But it is there [14:31] I 3 double checked my vhost file for domain.com, so I think this is an DNS issue [14:31] Any ideas? [14:43] Unkn0wn: check your logs [14:44] Figured out already... Wrong IP in the DNS [14:44] 65 instead of 56... === _Sieb is now known as Sieb === Sieb is now known as _Sieb === _Sieb is now known as Sieb === Gallomimia_ is now known as Gallomimia === Sieb is now known as _Sieb [16:52] If I want to roll back to openjdk-6-jre do I have to uninstall openjdk-7-jre or just install 6 right over 7? [17:02] Beatstreet: you can install 6 and 7 along side each other [17:03] hey all i am unable to get apache 2.4 to work and display a site for me on localhost any help would be greatly appreciated to help me whats missing [17:03] i am using an apache2-mpm-event + phpfpm and mod fastcgi [18:13] does anyone know the performance difference between raid10 over mdadm versus lvm? [18:17] and does grub2 support booting either without a seperate /boot partition? [18:40] Hello all, I am trying to connect to a NFS server as a client on Ubuntu server 12.04, but I'm getting this error: mount -t nfs 127.0.0.1:/z/test ~/mnt mount.nfs: rpc.statd is not running but is required for remote locking. mount.nfs: Either use '-o nolock' to keep locks local, or start statd. [18:41] it seems to be related to rpcbind. The problem is rpcbind package is in conflict with nfs-common, so both cannot be installed together. [18:41] and -o nolock simply results in... [18:42] mount.nfs: No such device [18:42] is there a way to update openssl on 10.10 ? [19:10] Hello, is there anyone I can talk to about an issue I'm having with ubuntu server? [19:11] there's a few people here, but I think most are away from keyboard :/ [19:11] !ask | point [19:11] point: Please don't ask to ask a question, simply ask the question (all on ONE line and in the channel, so that others can read and follow it easily). If anyone knows the answer they will most likely reply. :-) See also !patience [19:11] :) [19:13] Ok i will just ask the question hold on [19:23] Hey guys, i'm having some issues with setting up a cron job. This is the first time I do this, I want to run a script every 2 minutes. So I am typing: */2 * * * * myuser /opt/plexWatch/plexWatch.pl [19:23] this however returns: -bash: */2: No such file or directory [19:23] and I am within the /etc/ folder [19:23] where crontab is [19:23] So I recently set up a server with an old computer in my home. I can control it through SSH on my laptop. I can put files on it and stuff so in that regard everything is working perfectly. Only recently after a couple of days where everything worked as expected, the screen I have connected to the server gave me a continuous error message: "hub 1-0:1.0: unable to enumerate USB device on port 1" . This error message is only displayed on [19:23] the screen that is connected to the server. When using SSH it's not displayed. I would find this problem not so important if everything else worked fine, but that is not the case anymore. I have 3 problems with this. 1st I can't turn off the server remotely anymore (sudo shutdown -h now) 2nd I can't turn on the server remotely anymore by using WOL (I was able to do this). 3rd (smaller issue) when I want to input commands into the server [19:23] on the server itself (not using SSH) my commands work, but aren't readable because the error message keeps popping up. The third issue could be fixed by turning of the error message itself, but I figured by solving it entirely by just fixing the issue. I have looked for a couple for two days now for a fix, but nothing really came up. I only have a keyboard connected by USB to the server, but even when unplugged and restarting the server, [19:23] it still shows the error message. Anyone know how I can fix this? [19:25] In addition, if I put it to run every minute, so I type: * * * * * singularity9 /opt/plexWatch/plexWatch.pl [19:25] it returns: No support for device type: power_supply [19:25] O_o [19:26] first time i tried using cron it said it needed 'acpi' … so i installed that [19:26] if I run "/opt/plexWatch/plexWatch.pl" it works as expected.. [19:27] point: what version of ubuntu do you have? It seems newer linux kernel fixed that issue [19:28] 12.04.03 LTS [19:28] 12.04.3* [19:28] hmm definately shouldn't be happening then. :( [19:29] singularity9: cron can be a pain sometimes, you might want to try watch inside of a screen (messy but eh) :/ [19:46] Hey all, newb here I wanto host a website from home using php etc, I've tried winserv 03 and its a damn headache decided I should try out ubuntu server [19:47] I'm wondering how hard it's going to be to setup an ubuntu-server, config a webserver that can host php, and mysql databases etc === justizin_ is now known as justizin === chuck_ is now known as zul [21:02] Hey all, wanting to setup a server from home to host a website with php scripts, maybe later on mysql databases and other website things. I have an old machine Pentium 4 2.80ghz 32bit, 3 GB Ram [21:03] I'm wondering if I can use the latest ubuntu server or will i need to use a later version and if I do will it be safe and secure? [21:06] naz: You should use 12.04 LTS. [21:12] So I recently set up a server with an old computer in my home. I can control it through SSH on my laptop. I can put files on it and stuff so in that regard everything is working perfectly. Only recently after a couple of days where everything worked as expected, the screen I have connected to the server gave me a continuous error message: "hub 1-0:1.0: unable to enumerate USB device on port 1" . This error message is only displayed on [21:12] the screen that is connected to the server. When using SSH it's not displayed. I would find this problem not so important if everything else worked fine, but that is not the case anymore. I have 3 problems with this. 1st I can't turn off the server remotely anymore (sudo shutdown -h now) 2nd I can't turn on the server remotely anymore by using WOL (I was able to do this). 3rd (smaller issue) when I want to input commands into the server [21:12] on the server itself (not using SSH) my commands work, but aren't readable because the error message keeps popping up. The third issue could be fixed by turning of the error message itself, but I figured by solving it entirely by just fixing the issue. I have looked for a couple for two days now for a fix, but nothing really came up. I only have a keyboard connected by USB to the server, but even when unplugged and restarting the server, [21:12] it still shows the error message. Anyone know how I can fix this? (I asked this question a couple of hours ago but to no avail :( hoping anyone that logged in now will be able to help me) === Gurkenmaster is now known as zz_Gurkenmaster [21:19] thanks bekks, was worried since the machine is so old it might run slow but I will give it a try :) [21:20] Well, it will run slow, but you should use a LTS release for a server at least. [22:31] jdstrand: please could you comment on bug 1245251? The reporter claims this is a regression. [22:31] Launchpad bug 1245251 in libvirt "Apparmor blocks usb devices in libvirt in Saucy" [Undecided,New] https://launchpad.net/bugs/1245251 [23:15] Ok so I reported this problem 2 times before but I'm going to give you a small recap because something interesting just happened. [23:15] So my server gives me the error "hub 1-0:1.0: unable to enumerate USB device on port 1" [23:15] But it stopped now [23:15] I typed in lsusb to give me an overview of all usb devices [23:16] and now it shows a device it didn't show when I had the error [23:16] Silicon 10 Technology Corp. Flash Card Reader [23:17] So I'm going to try to unplug this thing tomorrow [23:17] but does anyone have any idea what could have triggered the stop of this error? [23:18] just curious [23:19] But it will probably just work when I have unplugged the card reader so it doesn't really mather [23:22] is there a way to update openssl on 10.10 ? [23:34] 10.10? that hasn't been supported for a long time [23:34] 10.10 was end of life in april 2012 [23:40] the good news is that upgrading to a supported release is relatively easy [23:40] the bad news is you'd have to do three upgrades to reach a supported release === bazhang_ is now known as bazhang