=== YamakasY_ is now known as YamakasY === peter is now known as Guest66287 === RaptorJesus_ is now known as RaptorJesus === RaptorJesus_ is now known as RaptorJesus [08:17] question, I am working at a place that uses a shared file system and therefor there's 1 server which they install all the apps on and the rest of the workstations are running it from the server. each user has a homedir with a small quota, Also the server has eclipse installed on it so every time a user is running eclipse from his workstation, eclipse is using quota space to store osgi/bundles and that maxes out the quota, now my question, is there any w [08:18] config eclipse to use a fixed location for those osgi/bundles? [08:18] so it wont have to deploy those osgi/bundles locally for each user's homedir? [12:56] I have a server at home on which i host a website with some domain, i also have two internet connections from two internet providers. How can I use the second connection as a backup so that when the first one is unavailbe, my website will still be reachable through the domain? [13:03] you have to switch the dns entry from the first to the second provider. that will be an issue. [13:08] is it possible to have two A-records [13:09] one for each gateway [13:21] vegnt: Did you see my suggestion for a load-balancing proxy outside your network? [13:38] TJ-, yes thanks [13:38] I need to read what that means exactly [13:38] if round robin dns with multiple a-records counts as load balancing [13:38] then i think that will work [13:50] It wont. [13:51] Because round-robin implies, that the failed connection will still be used. you have to disable the failing entry. Round robin doesnt help you at all at that point. [14:38] sarnold: ping. === lazyPower is now known as lazypower_travel === thesheff17_ is now known as thesheff17 [16:54] Hello there [16:56] hi [17:00] RoyK: What would you recommend I use to create a full backup of my Ubuntu VPS? [17:01] By backup I mean create a type of backup that I can download and restore elsewhere. I want to DL the backup to my local ubuntu machine [17:04] Joe_knock: I beleive deja dup is rather good [17:05] Will I still need to do an SQL dump even with a backup solution? RoyK [17:05] I use dirvish here myself. [17:05] Joe_knock: depends on the DBMS used [17:05] Joe_knock: which one do you use? [17:06] RoyK: MySQL [17:06] then make regular dumps [17:06] lordievader: But will the process I explained work with dirvish? [17:07] mysqldump --all-databases | gzip -c > mydump.sql.gz [17:07] or perhaps a separate dump per database for easier restore [17:07] RoyK: The server itself hasn't been used in a while, which is why I want to backup everything and cancel use. [17:08] then just make a dump of the database(s) and back it up [17:08] Joe_knock: even rsync will do it [17:09] Will I require another piece of software to DL whatever I've backed up? [17:09] Joe_knock: rsync -avPAHX yourserver:/ /place/to/put/it [17:09] Joe_knock: rsync -avPAHX root@yourserver:/ /place/to/put/it [17:09] make sure to enable ssh root login first if it's not enabled [17:10] Joe_knock: dirvish is a wrapper around rsync, if your server has ssh you can transfer the backup data. [17:10] RoyK: By using that command above, I will be able to take the remote file and put it on my local machine? [17:11] interesting, lordievader. [17:11] Joe_knock: the command above is for transferring the entire filesystem(s) from your server to your local machine [17:11] as it is or as they are [17:12] RoyK: I've read that I must avoid certain directories when doing a backup. Directories like /dev make it difficult to restore [17:12] Joe_knock: This is a nice guide on it: http://wiki.edseek.com/howto:dirvish [17:12] Joe_knock: then [17:13] Joe_knock: rsync -avPAHX --exclude=/dev --exclude=/proc --exclude=/sys root@yourserver:/ /place/to/put/it [17:13] should do it [17:14] RoyK: Based on what you've said, would my best solution be to: do an SQL dump first of the database(s) and then run the command you've given above so that all files and the SQL DB are backed up properly? [17:15] Joe_knock: the command above transfers the entire system - not just the database dumps [17:15] Joe_knock: if you only need the database(s), just copy the dump files [17:16] oh okay, So I can store the dump files on the remote server and they will be backed up onto my local machine too? I was thinking I would first DL the dump files separately and then run the command above for all other files [17:16] RoyK: ^^ [17:16] Joe_knock: erm - do you want to backup the whole server, or just the databases? [17:17] Everything, but I want dump files for MySQL so that I can restore it properly [17:17] Everything except /dev, /proc, /sys [17:17] then dump the database somewhere [17:17] and rsync the lot [17:17] that'll include the dump files, obviously [17:18] Thanks RoyK, that is what I will do. Your help is much appreciated [17:18] np [17:24] Hi, im using Ubuntu server and i have setup my web server with apache, webmin and virtualmin. What is the best way to get a email like info@domain.com to send and recieve? [17:25] !webmin [17:25] webmin is no longer supported in Debian and Ubuntu. It is not compatible with the way that Ubuntu packages handle configuration files, and is likely to cause unexpected issues with your system. [17:27] so what is the best and simplest way to run a web server? [17:27] i found that webmin is very easy thats why i used it [17:28] and also how can i fix so i cant have an email for my domains (i have three) [17:31] ahmadgbg: apt-get install apache2 [17:31] then configure it [17:31] it's not that hard [17:31] Royk: i already have it [17:31] just learn the commandline basics [17:32] but don't use webmin [17:32] Royk: the website is up and running, but i have a problem with the mail [17:34] then configure postfix or exim or whatever MTA you're using [17:36] MTA? [17:39] ye i have postfix [18:35] hi! i have 2 different physical servers both with virtualmin installed. server1 has DNS for domain.tld and managed to make it point to server2 for subdomain.domain.tld (external IP of server2) but now i want to have sub.subdomain.domain.tld public_html files also being on server2. how does one go for something like that? [18:48] ntz_: if subdomain.asdf.tld is delegated, all subdomains are too [18:50] RoyK: they are. but where do i define/create the sub.subdomain.asdf.tld? on server1 or server2 machine? [18:53] in the zone file? or apache config? I don't quite get it [18:59] oh, wanst aware you could do it on either. i would say go with the zone file [19:04] so which machine's zone file should i edit? [19:06] ntz_: if you have delegated the subdomain from server1 to server2, so that server2's DNS is the SOA for it, then server2 would generally be the one to host an additional subdomain zone file [19:09] TJ-: how can i be 100% sure this is the case? [19:10] i mean that i have correctly delegated the subdomain from server1 to server2 [19:12] see what the NS records are for the sub-domain [19:12] ntz_: "dig -t NS sub.domain.tld" [19:15] TJ-: ;; ANSWER SECTION: subdomain.asdf.tld. 38400 IN NS server1. [19:15] does that mean it's not delegated? [19:15] ntz_: Looks that way; depends where you are querying from and what the TTL on domain.tld is, of course. [19:16] i run the command from server1 [19:16] not delegated, huh? [19:58] hi. small question. what is the best filesystem for storing a unlimited amount of files with ubuntu? with ext4 it looks like there is a limited amount of files that can be stored in ubuntu. myself i want create hundreds of billions small files for a test project and i don't think ext4 will serve me! [20:01] xperia: Well, I doubt you can find any filesystem which allows you to store and unlimited amount of files. [20:01] xperia: That aid, ZFS is supposed to be able to handle 2^48 files and Btrfs is supposed to handle 2^64 files. [20:03] andol: thanks a lot for the reply! will look into it. have read that raiserfs should provide the possibility to store a unlimited amount of files. could this be true? [20:04] xperia: Realize what a large number 2^48 is, not to mention 2^64? [20:08] andol: yeah it is huge 2^64 => 18'446'744'073'709'551'616 i have heard that btrfs however is not really stable. one person i know tryed to use btrfs but he changed back. problem however could be maybe that he tried to use it on a raid array... [20:09] When I run this command: mysqldump --all-databases | gzip -c > mydump.sql.gz Where will my sqldump appear? [20:09] Joe_knock: where you execute the command [20:10] so if I ls, I will see it in the place I execute it? xperia [20:11] yes [20:14] Joe_knock: pwd also shows you the current working directory. [20:15] I guess it makes sense to store my SQL dump in a specific folder though [20:18] Joe: in this case use the full path you like as a exmple: mysqldump --all-databases | gzip -c > /tmp/mydump.sql.gz [20:44] i have a problem to install realtek ethernet NIC seeking r8169 driver [20:45] i get the message: INSTALL /home/fai/Downloads/r8169-6.018.00/src/r8169.ko [20:45] Can't read private key [20:45] DEPMOD 3.11.0-18-generic [20:45] what should i do about? [20:48] faiss, have you installed firmware-realtek? [20:48] i dont think so, how should i do? [20:49] faiss, not sure about ubtuntu, but on Wheezy this package solved all realtek driver issues [20:49] whats the package name? let us check it on apt [20:49] apt-cache search firmware | grep real [20:50] i think it gonna work :p [20:50] i forgot about [20:50] but i think its apt-file no? [20:53] faiss, firmware-realtek is not found on ubuntu-server 12.04 [20:55] no it works fine on 12.04 but not in 13.10 [20:56] i just upgraded to 13.10, since that time i have the problem [21:19] faiss: what does => lsmod | grep r816 return ? [21:22] xperia, it returns the "r8169 0" but after a simple reboot it returns nothing [21:23] hmmmm looks like you had loaded it somehow ... try [21:23] rmmod r8169 [21:23] modprobe -r r8169 [21:23] lsmod | grep r816 [21:23] what does it output ? [21:28] xperia, returns nothing [21:29] i'll test an other release 014 instead of 018 and i'll retest [21:30] so module could be not loaded then. some people report that this is actually not reall the right driver for the network card. can you post the output of this command here [21:30] lspci -v [21:32] done. ----> http://paste.debian.net/plain/90652 [21:33] okay let me check this here => Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168/8411 PCI Express Gigabit Ethernet Controller (rev 07) [21:34] faiss: yeah 8169 looks like is wrong! you need => r8168 ... [21:35] try this howto here => http://ubuntuforums.org/showthread.php?t=2204946 [21:36] even with r8168 i got the same problem, so i tried with r8169 because it worked for a friend with the same model of computer [21:37] i'll follow the howto again, and i'll retest [21:37] thank you xperia :) [21:38] faiss: check also this here => http://ubuntuforums.org/showthread.php?t=2201556 [21:38] has much more infos in case you still have problems. [21:39] with the module driver r8168 it should works [21:47] is it normal to wait 2 Hours for resizing a SSD Partition from 500 GB to 125 GB. Don't understand why resizing a SSD Partition takes that much long and is still not finished. [21:48] xperia, ifconfig -a shows p3p2 but no way to connect [21:49] xperia, what did you expect? [21:49] and no rules on /udev/rules.d/ for p3p2 [21:50] Patrickdk: Hmmm other Harddrives takes normally 1 Hour and the Job is done but maybe i don't remember anymore right. [21:51] faiss: what for a module is loaded when you do lspci -v it should ouptu something like "Kernel driver in use: ..." [21:51] well, it depends on the size of the file [21:51] if it's all large files, an hour [21:51] if it's all tiny files, it could take a long long time [21:52] huuuhhh well i have about 4'000'000 Tiny Files on the SSD [21:52] xperia, Kernel driver in use: r8168 and the ifconfig shows the ethernet card, and is activated using ifconfig p3p2 up [21:53] so are you able to "ping google.com" [22:00] faiss: should it p3p2 be actually eth0 ? [22:13] xperia, with all tiny files, your limited to drive latency :( [22:16] Patrickdk: i guess i need to leave it for the next 6 Hours then. Thans for the Info. [22:17] what model ssd? [22:17] must be an older one [22:20] Patrickdk: Samsung SSD 840 Series MZ-7TD500 [22:21] evo? [22:22] sas ssd's are still expensive as hell [22:25] Patrickdk: on the Package does not stay anything that is PRO i guess it is the EVO version. Planning to buy hopefully the 1 Terrabyte SSD in the next Months. [22:26] well, the rounded ones are evo [22:26] 250, 500, 750, 1000 [22:26] the base2 ones are pro, 256, 512 [22:27] ahh okay then it is EVA as i have the 500GB Samsung SSD [22:27] sorry EVO [22:27] ya, that isn't it's best workload :) [22:27] still sounds a bit slow though [22:28] I almost never shrink filesystems though [22:29] Patrickdk: problem is i just realized that i need to use from now on btrfs on all my new drives to aviod the inodes limits. so i started now to resize the SSD to test BTRFS a BTRFS partition. [22:31] Patrickds: what do you think about this 1 Terrabyte SSD its quite cheap only 50 Cents per Gigabyte => http://www.steg-electronics.ch/de/article/samsung-ssd-840-evo-basic-609032.aspx [22:43] depends on the usage [22:44] I've been leaning more twords the m500 960gb one [22:44] it's alittle slower, but not triple layer [22:44] for a laptop/desktop either should be fine [22:44] but I was going put it into server usage [22:45] though, the m550 just came out last week, and is close to the same specs, but still not triple layer [22:48] Patrickdk: ahh the "Crucial m500 960GB" interessting is in the same Price Range. [22:49] I'm looking at filling a few servers with those [22:49] When trying to run this command: rsync -avPAHX --exclude=/dev --exclude=/proc --exclude=/sys root@yourserver:/ /place/to/put/it I am getting a "Connection refused" message. [22:49] yeah has some less IOPs 80'000 IOPS vs. 98'000 IOPS [22:49] Joe_knock, and? [22:50] xperia, you will never see those iops [22:50] Patrickdk: rsync doesn't follow through then [22:50] unless you always have like a 32 qd [22:50] Joe_knock, that message is pretty self explaining [22:50] connection refused [22:50] what moer do you want to knwo? [22:51] Patrickdk: I'd like to know if there might be something wrong with my command. It says root@myserver.com port 22: Connection refused (but I changed my port [22:52] I don't see anywhere in your rsync command where you changed the port [22:53] Patrickdk: I changed my port a long time ago. Do I need to add the current port in my command? [23:04] Joe_knock: i am not sure but i thin you need something like this => root@yourserver:YourPortNumber/ /place/to/put/it [23:04] Do I have to use vconfig to create a VLAN? Can I just enable the 8021q kernel module and configure /etc/network/interfaces properly for a vlan interface? I am stuck in a vlan and can't apt-get install vlan [23:04] xperia: It seems even my mysql dump is being prevented for the root user [23:05] mysqldump? [23:06] Joe_knock: what do you mean with prevented for the root user. are you able to make a mysqldump as normal user but not as root? [23:06] WannaBeGeekster: I am trying to create a backup of it [23:06] You are using mysqldump and not mysql dump, right? [23:06] xperia: I tried as a normal user using sudo and I am in root right now getting this error: [23:06] mysqldump: Got error: 1045: Access denied for user 'root'@'localhost' (using password: NO) when trying to connect [23:07] ahh -p [23:07] mysqldump -p ... [23:07] Then you will need to put in your root password if you are running as the root user. If you are running as a userland user then use mysqldump -u root -p [23:08] You have to put user= and pass= in your .my.cnf to make the password work without having to supply it on the command line. You can make a heading specifically for mysqldump [mysqldump] user=root pass=mypassword [23:10] Success! [23:10] Is this something I should worry about: Warning: Skipping the data of table mysql.event. Specify the --events option explicitly. ?? [23:11] I wouldn't worry about it. Unless you really want to keep them. [23:12] Ahh. Actually. Do you have scheduled events configured? If so then I would do your backup again with that option. [23:13] scheduled events? no I didn't get that far. This VPS was to host a code-management tool and outgoing mail server. The guys running the company are screwing around, so I'm backing up and leaving them. [23:14] Gotcha. [23:14] Yes, then I wouldn't worry about it. [23:14] They don't even have a working customer portal anymore. That is how bad it is. [23:15] Wow, that is nuts. [23:16] I am at the datacenter right now setting up a new cloudstack cluster. [23:16] My billing system works and I don't even have any customers right now. lol [23:16] WannaBeGeekster: I am now going to transfer everything to my local PC using this: rsync -avPAHX --exclude=/dev --exclude=/proc --exclude=/sys root . Will i need to include -p in this as well? [23:16] WannaBeGeekster: Are you selling VPS stuff? [23:16] Actually I am [23:17] On your own? [23:17] You can PM me if you like [23:17] And the -p is only necessary for mysql and its related tools if you have a password set for the root account in MySQL specifically. [23:17] Let me give the rsync command a try. [23:18] Go for it. I am curious to see if it works for you. [23:25] that command failed :'D [23:27] Joe_knock: You'll need at least one "--include" too [23:27] TJ- I want to backup everything except those 3. Does the command require a specific include? [23:28] I get a "ssh: ... port 22: Connection refused [23:28] Joe_knock: Oh well, that isn't the parameters" That's no ssh server listening, or the firewall DROPing connections to port 22 [23:29] TJ- , the full command goes like this: [23:30] rsync -avPAHX --exclude=/dev --exclude=/proc --exclude=/sys root@myserver.com:/ /path/to/backup [23:30] I am trying to get the backup copied to my local system [23:32] Joe_knock: Well, you need to focus on the ssh connection first. Can you do "ssh root@myserver.com" and get an interactive log-in? [23:34] TJ- Do I need to run that command when I am logged in to my VPS or when I am on my local system? I can SSH into my remote server [23:35] Joe_knock: From the local PC, to the server. If that works, but the rsync command doesn't then there's something of a syntax or typo error in your remote server [23:39] TJ- I am using a custom port on my server. Could the port 22 error be because of that? [23:40] Joe_knock: Yes! You need to tell rsync the port [23:41] TJ- I tried this: rsync -avPAHX --exclude=/dev --exclude=/proc --exclude=/sys root@myserver.com:[portnum]/ /path/to/backup but that didn't work. Any other options? [23:49] Joe_knock: Does rsync connect, or get the same connection-refused message? [23:50] TJ- Same error [23:51] Joe_knock: Then you have some kind of typo/syntax error on the command line [23:51] Joe_knock: try adding debug output: "-vvvv" [23:52] It could be this: -avPAHX [23:52] Joe_knock: If you get the connection error, then its the remote system specification [23:54] It doesn't give any connection errors. [23:54] Perhaps it could be my client that is blocking it. [23:54] Joe_knock: you said "Same error" [23:56] Oh you mean same "connection refused" error? [23:57] is there a step-by-step guide at all for setting up a mail server with IMAP/POP3 and SMTP anywhere that works? I couldn't find any that were really complete enough for what I need... [23:58] Have you looked at Postfix? teward [23:58] Joe_knock, I've looked at postfix, yes, but never found a guide that completely explains its configuration and setup [23:59] while I may be decently fluent with nginx, I'm essentially a newbie in setting up mail servers [23:59] and the other thing is i need a mailserver that can work without linux user accounts tied to it (not sure if postfix can do this?) [23:59] teward are you looking for something highly custom? [23:59] teward, I think my mail server is running on a single user account