[00:57] got real quiet here... [01:08] it happens [13:23] g'morning all....are there any Python experts around, and if so are you familiar with paramiko? [13:32] morning Lupine [13:32] hey man, how's things been [13:32] good, what do you need help with? [13:33] http://paste.ubuntu.com/556510/ [13:33] it looks like it should be so simple, it's driving me crazy [13:35] I got this from examples found all over the place...and it's an exact duplicate of the examples [13:38] hmmmm [13:38] I don't see anything wrong [13:39] did you install from the repos? [13:39] damn....then that means it must be the host I'm trying to connect to [13:39] a Cisco router [13:39] yes...installed from the repos, and even tried older/newer versions...I bet ya it's the stupid host [13:41] yeah, I just tried your script on my box and it ran fine [13:41] grrrrr....stoopid me...it's the host, when I try the above with a normal Linux host, it works as expected. [13:41] thanks for the sanity check...guess I'll have to try something else for the Cisco router [13:42] good luck [13:42] thx [15:33] hey guys, anyone know a way around wget limitation of 2gb or whatever random number it is? [15:33] I have a server with a 2.4gb file i need to download that is tgz [15:33] i downloaded all the rest fine [15:33] but this one through http or wget gives a 403 error [15:34] wget has a file size limit? [15:35] Does Curl also have a file size limit? [15:35] I dont think its wget or curls problem [15:35] Can you ftp the file instead using get? [15:35] i think its on my source server thats having a problem [15:36] well im trying to do a direct transfer [15:36] because i tried doing it through my work computer and its saying its gonna take 18 hours to download [15:36] where as a direct transfer would take about 10 minutes [15:36] so i dont mind wasting a bit of time to figure it out.. [15:40] wget uses a pointer to a LONG to store where it's up to in a file. [15:41] 403 == forbidden [15:41] it's returned by the webserver [15:41] yea [15:41] change your --user-agent? [15:42] --user-agent=FireFox/Mozilla or something like that...the server may be refusing any requests from *wget* [15:42] i tried directly from firefox and got the same [15:43] time to edit the webserver config then... [15:43] maybe the webserver is timing out the connection [15:44] gah, the in-laws windows computer won't boot [15:44] I'll be making a house call this evening [15:45] windows is such a pain to fix [15:45] can you use rsync to pull files from remote server? [15:45] yes [15:45] i know you can push [15:45] its the same as rsh [15:45] but the server is locked down and wont let me push to it lol [15:46] I hate taking over crap from previous IT people [15:46] weirdest thing.. I tried even ssh'ing from another server and it wont let me in with same login credentials im using to login from work [15:50] yay think its working! [15:50] zsync, ftw [15:50] rsync -avz -e ssh credentials@mysite.com:/path/to/file /new/path/to/file [15:50] if you're going to have to keep pulling updates to that file, you want zsync [15:51] looks like its gonna work.. we will see [15:51] mhall119: nope server migration [15:51] ok [15:51] company has been paying for 2 dedicated servers for 6 months because previous IT talked it up and talked it up and then never actually migrated [15:51] and they are pissed lol [15:51] nice [15:51] yea to the tune of 700/month for the new server [15:53] dual quad core, 8gb mem, 6 redundent raid 73gb hd's, 10 ips and 2tb bandwidth [15:53] fully managed [15:53] where? [15:53] plus hardware firewall and managed backup [15:53] rackspace [15:55] beats the pants off my dual single core 1gb mem single 36gb scsi server i have in my office lmfao [15:55] on their cloud or no? [15:55] uhm.. its a dedicated server.. dunno lol [15:55] i just have the list of equipment [15:55] ok [15:55] i didnt set it up [15:56] I'm looking around at hosting options for by brother [15:56] im just left with crossing my fingers and praying for minimal downtime when i do the transfer [15:56] getting prices, etc [15:56] well I love rackspace cloud [15:56] for vps [15:56] how's their support? [15:56] I have like 6 clients i have on there [15:56] have you ever needed it? [15:56] support is great. 24/7 [15:56] I needed it when i was up at 2 am and did an oopsie [15:56] but never for down time or anything [15:56] cool, that's hopefully all he'll need it for too [15:56] and cheap [15:57] do you host any Windows systems with them? [15:57] nah i hate windows servers [15:57] every client ive ever had with a windows server permissions were nightmarish [15:57] how big of a server does he need? [15:58] I like the cloud servers cause you can get them starting at 10.95 a month + bandwidth [15:58] and their bandwidth charges are fairly cheap [15:58] or, if you prefer to go ahead and buy bandwidth at the same time.. i dunno if you know that slicehost is rackspace [15:58] not too big, I think [15:59] slicehost 256mb slice with 15gb bw is 20 a month [15:59] a dozen or so ASP.NET websites [15:59] bigger than that [15:59] hmm... so like a 512 or 768 [15:59] i dunno how resource hungry asp.net is [15:59] I would say closer to 2GB [15:59] I don't know much about asp.net either [15:59] ah ok.. lol im no asp.net/windows fan [16:00] well 2gb with 1200gb transfer is 130 from slicehost [16:00] be either, but he'll be hosting asp websites, so there's not much choice [16:01] hmm. slicehost doesnt offer windows [16:01] so it'd be around 116/month for a 2gb rackspace vps + bandwidth [16:02] yeah, I have no idea what kind of bandwidth he'll need [16:02] bandwidth is fairly cheap [16:02] and i find people over estimate their bandwidth a lot [16:03] better than underestimating it [16:03] had a large client with thousands of pictures and about 500 hits a day, art gallery.. only used like 30gb of bw a month [16:03] hmm maybe they had more than that for hits [16:04] either way... bandwidth is only 18 cents a gig out and 8 cents a gig in [16:04] hmm me hopes rsync hasnt frozen [16:05] it doesnt give any indications.. [16:05] wow, amazon's prices for windows servers is expensive [16:05] this is all it says... [16:05] What's the use of a windows server? [16:05] tiemonster: hosting asp.net websites [16:05] why would you want to do that? [16:06] paying clients with asp.net websites who need hosting [16:06] http://screencast.com/t/KjcNnGfdM [16:06] must be desperate [16:07] thats how its been sitting since i said yay! it worked. lmao [16:08] oh wait.. maybe i underestimated the size [16:08] oops [16:09] 2733971998 [16:09] how big is that in gigs? lol [16:09] 27 [16:09] is that bytes? [16:10] yes [16:10] thats what ls -la gives me [16:10] 2.5 [16:10] so i assume its bytes [16:10] not 27 [16:10] 2.5 [16:10] thats what i thought.. rsynce shouldnt take that long! [16:10] rsync* [16:10] unless it's the first run [16:10] it is [16:10] just one file [16:10] just [16:10] i meant 2.7... my period key didnt fire [16:10] uh huh [16:10] lolol [16:10] I would give it a few hours [16:11] damn.. well i guess its better than the 18 it told me it was gonna take through sftp [16:11] just to download to my local computer [16:11] subsequent runs will go much faster [16:11] because it will calculate deltas [16:11] leave me alone... i'm on a windows machine with a keyboard whose keys are raised 2 inches [16:11] lol dont need subsequent runs [16:11] you have to literally punch the keys [16:11] which is defeated by using tar... [16:11] i just need to get this backup file of one of our websites out to new server [16:12] one time only? [16:12] yes [16:12] scp would be faster [16:12] tar isn't the problem, gzip is [16:12] mhall119: yeah - that's what I meant [16:12] i tried using wscp... it said 13 hours [16:12] to download to local computer [16:12] then who knows how long to upload it back [16:12] probably accurate [16:13] winSCP i mean [16:13] lol [16:13] if you've got a slow connection, that would be right [16:13] it usually takes 6-8 hours for me to download Ubuntu releases at home [16:13] course, encrypting and decrypting 2.5GB for the transfer will slow things down a bit too [16:13] erm.. i wish there were a faster way [16:13] oh - yeah I didn't think about using rsync without an ssh tunnel [16:14] breaking up the file or something and grabbing it with wget [16:14] that would work [16:15] zsync does essentially that [16:15] wow.... it only takes me around 40 minutes to download an ubuntu release.... [16:15] if that [16:15] bittorrent, ftw [16:15] wget.... [16:15] greiser: yeah - it takes about 25 minutes at work [16:16] but we have 100MB symmetric [16:16] i have rr turbo at home [16:16] something like 10mb down, 1mb up.. or something.. [16:16] yeah, I like to download from the USF mirrors while on the USF network at work [16:16] haha.. yea we dont even have rr turbo at work [16:16] its just rr [16:16] lmao [16:17] so like 2mb down, 512kb up? [16:17] uhm.. heres my speed test [16:18] something doesnt seem right [16:18] lol [16:18] http://screencast.com/t/QdVDhcK9He [16:20] is there a cli to compress an entire directory but split in to multiple files under a certain size automatically? [16:20] you mean like rar? [16:20] yea i guess [16:21] rar/unrar are in the ubuntu repos [16:21] there's also p7zip-rar [16:21] i'll have to see if its available on red hat, cause old server is running that [16:21] im sure it is [16:22] gotta be a quicker way than this rsync or 13 hours for ftp just to download [16:22] probably not [16:22] either way you'll be transferring the same amount of data [16:22] yea.. but it only took me like 10 minutes to direct transfer a 1 gig file from server to server [16:23] its just this other one is over 2 gigs and i cant just wget it [16:23] y? [16:23] because wget and curl and http all give me a 403 error instantly when i try to go there [16:24] so im guessing the apache is refusing to send a file that large? [16:24] lol [16:24] 403 is a forbidden error [16:24] permissions and such [16:24] yea.. the thing is i copied all the tars to same directory at same time [16:25] all have same permissions.. all are identical [16:25] only thing different is size [16:25] others copied just fine with wget [16:26] all run same backup script nightly with same user permissions, and i copied from backup spot to public http spot for me to quickly grab them to new server by just cp ./* /path/to/public/ [16:37] apache? [16:37] apaches doesn't support large files until version 2.2 so if you are running a previous version your hitting apache's file size limit which is why your getting a 403.... [17:42] Oooh, the new auto-hiding dock in Natty looks nice. [17:54] I just noticed the LibreOffice is the default in Natty as well. Cool. [17:58] any mdadm/software raid ninjas around? [18:51] zoopster: i had to uninstall and reinstall google voice in order for it to work with the latest cm7 .... ymmv, but just fyi [18:54] dantalizing: I updated over lunch but gv seems to work ok