[00:57] <greiser> got real quiet here...
[01:08] <mhall119> it happens
[13:23] <Lupine> g'morning all....are there any Python experts around, and if so are you familiar with paramiko?
[13:32] <mhall119> morning Lupine
[13:32] <Lupine> hey man, how's things been 
[13:32] <mhall119> good, what do you need help with?
[13:33] <Lupine> http://paste.ubuntu.com/556510/
[13:33] <Lupine> it looks like it should be so simple, it's driving me crazy
[13:35] <Lupine> I got this from examples found all over the place...and it's an exact duplicate of the examples 
[13:38] <mhall119> hmmmm
[13:38] <mhall119> I don't see anything wrong
[13:39] <mhall119> did you install from the repos?
[13:39] <Lupine> damn....then that means it must be the host I'm trying to connect to
[13:39] <Lupine> a Cisco router 
[13:39] <Lupine> yes...installed from the repos, and even tried older/newer versions...I bet ya it's the stupid host
[13:41] <mhall119> yeah, I just tried your script on my box and it ran fine
[13:41] <Lupine> grrrrr....stoopid me...it's the host, when I try the above with a normal Linux host, it works as expected.  
[13:41] <Lupine> thanks for the sanity check...guess I'll have to try something else for the Cisco router
[13:42] <mhall119> good luck
[13:42] <Lupine> thx
[15:33] <amouge> hey guys, anyone know a way around wget limitation of 2gb or whatever random number it is?
[15:33] <amouge> I have a server with a 2.4gb file i need to download that is tgz
[15:33] <amouge> i downloaded all the rest fine
[15:33] <amouge> but this one through http or wget gives a 403 error
[15:34] <mianosm> wget has a file size limit?
[15:35] <mianosm> Does Curl also have a file size limit?
[15:35] <amouge> I dont think its wget or curls problem
[15:35] <mianosm> Can you ftp the file instead using get?
[15:35] <amouge> i think its on my source server thats having a problem
[15:36] <amouge> well im trying to do a direct transfer
[15:36] <amouge> because i tried doing it through my work computer and its saying its gonna take 18 hours to download
[15:36] <amouge> where as a direct transfer would take about 10 minutes
[15:36] <amouge> so i dont mind wasting a bit of time to figure it out..
[15:40] <mianosm> wget uses a pointer to a LONG to store where it's up to in a file.
[15:41] <mhall119> 403 == forbidden
[15:41] <mhall119> it's returned by the webserver
[15:41] <amouge> yea
[15:41] <mianosm> change your --user-agent?
[15:42] <mianosm> --user-agent=FireFox/Mozilla or something like that...the server may be refusing any requests from *wget*
[15:42] <amouge> i tried directly from firefox and got the same
[15:43] <mianosm> time to edit the webserver config then...
[15:43] <mhall119> maybe the webserver is timing out the connection
[15:44] <mhall119> gah, the in-laws windows computer won't boot
[15:44] <mhall119> I'll be making a house call this evening
[15:45] <mhall119> windows is such a pain to fix
[15:45] <amouge> can you use rsync to pull files from remote server?
[15:45] <mianosm> yes
[15:45] <amouge> i know you can push
[15:45] <mianosm> its the same as rsh
[15:45] <amouge> but the server is locked down and wont let me push to it lol
[15:46] <amouge> I hate taking over crap from previous IT people
[15:46] <amouge> weirdest thing.. I tried even ssh'ing from another server and it wont let me in with same login credentials im using to login from work
[15:50] <amouge> yay think its working!
[15:50] <mhall119> zsync, ftw
[15:50] <amouge> rsync -avz -e ssh credentials@mysite.com:/path/to/file /new/path/to/file
[15:50] <mhall119> if you're going to have to keep pulling updates to that file, you want zsync
[15:51] <amouge> looks like its gonna work.. we will see
[15:51] <amouge> mhall119: nope server migration
[15:51] <mhall119> ok
[15:51] <amouge> company has been paying for 2 dedicated servers for 6 months because previous IT talked it up and talked it up and then never actually migrated
[15:51] <amouge> and they are pissed lol
[15:51] <mhall119> nice
[15:51] <amouge> yea to the tune of 700/month for the new server
[15:53] <amouge> dual quad core, 8gb mem, 6 redundent raid 73gb hd's, 10 ips and 2tb bandwidth
[15:53] <amouge> fully managed
[15:53] <mhall119> where?
[15:53] <amouge> plus hardware firewall and managed backup
[15:53] <amouge> rackspace
[15:55] <amouge> beats the pants off my dual single core 1gb mem single 36gb scsi server i have in my office lmfao
[15:55] <mhall119> on their cloud or no?
[15:55] <amouge> uhm.. its a dedicated server.. dunno lol
[15:55] <amouge> i just have the list of equipment
[15:55] <mhall119> ok
[15:55] <amouge> i didnt set it up
[15:56] <mhall119> I'm looking around at hosting options for by brother
[15:56] <amouge> im just left with crossing my fingers and praying for minimal downtime when i do the transfer
[15:56] <mhall119> getting prices, etc
[15:56] <amouge> well I love rackspace cloud
[15:56] <amouge> for vps
[15:56] <mhall119> how's their support?
[15:56] <amouge> I have like 6 clients i have on there
[15:56] <mhall119> have you ever needed it?
[15:56] <amouge> support is great. 24/7
[15:56] <amouge> I needed it when i was up at 2 am and did an oopsie
[15:56] <amouge> but never for down time or anything
[15:56] <mhall119> cool, that's hopefully all he'll need it for too
[15:56] <amouge> and cheap
[15:57] <mhall119> do you host any Windows systems with them?
[15:57] <amouge> nah i hate windows servers
[15:57] <amouge> every client ive ever had with a windows server permissions were nightmarish
[15:57] <amouge> how big of a server does he need?
[15:58] <amouge> I like the cloud servers cause you can get them starting at 10.95 a month + bandwidth
[15:58] <amouge> and their bandwidth charges are fairly cheap
[15:58] <amouge> or, if you prefer to go ahead and buy bandwidth at the same time.. i dunno if you know that slicehost is rackspace
[15:58] <mhall119> not too big, I think
[15:59] <amouge> slicehost 256mb slice with 15gb bw is 20 a month
[15:59] <mhall119> a dozen or so ASP.NET websites
[15:59] <mhall119> bigger than that
[15:59] <amouge> hmm... so like a 512 or 768
[15:59] <amouge> i dunno how resource hungry asp.net is
[15:59] <mhall119> I would say closer to 2GB
[15:59] <mhall119> I don't know much about asp.net either
[15:59] <amouge> ah ok.. lol im no asp.net/windows fan
[16:00] <amouge> well 2gb with 1200gb transfer is 130 from slicehost
[16:00] <mhall119> be either, but he'll be hosting asp websites, so there's not much choice
[16:01] <amouge> hmm. slicehost doesnt offer windows
[16:01] <amouge> so it'd be around 116/month for a 2gb rackspace vps + bandwidth
[16:02] <mhall119> yeah, I have no idea what kind of bandwidth he'll need
[16:02] <amouge> bandwidth is fairly cheap
[16:02] <amouge> and i find people over estimate their bandwidth a lot
[16:03] <tiemonster> better than underestimating it
[16:03] <amouge> had a large client with thousands of pictures and about 500 hits a day, art gallery.. only used like 30gb of bw a month
[16:03] <amouge> hmm maybe they had more than that for hits
[16:04] <amouge> either way... bandwidth is only 18 cents a gig out and 8 cents a gig in
[16:04] <amouge> hmm me hopes rsync hasnt frozen
[16:05] <amouge> it doesnt give any indications..
[16:05] <mhall119> wow, amazon's prices for windows servers is expensive
[16:05] <amouge> this is all it says...
[16:05] <tiemonster> What's the use of a windows server?
[16:05] <mhall119> tiemonster: hosting asp.net websites
[16:05] <tiemonster> why would you want to do that?
[16:06] <mhall119> paying clients with asp.net websites who need hosting
[16:06] <amouge> http://screencast.com/t/KjcNnGfdM
[16:06] <tiemonster> must be desperate
[16:07] <amouge> thats how its been sitting since i said yay! it worked. lmao
[16:08] <amouge> oh wait.. maybe i underestimated the size
[16:08] <tiemonster> oops
[16:09] <amouge> 2733971998
[16:09] <amouge> how big is that in gigs? lol
[16:09] <greiser> 27
[16:09] <tiemonster> is that bytes?
[16:10] <amouge> yes
[16:10] <amouge> thats what ls -la gives me
[16:10] <mhall119> 2.5
[16:10] <amouge> so i assume its bytes
[16:10] <mhall119> not 27
[16:10] <tiemonster> 2.5
[16:10] <amouge> thats what i thought.. rsynce shouldnt take that long!
[16:10] <amouge> rsync*
[16:10] <tiemonster> unless it's the first run
[16:10] <amouge> it is
[16:10] <amouge> just one file
[16:10] <tiemonster> just
[16:10] <greiser> i meant 2.7... my period key didnt fire
[16:10] <mhall119> uh huh
[16:10] <amouge> lolol
[16:10] <tiemonster> I would give it a few hours
[16:11] <amouge> damn.. well i guess its better than the 18 it told me it was gonna take through sftp
[16:11] <amouge> just to download to my local computer
[16:11] <tiemonster> subsequent runs will go much faster
[16:11] <tiemonster> because it will calculate deltas
[16:11] <greiser> leave me alone... i'm on a windows machine with a keyboard whose keys are raised 2 inches
[16:11] <amouge> lol dont need subsequent runs
[16:11] <greiser> you have to literally punch the keys
[16:11] <tiemonster> which is defeated by using tar...
[16:11] <amouge> i just need to get this backup file of one of our websites out to new server
[16:12] <tiemonster> one time only?
[16:12] <amouge> yes
[16:12] <tiemonster> scp would be faster
[16:12] <mhall119> tar isn't the problem, gzip is
[16:12] <tiemonster> mhall119: yeah - that's what I meant
[16:12] <amouge> i tried using wscp... it said 13 hours
[16:12] <amouge> to download to local computer
[16:12] <amouge> then who knows how long to upload it back
[16:12] <tiemonster> probably accurate
[16:13] <amouge> winSCP i mean
[16:13] <amouge> lol
[16:13] <mhall119> if you've got a slow connection, that would be right
[16:13] <tiemonster> it usually takes 6-8 hours for me to download Ubuntu releases at home
[16:13] <mhall119> course, encrypting and decrypting 2.5GB for the transfer will slow things down a bit too
[16:13] <amouge> erm.. i wish there were a faster way
[16:13] <tiemonster> oh - yeah I didn't think about using rsync without an ssh tunnel
[16:14] <amouge> breaking up the file or something and grabbing it with wget
[16:14] <mhall119> that would work
[16:15] <mhall119> zsync does essentially that
[16:15] <greiser> wow.... it only takes me around 40 minutes to download an ubuntu release....
[16:15] <greiser> if that
[16:15] <mhall119> bittorrent, ftw
[16:15] <greiser> wget....
[16:15] <tiemonster> greiser: yeah - it takes about 25 minutes at work
[16:16] <tiemonster> but we have 100MB symmetric
[16:16] <greiser> i have rr turbo at home
[16:16] <greiser> something like 10mb down, 1mb up.. or something..
[16:16] <mhall119> yeah, I like to download from the USF mirrors while on the USF network at work
[16:16] <amouge> haha.. yea we dont even have rr turbo at work
[16:16] <amouge> its just rr
[16:16] <amouge> lmao
[16:17] <greiser> so like 2mb down, 512kb up?
[16:17] <amouge> uhm.. heres my speed test
[16:18] <amouge> something doesnt seem right
[16:18] <amouge> lol
[16:18] <amouge> http://screencast.com/t/QdVDhcK9He
[16:20] <amouge> is there a cli to compress an entire directory but split in to multiple files under a certain size automatically?
[16:20] <mhall119> you mean like rar?
[16:20] <amouge> yea i guess
[16:21] <mhall119> rar/unrar are in the ubuntu repos
[16:21] <mhall119> there's also p7zip-rar
[16:21] <amouge> i'll have to see if its available on red hat, cause old server is running that
[16:21] <amouge> im sure it is
[16:22] <amouge> gotta be a quicker way than this rsync or 13 hours for ftp just to download
[16:22] <mhall119> probably not
[16:22] <mhall119> either way you'll be transferring the same amount of data
[16:22] <amouge> yea.. but it only took me like 10 minutes to direct transfer a 1 gig file from server to server
[16:23] <amouge> its just this other one is over 2 gigs and i cant just wget it
[16:23] <greiser> y?
[16:23] <amouge> because wget and curl and http all give me a 403 error instantly when i try to go there
[16:24] <amouge> so im guessing the apache is refusing to send a file that large?
[16:24] <greiser> lol
[16:24] <greiser> 403 is a forbidden error
[16:24] <greiser> permissions and such
[16:24] <amouge> yea.. the thing is i copied all the tars to same directory at same time
[16:25] <amouge> all have same permissions.. all are identical
[16:25] <amouge> only thing different is size
[16:25] <amouge> others copied just fine with wget
[16:26] <amouge> all run same backup script nightly with same user permissions, and i copied from backup spot to public http spot for me to quickly grab them to new server by just cp ./* /path/to/public/
[16:37] <greiser> apache?
[16:37] <greiser> apaches doesn't support large files until version 2.2 so if you are running a previous version your hitting apache's file size limit which is why your getting a 403....
[17:42] <maxolasersquad> Oooh, the new auto-hiding dock in Natty looks nice.
[17:54] <maxolasersquad> I just noticed the LibreOffice is the default in Natty as well.  Cool.
[17:58] <dantalizing> any mdadm/software raid ninjas around?
[18:51] <dantalizing> zoopster: i had to uninstall and reinstall google voice in order for it to work with the latest cm7 .... ymmv, but just fyi
[18:54] <zoopster> dantalizing: I updated over lunch but gv seems to work ok