[08:31] <head_victim> A dry run rsync to a fresh location gives me a total size of 463GB, an accumulative rsync gives me 850GB. I'm using rsync -vtlrh source destination - any suggestions?
[08:31] <head_victim> Oh and I've thrown a --delete in there as well to no avail.
[09:21] <head_victim> For the record --ignore-errors did the trick.
[09:24] <sagaci> head_victim: how long did it take to sync
[09:25] <head_victim> It took bugger all (came back to the pc within 30 mins) and the extra 400GB was gone
[09:26] <head_victim> The 850GB collection built up over months.
[09:27] <sagaci> ah ok
[09:28] <head_victim> I've noticed aarnet has slowed to about 2.2MB/s these days though.
[09:28] <head_victim> First few seconds ramp up over 4 but within 5 - 10 seconds its back to 2.2 - 2.4
[09:34] <sagaci> yeah, i'm down to 1.5
[09:43] <head_victim> I'd say the speed is there because I haven't noticed any drop off anywhere else, maybe it's being throttled somewhere I'd say.
[09:44] <sagaci> bigpond, i'd say
[09:48] <head_victim> Or aarnet suddenly having a massive load of Telstra clients.
[09:50] <sagaci> yeah, maybe
[11:17] <head_victim> I wonder how back "a few bad sectors" is on a SMART test for a 2tb drive.
[11:29] <gorilla> head_victim: I'd be backing up the drive and replacing it. But that's more my level of paranoia.
[11:30] <head_victim> Yeah, I'm working on backing it up now. 
[11:31] <head_victim> It's showing a bad sector count of 79 on a 2tb drive
[11:31] <head_victim> It's only been powered on 1.2 years 
[11:32] <head_victim> gorilla: would a fsck help at all?
[11:32] <head_victim> Just researching the topic on google and working out if it is worth using the drive at all
[11:54] <sagaci> head_victim, is it your ubuntu archive drive?
[11:54] <head_victim> sagaci: that's one of the things on it. No real problem with that though, I can get that again any time I need it.
[11:54] <head_victim> It's also my primary network drive for documents, etc.
[11:55] <sagaci> I wouldn't share that kinda stuff on the same drive
[11:55] <head_victim> I do semi-regular backups so just doing a manual rsync of everything I can't replace elsewhere to put my mind at ease.
[11:55] <head_victim> It's only shared across the lan. No security threat.
[11:56] <head_victim> On a side note, I'm getting almost as much linked in spam on mailing lists as I'm getting other spam on my email account :/
[11:59] <sagaci> like I mean, I know for myself, I'd have separate drives for ubuntu backup/archive and a separate one for personal stuff
[12:00] <head_victim> I was meant to, but that got sidetracked learning how to correctly build the server to run it all.
[12:00] <head_victim> So the personal data sits on raid 10 and the mirror on an external drive.
[12:01] <sagaci> ah ok
[12:01] <head_victim> And then anything not easily replaceable rsynced on a weekly cron to another network drive.
[12:02] <head_victim> And then I also burn some DVDs for semi-static data like photos, etc.
[12:02] <head_victim> I think that's a decent enough backup plan for a home network, just don't want to ever have to test it.
[12:03] <sagaci> i'm going to see if I can hack around on testdrive to include sync/launches of lubuntu daily isos
[12:03] <head_victim> I would but that's metered data :/
[12:04] <head_victim> Unless you can convince aarnet to mirror dailies as well ;)
[12:04] <sagaci> well it's 700mb straight up but then it just sync the changes
[12:04] <sagaci> which is convenient
[12:04] <head_victim> Ah zsync the changes?
[12:06] <sagaci> yeah, zsync
[12:06] <sagaci> otherwise falls back to curl/wget
[12:09] <head_victim> Nice work. I'm concentrating on getting the basics right. My previous approach was leaving gaps in my knowledge so now I'm sticking strictly to a book to work through.
[12:10] <head_victim> Trying to plug the holes so I can do things right instead of just the quickest way I could find on google
[12:21] <sagaci> head_victim, is an email saying sunday week too confusing to send out now
[12:22] <head_victim> Nah just give the time and date and a shortlist of topics to try and generate conversation on them