Ask Slashdot on Website Backup Options

A few days ago Ask Slashdot featured a question on how to handle local backup of files on remote web servers. I’ve been trying to figure out a good answer to that question the past few days.

My problem is this — I’ve been using Cpanel to do a daily full backup of my server. That was fine when the .tar files were 600mb-1gb. But lately I’ve been dumping more and more of my life on to my web server and the daily .tar file has ballooned to 12gb and growing.

I’ve got 16mb download speed on my cable Internet, so that file still takes awhile, but if I set it up to download when I got to bed, it’s finished long before I wake up. Still, I’m sure at some point Charter is going to ring me up and wonder why I’m downloading hundreds of gigabytes per month. Plus, I can see the day when that download is 50gb (FIOS, where are you?)

So what to do? It looks like the consensus in the Slashdot thread is to use rsync. I guess I’ll talk to my webhost about getting that up and running (there is an rsync install on the server, but I can’t make head nor tails of the rsync documentation on the web).

Anyone else have to do local backups of very large remote file sets?