Ask Slashdot on Website Backup Options

A few days ago Ask Slashdot featured a question on how to handle local backup of files on remote web servers. I’ve been trying to figure out a good answer to that question the past few days.

My problem is this — I’ve been using Cpanel to do a daily full backup of my server. That was fine when the .tar files were 600mb-1gb. But lately I’ve been dumping more and more of my life on to my web server and the daily .tar file has ballooned to 12gb and growing.

I’ve got 16mb download speed on my cable Internet, so that file still takes awhile, but if I set it up to download when I got to bed, it’s finished long before I wake up. Still, I’m sure at some point Charter is going to ring me up and wonder why I’m downloading hundreds of gigabytes per month. Plus, I can see the day when that download is 50gb (FIOS, where are you?)

So what to do? It looks like the consensus in the Slashdot thread is to use rsync. I guess I’ll talk to my webhost about getting that up and running (there is an rsync install on the server, but I can’t make head nor tails of the rsync documentation on the web).

Anyone else have to do local backups of very large remote file sets?

One thought on “Ask Slashdot on Website Backup Options”

  1. I’ll resist commenting on your data-related OCD, however, Rsync over SSH is what you need. There’s a number of ways you can set it up, but ultimately, it will just come down to a one-liner unix command with the right flags that you can run via a cron/scheduler.

    If you google “Rsync over SSH”, you’ll find lots of info.

    If you want to get a little more detailed, you might also consider using Git, the version control system, to manage the files…which also does very smart storage of modifications and versioning that you wouldn’t get with rsync.

    greg.

Leave a Reply to Greg PierceCancel reply