5

Last week I started making daily backups of my relatively small (~300mb) site. The backups are zipped, so it's less space, but it will quickly add up, especially as my site gets bigger.

I was thinking of unzipping the latest backup, and from now on just using mercurial to add/update files from newer backups.

Are there any drawbacks to this? Is there a better way to do it?

asked May 15, 2012 at 20:17
1
  • Why not just put the site itself in version control? Then backup the repo? Commented May 15, 2012 at 20:42

5 Answers 5

8

Git is your best choice, since it allows "cheap" commits, branches, etc. every commit is simply a diff of the previous commit, so the size is minimal, for example if you have

index.html
image.png
README.txt
Makefile
main.c

and you only change index.html, then that is all that is stored in the commit, making git extremely space efficient, whats more, it only stores the one or two lines you changed, it could be 500000 line html file (WoAH) making it super fast.

This does mean that the first push to whatever git server that you're using might be long. But everything after that should take no time at all.

And the beautiful thing is, everything can be done locally, you can do as many changes as you want to a local repository, commit as many times as you want, and then push all at once, and since repositories can be local, you can push to just another physical (in the virtual sense :P) on your hard drive, or to a webserver, or through email even!

answered May 15, 2012 at 21:16
1

I use git for this purpose. Assuming unix/linux:

cd /path/to/site
git init
chmod 600 .git
vi .gitignore # add any tmp files you dont want here
git add .
git commit -m "initial commit"

Then, add this to the nightly crontab:

0 3 * * * cd /path/to/project; git add . && git add -u && git commit -m "Daily Commit" 

You could also setup a push to a remote (like github) for true backups.

answered May 15, 2012 at 21:11
1
  • Thanks! Also, what about the sql database? Can I backup that up automatically somehow? And what about the password? Commented May 15, 2012 at 21:22
0

IMO this is the best way, I'm doing it like this as well.
(just commit every change and push the whole thing to a private repository on Bitbucket)

It makes absolutely no sense to always copy the whole 300 MB if only a few files have changed.
Plus, you get a changelog "for free" (which you don't have if you just make a ZIP file every day).

answered May 15, 2012 at 20:21
0

Definitely the way to go here. One additional plus -- you can easily revert changes in many cases if need be. Saved me from more than a few bad updates . . .

answered May 15, 2012 at 20:35
0

Git is a great way to go as others' mentioned. For database (like MySQL), you should be able to create a cron job that runs a script to backup the database and to rotate the the log files.

s, and use daily logging I do something like:

/usr/sbin/logrotate -s status-file logrotate.conf

mysqldump --tables --add_drop_table --user=my-db-username --password='my-password' --host=dbase-server my-database> /the/backup/dir/my-database.sql

My logrotate.conf says to do daily logs, compress, rotate every 14 day. I wildcard it to apply to all *.sql files.

answered May 16, 2012 at 23:39

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.