Back to the main page

wget - one simple/"cheap" way to backup your web site

Okay, so I have this site and want simple way to back it up, since my web host provider doesn't backup site with cheap hosting plan :-) 

I am sure there are tons of solutions for such backup but this was really on the top on my head (although I already have kind of another backup since I create files on my machine and upload to the live site). 

Anyways, I was curious how long backup takes so hence use of time command. 

Note:  I prefer using /usr/bin/time instead of built-in shell command, since it gives somehow more convenient output, like below example.
# /usr/bin/time sleep 15
real       15.0
user        0.0
sys         0.0

# time sleep 15
0.00u 0.01s 0:15.04 0.0%
So here you go, "cheap" backup :)
/usr/bin/time -p wget --mirror -t 10 -o etcfstab_backup.log

-t = try 10 times site (to unlimit the number of retries, use -t inf or -t 0, default is 20 retries)
-o = log file 
No need to type destination dir, will be automatically created in the current directory. This does not backup subdirectories; they have to be done manually, like:
/usr/bin/time -p wget --mirror -t 10 -o etcfstab_backup.log
The subdirectory 'kumon' will be also created automatically under Another thing, if you for example change file name on live server's directory and back it up, you will backup new file name, but old one still stays in your backup. So this '--mirror' option is not really what it claims to be. Say, you want to see size of your backup. Check disk usage of the direcotry with du command:
# du -s
-s = total sum of directory Note: This shows size in 512 byte unit, so what you see multiple with 512 and you'll get size in bytes. 34028 x 512 = 17422336 bytes Use -k to see size in kilo bytes:
# du -sk
Or use -h to see humanly readable output, like in bytes, kilo bytes or mega bytes.
# du -sh
Back to the main page