Jump to content

Lightning Fast Linux Web Server Backups!

Recommended Posts

  • Administrator

With all the time I spend working on web servers I've found new ways of backing up files and MySQL databases fast so I don't have to download files across the Internet or FTP stuff back and forth.

So here is my BASH script...

#!/bin/bashcd ~/folder/forum/tar -zcvf forum-backup-$(date '+%m-%d-%Y').tar.gz ~/folder/forum/cd ~/folder/articles/tar -zcvf articles-backup-$(date '+%m-%d-%Y').tar.gz ~/folder/articles/
This little but effective script creates a TARball file similar to ZIP in Windows and the file name has a date stamp in it so if there is several backups made they are all dated.


Just for fun facts todays backup at 06/08/2012 is roughly 1.3 GB just for the forum alone! :stuned::wow:

The it switches directories and backs up the article files as well in the same manner.


But it allows me to back up the 2 big chunks of the site in less than about 10 minutes tops. :wink:

Same with the MySQL databases... :whistle:

#!/bin/bashcd /usr/bin//usr/bin/mysqldump -h [I]forummysqladdress[/I] -u [I]username[/I] -p[I]password dbname[/I] | gzip > $HOME/folder/folder/folder/forum_$(date +"%Y-%m-%d").sql.gz/usr/bin/mysqldump -h [I]articlesmysqladdress[/I] -u [I]username[/I] -p[I]password[/I] [I]dbname[/I] | gzip > $HOME/folder/folder/folder/articles_$(date +"%Y-%m-%d").sql.gz


No downloading required! :thumb1:

  • Like 1
Link to comment
Share on other sites

  • Create New...