I'm brand new to shell scripting and have been searching for examples on how to create a backup script for my website but I'm unable find something or at least something I understand.
I have a Synology Diskstation server that I'd like to use to automatically (through its scheduler) take backups of my website.
I currently am doing this via Automator on my Mac in conjunction with the Transmit FTP program, but making this a command line process is where I struggle.
This is what I'm looking to do in a script:
1) Open a URL without a browser (this URL creates a mysql dump of the databases on the server to be downloaded later). example url would be http://mywebsite.com/dump.php
2) Use FTP to download all files from the server. (Currently Transmit FTP handles this as a sync function and only downloads files where the remote file date is newer than the local file. It also will remove any local files that don't exist on the remote server).
3) Create a compressed archive of the files from step 2, named as website_CURRENT-DATE
4) Move archive from step 3 to a specific folder and delete any file in this specific folder that's older than 120 Days.
Right now I don't know how to do step 1, or the synchronization in step 2 (I see how I can use wget to download the whole site, but that seems as though it will download everything each time it runs, even if its not been changed).
Steps 3 and 4 are probably easy to find via searching, but I haven't searched for that yet since I can't get past step 1.
Thanks!
Also FYI my web-host doesn't do these types of backups, so that's why I like to do my own.