dongzhimin2231 2014-01-29 16:32
浏览 50
已采纳

用于自动备份网站的Bash / Shell脚本

I'm brand new to shell scripting and have been searching for examples on how to create a backup script for my website but I'm unable find something or at least something I understand.

I have a Synology Diskstation server that I'd like to use to automatically (through its scheduler) take backups of my website.

I currently am doing this via Automator on my Mac in conjunction with the Transmit FTP program, but making this a command line process is where I struggle.

This is what I'm looking to do in a script:

1) Open a URL without a browser (this URL creates a mysql dump of the databases on the server to be downloaded later). example url would be http://mywebsite.com/dump.php

2) Use FTP to download all files from the server. (Currently Transmit FTP handles this as a sync function and only downloads files where the remote file date is newer than the local file. It also will remove any local files that don't exist on the remote server).

3) Create a compressed archive of the files from step 2, named as website_CURRENT-DATE

4) Move archive from step 3 to a specific folder and delete any file in this specific folder that's older than 120 Days.

Right now I don't know how to do step 1, or the synchronization in step 2 (I see how I can use wget to download the whole site, but that seems as though it will download everything each time it runs, even if its not been changed).

Steps 3 and 4 are probably easy to find via searching, but I haven't searched for that yet since I can't get past step 1.

Thanks!

Also FYI my web-host doesn't do these types of backups, so that's why I like to do my own.

  • 写回答

1条回答 默认 最新

  • doucitan2544 2014-01-29 20:32
    关注

    Answering each of your questions in order, then:

    1. Several options, the most common of which would be one of wget http://mywebsite.com/dump.php or curl http://mywebsite.com/dump.php.

    2. Since you have ssh access to the server, you can very easily use rsync to grab a snapshot of the files on-disk with e. g. rsync -essh --delete --stats -zav username@mywebsite.com:/path/to/files/ /path/to/local/backup.

    3. Once you have the snapshot from rsync, you can make a compressed, dated copy with cd /path/to/local/backup; tar cvf /path/to/archives/website-$(date +%Y-%m-%d).tgz *

    4. find /path/to/archives -mtime +120 -type f -exec rm -f '{}' \; will remove all backups older than 120 days.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥15 QT6颜色选择对话框显示不完整
  • ¥20 能提供一下思路或者代码吗
  • ¥15 用twincat控制!
  • ¥15 请问一下这个运行结果是怎么来的
  • ¥15 单通道放大电路的工作原理
  • ¥30 YOLO检测微调结果p为1
  • ¥15 DS18B20内部ADC模数转换器
  • ¥15 做个有关计算的小程序
  • ¥15 如何用MATLAB实现以下三个公式(有相互嵌套)
  • ¥30 关于#算法#的问题:运用EViews第九版本进行一系列计量经济学的时间数列数据回归分析预测问题 求各位帮我解答一下