【问题标题】:Copy two most recent files to another directory using bash script使用 bash 脚本将两个最近的文件复制到另一个目录
【发布时间】:2015-05-27 02:30:59
【问题描述】:

我正在尝试创建一个 bash 脚本来创建 MySQL 数据库和 Web 目录的每日备份。然后它应该 tar 然后将两个最近的 .tar.gz 文件复制到每周第 0 天的每周目录、每月第 1 天的每月目录和每年第 1 天的年目录。

我在尝试让“复制最近的两个文件”部分工作时遇到问题。

到目前为止我得到了什么(使用来自https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script 的脚本作为基础。):

#!/bin/sh
# https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Local Source
SOURCE=/path/to/source
# Create directories etc here
DIR=/path/to/backups
# Local Destination
DESTINATION=/path/to/network/share

# Direct all output to logfile found here
#LOG=$$.log
#exec > $LOG 2>&1

# Database Backup User
DATABASE='wordpress'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'

# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
DOW=$(date '+%u')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')

#LATEST=$(ls -t | head -1)
#LATEST_DAILY=$(find $DIR/tmp/daily/ -name '*.tar.gz' | sort -n | tail -3)
#DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
#DAILY=$(ls -1tr $DIR/tmp/daily/ | tail -2 )
DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)

# Direct all output to logfile found here
# LOG=$DIR/logs/$$.log
# exec > $LOG 2>&1

# Make Temporary Folder
if [ ! -d "$DIR/tmp" ]; then
        mkdir "$DIR/tmp"
        echo 'Created tmp directory...'
fi

# Make Daily Folder
if [ ! -d "$DIR/tmp/daily" ]; then
        mkdir "$DIR/tmp/weekly"
        echo 'Created daily directory...'
fi

# Make Weekly Folder
if [ ! -d "$DIR/tmp/weekly" ]; then
        mkdir "$DIR/tmp/weekly"
        echo 'Created weekly directory...'
fi

# Make Folder For Current Year
if [ ! -d "$DIR/tmp/${YEAR}" ]; then
        mkdir "$DIR/tmp/${YEAR}"
        echo 'Directory for current year created...'
fi

# Make Folder For Current Month
if [ ! -d "$DIR/tmp/${YEAR}/$MONTH" ]; then
        mkdir "$DIR/tmp/${YEAR}/$MONTH"
        echo '...'Directory for current month created
fi

# Make The Daily Backup
tar -zcvf $DIR/tmp/daily/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/tmp/database.sql
tar -zcvf $DIR/tmp/daily/${NOW}_database.tar.gz $DIR/tmp/database.sql
rm -rf  $DIR/tmp/database.sql
echo 'Made daily backup...'

# Check whether it's Sunday (0), if so, then copy most recent daily backup to weekly dir.
        if [ $DOW -eq 2 ] ; then
               cp $DAILY $DIR/tmp/weekly/
        fi
                echo 'Made weekly backup...'

# Check whether it's the first day of the year then copy two most recent daily backups to $YEAR folder
        if [ $DAY_OF_YEAR -eq 146 ] ; then
                cp $DAILY $DIR/tmp/${YEAR}/
        fi
                echo 'Made annual backup...'

# Check if it's the first day of the month, if so, copy the latest daily backups to the monthly folder
        if [ $DAY_OF_MONTH -eq 26 ] ; then
                cp $DAILY $DIR/tmp/${YEAR}/${MONTH}/
        fi
                echo 'Made monthly backup...'

# Merge The Backup To The Local Destination's Backup Folder
# cp -rf $DIR/tmp/* $DESTINATION
# Delete The Temporary Folder
# rm -rf $DIR/tmp
# Delete daily backups older than 7 days
# find $DESTINATION -mtime +7 -exec rm {} \;
echo 'Backup complete. Log can be found under $DIR/logs/.'

我现在已经注释掉了一些部分,同时我试图让它工作,我已经将日/月/年设置为今天,这样我就可以看到正在复制的文件。我还保留了之前对 $DAILY 变量的注释掉的尝试。

我遇到的问题是,在执行脚本时,它会返回以下内容:

./backup-rotation-script.sh                            
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made weekly backup...                                                       
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made annual backup...                                                       
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made monthly backup...                                                      
Backup complete. Log can be found under /path/to/backups/logs/.  

但是当我检查 /path/to/backups/tmp/daily/ 文件时,它可以清楚地看到它们,因为它在错误中返回文件名。

据我所知,这是因为 $DAILY (find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2) 在一行上返回两个结果?我假设让它工作的最简单方法可能是创建一个 for 循环,将两个结果复制到每周/每月/每年目录?

我尝试添加变体:

for file in `ls -1t /path/to/backups/tmp/daily/ | head -n2`
do
   cp $file /path/to/backups/tmp/weekly/
done

但事情并不顺利。 :S

理想情况下,如果它失败了,我也希望它报告,但我还没有那么远。 :)

任何帮助将不胜感激!

【问题讨论】:

    标签: linux bash backup


    【解决方案1】:

    没关系!想通了。

    我完全删除了 'daily' 变量并使用以下内容代替:

    find $DIR/tmp/daily/ -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/tmp/weekly/
    

    所以脚本现在看起来像:

    #!/bin/sh
    # Original script: https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
    # Edited/hacked/chopped/stuff by Khaito
    
    # Redirect all script output to log file located in log directory with date in name.
    exec 3>&1 4>&2
    trap 'exec 2>&4 1>&3' 0 1 2 3 RETURN
    exec 1>/path/to/logs/$(date +"%Y-%m-%d-%H%M")_intranet.log 2>&1
    
    # Local Source
    SOURCE=/path/to/source
    # Create directories etc here
    LOCAL=/path/to/backups
    DIR=/path/to/backups/intranet
    DIRD=/path/to/backups/intranet/daily
    DIRW=/path/to/backups/intranet/weekly
    DIRM=/path/to/backups/intranet/monthly
    
    # Local Destination
    DESTINATION=/path/to/network/share
    
    # Database Backup User
    DATABASE='dbname'
    DATABASE_USER='dbuser'
    DATABASE_PASSWORD='password'
    DATABASE_HOST='localhost'
    
    # DO NOT EDIT ANYTHING BELOW THIS
    # Date Variables
    DAY_OF_YEAR=$(date '+%j')
    DAY_OF_MONTH=$(date '+%d')
    DAY_OF_WEEK_RAW=$(date '+%w')
    WEEK_OF_YEAR=$(date '+%W')
    DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
    DAY=$(date '+%a')
    NOW=$(date +"%Y-%m-%d-%H%M")
    MONTH=$(date '+%m')
    YEAR=$(date '+%Y')
    DOW=$(date '+%u')
    YEARMONTH=$(date +"%Y-%m-%B")
    
    # Make Daily Folder
    if [ ! -d "$LOCAL/intranet" ]; then
            mkdir "$DIR/intranet"
            echo 'Intranet directory created...'
    fi
    
    # Make Daily Folder
    if [ ! -d "$DIR/daily" ]; then
            mkdir "$DIR/daily"
            echo 'Daily directory created...'
    fi
    
    # Make Weekly Folder
    if [ ! -d "$DIR/weekly" ]; then
            mkdir "$DIR/weekly"
            echo 'Weekly directory created...'
    fi
    
    # Make Folder For Current Month
    if [ ! -d "$DIR/monthly" ]; then
            mkdir "$DIR/monthly"
            echo 'Monthly directory created...'
    fi
    
    # Make Folder For Current Year
    if [ ! -d "$DIR/${YEAR}" ]; then
            mkdir "$DIR/${YEAR}"
            echo 'Directory for current year created...'
    fi
    
    # Tar the intranet files then dump the db, tar it then remove the original dump file.
    tar -cvzf $DIRD/${NOW}_files.tar.gz $SOURCE
    mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/database.sql
    tar -cvzf $DIRD/${NOW}_database.tar.gz $DIR/database.sql
    rm -rf  $DIR/database.sql
    echo 'Made daily backup...'
    
    # Check if it's Sunday (0), if so, copy the two most recent daily files to the weekly folder.
            if [ $DOW -eq 0 ] ; then
                    find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRW
            fi
                    echo 'Made weekly backup...'
    
    # Check if it's the first day of the month, if so, copy the two most recent daily files to the monthly folder
            if [ $DAY_OF_MONTH -eq 1 ] ; then
                    find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRM
            fi
                    echo 'Made monthly backup...'
    
    # Check if it's the first day of the year, if so, copy the two most recent daily files to the current year folder
            if [ $DAY_OF_YEAR -eq 1 ] ; then
                    find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/${YEAR}/
            fi
                    echo 'Made annual backup...'
    
    # Rsync the new files to the network share for backup to tape
    rsync -hvrPt $DIR/* $DESTINATION
    
    # Delete local backups
    # find $DIRD -mtime +8 -exec rm {} \;
    # find $DIRW -mtime +15 -exec rm {} \;
    # find $DIRM -mtime +2 -exec rm {} \;
    # find $DIR/${YEAR} -mtime +2 -exec rm {} \;
    
    # Delete daily backups older than 7 days on network share
    # find $INTRANETDESTINATION/daily -mtime +8 -exec rm {} \;
    # Delete weekly backups older than 31 days on network share
    # find $INTRANETDESTINATION/weekly -mtime +32 -exec rm {} \;
    # Delete monthly backups older than 365 days on network share
    # find $INTRANETDESTINATION/monthly -mtime +366 -exec rm {} \;
    
    echo 'Backup complete. Log can be found under /path/to/logs/.'
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2015-07-19
      • 1970-01-01
      • 2020-12-08
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2019-11-10
      相关资源
      最近更新 更多