Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Mount, Copy, Compress/Archive

  • 16-05-2005 10:45am
    #1
    Registered Users, Registered Users 2 Posts: 11,987 ✭✭✭✭


    So here's what i want todo.

    1. Mount a windows share
    2. Backup to backup drive
    3. Compress and archive it off as Mon/Tue/Wed/Thur/Fri
    4. Notify me of any errors

    Simple enough, i want to backup a windows share onto my linux box, In my head its like this.
    - Create backup folder, overwrite each day.gz
    smbmount //server/public /backup -o username=guest%,fmask=644,dmask=755,uid=1000,gid=1 00, debug=0,workgroup=mshome
    
    That mounts it - works grand

    Now to rsync it(dont even think i need todo this but hey)
    rsync -a /backup /backup/public
    

    now to compress and archive it
    #!/bin/sh
    
        export PATH=/usr/local/bin:/usr/bin:/bin
    
        DAY=`date "+%A"`
    
        tar -cvzf /backup/$DAY /backup/public
    
    

    Now im just running through these steps one by one on the command line, and im left with a tar file thats not really compressed.

    So I can prolly through in a
    gzip -9 public.tar
    
    or whatever way this all works out, maybe pipe some of those commands as well.

    Any idea/improvements ?

    The folder to be backed up is around 10gb


Comments

  • Technology & Internet Moderators Posts: 28,832 Mod ✭✭✭✭oscarBravo


    bazH wrote:
    rsync -a /backup /backup/public
    
    Um. You probably realise this, but you're creating a copy of the mounted share in a subdirectory of the mounted share. Given that the rsync copies recursively, this just doesn't seem like all that great an idea.
    bazh wrote:
    now to compress and archive it
    #!/bin/sh
    
        export PATH=/usr/local/bin:/usr/bin:/bin
    
        DAY=`date "+%A"`
    
        tar -cvzf /backup/$DAY /backup/public
    
    
    Now im just running through these steps one by one on the command line, and im left with a tar file thats not really compressed.

    So I can prolly through in a
    gzip -9 public.tar
    
    or whatever way this all works out, maybe pipe some of those commands as well.
    You've included the -z option to the tar command - are you sure it's not compressed? If you use -j instead of -z it should use bzip2 compression, which should make it smaller.


  • Registered Users, Registered Users 2 Posts: 11,987 ✭✭✭✭zAbbo


    I changed the names of the folders to make it look simplier.

    The backup and mount folders are in two totally diff locations, so rest assured there.

    The raw folder is 10gb, and the .tar is around 9.8gb

    These are around 120,000 files some small, some big, around 35,000 word docs


  • Registered Users, Registered Users 2 Posts: 354 ✭✭AndrewMc


    You might also consider using incremental hard-link based backups (if you're backing up onto a unix filesystem). It works like this:
    • Make your first backup, like
      cp -a /stuff /backup/2005-05-16
      
    • Every day, duplicate the previous day's backup, but using hard-links, like
      cp -al /backup/2005-05-16 /backup/2005-05-17
      
      This duplicates the directory structure, but not the files. The entries in the new directory point to the same actual files on disk, so it takes up very little space.
    • Then use rsync to transfer over the changed files, like
      rsync -av --delete /stuff/ /backup/2005-05-17/
      
      The reason this works (safely) is that rsync transfers each file into a temporary file, and when finished it deletes the old directory entry and moves the temporary file into place. The space used by the old version is still used (since yesterday's backup still points to it), and new space is now used to store today's version.
    This allows you to keep incremental backups, but still allowing you to see a full directory structure of what /stuff looked like on the day in question. You can still tar/gzip the backup afterwards, but unless you've got a huge number of changes per day, this is frequently far more efficient. With a 10GB area to backup, and 1GB of changes per day you could keep about a month's backups, unzipped and ready-to-use, with just 40GB of space.


  • Registered Users, Registered Users 2 Posts: 11,987 ✭✭✭✭zAbbo


    Alright, after a lengthy read and some testing, i've settled on
    # Backup smbmounted source to internal 300gb sata drive
    
    SOURCE_FOLDER=/backup;
    TARGET_FOLDER=/backup2;
    
    
    # Check to see if user = root
    if (( `id -u` != 0 )); then
    { echo "Sorry, must be root. Exiting..."; exit; }
    fi;
    
    # Rotating 5 daily backups:
    
    # Step 1: force delete the oldest backup, if it exists
    if [ -d $TARGET_FOLDER/backup.5 ] ; then
    rm -rf $TARGET_FOLDER/backup.5 ;
    fi;
    
    # Step 2: Shift the backups by 1
    
    if [ -d $TARGET_FOLDER/backup.4 ] ; then
    mv $TARGET_FOLDER/backup.4 $TARGET_FOLDER/backup.5 ;
    fi;
    
    if [ -d $TARGET_FOLDER/backup.3 ] ; then
    mv $TARGET_FOLDER/backup.3 $TARGET_FOLDER/backup.4 ;
    fi;
    
    if [ -d $TARGET_FOLDER/backup.2 ] ; then
    mv $TARGET_FOLDER/backup.2 $TARGET_FOLDER/backup.3 ;
    fi;
    
    if [ -d $TARGET_FOLDER/backup.1 ] ; then
    mv $TARGET_FOLDER/backup.1 $TARGET_FOLDER/backup.2 ;
    fi;
    
    # Step 3: make a hard-link-only copy of the latest backup, if it exists
    # cpio options are single (p)ass, create dir and (l)ink files
    if [ -d $TARGET_FOLDER/backup.0 ] ; then
    # the next 2 lines are for AIX
    cd $TARGET_FOLDER/backup.0 && find . -print |
    cpio -pdl $TARGET_FOLDER/backup.1 ;
    # the next line is for GNU cp
    # cp -adl $TARGET_FOLDER/backup.0 $TARGET_FOLDER/backup.1
    fi;
    
    # Step 4: create backup by updating previous
    # rsync options are (a)rchive and (delete) extra
    rsync -a --delete $SOURCE_FOLDER/ $TARGET_FOLDER/backup.0/ ;
    
    # Step 5: update backup.0 to reflect the backup date and time
    touch $TARGET_FOLDER/backup.0 ;
    

    Works grand, now all i need todo is archive it off to tape, I can prolly throw in a cpio command to do that on a weekly basis

    Thanks for all the help


Advertisement