Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

rsync resume issue

  • 18-06-2013 4:42pm
    #1
    Registered Users, Registered Users 2 Posts: 3,683 ✭✭✭


    I have a small basic script to upload to a server overnight.
    #!/bin/bash
    
    SOURCEDIR=/home/blah1
    DESTDIR=/server/blah2
    TIMER=5m
    RETURN_CODE=99
    
    rsync -P -e ssh $SOURCEDIR $DESTDIR
    RETURN_CODE=$?
    while [ $RETURN_CODE != 0 ]
       do
         sleep $TIMER
         rsync -P -e ssh $SOURCEDIR $DESTDIR
         RETURN_CODE=$?
       done
    

    The intention was to recover from instances where I might lose internet connection and have rsync try again after 5 minutes and resume uploading from where it left off e.g. if 20% had been uploaded it would start uploading the remaining 80% instead of starting again at 0%.

    When rsync is running it copies filename.ext from the source to .filename.extQWUT on the destination. (QWUT seems to be some random letters added)
    If I use CTRL and C to quit the script before it completes the name of the file on the server is changed back to filename.ext

    However if the connection drops the name of the file on the server remains as .filename.extQWUT

    This causes rsync to start again at 0% when the while loop restarts.

    Is there some way to ensure that rsync will recognise that x% has already been uploaded rather than starting again from scratch?


Comments

  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie




  • Closed Accounts Posts: 18,966 ✭✭✭✭syklops


    DeepBlue wrote: »
    I have a small basic script to upload to a server overnight.
    #!/bin/bash
    
    SOURCEDIR=/home/blah1
    DESTDIR=/server/blah2
    TIMER=5m
    RETURN_CODE=99
    
    rsync -P -e ssh $SOURCEDIR $DESTDIR
    RETURN_CODE=$?
    while [ $RETURN_CODE != 0 ]
       do
         sleep $TIMER
         rsync -P -e ssh $SOURCEDIR $DESTDIR
         RETURN_CODE=$?
       done
    

    The intention was to recover from instances where I might lose internet connection and have rsync try again after 5 minutes and resume uploading from where it left off e.g. if 20% had been uploaded it would start uploading the remaining 80% instead of starting again at 0%.

    When rsync is running it copies filename.ext from the source to .filename.extQWUT on the destination. (QWUT seems to be some random letters added)
    If I use CTRL and C to quit the script before it completes the name of the file on the server is changed back to filename.ext

    However if the connection drops the name of the file on the server remains as .filename.extQWUT

    This causes rsync to start again at 0% when the while loop restarts.

    Is there some way to ensure that rsync will recognise that x% has already been uploaded rather than starting again from scratch?

    If you are uploading over the internet and the files you are transferring are not already excrypted, then as per Khannie's post, I'd recommend SCP. However, if you are uploading a lot of information, there would be less over head to encrypt it, and use rsync.

    To answer your question there is a switch called '--existing' which will skip the transfer of existing files, so it won't start back at 0%.


  • Registered Users, Registered Users 2 Posts: 3,683 ✭✭✭DeepBlue


    I've seen Khannie's link before - it's what persuaded me to switch from using scp to rsync as the ability to break off an upload and resume is handy for me.

    Since I'm using rsync over ssh I would have thought that would be secure enough.

    The --partial switch may help and I'm also looking at the --timeout=TIME switch to see if that might help.

    Of course perhaps it might the be ssh portion of the script that's causing the issue when the connection times out.......

    The -existing switch would mean that rsync abandons any attempt and completing the upload of the files which timed out but that's not what I want.


  • Closed Accounts Posts: 18,966 ✭✭✭✭syklops


    DeepBlue wrote: »
    I've seen Khannie's link before - it's what persuaded me to switch from using scp to rsync as the ability to break off an upload and resume is handy for me.

    Since I'm using rsync over ssh I would have thought that would be secure enough.

    The --partial switch may help and I'm also looking at the --timeout=TIME switch to see if that might help.

    Of course perhaps it might the be ssh portion of the script that's causing the issue when the connection times out.......

    The -existing switch would mean that rsync abandons any attempt and completing the upload of the files which timed out but that's not what I want.

    If your using rsync over ssh then that is fine security wise. At the end of the day its the same protocol, scp is an SSH compliant replacement for rcp, rsync was designed to be more flexible than rcp.

    The -P switch for rsync represents resume file transfer. Is it possible your while loop is breaking that functionality?

    If you use the same command " rsync -P -e ssh $SOURCEDIR $DESTDIR" on its own and the connection goes down, it should resume when the connection comes back up.


  • Registered Users, Registered Users 2 Posts: 3,683 ✭✭✭DeepBlue


    syklops wrote: »
    If you use the same command " rsync -P -e ssh $SOURCEDIR $DESTDIR" on its own and the connection goes down, it should resume when the connection comes back up.
    That's the command I used to use but it failed when the connection was lost and didn't resume.
    I'll try simulating the failure again to see what the exact error code was.


  • Advertisement
Advertisement