I need to transfer a number of files files over a low-quality broadband link to a server. The files are large, and take approximately 30 minutes to transfer per file.
I use scp, but it sometimes hangs — the transfer doesn’t fail with an error, it keeps running, but no further data is transferred.
So, I’m looking for a “failsafe” upload solution, one that will work even if the link fails for a few minutes or is otherwise unreliable.
My idea is:
- split big file in small parts
- upload parts, with timeout and retry if fail
- is there a ready-to-run tool that implements this idea? (no specific need for scp. could be ftp or anything else)
- is there a way to detect when scp hangs? ((that is, it is still running, but does not transfer data)
You can use rsync to copy your file from one computer to the other. rsync can use ssh as its underlying transport. Combine
rsync --partial with a script such as this one to try again in case of network failure, and you should be able to move your files even in the face of network errors.
Another way to do it would be to mount the remote filesystem on your local computer with
sshfs -o reconnect, and then just cp the file(s). sshfs/Linux will take care of the rest. Based on some testing I did today, this seems to be much slower than rsync.
Finally, you can set up a VPN between the two machines. This involves the most work, and either of the above solutions are much easier, but it would solve the problem of a flaky network connection. And as some people have noted, this can also be flaky. It can work if the VPN is very aggressive about reestablishing connections, as OpenVPN is, but the above solutions are much better.
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.