I’ve been backing up my data using rsync, but when your data directory size has reached 90 GB, it can be pretty slow at times.
So I looked around, and found lftp. Backing up with it is very simple :
lftp -u myusername,mypassword -d -e “mirror -vnR /the/source/directory /the/target/directory” the.ftpserver.com
Explanation of the switches
-u : I think it’s quite clear, put your username & password on the command line, so this can be automated
-d : turn on debugging. Once you’re happy that lftp is running smoothly every single time, you can omit this.
-e : define the command to be done by lftp.
mirror : mirror the source into the target. Very nifty.
-v : be verbose.
-R : reverse mirroring – lftp will issue PUT command (upload) into the FTP server. Otherwise, the mirror switch will cause lftp to issue GET command (download) from the FTP server.
-n : just upload newer files (saves time)
A few tips
1. lftp seems to hang ? Try putting set ftp:passive-mode off on /etc/lftp.conf
Some FTP servers expect active ftp connection, and hung on passive ones. At least the one in my Mac OS X shows this behaviour (tnftpd).
2. For even faster backup speed, try putting set ftp:sync-mode off in the /etc/lftp.conf file.
This will cause lftp to enable pipelining; however, be advised that not all FTP servers nor all routers can handle this.
Since setting up FTP server is pretty easy (hint: on ubuntu; aptitude install gproftpd, on windows; download & install filezilla), I can backup my data to more machines now. And the backup speed is also faster. I just need to have lftp on my laptop.
And, do you know that lftp can handle seven file transfer protocols : ftp, ftps, http, https, hftp, fish, sftp and file? This is the swiss army knife of file transfer. Very useful. Kudos to Alexander V. Lukyanov for this.