Python script to backup remote directory using sftp
This is the sftp version of the ftp script I published in an earlier post. It logs into a remote host using sftp and backs up a remote directory recursively backing up all subdirectories.
This one uses Paramiko instead of ftplib so you can do encrypted file copies.
I live in Chandler, Arizona with my wife and three daughters. I work for US Foods, the second largest food distribution company in the United States. I have worked in the Information Technology field since 1989. I have a passion for Oracle database performance tuning because I enjoy challenging technical problems that require an understanding of computer science. I enjoy communicating with people about my work.
This entry was posted in Uncategorized
. Bookmark the permalink
is it possible to get all subdirectories and zip the whole backup with a timestamp?
That is what I do now but at the time I wrote the blog post I did not have the ability to SSH into the host that my blog was on. I could only FTP individual files. I should probably post my current backup script but I have to review it to take out any secure information.
Thanks for your comment/question!
thanks for your answer.
I’m trying to backup a SFTP server to my Mac, Zip it there and push the zip to a NAS.
Thats working fine.
The only thing is, I would like to keep the data structure I have on the SFTP.
The Script creates a new dir for all files. The dir path contains the path where the script found the file but its not the old structure and if I’d have to get the backup back to the SFTP I would copy paste for days 😀
As I’m very unexperienced to python I’d like to ask you if you have either a script which puts the files from NAS back to SFTP in the paths it created. Or if you could help me getting the script keeping the old file structure.
Thanks for your comment but I am not sure that I understand your question. If you zip your sftp server the zip will preserve the directory structure and to restore you would just unzip to the same top level directory. I use tar and gzip but zip is basically the same thing. The zip file stores the directory structure.
Can you fix the script so that only new files are downloaded, existing ones are not overwritten?
You would have to look at the remote and local files to see if they are the same size and date maybe before choosing to overwrite them. Or you could just check that they do not exist. I think you would add these checks around line 49 in the listed code. In my case I used this to download the files into a new directory so there was nothing to overwrite.