First of all: I don't know if this is the correct place to ask, please tell me where I can go with this before you remove my post, that would be great, thank you!
What I want to do (and what my options are):
I got a synology NAS which can execute tasks that (yes I'm a noob) are "linux commands".My goal is to backup my whole webspace or a specific folder on it to the NAS, but only new or changed files (like it would work with git).
I can't use SSH keys (which would be the best way I assume) because I can't set them up correctly on my NAS (it is possible but I'm missing knowledge and even though I would appreciate if you help me with those, it's just too complicated for me, I read a bunch of stuff and it just doesn't work, so I try the way without SSH keys (at least this way I understand a little bit whats going on)).
So my pseudo code would be something like:
- Connect the NAS to the webspace
- Go to my specific folder (in my case the FTP login is already limited to only that folder, so we can skip that)
- Create a folder on my NAS / or navigate to it (it's already existing)
- Clone all the stuff from the webspace folder initially the first time
- gzip the whole folder and name the zip by date
- When executing again the script should only check if any files have been changed and only update files, download new ones or also remove old ones (so every of my zips would be a fully working webspace without any unnecessary files)
- So now my main folder is up to date with the webspace and get zipped again
What I currently have:
lftp -u MY-FTP-USERNAME,MY-FTP-PASSWORD MY-WEBSPACE-URL 'mirror /test'
tar -zcvf /volume1/BACKUPS/backup-$(date +%Y-%m-%d-%H-%M-%S).tar.gz /volume1/BACKUPS/MY-WEBSPACE-NAME/
rm -rf /volume1/BACKUPS/MY-WEBSPACE-NAME/
Some problems with that:
- It downloads the whole webspace every time, because I couldn't make that "only new files" thing to work. The filesize is not the problem, but these are so many small files, it takes a really long time and just blocks the resources of the NAS
- For some reason the gzip when unzipped contains the whole path
/volume1/BACKUPS/MY-WEBSPACE-NAME/
and only in the last folder are my files. I just want the MY-WEBSPACE-NAME
folder with my files inside to be zipped.
I would really much appreciate if you could help me with this. It doesn't have to be lftp
, I also tried wget
but that also didn't work. So anything thats work, just go for it. It's a little time ago I was working on this the last time, but if I remember correctly I can't use git
but I don't know why anymore