Score:-1

Backup my webspace to NAS

id flag

First of all: I don't know if this is the correct place to ask, please tell me where I can go with this before you remove my post, that would be great, thank you!

What I want to do (and what my options are):

I got a synology NAS which can execute tasks that (yes I'm a noob) are "linux commands".My goal is to backup my whole webspace or a specific folder on it to the NAS, but only new or changed files (like it would work with git).

I can't use SSH keys (which would be the best way I assume) because I can't set them up correctly on my NAS (it is possible but I'm missing knowledge and even though I would appreciate if you help me with those, it's just too complicated for me, I read a bunch of stuff and it just doesn't work, so I try the way without SSH keys (at least this way I understand a little bit whats going on)).

So my pseudo code would be something like:

  1. Connect the NAS to the webspace
  2. Go to my specific folder (in my case the FTP login is already limited to only that folder, so we can skip that)
  3. Create a folder on my NAS / or navigate to it (it's already existing)
  4. Clone all the stuff from the webspace folder initially the first time
  5. gzip the whole folder and name the zip by date
  6. When executing again the script should only check if any files have been changed and only update files, download new ones or also remove old ones (so every of my zips would be a fully working webspace without any unnecessary files)
  7. So now my main folder is up to date with the webspace and get zipped again

What I currently have:

lftp -u MY-FTP-USERNAME,MY-FTP-PASSWORD MY-WEBSPACE-URL 'mirror /test'

tar -zcvf /volume1/BACKUPS/backup-$(date +%Y-%m-%d-%H-%M-%S).tar.gz /volume1/BACKUPS/MY-WEBSPACE-NAME/

rm -rf /volume1/BACKUPS/MY-WEBSPACE-NAME/

Some problems with that:

  1. It downloads the whole webspace every time, because I couldn't make that "only new files" thing to work. The filesize is not the problem, but these are so many small files, it takes a really long time and just blocks the resources of the NAS
  2. For some reason the gzip when unzipped contains the whole path /volume1/BACKUPS/MY-WEBSPACE-NAME/ and only in the last folder are my files. I just want the MY-WEBSPACE-NAME folder with my files inside to be zipped.

I would really much appreciate if you could help me with this. It doesn't have to be lftp, I also tried wget but that also didn't work. So anything thats work, just go for it. It's a little time ago I was working on this the last time, but if I remember correctly I can't use git but I don't know why anymore

cn flag
Off topic now: Questions on Server Fault must be about managing information technology systems in a business environment. Home and end-user computing questions may be asked on Super User, and questions about development, testing and development tools may be asked on Stack Overflow.
Score:0
cn flag

For issue 1, if you are using GNU tar (which you probably are), there is the option --after-date=date or -N date. You may use an expression like --after-date='2 days ago' or tar --newer 20210601 or -N '2021-06-01 12:00:00' - be careful to use the proper quotes. Formats are said to a bit version-dependent with GNU tar (I had no problems yet, though), so if one doesn't work, at least one of the others should.

You may also use a path (starting with "/"!) instead of a date. In this case, tar will store only files modified or created after the modification time of said file. If you keep your backups in /volume1/BACKUPS, you might use '-N /volume1/BACKUPS/name-of-your-latest-backup'.

For issue 2, the easiest way is to use tar with the -C option, like this:

tar -zcvf /volume1/BACKUPS/backup-$(date +%Y-%m-%d-%H-%M-%S).tar.gz -C /volume1/BACKUPS/MY-WEBSPACE-NAME/ .

This instructs tar to "cd" to /volume1/BACKUPS/MY-WEBSPACE-NAME/ and then tar the current directory (".", which now is /volume1/BACKUPS/MY-WEBSPACE-NAME/.

jona avatar
id flag
Thank you very much. I didn't test the first part because I now actually switched to SSH keys and rsync, but the second part did the trick.
Score:0
my flag

If I understand well what you want to do, I would suggest you use a program that allows you to archive all files in a versioned way, in order to keep all files, from always to always, for example with ZPAQ technology. With something like a "tar-snapshot-enabled" software.

If I understand correctly, your webspace is "mounted" by the Synology NAS as if it were a folder, correct? Not sure, because I have ported to QNAP and not Synology (PS cannot comment, not enough reputation still)

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.