![]() The set up I have is the PCs back up to the NAS and the NAS is backed up both locally (for *when* it fails so I can restore without having to pull the remote) and offsite in case the house burns down or some similar disaster. If a computer breaks and I need to restore the data onto a new one it is going to be quicker to restore from a local backup rather than pull it from a remote site. It's a good question and you asking has made me think.įor me the purpose of the offsite backup is to provide me with diasater recovery. # /usr/bin/rsync -av -delete -progress -e /usr/bin/ssh /share/bkup/pst_file_to_be_backed_up.zip To pull I think you just reverse the two locations but I am not a rsync expert so someone else can advise depending on the details of your system. My command is something like the following.I had to run it manually the first time to save the keys from the other side but that will depend on your set up. SSH and Rsync are both present on the NAS so I set up a script to run it overnight via cron but can easily log in to my QNAP NAS via ssh or telnet and just issue the rsync command. The title says it is broken but I have got it working somewhat since then. It has worked well but just lately I have had some difficulty (see this thread: ). I wanted to do it via the web interface but the web interface doesn't allow it and require ssh so I ended up doing it directly by logging in to the NAS via ssh/telnet. I just noticed this thread about rsync over ssh.
0 Comments
Leave a Reply. |