Skip to main content


I use #Syncthing on a #Pi that sits in some place (not home) behind a firewall, I installed #Tor on it and can SSH to it via a hidden service, very convenient, so now that #RPi got a 2TB external HDD that has to be mounted manually (due to encryption). The #RPi is used for #backup and it works like charm, it is up without a problem for over a month now and keeps some tens of GB now.
Really very nice thing. I guess you can also use #Rsync via the hidden service, but that wouldn't be fast. The thing is, this way you can take the Pi to any place, hook it to the LAN there, wait 5 minutes (max) and SSH into it from all over the world. Off-site backup made easy, just drop the Pi at your grandmothers, hook it up to the router and go.

You can also do this without the external HDD and use a 512 GB micro SD.

@Utzer
im thinking about replacing my pi. it regularly dies out of no reason. im tired...

I had (maybe still have) a problem like that with an other Pi, I bought a new one and that one is running just fine.
The older one now got a bigger micro SD, but I was to lazy to get is configured, but I will do that next week I think.
I am not sure why it died that often, but I think there is some Pi that have problems with the internal power supply.

I don't suppose you documented your process of setting this up, did you?

@Stephen Judge Setting up Syncthing is a very straightforward process which is also documented. Adding a Tor hidden service to connect via SSH is a different story, though adding that capability probably wouldn't be necessary if you keep the backup drive mounted.

@utzer I built something rougly related only that the remote machine would NAT punch upon the first SSH connection and I'd automatically tunnel through that one after the initial SSH over Tor trigger. (Full speed, less latency, still discoverable from everywhere. Needs a spot for connecting to on demand though.)

Also, given your latency very quite high over Tor, you may want to skip rsync.
You could be interested in restic though.

Are you sure that Syncthing is enough for a backup? My main concern is that it merely mirrors your live data somewhere else. But what if you accidentally delete something and only notice it later, say a week later? The deletion by that time would have been propagated by Syncthing to the other drive and the data would have been deleted there as well. I think a backup solution should also keep historical backups for a configurable period of time.

@setThemFree I don't have access to a Syncthing client at the moment, but IIRC, there was an option to have a trash bin of sorts if any files are deleted.

Maybe, but what about accidentally overwritten files?

In general, I think Syncthing and/or rsync themselves are not enough for a reliable backup. The ability to travel back in time is crucial. I have had a situation when I learned that a file was deleted only a year after the fact. Luckily, I have had obnam snapshot of the data. Everything deduplicated and stored efficiently.

(Since obnam was retired I have moved to borg-backup).

Syncthing can be configured to keep revisions of each file, I use staggered file revisions, for different durations (90 days, 180 days), these will thin out the older revisions, have a look in the documentation linked in the comments. My initial idea was to combine this with ZFS/BTRFS snapshots, but I did not implement that yet, I could at any time.

Syncthing can be reached through the firewall, there is relays out there that it will connect to, so it is indirect connection then, but it works fine. Data transferred is encrypted, but you can also disable the relay function.

I use the tor hidden service because it is just nice, also in the LAN, to just plugin the Pi, then wait 2 minutes and it is available by SSH over the .onion address, this spares me from trying to find the local IP. Yeah I know how to do that by a network scan.

Also it is more fault prove, but I learned that Tor updates do not work fine when done in a screen and being connected via the hidden service. 🙁
Solution for this: sudo apt-get update && sudo apt-get install -o ... show more

Syncthing can be configured to keep revisions of each file, I use staggered file revisions, for different durations (90 days, 180 days), these will thin out the older revisions, have a look in the documentation linked in the comments.
Thanks, good to know.
Solution for this: sudo apt-get update && sudo apt-get install -o Dpkg::Options::="--force-confold" -y tor
Thanks for the tip.
I was always looking for a backup script that allows me to do a backup via some SSH connection to some server, it should be run daily and retain revision of each run up to a defined maximum age while using using hard links (I think that would save space in the destination), but then I would need a more performance computer that a RPi, because we are talking about a few hundred GB.
I use borg-backup to make local backups of ~300 GB amount of data. It deduplicates everything while in the same time giving me a complete snapshot of each backup. I haven't tested it over a network, but backing up loc... show more

Yes I guess borg it will need CPU resources on the destination, I assume it does compare the files (checksum) before transfer. I will have a look in it, because I need to set up some backup for home folder and some system folders and bord sounds interesting. Thanks.

Please ping back about your findings.

storeBackup might also be an option for you. It is what I use and I like that it is simple to configure and works flawlessly. On top of that, it is written in Perl.

@utzer : yes you should. Borg-backup is a software that's worth using. Very efficient. I use it to backup disk images and large amount of data.

@Djan GICQUEL yes, I am aware of that, but currently I use Syncthing as a live "backup" synchronization, I am not home very often, so it is better to synchronize the data all the time to prevent loss of it. I had some problem before with an HDD that died and ever since I use a cloud or now my home cloud to prevent loss of data.

And yes I tested this, new install is done by restoring via Syncthing for the data.

I will at some point look closer at Borg, if it manages well with blockwise backup (transferring only the changed part of a file) it is a good option even for a backup over WAN. But in the meantime it is the plan to setup some other computers to use Borg. I should start with that to retire my "tar|gz rsync" solution.

@Djan GICQUEL I just saw Borg-Backup can do source side encryption and can do backup via SSH. I really need to look into this.

Yes, do it 😃

I use storeBackup which is highly configurable, fast, and very efficient in the way it stores data. The whole thing is written in Perl and thus runs without extras on an RPi. Takes about 5 minutes to setup, including installing using apt.

I think #BorgBackup takes about the same effort. I needed the most time for other things around it, like restricting the ssh access on the remote host to a certain folder and only allowing to run one command, things like that took much more effort.