Any suggestions on Crashplan home-to-home setup?

As a CrashPlan Home orphan I’m looking for a way to replicate my current “swapping” backup with my dad (I back my stuff up to me and him, he backs his stuff up to himself and me).

I run a few servers at my end so it’s no problem setting up SFTP, NextCloud, etc. and punching holes in firewalls but it’s a different story for my dad who is running Windows 10 and has little control over the pipe coming into his home.

Are there any tools people might suggest using (or avoiding) to simplify things like:

  • dealing with dynamic IPs (noip.com and the like)
  • getting through firewalls
  • receiving Duplicati backup files (such as FileZilla server for SFTP or IIS??? for WebDAV)

Note that I’m particularly interested in utlities that don’t require server level hardware or configs - half of this monster is going to be running on plain old Windows 10. :open_mouth:

Thanks!


Oh, dear - I seem to have overlapped a bit with this topic where the promising sounding Minio is mentioned (along with the necessary --s3-ext-forcepathstyle=true)!

1 Like

I’d also be curious about possible solutions, but unless you are sending huge amounts of data back and forth, I’m wondering whether it’s actually worth the hassle, given how cheap cloud storage has become…

Probably the easiest solution for your scenario is to set up a simple NAS at your dad’s.

I’m actually evaluating unRAID and NextCloud for that right now, but that’s still a few hundred dollar investment to get someting like that going and it still doesn’t deal with firewall issues.

Mostly I’m looking into this because I’m a paranoid idiot who thinks everybody is out to get my data. Well, that and I’m cheap so would likely choose the crapiest provider out there who would then close up shop a week after I finished my first full backup. :blush:

The data which dublicati encrypts for you?

Yea, like Crashplan, for example :wink:

Well, I wasn’t planning to name names, but… :rofl:

The other thing that you might consider is the age of your Dad’s hardware. It sounds outdated and hard drives eventually fail. It would be super frustrating to back all this up only to have his HD die at some point. I am playing with Duplicati on a single board computer and that could potentially be an option for you. However, I still in the testing phase and could go another direction.

1 Like

You can use afraid.org for free dynamic dns.

As for the file server, you could use ssh with public-key authentication for maximum security. Instructions here: Installing SFTP/SSH Server on Windows using OpenSSH :: WinSCP

Whichever server you use, remember to take extra precautions before exposing your father’s computer to the public internet (e.g. run the server on a separate account with limited privileges, use a random incoming port instead of the default, etc.)

Thanks for the suggestions! I’ve got an original Raspberry Pi sitting in a drawer somewhere that was way under-powered for something like CrashPlan but should indeed be just fine for Duplicati! Time to read up on a few more topics!

And I’ve not heard of afraid.org or winscp server (I’ve only seen the client) before so I’ll check that out too. As far as ports I try to never use the default ones where possible, though I do know security through obscurity isn’t particularly useful so thanks for the separate account reminder.

Oh, and my comment about “not naming names” earlier might need to be adjust now that we’ve got a final date on Google Drive being retired (though to be fair I haven’t actually looked at Google Backup & Sync). :frowning:

1 Like

Hi JonMikeIV

I’m also a CrashPlan (CP) Home orphan with friend-to-friend backups. Fortunately my contract does not expire until March so I have some time left.

I just switched to Freenas 11 with zfs on one of my two servers that was running Ubuntu with an ext4/mdadm raid and CrashPlan. Freenas 11 includes minio and it was almost trivial to set up a bucket and connect to it through duplicati. The only trick is that you have to enable the ‘’’–s3-ext-forcepathstyle=true)’’’ option AND you have to connect to the Minio web interface to create the buckets (this was not clear to me … it’s not possible to create buckets from the Freenas web interface.

I am using afraid.org for dynamic dns. This is all pretty new to me, but I also just got a trial of Wasabi (S3 compatible) to host a secondary remote backup.

After I get all the initial backups seeded locally this server will live at a remote location. I will let you know how it works.

I’m really hoping with fingers crossed that duplicati will consider some kind of ethernet-link-checking as I only want my laptops to backup via LAN cable, never on Wi-Fi (this was one of the great options on CP).

Damon

Thanks for the suggestions!

I tried FreeNAS, unfortunately it was during the Corral debacle so I ended up going with unRAID instead. I’m happy with it so far (though would have preferred free.)

It also supports Docker so I have set up Minio (yep, it was easy) but haven’t actually connected Duplicati to it yet. Luckily I have a fixed IP & domain so don’t have to worry about dynamic DNS.


The interface limit feature has been discussed and requested, but I haven’t seen much work done with it.

Maybe a Bounty might help?

Thanks! Yes I have followed those threads including the scripts that can be used, but I have to run this on windows, mac and linux and I don’t like the cross-portability problems with the scripts…

Maybe a bounty is a good idea–is there a suggested method for doing this? I imagine there might be several contributors to a crowd-funded initiative.

I’m also going to be evaluating Cloudberry, which although not free, might be less than a bounty.

This all seems great, but… unless I’m missing something here :slight_smile:

Setting up a minio service on your end, using your server hardware, and your hard disks etc etc… nets you a service your Dad can connect to and back up to, and also yourself. But doesn’t fulfill the ‘I want MY backups to be located remotely’ - ie: At your Dads location.

looking for a way to replicate my current “swapping” backup with my dad (I back my stuff up to me and him, he backs his stuff up to himself and me).

You’d have to have the same setup at HIS location too, no?
and that was exactly the thing you were trying to avoid?

Shoutout to dynu.net who I used for my dynamic dns. :slight_smile:

Correct.

My primary “need” is to get all the people I used to support with CrashPlan switched to something else (such as Duplicati) getting THEM set up with offsite backups.

My secondary “need” is to get my content (not including their backups) somewhere offsite - such as at my father’s.

Since I know my unRAID box will be up almost 100% of the time, Duplicati works great for that - but even setting up my father’s Windows 10 desktop to Minio or an SFTP server gets a bit sketchy because I don’t know when he’s going to be online, and at the moment Duplicati doesn’t handle intermittent destination availability well (it just skips the backup and waits to try again at the next scheduled time).

I may do that just so I have SOMETHING going, but I’m also looking at building him a FreeNAS or unRAID box like mine (out of spare hardware I have sitting around) or using SyncThing to keep my local backup synchronized with his machine.

According to Wikipedia on SyncThing:

Device discovery is achieved via publicly-accessible discovery servers hosted by the project developers

I’m assuming if that all goes away for some reason the sync will no longer work, but I’ll still be able to manually get the files from the remote machine if necessary.

You could (temporarily) choose to run the task more often so you can be sure that it will start backing up when your father is online. But agree it would be nice that it would keep checking in the background for destination availability).

If you are using the canary version you could reduce versions (and storage space) by using the retention policy.

Those are good suggestions, but I’m in the lucky case that my CrashPlan doesn’t go away until October 2018 - so I have time to wait for (and help) Duplicati to include the features I really want to see. :smiley:

Device discovery is achieved via publicly-accessible discovery servers hosted by the project developers

This was my problem with CrashPlan. Granted I didn’t use any of their storage. I signed on with them… years and years and years ago… (about when Mozy went from unlimited storage to some fixed amount) Set up my own server, and my own storage. All I ever actually ‘took’ from Crashplan was the ability for remote users to discover my peer’d backup server. So… I am also ‘good’ until Oct 2018 - since I never signed up for a $'s plan, and never needed their storage. But I’d like to have tested and migrated to a different solution way before that happens.

As for your situation? I dunno… I guess a simple sftp service on your Dad’s box would work. Much more than that and you’re on the hook for a box you admin running firewalls, dynamic dns and storage etc… at a remote location vulnerable to vacuum cleaners… >_<

1 Like

Yeah - I’m torn. I love the simplicity of CrashPlan / Syncthing style peer finding, but I dislike the need for a centralized “service” over which I have no control.

A “perfect” setup for me would be Duplicati natively providing:

  • backup (done)
  • SFTP server (nice & simple but still need manual firewall config)
  • peer-finding including firewall traversal (configurable to be disabled, since by it’s nature it requires broadcasting itself on the internet)

Well I don’t know if it’s really decentralized, but I just remembered Himachi from years ago. Maybe that could be a solution. It still seems to be available and updated: https://www.vpn.net

Oh, thanks for reminding me. My perfect scenario would include something that I believe Himachi supported at one time (sorry, I gave up on it about 5 years ago) - defining WHO is the P2P server.

So with Himachi, I could run my own private Himachi P2P server and when I configured a client I could point it to MY server instead of a public one. (I think that was Himachi…sorry, if I’m thinking of something else.)

Sure I still needed to poke a hole in my firewall for Himachi, but once that was done I didn’t have to worry about them deciding to go out of business or start charging. :wink:

So, yeah - if Duplicati could include that part of it’s internal hosting functionality, that would be awesome too.

Hey I just want to mention here that with the cheap prices of B2 and Wasabi, if you’re thinking of building a new system to put in your Dad’s place, it’s worth doing a spreadsheet to see if one of these storage retailers is more cost-effective (include reasonable estimates of your time and maybe even billing of your backup “clients”).

I built a server (I had an old Quad-core Q6600 and a handful of old drives) for this purpose and I’m regretting not just doing Wasabi from the beginning.

We have 10TB of data for our photography (and some videography) business. For awhile it was growing at 1TB a year, but the past year has seen more than 1.5TB growth.

To have reasonably RAIDed data locally and offsite, I’m spending at least $500 a year on drives (between replacing failures or too-small drives and adding new capacity to a Drobo and two servers), and although CrashPlan seemed great, it’s been choking on this much data and eating RAM like crazy.

Basically Wasabi would cost us about $500 a year also, which seems like a lot but is actually cheaper than drives/hardware/maintenance/electricity and comes with easier maintenance and peace of mind. Of course, the charge to get the data OUT is also huge, but this is just for backup anyway, so ideally we would never use it.

Damon

1 Like