Duplicati vs. CrashPlan Home

In an effort to keep @kenkendk from being totally overwhelmed by CrashPlan Home (aka CP) orphans I thought I’d start this topic for a centralized location for people to start at when considering Duplicati v2 (aka D).

Please note that this is just how I see things working in the two apps so feel free to correct me if something I’ve said is incorrect. Oh - and this will likely get many updates as I think of / find new features in the two tools.

Here’s an OFFICIAL Fact Sheet about Duplicati.

And here’s a list of features I’m aware of that are the same & different between the two (hmm…is there a way to make this a table?).

CP D  Feature
x  x  Client side software
x  *  Server side software (CP does maint. on the server, *D does it on the client though some server scripts are available)
x  x  De-duplication
*  x  Cloud storage (*CP only allows cloud storage on their cloud, D can use many providers)
x  x  Local storage
x  *  Computer-to-computer backup/"Back up to a friend" (D must back up to a file server, which can be set up by a friend but does not come with one)
x  x  Schedule backups (D allows more granularity than the FREE CP services)
x  x  Pause backups
   x  "Low" memory use (CP uses JAVA which can be a big memory hog)
x  *  CPU throttling (*D allows thread priority while CP allows actual percentage limits)
x  x  Bandwidth throttling
x  *  Run missed backup ASAP (*D does NOT detect drive mounts to trigger missed backups like CP does)
x  x  Runs on Windows
x  x  Runs on Linux
x  x  Runs on Mac OS
x  *  Single source to multiple destinations (*D can have two sources going to separate destinations for a SIMILAR result)
x  *  Specific network port binding (*D allows specific IP binding)
   x  Run pre-run and/or post-run scripts
x  *  Positive notifications (email if backups NOT run in a while, *D has third party host options including duplicati-monitoring.com and dupReport)
10 Likes

Great effort. You may want to add the following:

  • CP simplifies multiple destinations - they are defined per backup set
  • CP can be restricted to use specific network interfaces and also specific WiFi networks - very important for laptops (avoiding backup over cell network)
3 Likes

Thanks dgcom! Note that Duplicati CAN bind to specific IPs (though apparently not specific NICs / devices) as indicated here:

1 Like

As a workaround, you can use the command line option --run-script-before-required to launch a script that performs one or more tests. If one of the tests succeeds, the script exits with errorlevel 0, otherwise it exits with errorlevel 1.

Duplicati will abort the backup job if the script specified with --run-script-before-required returns anything else than 0.

A script could look something like this:

@echo off
setlocal enabledelayedexpansion

set ErrLev=1

rem Check if connected to SSID "MyHomeNetwork"
for /f "tokens=1,2 delims=:" %%a in ('netsh wlan show interfaces ^| find " SSID"') do (set SSID=%%b)
set SSID=!SSID:~1!
if "!SSID!" equ "MyHomeNetwork" (set ErrLev=0)

rem check if IP 172.16.1.254 can be reached
ping -n 1 -w 500 172.16.1.54 > nul 2> nul
if errorlevel 1 goto :IP_172_16_1_254_Not_Found
set ErrLev=0
:IP_172_16_1_254_Not_Found

rem check if wired interface "Ethernet" is connected AND \\SERVER1\Share\Testfile.txt can be found
for /f "tokens=1,2 delims=:" %%a in ('netsh interface show interface Ethernet ^| find "Connect state:"') do (set State=%%b)
if "!State!" equ "!State:Disconnected=!" (
   if exist "\\SERVER1\Share\Testfile.txt" (
      set ErrLev=0
   )
)

exit !ErrLev!

DISCLAIMER: Script not tested, use at your own risk!
(Originally posted at GitHub, reposted here, because this post fits better here).

2 Likes

Oh, yes, I am totally aware of the workarounds… But workarounds are just that… not features.
And, speaking of more features - CP did NOT have an option to run pre- and post- scripts - very good item to add to the list as well, mentioning that it can help with such workarounds :slight_smile:

2 Likes

Hi,

I am also coming from CP, and I have to say I prefer duplicati rather than CP :), I wish I had found it before.

A very interesting thing I find in duplicati wich CP lacks of, is that it runs without problems on Raspberry pi, wich is very cool to set a local, small, low power consuming file server. CP worked with some workarounds some tome ago, but after an upgrade it was totally broken.

Good work guys! Keep going!

1 Like

Hi – I am one of those Crashplan Orphans :-).

I tried iDrive, but had to punt it when the sync functional went a bit crazy and deleted everything on every computer. Sigh… Fortunately, I had a local backup running every night that I was able to recover from.

I’ve been playing with Duplicati 2.x for about a week and I am really impressed! My only concern is the “beta” status of 2.x.

My primary use-case will be backing up via sftp so that backups will happen even when remote.

So here’s the question: In practice, is the beta designation anything I should worry about, or can I assume that things are mature enough that I will be getting good backups? Of course, I will try at least 1 trial restore…

Thanks!

Marc

This may be trivial but I thought I’d mention it anyway: if you search for “Crashplan” in the forum search :mag: you will find quite a number of posts related to CP.

tophee, the “search for CrashPlan” link you provided only shows results in this Topic me. Is that intended?

oops, no, that was not intended. Fixed it.

Re: “Single Source to Multiple Destinations”. I have that working in my test setup. In my tests I have a single set of directories. One backup goes to a local network drive, the second goes to the B2 cloud. You do have to set up two separate backup jobs, so it’s not as clean as CP’s implementation, but it’s not too hard and it gets the job done.

Unless I have misinterpreted what was meant by the feature, please place an “X” in that column! :grinning:

The OP is now a wiki, which means anyone who is not a new user can edit it.

Correct me if I’m wrong, but I believe CP does a better job here in that it deduplicates across backup jobs, i.e. a file that’s already in the cloud will not be uploaded again, no matter what. Right?

SO a big feature I’ve been seeing people complain about losing with crashplan is their ‘backup to a friend’ system. The way it works is that you give out a code that your friend enters, and they can use you as another destination to back their own files up to.

Now, Duplicati doesn’t have that insofar as it’s not as easy as entering a code, but that functionality can be accomplished by setting up the file server you back up to in such a way that your friends can use it over FTP or any of the other supported methods.

Does this feature deserve a star or a plus?

Not more than a star, I’d say.

tophee (sorry, stupid auto-correct), I guess I always assumed CP de-duped only within a backup set (just like Duplicati does). So even if I had multiple sources going to a single destination in CP the wouldn’t de-dupe across b them (potentially due to different encryption keys).

As I understand it Duplicati does not re-upload files unless there is a change or as part of archive maintenance (such as history cleanup resulting in merging of multiple small archives).

I know somebody who works at Code 42 but I don’t know that he’d be able to confirm or assumptions one way or the other. :slight_smile:

Okay. Or wait: does CP have separate archives (and hence encryption keys) for each backup job? I thought it was one archive per client so that if I create multiple backup jobs on the same machine, de-duplication would work across backups.

tophee, no - you’re right. I believe there’s a single key for each CLIENT not each job and all jobs for a single client go to a single location and are de-duped as a set. I guess I was trying to say that if you’ve got multiple clients going to a single destination they would each be de-duped individually.

As for a Duplicati, I’m pretty sure each backup job is only de-duped on itself - so even if you have a single client with two different backup jobs going to the same destination (obviously to either separate folders or with a --prefix setting to distinguish the file sets) they will be de-duped separately.

I guess another way to look at it is in your C:\Users<user>\AppData\Roaming\Duplicati folder (assuming Windows non-server install) de-duplication will only occur inside a single xxx.sqlite file. So you make two backup jobs and end up with both xxxA.sqlite and xxxB.sqlite files I’m pretty sure they’ll de-dupe independently.

Unfortunately, this is just my guess and neither of the “how it works” pages specify what happens in this scenario.


So I guess we’ll have to bug @kenkendk and ask - if a single client has multiple backup jobs that happen to include some of the same files, will de-duplication happen for each job individually (so a shared 100M file will be backed up twice, once for each job) or across all jobs on the client (so a shared 100M file will be backed up only once, no matter how many jobs point to it)?

1 Like

Deduplication indeed doesnt work across multiple backups. In theory, it could be implemented, but this question already has been answered by @kenkendk:

1 Like

Second the comment about single source to multiple destinations

Right now I am benchmarking several storage back ends and have eight (!) jobs of the same folder. It sure was a hassle to set up and maintain.

In the long term expect at least two, maybe three backends so still worth to implement

1 Like