Duplicati 2 vs. Duplicacy 2

Continuing the tests …

One of the goals of my use for Duplicati and Duplicacy is to back up large files (some Gb) that are changed in small parts. Examples: Veracrypt volumes, mbox files, miscellaneous databases, and Evernote databases (basically an SQLite file).

My main concern is the growth of the space used in the backend as incremental backups are performed.

I had the impression that Duplicacy made large uploads for small changes, so I did a little test:

I created an Evernote installation from scratch and then downloaded all the notes from the Evernote servers (about 4,000 notes, the Evernote folder was then 871 Mb, of which 836 Mb from the database).

I did an initial backup with Duplicati and an initial backup with Duplicacy.

Results:

Duplicati:
672 Mb
115 min

and

Duplicacy:
691 Mb
123 min

Pretty much the same, but with some difference in size.

Then I opened Evernote, made a few minor changes to some notes (a few Kb), closed Evernote, and ran backups again.

Results:

Duplicati (from the log):

BeginTime: 21/01/2018 22:14:41
EndTime: 21/01/2018 22:19:30 (~ 5 min)
ModifiedFiles: 5
ExaminedFiles: 347
OpenedFiles: 7
AddedFiles: 2
SizeOfModifiedFiles: 877320341
SizeOfAddedFiles: 2961
SizeOfExaminedFiles: 913872062
SizeOfOpenedFiles: 877323330

and

Duplicacy (from the log):

Files: 345 total, 892,453K bytes; 7 new, 856,761K bytes
File chunks: 176 total, 903,523K bytes; 64 new, 447,615K bytes, 338,894K bytes uploaded
Metadata chunks: 3 total, 86K bytes; 3 new, 86K bytes, 46K bytes uploaded
All chunks: 179 total, 903,610K bytes; 67 new, 447,702K bytes, 338,940K bytes uploaded
Total running time: 01:03:50

Of course it jumped several chunks, but still uploaded 64 chunks of a total of 176!

I decided to do a new test: I opened Evernote and changed one letter of the contents of one note.

And I ran the backups again. Results:

Duplicati:

BeginTime: 21/01/2018 23:37:43
EndTime: 21/01/2018 23:39:08 (~1,5 min)
ModifiedFiles: 4
ExaminedFiles: 347
OpenedFiles: 4
AddedFiles: 0
SizeOfModifiedFiles: 877457315
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 914009136
SizeOfOpenedFiles: 877457343

and

Duplicacy (remembering: only one letter changed):

Files: 345 total, 892,586K bytes; 4 new, 856,891K bytes
File chunks: 178 total, 922,605K bytes; 26 new, 176,002K bytes, 124,391K bytes uploaded
Metadata chunks: 3 total, 86K bytes; 3 new, 86K bytes, 46K bytes uploaded
All chunks: 181 total, 922,692K bytes; 29 new, 176,088K bytes, 124,437K bytes uploaded
Total running time: 00:22:32

In the end, the space used in the backend (contemplating the 3 versions, of course) was:

Duplicati: 696 Mb
Duplicacy: 1,117 Mb

That is, with these few (tiny) changes Duplicati added 24 Mb to the backend and Duplicacy 425 Mb.

Only problem: even with a backup so simple and small, in the second and third execution Duplicati showed me a “warning”, but I checked the log and:

Warnings: []
Errors: []

It seems to me a behavior already known from Duplicati (this kind of “false-positives”). What worries me is to ignore the warnings and fail to see a real warning.

Now I’m here evaluating the technical reason for such a big difference, thinking about how Duplicati and Duplicacy backups are structured. Any suggestion?