Big and long term user of Duplicati but want to share some observations of faults - want to promote some healthy discussion and hopefully get some thoughts/energy to improve Duplicati further.
So I was experimenting with Arqbackup recently (gasp!) - while it is a bit rough around the edges and a bit vulnerable to small developer risk - I really like a couple of features:
- Ability to get to backups from other computer
So my main file server uses Arq/Duplicati as a backup in case of fire/disaster - but occasionally I need to get at those files from outside home; usually impossible coz the fileserver is just for my LAN but with Arq I can get to the backup and pull out the files. It doesn’t happen often but it is helpful to have when I need it (crashplan used to allow this as well)
For Arq, this is viable coz the configuration is mostly self contained (I just point to the backup location; put in password and can easily see the backup records). I find Duplicati much more trouble one-off adhoc get at backups.
- Ability to easily trim the backup
Me being OCD; sometimes I want to optimize out the space usage. It is easy in Arq to choose what backups to keep (and discard all older) and then either through the normal periodic clean up or by own initiation, I can do a drop-unreferenced-objects and minimize my backend storage use.
It isn’t too clear how to do this easily in Duplicati (at least; I couldn’t find option for deleting certain backups) and Duplicati’s clean/compress seems quite unstable/error prone that I have much less comfort to use it
Sort of related to (1) and (2) above; but Arq seems to be a little more graceful when hitting errors - eg missing data files and/or errors accessing the backend. On a few occasions, I get the red error messages about corrupt / missing files and even if I choose the repair/rebuild the errors keep on coming back (in fact; one time I got so frustrated in trying to find the right “just go recreate it” I deleted the whole backup and redid it. viable for my 100 gb backup; but I have another close to 1Tb which would be really painful)
On a more comestic piece; Arq makes it really easy to see in the restore menu/file-picker for each backup which files were added/modified/deleted and that is rather handy sometimes.
As I said; aim is for healthy discussion - I like Duplicati ability to blob up the data files to one big file (Arq uses many little files and it really cripples performance of basically everything else when running intensive operations) and Duplicati supports more backends which is immensely useful. I just hope that some of the blemishes could be addressed