Good point, although I’m not sure if that impedes this (not totally known) planned use.
Possibly the only time the backup data is of interest is when somebody wants restore.
Possibly the clients (especially for desktops) run backup more often then drive swaps.
The onsite drive would be preferred. If disaster occurs, there’s an older backup offsite.
Leaving a drive attached by default also makes that drive more vulnerable to damage.
It’s all tradeoffs. Multiple local jobs also has drawbacks too (as you noted). Moving on:
I haven’t done much with that, so am not sure exactly what its limits are. It looks slow.
On the other hand, I’m not sure how well the other candidates do things with no drive.
sounds like the temporary database that Direct restore from backup files builds for disaster recovery.
In the case of a missing local DB due to no drive, its backup files are missing too. Nothing is there…
If looking around the backup with no drive is important (I don’t know), the database on C: is needed.
This might be more useful someday as progress happens (all thanks to the developer volunteers…).
Volunteers in all areas including development on fixes and features, test, docs, etc. can help hugely.
There’s a pressing need to work on fixes, so features often must wait for special assistance such as
The reason I mention this is that, while it’s always nice to have a better view (including dates) of files,
physical retrieval of a drive might add some additional motivation to try to plan the restore in advance.
C: storage is especially useful if C: is an SSD (I don’t know). Portable drive is (I’d guess) a mechanical.
There’s a C: space question, and there’s also a redundancy point. Sometimes keeping the working DB offsite is better for local disaster recovery, as it may avoid having to discover (late) a DB recreate issue.
Once again, tradeoffs.
As I mentioned, are you sure this could easily-without-additional-work steer Duplicati’s database there?
That might be beyond its design intent (which seems more aimed at backup files), but I have not tested.
Avoiding DB-on-removable-drive (by doing DB-on-C:) avoids the issue, but I don’t know what you mean.
You seem to be rebutting my concern about database on portable drive from my lines above and below.
by responding
By the way (before anyone else hits it), this is best done on Options screen 5. Screen 2 drops it (a bug).
Lots of good advice, but I don’t know if that’s still too much for the network, or if there’s a remote device.
To clarify, the USN journal improves scanning speed. The upload either way is only the source changes.
You can also directly view the historical upload sizes. It’s in the log. Click on Complete log
for the stats.
"BackendStatistics": {
"RemoteCalls": 16,
"BytesUploaded": 104392420,
"BytesDownloaded": 53284903,
If rate of change is highly variable, one might need to read lots. Other tools might do long-term changes better, but client complaints about Duplicati slowing their other Internet work might track short-term load.
One can throttle network usage (it’s still a bit bursty – smooth throtte needs QoS and router) if that helps.
YMMV and I’m sorry if @Flea77 head is spinning from all this, but it’s been a pretty thorough discussion.
Thanks to all who participated, but without knowing more about actual usage, it’s hard to say what’s best.