as the title of the topic says,
I’d like to know if running two duplicati 2.0 on the same machine is possible,
because I’d like to make a restore (for test), but I fear it will run for days, and in the meantime I’d like to continue to do regular backups.
Do you think it’s possible ?
I may be wrong, but I remember that in Duplicati 1.3.4 there was a “lock file” that prevented to have two instances of Duplicati running in the same machine, even if one was launched in --portable-mode .
Can you help me ?
P.S. : I mean I will restore in a TMP folder, I will not overwrite my original files !
I feel this is needed at times the other way around when a backup is running & there is a need to restore few files from the previous version due to accidentally file deletion.
The only quick solution for now is to use the command line tool simultaneously when the wui runs.
I would worry about the Duplicati running a backup doing something that drastically changes the destination. For example, The COMPACT command could run, however the –no-auto-compact option could avoid that. Defense against trojans talks about this some, where the ultimate protection against damage from ransomware is to have a destination that can only be added to… There’s still a lock file, it can probably be worked around, but doing so might have bad results in some cases. Interestingly, Duplicati knows how to avoid other copies of itself using the default port 8200 for the web UI. See Duplicati.Server.exe. This can happen by accident when somebody installs Duplicati as a Windows Service (which starts first and gets port 8200), then starts the Tray Icon (and gets 8300). By default the databases are separate, so the job view is separate. Restore doesn’t need a complete database. For most realistically trying to recreate a lost-disk recovery, you wouldn’t want one anyway, just destination files.
This issue on github is somewhat related Repair command deletes remote files for new backups · Issue #3416 · duplicati/duplicati · GitHub
It could be a use case of wanting to restore from a different machine or a different duplicati server on the same machine but you have to be careful to not break your backup with two jobs pointing to the same data.
Ouch on that GitHub wreckage. I agree care is required. I think this is getting far away from the nice, simple, coordinated system that is wished for (but doesn’t exist yet), but some storage can provide limited access to avoid such issues. For example, Application Keys in Backblaze B2 can probably stop the reader from writing. Another option (but possibly at some storage or transfer expense) is to clone the destination for restore test. Cloning to a local drive could reduce storage costs, but would probably gain an unrealistic speed advantage.
Direct Restore from Backend Failing - Found X remote files that are not recorded in local storage, please run repair #3231 is a less harmful pit to fall into. Duplicati checks destination files to detect extra or missing files…