Restore with read-only database

Defense against trojans gets into protection of the remote backup, sometimes for defense, sometimes for regulations. Even though the normal Duplicati way is create/read (after disabling compaction and retention deletes), there are some occasional situations where file delete might be attempted, e.g. if an upload fails partway, Duplicati will try to delete any pieces that got out. Possibly you’ll see some warnings as it tries to delete a file that’s missing. This may have gotten noisier in canary and experimental, so maybe I’ll pursue.

Because managers with restore privileges are trusted to see anyone’s files apparently (fixable if you divide by team), and because this all seems to live on the server (except for Google Cloud Storage backup files), easy solution is to give managers web access to Duplicati on server, and hope nobody hacks their way in.

Setting a certificate on Duplicati server might be good so you at least protect from network packet capture. Remote desktop to server would be less hackable (I assume), if policy allows that, then browse on server.

There’s a feature request somewhere for better administrative partitoning. It likely needs webapp work. :wink:

Although it’s conceivable a manager could go into Database and delete it, it can in theory be (too slowly…) recreated from remote backup files, which unfortunately must also be done if Duplicati breaks its own DB. People sometimes backup their DB (ideally at least two copies in a row) in order to avoid doing Recreate. Putting in an old DB, though, will find it surprised by files it hasn’t seen. Repair can delete new files, but in your situation, I’m not entirely sure what it will do when faced with remote files it can’t delete. Please test.

Having an ever-increasing number of versions will also take its toll as the DB has more and more to track. This is another reason why smaller backups might be better. If something goes wrong, it’s a smaller fixup.

You’re somewhat protected against permanent remote damage by your access permissions, however a push of the backup button on a PC could cause partial file lists to upload due to permissions or no maps. Fortunately the next proper backup on the server would find all the blocks it needs, so no big new upload.

If you really want to put Duplicati onto manager PCs, getting a copy of the DBs (perhaps even by pull not server-push) would avoid the update-read-only-DB problem but worsen all-DBs-using-same-remote one. Large enough DBs may take awhile to copy, which favors smaller backups or your read-only DB request.

Interesting goal, and you have some setups that make it a little more possible compared to general case.

1 Like