Backup the Backup database and some cryptic messages in my logs

Which files in the %appdata%\Duplicati folder do I need to backup in case of major disaster like a flood \ theft of original hardware?

I have a backup running which works backing up files to OneDrive. BUT I realise if the hard disk fails then recovery is hard without that Duplicati database.

I want to add a second backup to pick up that Duplicati folder of settings. Which files are the important ones in there to backup? And could I just simplify things by popping that database into my OneDrive?

I noticed that a single backup is not allowed to backup its own database as it is in use. This is why I want to make a second backup to grab the first backup’s database. (Hope that makes sense)

Technically you don’t HAVE to back any of those files up, if you recorded information about your backup sets elsewhere (source, destination, credentials, encryption passphrase, options, etc).

That being said, if you back up the Duplicati-server.sqlite file it will make your life a lot easier if you have to do a disaster recovery. This file contains your global settings as well as all job configurations.

Be sure to store this in a secure location as it contains sensitive data (credentials, encryption passphrase, etc).

The other, much larger, sqlite files don’t need to be backed up. Duplicati can recreate them automatically from the back end data. Hopefully you are running 2.0.5.1_beta - it has a lot of improvements to speed up the database recreation process.

So as long as I know these details the recovery would be just as fast as having those files available? It doesn’t have to faff around with rebuilding a database?

You say “options” in there. I try and be as close to default as possible.

And yeah, I think it is using the newest beta. Or I did try and get it to update to that.

If you have all the details recorded, then in the case of a disaster you’d have to reinstall Duplicati and then recreate your backup job with all the correct details. Then you can tell it to do a database recreation and let it go through that process.

If you save Duplicati-server.sqlite it saves you from having to redefine your backup jobs, but you still have to go through the database recreation process.

The only way to avoid the database recreation process is to have a backup of the job-specific sqlite files too. But those are usually quite large and they have to be backed up constantly. You don’t want to try to use an out-of-date one. It’s not worth the trouble now that database recreation is usually a pretty fast process.

About button will do You are currently running Duplicati - 2.0.5.1_beta_2020-01-18
If you have logs emailed, you can check "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)"
2.0.5.1 is a lot better at running reliably (avoids some fix pain). Old code base was 14 months old.

On a Windows service install, I think it’s guaranteed that Activate button doesn’t do the Activate.
Service restart solves that. On non-service install, Activate seems intermittent per some reports.
First problem had one attempted fix which was later removed. Second is still somewhat unknown.

I don’t know how many different jobs you set up per computer, but if it’s one or small, you can also Export To File which can encrypt it for you, and Import is easy if you ever have to start a new drive.

How to restore data and then setup backup again? was a good discussion that used a job Import. Metadata such as Last successful backup and sizes will be lost, but return on the next backup. There’s also a chance to disable scheduled jobs before Save, which may avoid some messiness…

Whether you record key data somehow, or Export jobs, or copy/backup the server DB is up to you.

This is the step I was trying to avoid. If database recreation takes more than 20 mins, it seems more logical to me to setup a separate backup task to backup the database files AFTER each backup has completed. I am on fast 200Mbps broadband in that office and uploading 60GB to OneDrive overnight is surprisingly easy. So topping up the odd GB or two here or there is fine. Especially if it will save hours in the case of a disaster.

Yes - I have stupid clients. One on of the PCs there has a 40GB(!!) Outlook mail store. (Time for me to hit some House Cleaning in that office :wink: )

@ts678 I haven’t had time to play with it yet, but did notice the clearer to understand logs in the newer edition. Very nice to see as deciphering them in the past used to make my head spin. There are other threads here from me and I had been nervous about putting Duplicati back into play in this place but I am going to keep it up. This is an office that suffers from floods due to it being a basement and a leaky roof. And they had a break in last month - but they didn’t bother nicking the PCs as they are a bit too naff and old. :smiley:

But then you’d still have to recreate the database for THAT backup job before you can restore the database. :wink:

I used to actually do this, but now that database recreation is pretty fast for me, I stopped backing up the job sqlite files.

You should experiment to see how long a database recreation takes so you can judge the best approach for you.

For speed, I suppose it’s a question of whether the DB (which can get large) downloads faster than the individual dindex files (whose size varies according to what’s in the dblock, but you could look at a few).

2.0.5.1 still downloads a few dblocks due to a bug which is fixed in Canary (so is fixed in next Beta) but there’s still some small chance that Duplicati will go on a big dblock download, chasing a missing block.

Having an actual DB backup at least ensures a predictable time frame, if that’s an essential part of this. Whatever you do, practice it before the disaster so that you know the steps. And feel free to document somewhere. I’m not sure if there was ever a complete step-by-step How-To for backup/restore DB plan.

But I was hoping recreating that backup should be fast as it will only be two or three files? therefore speeding up the main restore itself?

I’ll find some time to attempt a “disaster recovery”. In theory, according to your explanation, I should be able to do this from my own house on a fresh test PC. As long as I have the (source, destination, credentials, encryption passphrase, options, etc).

But that list is already a bit confusing. What is “source”? I won’t know the list of folders being backed up as this is different on each machine. “Destination”, “credentials” and “encryption phrase” are simple as I already recorded these.

-=-
I’m looking at the three computers now trying to make sense of what is going on. Turned out the boss’ PC failed doing backups months ago and is throwing me cryptic errors. :frowning: Even with the new interface it is still talking “synthetic filelists” - which might as well be Croatian for all the sense I can make of that warning.

And an even more vague “failed to process path” error on the Outlook PST files even though the path is correct.

2020-02-21 12:06:43 +00 - [Warning-Duplicati.Library.Main.Operation.BackupHandler-SnapshotFailed]: Failed to create a snapshot: System.UnauthorizedAccessException: Attempted to perform an unauthorized operation. at Alphaleonis.Win32.Vss.VssBackupComponents…ctor() at Alphaleonis.Win32.Vss.VssImplementation.CreateVssBackupComponents() at Duplicati.Library.Common.IO.VssBackupComponentsHelper.GetVssBackupComponents() at Duplicati.Library.Snapshots.WindowsSnapshot…ctor(IEnumerable1 sources, IDictionary2 options) at Duplicati.Library.Snapshots.SnapshotUtility.CreateWindowsSnapshot(IEnumerable1 folders, Dictionary2 options) at Duplicati.Library.Main.Operation.BackupHandler.GetSnapshot(String sources, Options options)
2020-02-21 12:09:53 +00 - [Warning-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-MissingTemporaryFilelist]: Expected there to be a temporary fileset for synthetic filelist (54, duplicati-b22833b46e5fa4cc58af5b3e0700d2304.dblock.zip.aes), but none was found?
2020-02-21 14:18:32 +00 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: C:\Users\FRED\Documents\Outlook Files\FRED@DOMAIN.co.uk.pst
2020-02-21 14:18:33 +00 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: C:\Users\FRED\Documents\Outlook Files\SHAREDBOX@DOMAIN.co.uk.pst

Email address has been anonymised. But not sure what that is saying as the file name is otherwise exactly correct. This is what lost me last time I tried to make sense of the errors. The newer report interface is quicker to find these errors, but they still don’t make much sense.

I may just delete this whole backup and start again with it. The newly rebuilt PC seemed to complete its fresh backup last night without troubles. It looks like it kicked 30GB of data onto OneDrive in 6.5 hours. This is what I like about Duplicati - the ability to do a backup like that! :smiley:

This is why it is hard to make sense of the error logs. Sentences like this make no sense to us users :frowning:

And I have tried helping out documenting stuff I have done before - examples being what I learnt trying to run on Windows as a Service. Posts now long ago lost in the forum.

Technical questions like this sometimes get technical answers. Simple version – see destination dindex sizes if you like, with whatever tool shows sizes. Or, lacking that, benchmark and see which path is fast.

How the backup process works is the manual section that expains how the dindex indexes a dblock file. You don’t need to know that, just know that in Recreate, all the dlist and dindex files you see download, plus you might get some dblock. On downloading backed up DB, dblock downloading WILL be needed.

If users configure this part of the backup themselves (I don’t know who set it up), then this prevents the Export plan and leaves you with the backup of Duplicati-server.sqlite plan, as that should be up to date.

The SnapshotFailed is possibly from not running with Administrator privilege actually in use. Windows won’t create a VSS snapshot (Duplicati –snapshot-policy=required). Running as a service as SYSTEM should not have that issue, but running at login could, even on an Administrator account. Windows will generally need a UAC prompt answered by someone before it will put Administrator permission in use.

This setting controls the usage of snapshots, which allows Duplicati to backup files that are locked by other programs. If this is set to off , Duplicati will not attempt to create a disk snapshot. Setting this to auto makes Duplicati attempt to create a snapshot, and fail silently if that was not allowed or supported. A setting of on will also make Duplicati attempt to create a snapshot, but will produce a warning message in the log if it fails. Setting it to required will make Duplicati abort the backup if the snapshot creation fails. On Windows this uses the Volume Shadow Copy Services (VSS) and requires administrative privileges.

is not something you asked about, but because this is my third mention of it today (I hope someday it will be fixed), it indicates an interrupted backup (if that sheds any light) and is trying to build a version which is the old version plus progress on the new version before interruption. Feature is still buggy…

I found one that you originated in a How-To. Posts in Support topics do tend to be hard to find later on. Searching is possible either in the forum or externally (e.g. Google), but lots of people just ask again…

Yes, this is true, and it was also my reasoning when I set up a secondary backup job whose sole function was to back up the database of the primary backup job. It does achieve this goal, but I personally stopped doing it with 2.0.5.1 beta.

The list of folders being backed up could be discoverable once you recreate the database and view past backup jobs. But I would recommend you either back up the Duplicati-server.sqlite file or take exports of all your backup configs. It really makes it easier when recovering from a disaster. You don’t need to back up these very often, either. Once is fine and then perhaps update the backups if you change any of the config options.

1 Like

Ehh? Lost me at the first sentence. “Simple Version” isn’t simple when you talk about dindex sizes. Your description has been lost in technical tangles. Look at how a dictionary defines words - it does not use the word that is being defined as part of the definition.

I really do not have a clue how this product works and don’t really want to have to spend hours in the manuals trying to understand the language of the internals of the product.

Thank you. Now this bit I do understand because you avoid Duplicati language and talked plain Computer language. So basically the PC rebooted in the middle of a backup. Is there a way to tell Duplicati to now just discard the corrupted backup and try again another day? Or am I going to have to understand dblock dinexed and d-command line to dfix? When I get to this level of having to untangle the mess I just abandon the whole backup and start again. My clients can’t afford to pay me to debug beta software. :wink:

When I can use versions I trust without so much technical work, I’ll be donating. But the more hours I burn untangling more problems caused by the software, the less I use it, and the less I am tended to donate.

Thanks. That is a clear answer I can understand. So if I setup a new backup, then copy “Duplicati-server.sqlite” (or the Files in %APPDATA%\Duplicati) then I’ll be ready for a disaster.

This is getting pretty strange, but I’ll bite. Where’s the technical tangle in the below? Let me dissect:

refers to the Duplicati Destination screen, and wherever you set Duplicati up to keep its backup files.

refers to files with dindex in the name. Although it’s not the extension, it’s like your reference today to:

refers to whatever can show file sizes. The other post spoke of OneDrive, so maybe Windows Explorer.
The one word I wish I’d said was “dindex file sizes”, but I did refer to “dindex files” three times before…

Although there are bugs with hard stops, ideally you can just start the backup again and it keeps going. There will be some cleanup activity and file scanning, but it should not upload data already uploaded…
There’s not an easy way to discard the interrupted backup because it’s not visible to user until finished.