So… somehow I managed to jack up my install after a Windows update.
It looks like I may have muddled up the database connection, and I also don’t seem to have a copy of the job definitions.
Is there a way to attach an existing backup, and continue to use it moving forward? I’ve been hacking away at this for a few days now, and I’ve managed to delete my backup files twice. (I attempted a database repair only to find that it wiped my backup set) Fortunately, I had a copy of it.
I feel like I’m rambling a bit, but basically, I don’t have a copy of the job definitions (a JSON file), and the sqlite DB might be messed up too. This all came after the last big Windows update.
I’ll put together a more detailed description of everything I’ve done, but I’m posting this in hopes that there is a process where I can just reattach and reuse my existing backups.
Is this a Windows service install (extra steps)? Windows version updates wipe Duplicati into \Windows.old
EDIT: Basically, make sure that you really lost everything you think you lost before trying to sort of recreate.
There are at least two involved. Duplicati-server.sqlite would have your job definition (what to back up, etc.). Individual job would have database with seemingly random name at path seen on the job’s Database page.
An intact backup can work with Database Recreate to rebuild the job database. If you lost info on what you backed up, you can guess from what’s in the Restore tree. Do you tend to customize heavily (filters, etc.)?
Yes, it was installed as a service. When I had moved it to a service, I didn’t realize that you could use the --portable-mode option to help keep windows updates from breaking things.
This might be part of my issue. I’m pretty sure I only copied 2 files from the windows.old directory.
Here are the files that may be relivent:
12/03/2020 03:59 AM 10,264,072,192 NRGIGNKROV.sqlite
12/04/2020 05:14 AM 3,354,357,760 80746667816881868170.sqlite
12/04/2020 09:19 PM 176,128 Duplicati-server.sqlite
My backup job definition is pretty simple. I was able to double-check which folders are backed up by mounting the backup (restore from backup files). I think I just have the skip system and temp files used as a filter.
Basically, I have 2 backup jobs. The file selection and any filter should be the same for both. The only difference is the target and the schedule. I alternate between targets every other day, skipping Sunday altogether.
I’ll dig into the links you supplied, but I thought I’d take the time to reply to provide some more details just in case it helps somehow.
I couldn’t quite follow whether you thought remote file damage had happened, but if you think (in retrospect) that there was more than just a DB problem, you could run restore with no-local-blocks to make sure that it went to the destination for blocks instead of using a shortcut of looking for blocks in the current source files.
I’m glad things are looking good, and hoping that someday the default service install will use a safe location:
I store one of my backups on a samba share. When I was trying to generate a new job DB I ran a repair. When the repair was finished it reported something along the lines of XXXX files found on remote storage… something something…
When I looked at the share, all the backup files were gone. Fortunately, I had made a copy before running the repair.
I got a good backup last night, so everything is back to normal.