How to refresh revisions imported from file and restore from another machine?

Hi,

I just did a big backup from Linux using v2.0.4.21-2.0.4.21_experimental_2019-06-28, then exported the config file and imported it locally to my Windows machine, including metadata (and disabled the schedules - this config is only meant to test and in the future be ready to do restores).

At this point, I ran another backup on Linux. So now Linux had 2 backups and Windows Duplicati still says 1.

  1. I go to Restore on Windows, but after loading for a while, it only sees 1 backup, not 2. OK, it must need some sort of metadata refresh?
  2. I try the restore of the 1st version anyway just to test it, and get an error: Found 3 remote files that are not recorded in local storage, please run repair (I found a similar post here Found X remote files that are not recorded in local storage, please run repair · Issue #2449 · duplicati/duplicati · GitHub).

2019-08-07 17:25:21 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Restore has started
2019-08-07 17:25:21 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2019-08-07 17:25:43 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (1.92 KB)
2019-08-07 17:25:43 -07 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-ExtraUnknownFile]: Extra unknown file: duplicati-20190807T230440Z.dlist.zip.aes
2019-08-07 17:25:43 -07 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-ExtraUnknownFile]: Extra unknown file: duplicati-i8e6d8d605a66409aa359b7044da48b52.dindex.zip.aes
2019-08-07 17:25:43 -07 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-ExtraUnknownFile]: Extra unknown file: duplicati-b18792334cbd9483eb2c08ec0f1ea4aea.dblock.zip.aes
2019-08-07 17:25:43 -07 - [Error-Duplicati.Library.Main.Operation.FilelistProcessor-ExtraRemoteFiles]: Found 3 remote files that are not recorded in local storage, please run repair

So I can’t just have Duplicati auto sync the latest remote backup info and have to rebuild the local db? In addition, it can’t just restore the 1st version it already knows about?

  1. I initiate a repair. Here’s the result:

2019-08-07 17:27:13 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Repair has started
2019-08-07 17:27:15 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2019-08-07 17:27:34 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (1.92 KB)
2019-08-07 17:27:34 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20190807T230440Z.dlist.zip.aes (149.19 MB)
2019-08-07 17:27:36 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20190807T230440Z.dlist.zip.aes (149.19 MB)
2019-08-07 17:27:36 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-i8e6d8d605a66409aa359b7044da48b52.dindex.zip.aes ()
2019-08-07 17:27:36 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-i8e6d8d605a66409aa359b7044da48b52.dindex.zip.aes (106.29 KB)
2019-08-07 17:27:36 -07 - [Error-Duplicati.Library.Main.Operation.RepairHandler-FailedNewIndexFile]: Failed to accept new index file: duplicati-i8e6d8d605a66409aa359b7044da48b52.dindex.zip.aes, message: Unknown remote file duplicati-b18792334cbd9483eb2c08ec0f1ea4aea.dblock.zip.aes detected
System.Exception: Unknown remote file duplicati-b18792334cbd9483eb2c08ec0f1ea4aea.dblock.zip.aes detected
at Duplicati.Library.Main.Operation.RepairHandler.RunRepairRemote()
2019-08-07 17:27:36 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-i8e6d8d605a66409aa359b7044da48b52.dindex.zip.aes (106.29 KB)
2019-08-07 17:27:37 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-i8e6d8d605a66409aa359b7044da48b52.dindex.zip.aes (106.29 KB)
2019-08-07 17:27:37 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-b18792334cbd9483eb2c08ec0f1ea4aea.dblock.zip.aes (155.08 MB)
2019-08-07 17:27:37 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-b18792334cbd9483eb2c08ec0f1ea4aea.dblock.zip.aes (155.08 MB)
2019-08-07 17:30:44 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Repair has started
2019-08-07 17:30:45 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2019-08-07 17:31:01 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (1.92 KB)
2019-08-07 17:31:15 -07 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RebuildStarted]: Rebuild database started, downloading 1 filelists
2019-08-07 17:31:15 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20190806T232509Z.dlist.zip.aes (149.04 MB)
2019-08-07 17:31:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20190806T232509Z.dlist.zip.aes (149.04 MB)
2019-08-07 17:44:40 -07 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FilelistsRestored]: Filelists restored, downloading 981 index files
2019-08-07 17:44:40 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-i95ea606b969f44bb91938076ac517f64.dindex.zip.aes (204.17 KB)
2019-08-07 17:44:41 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-i95ea606b969f44bb91938076ac517f64.dindex.zip.aes (204.17 KB)
2019-08-07 17:44:41 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-id4f6ce044ad648a0925954182b670cbc.dindex.zip.aes (50.84 KB)

2019-08-07 17:54:04 -07 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RecreateCompletedCheckingDatabase]: Recreate completed, verifying the database consistency
2019-08-07 17:56:18 -07 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RecreateCompleted]: Recreate completed, and consistency checks completed, marking database as complete

Cool, so at this point, we should have a record of the 2nd backup, right? Because I just rebuilt the db, and it downloaded all the info from remote? But it’s not the case - the profile on the Windows machine still says there is only 1 backup, not 2, and I can’t restore the most recent one: Backup:231.00 GB / 1 Version whereas Linux says Backup:231.29 GB / 2 Versions.

Is this scenario completely unsupported? I’m coming from Crashplan and other backup solutions, and some of Duplicati’s approaches are wild and confusing, and seem broken.

Thanks.

Now I really don’t get it. On Windows, I went to Restore and did Direct restore from backup files ..., pointed to Google Drive, added the token and the remote path, and had it load the remote listing. After a while restoring the db, I’m seeing just 1 version and not 2:

image

2019-08-07 23:05:57 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation List has started
2019-08-07 23:05:57 -07 - [Information-Duplicati.Library.Main.Operation.ListFilesHandler-NoLocalDatabase]: No local database, accessing remote store
2019-08-07 23:05:57 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2019-08-07 23:06:31 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (1.92 KB)
2019-08-07 23:06:31 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Repair has started
2019-08-07 23:06:31 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2019-08-07 23:06:50 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (1.92 KB)
2019-08-07 23:06:50 -07 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RebuildStarted]: Rebuild database started, downloading 1 filelists
2019-08-07 23:06:50 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20190806T232509Z.dlist.zip.aes (149.04 MB)
2019-08-07 23:06:52 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20190806T232509Z.dlist.zip.aes (149.04 MB)
2019-08-07 23:15:39 -07 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RecreateOrUpdateOnly]: Recreate/path-update completed, not running consistency checks
2019-08-07 23:15:40 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation List has started

How’s this possible?

In comparison, here’s what the Linux backup shows:
image

Just noticed the Linux machine, which is the true source of the backups with 2 backups currently, tried to run a scheduled backup, but ended up with this instead:
image

I don’t even understand.

Discussion How to use the same external hard drive for Windows and Linux? might give you some pointers.

The problem being seen is likely exactly what was warned against here in previous question/answer…

So Windows machine has never seen backup 2, sees the additional files from backup 2 as extra, and “fixes” the inconsistency by making the remote look like what it knows about, so there ends backup 2. Direct restore won’t see backup 2 either because the files from backup 2 are no longer on the remote.

Linux machine DB tells it it has two backups, and it expects remote files (it knows it changed nothing). Surprise comes when it tries to do something, checks remote files in advance, finds backup 2 deleted.

Backup services such as CrashPlan presumably have server-side databases that allow things like an adoption process where a new computer adopts backup computer identity used by previous computer.

Computer identities: How they work

Duplicati cannot rely on much from remote, certainly not an assumption of specific API or DB to help it. Remote is assumed to have very basic capabilities such as file put, get, list, delete, and a folder create. This leaves Duplicati largely on its own to query its local DB and periodically verify the remote matches.

EDIT:
Duplicati silently, permanently deleted backup from google drive - two-machine use case #3845
has some other thoughts about ways that the problem might be tackled, but the collaboration goal there really isn’t a good fit for a backup program like Duplicati. I suspect CrashPlan won’t need a refresh to do another download at the other-system-destination, but Duplicati (for the moment anyway) refreshes the wrong way, “refreshing” the remote to match the local. I don’t know if repair rewrite will allow DB refresh.

Thanks for the explanation. I feel like there isn’t enough warning given against importing an existing profile for the purposes of a restore (as I noted, I immediately disabled backups from the Windows machine after the import and never ran a backup from there), and Duplicati itself suggests a repair as a way to reconcile (I assumed it would catch up/figure out the differences).

I’ve set up a backup from machine A and restore to machine B using similar notions with duplicacy, and due to the differences in architecture and design, it had no issues restoring this way without messing up the backups (and I might add, it did both almost an order of magnitude faster than duplicati).

Not a knock on the devs here, just sharing my testing.

Probably the best thing for restore from a config is Restore from configuration just below “Direct restore”.

Going down the “Add backup” --> “Import from a file” path might not make the partial temporary database created from the Restore tab options on the left. I haven’t tested this lately, but if it offers a Recreate since there’s no database, it will likely do a slower full Recreate so you can continue doing backups, but danger exists if the exporting PC is also doing backups (leading to the clash of two PCs backing up to one folder).

For your timing comparisons, do you recall if you had Duplicati do a full Recreate? The restore options are usually faster I think, however since your backup was likely very new, it might not make much difference…

These are several years old, but might still be of interest:

Duplicati 2 vs. Duplicacy 2 (Duplicati forum)
Duplicacy vs Duplicati (Duplicacy forum)

One thing you can see there is that Duplicati by default does fine-grained deduplication at 100KB chunks. Duplicacy uses (or at least used) 4MB chunks. I think Duplicacy also gets speed with parallel downloads. Duplicati is adding parallel uploads in next true beta, but parallel downloads are only for Jottacloud, I think.

Actually, in my comparisons, I used 10MB internal blocks with Duplicati, not the default 100KB (btw, I was surprised that the duplicati db only went down to 1.3GB from 2GB when I went from 100KB blocks to 10MB).

You can see the comparison here, duplicacy really blows away the competition Duplicacy demolishes duplicati and rclone for large backups - Praise - Duplicacy Forum.

https://twitter.com/ArtemR/status/1159392251992145920

I really wanted to like Duplicati more, and I gave it a fair shake for several days before finding duplicacy, but duplicacy’s robustness and speed have won me over.