What does it mean to report a duplicate path

I have no idea how to respond to this log message. What is it a duplicate of?

Duplicate path detected: /Applications/Alfred 3.app/Contents/Services/Alfred Text Service.app/Contents/Resources/Base.lproj/!

I believe that warning message indicates that the path listed was found in the local Duplicati database twice. How that would happen I’m not sure but my GUESS would be a symlink, circular reference, or similar folder structure oddity might cause it.

As far as the implications of that, I don’t really know. Since it’s a warning and not an error I expect it’s not a major issue. I wouldn’t know what to suggest for a response to it, though if I saw that in my backups I’d probably ignore it for a few runs and see if it goes away and if not then come back here and ask if I should be worried that it hasn’t resolved itself.

I recently had this with a different program which setup a Symlink when I installed it. It didn’t go away, but the folder was backed up, and I didn’t need the symlink backed up so I set the backup to ignore the path ending with the !.

1 Like

Ooh, good catch. I thought the “!” was added by the poster. :blush:

After upgrading to on OSX, I’m getting this error consistently on my home directory. The duplicate path is /Users/<name>/! That trailing exclamation point is not me, that is what duplicati is reporting.

I have tried removing home, saving the config, and then re-adding it. But I get the same error on every backup run.

I had the same thing today, had to delete and rebuild the database to get rid of the message, I am running on Windows Server 2016

So not version (both & or OS (both MacOS & Windows) dependant, but not super common. Odd… just did the same thing

Experimental just did it to my backup, on Linux.

[Warning-Duplicati.Library.Main.Database.LocalDatabase-DuplicatePathFound]: Duplicate path detected: /etc/rc0.d/!

It is not an option for me to delete the database and recreate it.

Below what I see in the restore tab: is it normal to see that folder named “/” inside rc0.d? Could it be the culprit?

I do not see it in the “Source data” tab of the job, so I do not know how to remove it.

I realized that the Restore operation always show such " / " folders, so this is not related to the duplicate path problem.

I tried to exclude the duplicate path from the job with either “Exclude directories whose names contain” or “Exclude folder”, applied to “/etc/rc0.d/!” with no avail.

This is not a critical issue, but having a warning at each backup, forces me to check every time whether there are other more serious warnings.

From duplicati/LocalDatabase.cs at master · duplicati/duplicati · GitHub it is clear that the “!” is not part of the name of the directory.

Repairing the database does not solve the warning. I would avoid to recreate it.

So I excluded from backup “/etc/rc0.d/” (without ending “!”), and the warning went away. Then I added “/etc/rc0.d/” again and the warning came back. This is frustrating.

I tried to have a look at duplicati/LocalDatabase.cs at master · duplicati/duplicati · GitHub . I am not expert of mono, but I tried to read lines from 1080 to 1100. I am correct that the warning is thrown when two subsequent entries in the database, path and lastpath, are equal?
So, has anybody a clue why I get twice the warning on “/etc/rc0.d/”? It would mean I have three subsequent entries with path “/etc/rc0.d/” but I am pretty confident I do not have symlinks to “/etc/rc0.d/” in my filesystem.

Also, I would remove the “!” on line 1092, because it confuses users.

@JonMikelV sorry to ping you, I fear nobody is seeing this post, since OP did not flag it as Support

That’s ok - it’s been on my list to get back to, I just hadn’t gotten there yet. :blush:

Thanks for digging into the code! (For the record, that’s C# code with the .NET framework, mono is the tool that lets .NET run on Linux/MacOS.)

From what version did you update to If it’s or newer, you could try downgrading and see if the error goes away. That could tell us if it’s an issue stored in the database or just in the code.

Thanks, and sorry again for pinging, I noticed there are multiple open threads in the forum…

Unfortunately I migrated from previous beta to, and now I am on the new beta
In any case, the first post was in January and predates

OK - that lines up with what I’m seeing in the code where the
duplicate file" is being found in the database.

As a workaround I’m guessing we can determine the backup version with the duplicate and delete it which should make the error go away.

Of course that’s not a solution for whatever caused the problem in the first place…

Do you have access to sqlite database reader? :wink:

    ""H"".""Hash"" AS ""MetablocklistHash""
        ""F"".""Hash"" AS ""FirstMetaBlockHash"",
        ""C"".""BlocksetID"" AS ""MetaBlocksetID""
        ""FilesetEntry"" A, 
        ""File"" B, 
        ""Metadataset"" C, 
        ""Blockset"" D,
        ""BlocksetEntry"" E,
        ""Block"" F
        ""A"".""FileID"" = ""B"".""ID"" 
        AND ""B"".""MetadataID"" = ""C"".""ID"" 
        AND ""C"".""BlocksetID"" = ""D"".""ID"" 
        AND ""E"".""BlocksetID"" = ""C"".""BlocksetID""
        AND ""E"".""BlockID"" = ""F"".""ID""
        AND ""E"".""Index"" = 0
        AND (""B"".""BlocksetID"" = ? OR ""B"".""BlocksetID"" = ?) 
        AND ""A"".""FilesetID"" = ?
    ) G
   ""BlocklistHash"" H
   ""H"".""BlocksetID"" = ""G"".""MetaBlocksetID""
   ""G"".""Path"", ""H"".""Index""

Thank you. I used sqlitebrowser on my duplicati backup database. I had to substitute “” with " in your script. The result is however

0 Rows returned

Did you substitute any of the ? in the SQL with anything?

Each of those corresponds (in order) to a parameter at the end of the SQL definition. The trick is knowing what the parameter value is when the error is happening.

I recall somebody updating the error handler to allow printing the parameter values, but I don’t know if it is in regular releases yet.

Log sql variables #3314 has been in canary awhile, and it should be in the recent experimental and beta. Adding the –log-file option with –log-file-log-level=Profiling would get you a query that’s filled in, however I wonder if the code where the error occurs might be trying to use the SQL query to make a dlist file – and disliking what it got. If it made a backup anyway, it would be on the “Restore from” dropdown as number 0.

Using --log-file-log-level=Verbose level is rather noisy (but less so than Profiling), and might give an idea about how paths seemingly get picked up twice, unless somehow it’s in the imagination of the SQL query.

Example output where I updated a file date. Maybe you’ll be able to see something being noticed twice…

2018-12-03 18:24:46 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: C:\BackThisUp\test.txt
2018-12-03 18:24:47 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: C:\BackThisUp\test.txt
2018-12-03 18:24:47 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\BackThisUp\test.txt, new: False, timestamp changed: True, size changed: False, metadatachanged: True, 12/3/2018 11:24:16 PM vs 11/30/2018 1:01:21 AM
2018-12-03 18:24:50 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-FileMetadataChanged]: File has only metadata changes C:\BackThisUp\test.txt

Since I do not know any SQL, I thought it was the definitive script to launch. Could you point me to a basic reference on SQL?

Actually, you got the SQL right, in this case it’s the C# variables that tripped you up. :-):-[

The best way to get what we need is to use@ @ts678’s suggestion and add the --log-file=[path] and --log-level=profiling (or verbose) parameters.

That should generate a file at [path] with more detailed information - including actual SQL commands with parameter values (not just placeholders).

That SQL can then be run against the database log you already did and should provide more expected results.