Force detailed log of all backup files and missing files

Sorry for what is a newbie question but after 6 months of running Duplicati, I am unable to force the software to detail the filenames of ALL the files it is backing up. How do I do this please?

Additionally, it is confusing to me that when I select ALL files on a volume (136GB), Duplicati reports to have backed up about 49GB in total.

Are you sure the file enumeration tool is actually finding all the files on this volume? Hence my first question. IS there a way to query the remote storage server and generate a report of the backed up files?

Thank you.

Find (new manual)
The FIND command (old manual, which explicitly talks about * if you want to get all the files)
Using the Command line tools from within the Graphical User Interface is a good way to run

Compression and deduplication can reduce size, and the degree depends on content of files.

You can do that manually, e.g. in File Explorer, and see if count is similar to what job log said:

image

Unfortunately the backups that were configured are NOT enumerating all the files on this volume. I do not know why this is happening. Every time I use the find CLI to look at a backup, it is missing a large number of files.

Below is the json config file.

{
“CreatedByVersion”: “2.1.0.2”,
“Schedule”: {
“ID”: 1,
“Tags”: [
“ID=3”
],
“Time”: “2025-01-04T02:00:00Z”,
“Repeat”: “2D”,
“LastRun”: “2025-01-02T14:40:33Z”,
“Rule”: “AllowedWeekDays=Monday,Wednesday,Friday”,
“AllowedDays”: [
“Monday”,
“Wednesday”,
“Friday”
]
},
“Backup”: {
“ID”: “3”,
“Name”: “Backup ALL incl AJRT AJDM JEM JOM Folders from Production”,
“Description”: “”,
“Tags”: ,
“TargetURL”: “ftps://xxxxx2200.is.cc/backup/jemjom?auth-username=xxxxx”,
“DBPath”: “C:\Users\xxxxx\AppData\Local\Duplicati\CVFIVDUATD.sqlite”,
“Sources”: [
“Q:\”
],
“Settings”: [
{
“Filter”: “”,
“Name”: “encryption-module”,
“Value”: “”,
“Argument”: null
},
{
“Filter”: “”,
“Name”: “compression-module”,
“Value”: “zip”,
“Argument”: null
},
{
“Filter”: “”,
“Name”: “dblock-size”,
“Value”: “50mb”,
“Argument”: null
},
{
“Filter”: “”,
“Name”: “–no-encryption”,
“Value”: “true”,
“Argument”: null
},
{
“Filter”: “”,
“Name”: “–exclude-files-attributes”,
“Value”: “temporary,system,hidden”,
“Argument”: null
}
],
“Filters”: ,
“Metadata”: {
“LastBackupDate”: “20250103T184952Z”,
“BackupListCount”: “64”,
“TotalQuotaSpace”: “0”,
“FreeQuotaSpace”: “0”,
“AssignedQuotaSpace”: “-1”,
“TargetFilesSize”: “123028375302”,
“TargetFilesCount”: “4802”,
“TargetSizeString”: “114.58 GB”,
“SourceFilesSize”: “49711132369”,
“SourceFilesCount”: “19355”,
“SourceSizeString”: “46.30 GB”,
“LastBackupStarted”: “20250103T184952Z”,
“LastBackupFinished”: “20250103T185823Z”,
“LastBackupDuration”: “00:08:31.3462085”,
“LastCompactDuration”: “00:00:12.9290476”,
“LastCompactStarted”: “20250103T185712Z”,
“LastCompactFinished”: “20250103T185725Z”,
“LastErrorDate”: “20241216T214215Z”,
“LastErrorMessage”: “Backup aborted since the source path Q:\ does not exist. Please verify that the source path exists, or remove the source path from the backup configuration, or set the allow-missing-source option.”
},
“IsTemporary”: false
},
“DisplayNames”: {
“Q:\”: “Q:”
}
}

Probably due to the exclude you asked for:

That causes files to not be backed up, as requested. Try turning all that off. It should help.

EDIT 1:

In File Explorer, you can see file attributes by right-clicking column header to turn that on.
They are what Windows or the application decided to set, and not a decision by Duplicati.

EDIT 2:

File Attribute Constants are some guidelines to what these mean.
FileAttributes Enum is a higher level view likely used by Duplicati.

EDIT 3:

On the Source screen, you likely checked every Exclude attribute:

Thanks. The problem is that we must exclude temp, hidden and system files! Why would this remove large numbers of valid files?

Additionally, I create a brand new backup to determine what is happening and it seems that the configuration or installation of duplicati and it is seeing many more files.

Finally, it seems the duplicati is creating a large files on the source volume. It is taking up all the space on that volume. Is there a way to have it create smaller files and chunk them out to the storage server versus filling my volume?

Thank you!

Why?

Probably because you excluded them per requirement. Check attributes.
If the attribute you set is present, then your exclusion will remove the file.

The default is to not Exclude based on attributes, as it can be dangerous.

Got a name? Temporary files should be in C: Temp, and your source is Q:

I looked to see if your Options had a big Remote volume size. Seems not.
Even the files in C:\Users\xxxxx\AppData\Local\Temp should be 50 MB.
What degree of large are you talking about? Is the free space super tight?

There are ~140,000 files on this volume. The enumeration does not locate them all. I have tried to create brand new backups and I don’t change the excludes and I still am missing files. As for disk space, are you creating any other temp files on the volume being copied? Yes, space is tight on the source volume.

Also check Settings to see if you put excludes there. Also please look at file attributes.
If you see any other pattern to what’s missing and what’s not, then please supply that.

Only if the volume is where the user’s Temp folder is (so likely C:) as explained already.

tempdir can relocate temporary files, but I don’t know what you saw on Q:.

Predefined filter groups is a path-based (not attribute-based) way to Filter.
It’s probably more oriented towards stuff on the C: drive you may not want.
You can get help on filter-groups in GUI Commandline to see details.
Who is it who decides what “valid files” are? Duplicati is very customizable.

EDIT 1:

Of course don’t change Filters either, or you’re inviting files to be filtered out.

Maybe I am missing something important here. Does this software compare the remote backup to the local source volume and only do an incremental backup? For example if I had picked directories 1,2,3,4 to backup originally, then I edit the backup and add directories 5,6,7,8, will duplicati backup ALL directories 1-8 or will it just backup 5-8? If there a way to force a complete backup across all files? I ask because the CLI find option only finds directories 5-8 when it lists files!!

Basically, but it looks at database records of the remote backup (since that is much faster).

You can double-check on the CLI by simply browsing the tree of the latest Restore version.
Only changes are backed up in terms of upload, but anything still selected should still exist.
Basically, any version should be exactly like what you asked for at the time that backup ran.
Although only changes get uploaded, they might even be pieces of files, but aren’t detailed.

That’s exactly how it works. Chunk is Options screen Remote volume size at 50 MB default.
These should flow through C: Temp though. How Backup Works explains in technical detail.

Listing contents 0 (1/3/2025 8:10:19 PM):
C:\tmp\add\1\ 
C:\tmp\add\2\ 
C:\tmp\add\3\ 
C:\tmp\add\4\ 
C:\tmp\add\5\ 
C:\tmp\add\6\ 
C:\tmp\add\7\ 
C:\tmp\add\8\ 
Return code: 0

The one before the latest is called version 1, so the original backup looks like this for me:

Listing contents 1 (1/3/2025 8:09:31 PM):
C:\tmp\add\1\ 
C:\tmp\add\2\ 
C:\tmp\add\3\ 
C:\tmp\add\4\ 
Return code: 0

With that number of files, it is hard to do anything manually. Generally, if a file is skipped due to error (permission denied, file locked, etc) it will generate a warning. If it is skipped due to configuration (filters, symlink policy, etc) it will not generate a warning.

Compare file paths

How do you count the number 140,000 files?
If you can get a list of all the paths that are in that number, you can compare it to the paths in Duplicati. To do this, get SQLiteBrowser (or a similar tool) and query the “File” view, like:

SELECT DISTINCT Path FROM File;

You can then export to csv or similar and compare with the list you are expecting. Hopefully there is a pattern in the missing files.

Log what happens

An alternative choice it to go with logging. Set the avanced options --log-file=<path to a file> and --log-file-log-level=Verbose and then run the job.

This will create quite a lot of noise but you should see each single path in there with some information, like Excluding path due to ....

You can limit it a bit with:

--log-file-log-filter=Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess*

What are the differences between those two? Same machine? Same version of Duplicati? Same user?

Best guess here is that you are using a large volume size and/or a large setting for asynchronous uploads and compressors.

The volume size is set on the last page of the backup configuration, otherwise with --volume-size and is default 50MiB.

The compressors and uploaders can be limited with:

--concurrency-compressors=1
--asynchronous-upload-limit=1
--asynchronous-concurrent-upload-limit=1

(The last one will be removed in the future, in favor of the previous one)

They all default to the CPU core count / 2.

As described in another topic, you get: volumes = compressors + uploaders.

If you have 10 cores and 1GiB volume size, you get roughly 10 GiB temporary files (assuming transfer is the bottleneck).

Is that Q: or C:? Your job export had Q: source. You didn’t say what new test’s source is.

Thanks. The source volume is the Q: volume. It is a mapped drive to a netware volume.

I used the same volume. The “new test” was an attempt to create a completely new config file versus importing a previous config file to remember the server settings, etc.

As for the backup files… Below is a listing of the entire file set with the find CLI command and * as the option. It is below my signature.

What I have found is that this backup is broken. It still points to the original backup folder and does NOT seem to add new files…

ftps://xxxx.is.cc//backup/jemjom?auth-username=xxxxx&auth-password=xxxxx

Here is what I would expect…and please correct me if I am in the weeds.

  1. Backup from Q:\ volume and I selected the entire Q: volume at the top of the file select screen

  2. This should select ALL the files on the Q: volume regardless of their file type, status, etc. (I turned off any filters for temp, hidden and system files)

  3. Any NEW files added between backups would be automatically picked up at the next backup

  4. If I run the command line with find and *, I should see all the files from the Q: volume in the backup.

NOTE: I have done this with several other volumes and they work perfectly. There seems to be some persistence built into the program somewhere dealing with the Q: Volume.

As a test, I will make the same netware volume to a different letter and try it.

Thanks. I really want this to work and succeed as the concept if duplicati is great and I am just discovering how powerful it can be.

cheers!

Richard A. DeVito, Jr

If it’s always Q, then Duplicati Temp files on C: shouldn’t bother it (might bother C: though).

Interesting new info, adding another point of possible failure, or maybe even another writer.

???

If you made a new test backup in old folder, it should immediately error from seeing old files.
Each backup needs its own folder.

FTP in 2.1.0.2 is having some issues, but it should be complaining if it’s hitting any FTP issue.

You could still have something in the Filters or even Settings, but you’d likely know if you did it.

Only if NetWare makes the files visible, if you mean added to NetWare. Earlier you were testing

which in my test meant checkmarking additional Source folders that had been there all the time.

So you find that Q: behaves differently, and blame Duplicati for persistence? Why not NetWare?
If you remove “built into the program” (is this pointing at Duplicati), then I wouldn’t be protesting.

Some remote file protocols may not immediately reveal changes to clients. Did you confirm their visibility somehow else before having Duplicati go looking? This probably wouldn’t explain some disapperance, but slow folder appearance could make files be missed if Duplicati already looked.

AFAIK there’s persistence only in that it’s not going to look everywhere constantly during backup.

First, please stop personalizing my comments. I am trying to debug this and give you as much detail as I can. This issue has been a problem for 6+ months. I have done what you asked using the find command and have determined that only small subset of files is being backed up. IF you are doing the standard system calls for enumeration, you should not care what path or files the filesystem returns. That said, I just ran this backup with the new n: and here is the log that came back. What does not make sense is that the status field said that is got to ~115GB of data and ~65000 files and if you look at the backup, it only shows the following!! You can clearly see it is only backing up 3.5gb of data. And it seems to be starting somewhere down the file list.

IS THERE a way to force duplicati to do a FULL backup each time? Maybe you are looking at the “backed up” bit to determine which files to backup? If so, can you force a full backup?

Last successful backup:
Today at 12:15 PM (took 00:33:08)
Run now

Next scheduled run:
Today at 1:00 PM

Source:
4.07 GB

Backup:
3.30 GB / 1 Version

Here is the complete log file!

        {

“DeletedFiles”: 0,
“DeletedFolders”: 0,
“ModifiedFiles”: 0,
“ExaminedFiles”: 2930,
“OpenedFiles”: 2866,
“AddedFiles”: 2866,
“SizeOfModifiedFiles”: 0,
“SizeOfAddedFiles”: 4178457842,
“SizeOfExaminedFiles”: 4374752222,
“SizeOfOpenedFiles”: 4178457842,
“NotProcessedFiles”: 0,
“AddedFolders”: 378,
“TooLargeFiles”: 0,
“FilesWithError”: 0,
“ModifiedFolders”: 0,
“ModifiedSymlinks”: 0,
“AddedSymlinks”: 0,
“DeletedSymlinks”: 0,
“PartialBackup”: false,
“Dryrun”: false,
“MainOperation”: “Backup”,
“CompactResults”: null,
“VacuumResults”: null,
“DeleteResults”: null,
“RepairResults”: null,
“TestResults”: {
“MainOperation”: “Test”,
“VerificationsActualLength”: 3,
“Verifications”: [
{
“Key”: “duplicati-20250106T164218Z.dlist.zip.aes”,
“Value”:
},
{
“Key”: “duplicati-i22bea8991f4a4750ac63c8754371134e.dindex.zip.aes”,
“Value”:
},
{
“Key”: “duplicati-bcedd525c382e47fbade24f1363acdd1a.dblock.zip.aes”,
“Value”:
}
],
“ParsedResult”: “Success”,
“Interrupted”: false,
“Version”: “2.1.0.2 (2.1.0.2_beta_2024-11-29)”,
“EndTime”: “2025-01-06T17:15:22.335385Z”,
“BeginTime”: “2025-01-06T17:15:15.9118607Z”,
“Duration”: “00:00:06.4235243”,
“MessagesActualLength”: 0,
“WarningsActualLength”: 0,
“ErrorsActualLength”: 0,
“Messages”: null,
“Warnings”: null,
“Errors”: null,
“BackendStatistics”: {
“RemoteCalls”: 144,
“BytesUploaded”: 3540279535,
“BytesDownloaded”: 51816007,
“FilesUploaded”: 139,
“FilesDownloaded”: 3,
“FilesDeleted”: 0,
“FoldersCreated”: 0,
“RetryAttempts”: 0,
“UnknownFileSize”: 0,
“UnknownFileCount”: 0,
“KnownFileCount”: 139,
“KnownFileSize”: 3540279535,
“LastBackupDate”: “2025-01-06T11:42:18-05:00”,
“BackupListCount”: 1,
“TotalQuotaSpace”: 0,
“FreeQuotaSpace”: 0,
“AssignedQuotaSpace”: -1,
“ReportedQuotaError”: false,
“ReportedQuotaWarning”: false,
“MainOperation”: “Backup”,
“ParsedResult”: “Success”,
“Interrupted”: false,
“Version”: “2.1.0.2 (2.1.0.2_beta_2024-11-29)”,
“EndTime”: “0001-01-01T00:00:00”,
“BeginTime”: “2025-01-06T16:42:14.1777115Z”,
“Duration”: “00:00:00”,
“MessagesActualLength”: 0,
“WarningsActualLength”: 0,
“ErrorsActualLength”: 0,
“Messages”: null,
“Warnings”: null,
“Errors”: null
}
},
“ParsedResult”: “Warning”,
“Interrupted”: false,
“Version”: “2.1.0.2 (2.1.0.2_beta_2024-11-29)”,
“EndTime”: “2025-01-06T17:15:22.6353594Z”,
“BeginTime”: “2025-01-06T16:42:14.1777065Z”,
“Duration”: “00:33:08.4576529”,
“MessagesActualLength”: 290,
“WarningsActualLength”: 637,
“ErrorsActualLength”: 0,
“Messages”: [
“2025-01-06 11:42:14 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started”,
“2025-01-06 11:42:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()”,
“2025-01-06 11:42:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: ()”,
“2025-01-06 11:43:06 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ba61ad178bc93453bab7e77ad11451894.dblock.zip.aes (49.19 MB)”,
“2025-01-06 11:43:10 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-be76569029bde4e89922d9125aa151a44.dblock.zip.aes (49.68 MB)”,
“2025-01-06 11:43:10 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b301886cf64fd4959b91062f70bd7d6dc.dblock.zip.aes (49.43 MB)”,
“2025-01-06 11:43:11 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ba61ad178bc93453bab7e77ad11451894.dblock.zip.aes (49.19 MB)”,
“2025-01-06 11:43:13 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i28b182c8741746da98b0a0e3d1f242e3.dindex.zip.aes (4.56 KB)”,
“2025-01-06 11:43:14 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i28b182c8741746da98b0a0e3d1f242e3.dindex.zip.aes (4.56 KB)”,
“2025-01-06 11:43:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b301886cf64fd4959b91062f70bd7d6dc.dblock.zip.aes (49.43 MB)”,
“2025-01-06 11:43:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-if20c7134a7ae41f2b2682e99a725afe3.dindex.zip.aes (5.48 KB)”,
“2025-01-06 11:43:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-if20c7134a7ae41f2b2682e99a725afe3.dindex.zip.aes (5.48 KB)”,
“2025-01-06 11:43:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bf045cdb40aca4fed9ad2b3f02cc963a3.dblock.zip.aes (49.84 MB)”,
“2025-01-06 11:43:20 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b4f00746e38c3435c9649df46d0a086a0.dblock.zip.aes (49.84 MB)”,
“2025-01-06 11:43:22 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-be76569029bde4e89922d9125aa151a44.dblock.zip.aes (49.68 MB)”,
“2025-01-06 11:43:22 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ide3bee64a5354173b39fa9c33d567c2f.dindex.zip.aes (4.26 KB)”,
“2025-01-06 11:43:22 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ide3bee64a5354173b39fa9c33d567c2f.dindex.zip.aes (4.26 KB)”,
“2025-01-06 11:43:23 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b99b05267d6e34afc8831d18881fe0ee3.dblock.zip.aes (49.90 MB)”,
“2025-01-06 11:43:32 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bf045cdb40aca4fed9ad2b3f02cc963a3.dblock.zip.aes (49.84 MB)”,
“2025-01-06 11:43:33 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b99b05267d6e34afc8831d18881fe0ee3.dblock.zip.aes (49.90 MB)”
],
“Warnings”: [
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\fairleigh dickinson university-master of science in hs with grad certs (2).pdf\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\fairleigh dickinson university-master of science in hs with grad certs (2).pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_DefaultNCOAMerge.dbf\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_DefaultNCOAMerge.dbf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_DefaultNCOAMerge.dmt\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_DefaultNCOAMerge.dmt’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\fairleigh dickinson-global security and terrorism grad cert (1).pdf\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\fairleigh dickinson-global security and terrorism grad cert (1).pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.MetadataPreProcess.FileEntry-TimestampReadFailed]: Failed to read timestamp on "N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dbf"\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dbf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dmt\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dmt’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida institute of technology - bachelor of arts in crimnial justice with a concentration in homeland security.doc\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida institute of technology - bachelor of arts in crimnial justice with a concentration in homeland security.doc’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.MetadataGenerator.Metadata-MetadataProcessFailed]: Failed to process metadata for "N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.csv", storing empty metadata\r\nInvalidOperationException: Method failed with unexpected error code 50.”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\Importing text files into Excel.pdf\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\Importing text files into Excel.pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - undergrad cert in em and hs (2).pdf\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - undergrad cert in em and hs (2).pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\MailListCleaner - Sales Receipt - 00344187 - 20190312.pdf\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\MailListCleaner - Sales Receipt - 00344187 - 20190312.pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.csv\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.csv’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - grad cert in em and hs (2).pdf\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - grad cert in em and hs (2).pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\MLC MergeEvent_List Duplicate Report.pdf\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\MLC MergeEvent_List Duplicate Report.pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - interdisciplinary social sciences with em track.docx\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - interdisciplinary social sciences with em track.docx’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.MetadataGenerator.Metadata-MetadataProcessFailed]: Failed to process metadata for "N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dbf", storing empty metadata\r\nInvalidOperationException: Method failed with unexpected error code 50.”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\MLC_Layout.pdf\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\MLC_Layout.pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - mpa with em speciality - copy.pdf\r\nIOException: The request is not supported. : ‘\\?\N:\JEM\Issue 19\Special Higher Education Issue\Directory\Distance Learning\florida state university - mpa with em speciality - copy.pdf’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\NCOA Documents.xls\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\NCOA Documents.xls’”,
“2025-01-06 12:14:18 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dbf\r\nIOException: The request is not supported. : ‘\\?\N:\OCP\ICOO-2020\MARKETING\LISTS\2019\CASS_NCOA_OUTPUT\ICOO_2019_Final_Mailing_List_NCOA_Output_Changed_V2.dbf’”
],
“Errors”: ,
“BackendStatistics”: {
“RemoteCalls”: 144,
“BytesUploaded”: 3540279535,
“BytesDownloaded”: 51816007,
“FilesUploaded”: 139,
“FilesDownloaded”: 3,
“FilesDeleted”: 0,
“FoldersCreated”: 0,
“RetryAttempts”: 0,
“UnknownFileSize”: 0,
“UnknownFileCount”: 0,
“KnownFileCount”: 139,
“KnownFileSize”: 3540279535,
“LastBackupDate”: “2025-01-06T11:42:18-05:00”,
“BackupListCount”: 1,
“TotalQuotaSpace”: 0,
“FreeQuotaSpace”: 0,
“AssignedQuotaSpace”: -1,
“ReportedQuotaError”: false,
“ReportedQuotaWarning”: false,
“MainOperation”: “Backup”,
“ParsedResult”: “Success”,
“Interrupted”: false,
“Version”: “2.1.0.2 (2.1.0.2_beta_2024-11-29)”,
“EndTime”: “0001-01-01T00:00:00”,
“BeginTime”: “2025-01-06T16:42:14.1777115Z”,
“Duration”: “00:00:00”,
“MessagesActualLength”: 0,
“WarningsActualLength”: 0,
“ErrorsActualLength”: 0,
“Messages”: null,
“Warnings”: null,
“Errors”: null
}
}

To me, path information underlies everything. Little point in debating, with more to discuss.

No. That’s what some software does. Duplicati looks at a variety of factors. Summary log info:

2024-12-31 12:51:00 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: C:\PortableApps\Notepad++Portable\App\Notepad++64\backup\webpages.txt@2024-12-30_092527
2024-12-31 12:51:00 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\PortableApps\Notepad++Portable\App\Notepad++64\backup\webpages.txt@2024-12-30_092527, new: False, timestamp changed: True, size changed: True, metadatachanged: True, 12/31/2024 5:50:02 PM vs 12/31/2024 12:20:41 PM

I think the first line is saying that the file wasn’t filtered out of the enumeration, and the second explains why this file was being looked over to see if there’s something to backup as a change.

You can run a Verbose log to see if FileEnumerationProcess is seeing your new files, however

“ExaminedFiles”: 2930,
“OpenedFiles”: 2866,
“AddedFiles”: 2866,

says it’s not seeing much, but most of what it sees was not there on the previous look (so new).
Weren’t those Warnings from attempts to enumerate NetWare a problem on previous attempts?

If NetWare keeps not working, there may be no solution here, however to answer your question:

This is not natural use, but you can force it to try a full backup by deleting old destination and DB.
You can’t force it to succeed unless the source cooperates, and lots of other things can go wrong.
Starting fresh each time loses all the usual advantage of Duplicati versioning and change upload.

Technically, Duplicati does a full backup (or as full as Source allows) each time by default, but the deduplication makes it effectively an incremental backup in terms of the amount of data uploaded.

If you mean can one force NetWare to cooperate with full enumeration, that’s a NetWare question.

Have you considered using something else? Duplicati really needs the Source to cooperate with it.

Duplicati is written in C# and uses the standard enumeration and access. It is technically possible to use other Windows API calls, but the ones in C# are the ones the .NET team (mostly Microsoft) considers the “best ones”.

I agree with your statement, it should not matter, but there are always details.

Not really, it is not designed for that. It makes differential backups, only unknown stuff, but I don’t think that detail is relevant for your case.

This is likely the problem. Duplicati expects a local file system as the source. It should in principle work with a mounted folder, but it looks like there is some small quirk here.

I think this is the same issue we discussed in “Failed backup but why”, but there it was the destination that was a NetWare folder, and here it is the source?

I have found a new article that mentions the same problem and it states that it is an issue with NetWare not supporting long paths:

Based on that, I think there is some issue with the paths, where C#/.NET/Duplicati transforms the paths from old-style C:\Folder\File to the newer UNC style \\?\C:\Folder\File and this does not work with the NetWare drivers.

Another comment here says anything larger than 5MB seems to fail:
https://community.opentext.com/img/oes/f/discussions/480899/windows-7-64-bit-can-t-copy-large-files-to-netware-6-5-sp8/1104886

And I saw a recommendation to update the Novel Client, if that is possible.

1 Like

Thanks.
The tech notes don’t really help in this case. If a file explorer or dos DIR command can find the files, I would expect the C# and .NET routines to find it to. This machine has the latest OES/Netware client from OpenText.

Not really, it is not designed for that. It makes differential backups, only unknown stuff, but I don’t think that detail is relevant for your case.

Again, it seems there is persistence.We should be able to delete a backup on the remote end, we should be able to force the system to do a full backup but we still have only a limited set of files being backed up. The question is if we have to delete ALL backups that refer to the Q: drive in order to force a full backup.

To test this I tried to install the 32 bit version on a windows 7 machine. Unfortunately I am unable to get that machine to connect. It fails while testing the FTP connection as it failed at authentication.

This is likely the problem. Duplicati expects a local file system as the source. It should in principle work with a mounted folder, but it looks like there is some small quirk here.

The beauty of the netware client is that it looks just like a DOS volume but with the extended file attributes supported in netware. Again, we have 3 good backups mapped to other Netware volumes that worked perfectly. All files backed up to the exact count and size! So that tells me the client and volumes work properly. The one that fails was the FIRST backup we ran. Is there some data stored in the registry or a config file we can look at to see if we have hidden saved settings? In the client I have just turned OFF the UNC Path Filter and will run a test. I think this is not the issue but testing will show. Note this volume has DOS, MAC NFS, LONG filename support as do the volumes that worked. The 5MB file size issue is a non issue as the files that did backup were in many cases larger than 5MB.

I think we agree on how it should work, and in my understanding it generally works as we both expect, but for some reason not in one of your setups.

Duplicati basically does this:

  • List top-level folder
  • Recursively visit each folder
  • For each path (file or folder) check if the filter(s) exclude it
    • If excluded, skip it
  • For each file encountered, check if the path exists in the database
    • If path is known, check if modification time has changed
    • If modification time and file size is the same as last, skip the file
    • Otherwise treat as a new file
  • For each new or modified file, extract blocks, and deduplicate as needed

The main point is that the algorithm is always “full” meaning that it does not care what is already there, except that it will skip files it already knows. If the initial set of files is empty, all files are treated as new-or-modified. If you add or remove folders (or drives) from the backup sources, this does not make it do less than a full backup.

If you want to try disabling the file modification check (and treat each path as new-or-modified), you can set --disable-filetime-check=true in the advanced options.

No there is nothing stored except the paths that are already backed up. These are stored in the (randomly named) local database. Duplicati does not store anything outside the data folder (and briefly files in the temp folder).

I am not aware of how the drivers work, but my understanding is still that the issue IOException: The request is not supported is causing Duplicati to fail reading files (and perhaps folders), so the above algorithm never gets a chance to see the files. If that issue is fixed, the above algorithm will see the files and treat them as new-or-modified.

I don’t think the problem is with a previous backup, but you can test it:

  • Create a new backup, using the problematic drive as the source
  • Choose a local destination for the data
  • Run the backup and check the resulting file count

If you are out of space locally, you could use one of the paths that are reported as failed instead of the the whole drive as the source.