Struggling with "Fatal error => System.IO.IOException: Das Verzeichnis oder die Datei kann nicht erstellt werden."

Hi,
I’m very glad with duplicati - until now:

I’m struggling with the error message:
“Fatal error => System.IO.IOException: Das Verzeichnis oder die Datei kann nicht erstellt werden.”

I searched the forum and found to have a look at the system-log … with no success, there’s nothing in it what is helpful.
I found the hint to use the command-line. But also there is no additional hint. Then I added the “debug-output=true” … and still there is no hint for me to get closer to the problempoint.


This is my command:
"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" backup "file://U:\duplicati\Bilder\\" "H:\daten\bilder\\" --backup-name="Backup-USB-GUE - Bilder" --dbpath="C:\Users\ag2024\AppData\Local\Duplicati\CANRPICKWD.sqlite" --backup-id=DB-14 --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase="mypassphrase" --retention-policy="2D:U,1W:1D,4W:1W,36M:1M" --disable-module=console-password-input --debug-output=true
First it seems that everything is working fine: the process is starting. The process scans local and remote files. There are duplicati-*.dblock.zip.aes-Files created and uploaded to the target. But then the following line appears
"Fatal error => System.IO.IOException: Das Verzeichnis oder die Datei kann nicht erstellt werden."

And in the following debug-output there is no hint what the problem is.
Is it that the target can’t be written?
Is it that a temporary File cannot be written?
… I don’t know and I’m desperate.

what have I already tested:

  • Target is writeable (USB-Disk FAT32)
  • there is enough space on the target
  • there is enough space on the source-disk
  • delete all of the configuration (inc already created target-files) and start from scratch => same issue

Help would be grateful appreciated!

Full Command-line-Output is enclosed in cli-log.txt.
cli-log.zip (3.0 KB)

Welcome to the forum @aschelli

Where exactly are you looking? Do you mean server log at About → Show log → Stored, or server live log?

Those are both GUI tools, but you are not using the GUI for the backup? You can get similar logs if you ask.

--console-log-level=retry might be a good place to start, as it looks like you might have one file that did default --number-of-retries then failed, but logging at retry level would make that clearer. You may even have had various other write attempts fail too. Each retry gets a new name. Did files make it onto the drive?

BackendUploader.<DoPut> is what writes to the destination. I don’t know why your Windows is blocking it. Recently somebody updated Duplicati, didn’t whitelist it with security software, and so blocked themselves. Did you change anything recently that might be related to this sudden new problem? If so, please describe.

EDIT:

Yours looks a little different than the other one, but question of what changed before this problem still holds.

Fatal error => System.IO.IOException: Das Verzeichnis oder die Datei kann nicht erstellt werden.

   bei System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   bei System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
   bei System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize)
   bei Duplicati.Library.Common.IO.SystemIOWindows.FileCreate(String path)
   bei Duplicati.Library.Backend.File.<PutAsync>d__23.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.Backup.BackendUploader.<DoPut>d__24.MoveNext()

Actually Backend.File.<PutAsync> is a later sign of the uploader at work. It then tries to create a file, but

The directory or file cannot be created

(translated) results. I don’t know why. Make sure whatever user Duplicati is run as has access to folder.

So you apparently got some files. I wonder which ones? You can certainly try a tiny backup, then there will only be one dblock file to worry about. You can also point somewhere that’s not the drive. Does that work?

1 Like

With your input I could fix my problem:
--console-log-level=retry and --number-of-retries delivered good hints:
I got in fact the messages, that many of the created backup-files could be written. But only up to a specific point. After that point every operation failed, also the retries failed.

Then I experimented with the native windows explorer to copy / paste Files on to the USB-Disk and in to the duplicati-Backup-Directory - and here is the point:
Basically everything works regarding file operations on the usb-disk. Except this one particular directory! Also a very simple Fileoperation “copy test.txt with a few bytes” failed - only in this directory.
It seemed to me that it is a limitation of the FAT32-File-System. I couldn’t find the limitations in detail and I gave the Filesytem-switch a chance. I reformatted the USB-Disk with NTFS and tried again:
now it works!

my environment:

  • source-Directory with app 330 GB mostly images .jpg, some .avi/.mp4
  • backup-directory with app 270 GB
  • operating-system: Windows 11 home

Thank you very much for your support!

1 Like

FAT32 limit on total length of all filenames in a directory combined?

I’ve kind of a weird problem on one of our customer’s backup harddrives: The harddrive is formatted in FAT32 and last night our backup jobs threw an error on a subdirectory, claiming that it couldn’t copy the files that it had to copy.

happens to sound extremely like what you wrote here, with the stack trace showing it couldn’t create a file.

The 65536 entry limit might be used up at about 3 per file (short name, and long name of two entry length).

Roughly 20,000 files with each dblock having a dindex, at 50 MB default, might run out near 500-some MB assuming all dblock files are filled, however shorter dblock files could use up the entries using less space.

This is also just a rough calculation, and regardless, I’m glad you bypassed the limit by switching to NTFS.

yes - sounds really extremely like my problem.
In fact I did the same tests: create “test.txt” and investigate what happens. It was exact the same behaviour as described in this article.

Also the numbers are very similar:
The filenames are build in the format “duplicati-b00584be7f3584da6a4ee21bbc9b0d1b9.dblock.zip.aes”.
The number of characters is 59. Now with NTFS I have 11000 Files in this directory. I’m sorry, I didn’t write it down … If I remember it right I had the problem starting with about +10000 Files.
So when I multiply 59*10000 … this would be in sum about 600 k.