Duplicati-Windows 10

Hello, we would like to roll out Duplicati in the company. Linux version and MacOS version work great.
Storage location is AWS S3. The user has no delete rights, so the backup cannot be manipulated. The problem is rather the Windows version, which thinks about once a week it has to delete backups. Although all backups should be kept. Only the Windows computer has this problem. This computer has the same config as the Linux/MacOS computers. We have been testing this configuration for a month now, without any problems, except the Windows box mentioned above.

Amazon.S3.AmazonS3Exception: Access Denied ---> Amazon.Runtime.Internal.HttpErrorResponseException: Der Remoteserver hat einen Fehler zurückgegeben: (403) Unzulässig. ---> [System.Net](http://system.net/).WebException: Der Remoteserver hat einen Fehler zurückgegeben: (403) Unzulässig.
bei [System.Net](http://system.net/).HttpWebRequest.GetResponse()
bei Amazon.Runtime.Internal.HttpRequest.GetResponse()
--- Ende der internen Ausnahmestapelüberwachung ---
bei Amazon.Runtime.Internal.HttpRequest.GetResponse()
bei Amazon.Runtime.Internal.HttpHandler`1.InvokeSync(IExecutionContext executionContext)
bei Amazon.Runtime.Internal.RedirectHandler.InvokeSync(IExecutionContext executionContext)
bei Amazon.Runtime.Internal.Unmarshaller.InvokeSync(IExecutionContext executionContext)
bei Amazon.S3.Internal.AmazonS3ResponseHandler.InvokeSync(IExecutionContext executionContext)
bei Amazon.Runtime.Internal.ErrorHandler.InvokeSync(IExecutionContext executionContext)
--- Ende der internen Ausnahmestapelüberwachung ---
bei Duplicati.Library.Main.BackendManager.Delete(String remotename, Int64 size, Boolean synchronous)
bei Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
bei Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
bei Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
bei CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
bei Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
bei Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
bei Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)


* 24. Aug. 2020 10:41: Cannot open WMI provider \\localhost\root\virtualization\v2. Hyper-V is probably not installed.
* 24. Aug. 2020 10:41: Cannot find any MS SQL Server instance. MS SQL Server is probably not installed.
* 24. Aug. 2020 10:37: Cannot open WMI provider \\localhost\root\virtualization\v2. Hyper-V is probably not installed.
* 24. Aug. 2020 10:37: Cannot find any MS SQL Server instance. MS SQL Server is probably not installed.
* 24. Aug. 2020 10:34: Cannot open WMI provider \\localhost\root\virtualization\v2. Hyper-V is probably not installed.
* 24. Aug. 2020 10:34: Cannot find any MS SQL Server instance. MS SQL Server is probably not installed.
* 24. Aug. 2020 10:34: Die Operation Backup ist mit folgenden Fehler fehlgeschlagen: Access Denied
* 24. Aug. 2020 10:34: Fatal error
* 24. Aug. 2020 10:34: Überwinden eines Fehlers beim Löschen der Datei duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes fehlgeschlagen
* 24. Aug. 2020 10:34: Löschung der Datei fehlgeschlagen duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes, prüfe ob Datei existiert
* 24. Aug. 2020 10:34: Backend event: Delete - Failed: duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes (641,37 KB)
* 24. Aug. 2020 10:34: Operation Delete with file duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes attempt 5 of 5 failed with message: Access Denied
* 24. Aug. 2020 10:34: Backend event: Delete - Started: duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes (641,37 KB)
* 24. Aug. 2020 10:33: Backend event: Delete - Retrying: duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes (641,37 KB)
* 24. Aug. 2020 10:33: Operation Delete with file duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes attempt 4 of 5 failed with message: Access Denied
* 24. Aug. 2020 10:33: Backend event: Delete - Started: duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes (641,37 KB)
* 24. Aug. 2020 10:33: Backend event: Delete - Retrying: duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes (641,37 KB)
* 24. Aug. 2020 10:33: Operation Delete with file duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes attempt 3 of 5 failed with message: Access Denied

The log output is not necessarily a backup delete, which may happen after backup to delete old backups.
PreBackupVerify as seen in the log would be before the backup, probably trying to clean up something.
About → Show log → Live → Information might show you why it’s trying to delete that particular dblock.

The setup you use is likely to face the same issues as a cold storage cloud solution, in that at minimum “Keep all backups” and no-auto-compact=true are required on Options screen 5. This means Compact won’t clean up now-wasted space after version delete, but you keep all versions, so should not compact. Space usage will grow over time, and Duplicati will slow down. Any clean up job will need delete access.

The small print on the big picture above is that delete attempts happen for other routine reasons, such as attempting to clean up files when upload suffers an error, and which may or may not have uploaded a file.

Can you see whether duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes uploaded?

You can also check previous backup’s Show log → Complete log RetryAttempts for signs of fail & retry.

To do this thoroughly is more work, with log-file=<path> and log-file-log-level=retry, then look for lines that look like your posted log for Delete, but the action is Put. You see file names, so history can be checked.

EDIT:

How did you recover from something like the above? I’d think next backup run would want to do delete too.

the user for the s3bucket, has no right to delete, it is related to the policies and my company.
@ duplicati-b2c9a5e7702864a2d9d007bad34155e10.dblock.zip.aes
the file was not uploaded.
have set "no-auto-compact=true
I have now deleted the database and had it restored.
After the restore the backup ran without errors.

thx

If the problem was an attempt to delete a file that had not finished uploading (as evidenced by it not being there), DB delete would have cleared that issue by erasing its record that an upload was ever attempted.

Each upload try of the same content has a different filename, so if the second try worked, great, however cleanup of the first try remains impossible due to company policy. At least with S3, I think uploading either works or fails without leaving a partial upload which might break even DB Recreate, since it’s a bad file…

If you want to know if this is what it was, you can check the log before the posted failure for a retry attempt either in the log (similar to original post except for Put), or the job log “RetryAttempts” (before deleting DB).