Delete operation was not completed

Hello,

I get the following error message and the backup is not done because of that:

Error: A previously started delete operation was not completed, but the attempt to recover the operation details failed with the message: Invalid header marker.
If this problem happens more than once, it is possible that the delete transaction file is defect. Try renaming the file "duplicati-delete.transaction.aes" and the run the operation again. If no errors are reported, it is safe to delete the file, otherwise Duplicati will report what needs to be manually corrected.

I cannot find the file “duplicati-delete.transaction.aes” to rename it as suggested.
I think I changed the directory structure of the files to be backuped before I got this error message.
Any ideas what I can do now?

Hello @monolith, welcome to the forum!

Answers to a few questions might help narrow down what’s going on:

  • What version of Duplicati are you using
  • What OS are you running it on?
  • Has this backup worked before (and if so, did it fully complete)?

Hello @JonMikelV, thanks for helping!

  • I use Duplicati 1.3.4
  • My OS is Windows 10 Pro
  • Yes, the backup worked fine and completed fully. But as mentioned I think I changed the location of the 2 folders (Documents & Pictures) for backup and since then I receive this error.

Thanks for those answers. Unfortunately, Duplicati 1.x is no longer supported

There are a few forum members who have used it in the past - hopefully one of them will be able to give you hand.

Is there a particular reason you’re not using Duplicati 2?

Thanks for the hint - I will try to update to Duplicati 2.
Two reasons I did not earlier: First, Duplicati did not ask me to do so and there is no obvious update button, and second, never change a running system. :grinning:

Edit: I just wanted to download Duplicati at Heise, which is a very reliable and safe source for software in Germany, and there it still says Duplicati 2 is experimental. That is probably why I had chosen Duplicati 1.3.4 the last time.

Sorry this scrolled past my “to-do” list, but if you’re still interested…

There is no upgrade path from Duplicati 1 to 2 - they use completely different processes & storage formats so you’d basically be starting a fresh backup process. On the plus side, you might be able to run both simultaneously while evaluating Duplicati 2. :slight_smile:

Duplicati 2 is technically still in beta status - but has been for many years. It is generally quite stable and many people and businesses use it daily without issue.

That being said, there are some edge cases where Duplicati just doesn’t work well (lots of errors or warnings). While we haven’t figured out exactly what causes those issues, it seems to be the case that when Duplicati works, it works well, and when it has problems, it has lots of problems.

So if you try it out and it works well for you, then most likely it will continue to work juts fine.

No matter what route (or tool) you go with, I wish you happy (and healthy) backups! :smiley:

Thank you @JonMikelV for your support! I just setup a backup with Duplicati 2.0.4.5 but encountered some problems:

  1. When testing the FTP connection to my NAS that failed although I entered the identical settings that worked before.

  2. When running the backup I got an error and the (remote) log output did not help me at all:

* Dec 28, 2018 9:32 PM: put duplicati-b42db950a72f34e2b934da529dd7e431d.dblock.zip.aes

{"Size":52330781,"Hash":"zP8cDFThlimE/0b+5FxHDpaqHVqzt0EF+Wt5o+qidrk="}

* Dec 28, 2018 9:32 PM: put duplicati-bf410f269703f404abcbc532f5048d81c.dblock.zip.aes

{"Size":52346429,"Hash":"iya66Xu6pEYeCEXc1fJ6x8QpTsTSre2KFyi87mZClVs="}

* Dec 28, 2018 9:32 PM: put duplicati-b5a8aa73f06c344d581355861c0df371e.dblock.zip.aes

{"Size":52346429,"Hash":"iya66Xu6pEYeCEXc1fJ6x8QpTsTSre2KFyi87mZClVs="}

I also wish you happy and healthy backups for 2019! :fireworks::confetti_ball:

That’s a tough thing to diagnose but there should be a related error message either in the job “General” log or the global About -> Show log -> Stored log.

That’s likely because the Remote log is just for logging what happens with the destination. In your case it looks like it’s connecting just fine and is able to send (put) files to the destination as expected.

If an error was reported it should be in the job “General” log or the global “Stored” log.

If you can find that (and/or the FTP error) and post that here we might be able to track down the issue.

I’m not very sure that FTP is completely well. There are signs of retries, plus it ended in an error (not posted).

Although I’m not sure how they got there (was that copy-and-pasted, or an edit including final double quote?), seeing dblock uploads without dindex in between worries me, and even more when the size and hash repeats.

My own uploads were a little unreliable this morning, doing retries but finally succeeding, possibly unlike here.

From About --> Show log --> Remote which is in reverse-chronological order:

Dec 28, 2018 6:47 AM: put duplicati-20181228T114458Z.dlist.zip.aes
Dec 28, 2018 6:47 AM: put duplicati-iea9625f6e0bf4e7494f0bfb2832391fa.dindex.zip.aes
Dec 28, 2018 6:47 AM: put duplicati-b63bae741141b4b1f8acc303676c104c3.dblock.zip.aes
Dec 28, 2018 6:46 AM: put duplicati-ba64191172fd249d5962228fecf68f2a0.dblock.zip.aes
Dec 28, 2018 6:45 AM: put duplicati-b8f60b52e00904d378d070974918874ef.dblock.zip.aes

From –log-file log at –log-file-log-level=Profiling (trimmed to above put tries):

2018-12-28 06:45:50 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8f60b52e00904d378d070974918874ef.dblock.zip.aes (3.13 MB)
2018-12-28 06:46:15 -05 - [Retry-Duplicati.Library.Main.Operation.Common.BackendHandler-RetryPut]: Operation Put with file duplicati-b8f60b52e00904d378d070974918874ef.dblock.zip.aes attempt 1 of 5 failed with message: Unable to read data from the transport connection: 
2018-12-28 06:46:15 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b8f60b52e00904d378d070974918874ef.dblock.zip.aes (3.13 MB)
2018-12-28 06:46:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b8f60b52e00904d378d070974918874ef.dblock.zip.aes (3.13 MB)
2018-12-28 06:46:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-ba64191172fd249d5962228fecf68f2a0.dblock.zip.aes (3.13 MB)
2018-12-28 06:46:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ba64191172fd249d5962228fecf68f2a0.dblock.zip.aes (3.13 MB)
2018-12-28 06:47:12 -05 - [Retry-Duplicati.Library.Main.Operation.Common.BackendHandler-RetryPut]: Operation Put with file duplicati-ba64191172fd249d5962228fecf68f2a0.dblock.zip.aes attempt 2 of 5 failed with message: Unable to read data from the transport connection: 
2018-12-28 06:47:12 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-ba64191172fd249d5962228fecf68f2a0.dblock.zip.aes (3.13 MB)
2018-12-28 06:47:22 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-ba64191172fd249d5962228fecf68f2a0.dblock.zip.aes (3.13 MB)
2018-12-28 06:47:22 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b63bae741141b4b1f8acc303676c104c3.dblock.zip.aes (3.13 MB)
2018-12-28 06:47:22 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b63bae741141b4b1f8acc303676c104c3.dblock.zip.aes (3.13 MB)
2018-12-28 06:47:55 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b63bae741141b4b1f8acc303676c104c3.dblock.zip.aes (3.13 MB)
2018-12-28 06:47:55 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-iea9625f6e0bf4e7494f0bfb2832391fa.dindex.zip.aes (5.50 KB)
2018-12-28 06:47:55 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-iea9625f6e0bf4e7494f0bfb2832391fa.dindex.zip.aes (5.50 KB)
2018-12-28 06:47:55 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20181228T114458Z.dlist.zip.aes (32.76 KB)
2018-12-28 06:47:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20181228T114458Z.dlist.zip.aes (32.76 KB)

When retrying “Put”, it uses a different file name, with same contents, so the “Size” and “Hash” are the same.

Agree with @JonMikelV that getting an error message would help. You can also test “FTP (Alternative)” as “Storage Type” if you’d rather see if you can get around the FTP problem instead of actually debugging it…

Hmm… I haven’t looked at the Remote logging code, perhaps it is used it to log the start of a remote action as in “I’m about to do this, if you don’t hear anything else from me afterwards now you know where to go look”.

That might explain the singular (and past) Remote log entries vs. the multiple Profiling retries.

Thanks folks for your answers! I now tried FTP (Alternative) and got a clearer error message:

Failed to connect: Error writing file: The file duplicati-access-privileges-test.tmp was uploaded but the returned size was 0 and it was expected to be 83 

And thanks for indicating that the log can be found at “About” (which is not intuitive). My log is the following:

* Dec 29, 2018 5:23 PM: Request for http://localhost:8200/api/v1/remoteoperation/test gave error

* Dec 29, 2018 5:23 PM: Reporting error gave error

* Dec 29, 2018 5:08 PM: Failed while executing "Backup" with id: 1

* Dec 28, 2018 9:32 PM: Failed while executing "Backup" with id: 1

* Dec 28, 2018 9:23 PM: Request for http://localhost:8200/api/v1/remoteoperation/test gave error

* Dec 28, 2018 9:23 PM: Reporting error gave error

* Dec 28, 2018 9:18 PM: Request for http://localhost:8200/api/v1/remoteoperation/test gave error

* Dec 28, 2018 9:18 PM: Reporting error gave error

* Dec 28, 2018 9:05 PM: Failed while executing "Backup" with id: 1

* Dec 28, 2018 9:01 PM: Request for http://localhost:8200/api/v1/remoteoperation/test gave error

* Dec 28, 2018 9:01 PM: Reporting error gave error

* Dec 28, 2018 9:00 PM: Request for http://localhost:8200/api/v1/remoteoperation/test gave error

* Dec 28, 2018 9:00 PM: Reporting error gave error

* Dec 28, 2018 8:59 PM: Request for http://localhost:8200/api/v1/remoteoperation/test gave error

* Dec 28, 2018 8:59 PM: Reporting error gave error

Does this give you any clue?

That’s always a fun one. :slight_smile:

Based on the access test error message it sounds like the FTP account you are using my not have write access OR no free desk space on the FTP server.

Some FTP servers have logs that can say exactly what the issue was (from that end’s view). Does yours?

Alternatively, grab an FTP client to see what it can do and see. If it can’t upload files, Duplicati likely can’t.

Windows has a simple FTP client included. Other options include WinSCP (doing FTP) and FileZilla which develops both a client (for the test here) and a server (in case you get interested in a change of servers).

I’m not sure what settings Duplicati 1.3.4 has, but you likely don’t want to set Duplicati 2 to the same folder. Instead, make a new folder, make sure permissions are correct (including writing), and change your config.
While on the Destination screen, use the “Test connection” button. If that fails, the actual backup likely will.

I think I found the problem. After I deleted the old backup of Duplicati 1 everything works fine now. It seems that I was out of disk space on my NAS which is the target of my backup. On my Synology NAS I did not recognise that the disk was full. My data to be backuped is about 300 GB and my disk space on my NAS is 2.7 TB. So it is unclear to me, how Duplicati 1 managed to empty my free disk space. I hope Duplicati 2 has a better disk space management.
This most likely also explains why my Duplicati 1 failed - also because of a lack of disk space. It seems to my that the NAS changed the access rights of the FTP user to read only (I found this in user permissions) after the disk was full.
Thanks to both of you for your support!

I can’t compare to Duplicati 1 except through Block-based storage engine, where Duplicati 1 might have the potential to accumulate incremental changes forever, unless it had ways to delete versions like Duplicati 2’s “Backup retention” setting on the “Options” screen of the job provides. Duplicati 2 will definitely make noise when space runs out, and some storage types (but not FTP) that provide remote space information can get you a warning when space starts to run out. By default 10% of backup size must be free. I think SMB can do this, if you want to try. Below is from my backup job log run with SMB. I expect FTP will not give QuotaSpace.

    BackendStatistics:
...
        TotalQuotaSpace: 119507251200
        FreeQuotaSpace: 47542501376
        AssignedQuotaSpace: -1
        ReportedQuotaError: False
        ReportedQuotaWarning: False

Thank you for your reply and happy new year and healthy backups!