If navigate in my Unraid terminal to the local sata disk and run zip --test duplicati-20200123T000000Z.dlist.zip
I received the following message
What will be the next step from here?
If navigate in my Unraid terminal to the local sata disk and run zip --test duplicati-20200123T000000Z.dlist.zip
I received the following message
OK, so this confirms the Duplicati claim that the .zip file is bad. Do you know if you ever used the Advanced option –throttle-upload or the speed control at the top of screen to right of status area?
Corruption of files could happen that would escape the size check. The bug is fixed in 2.0.5.1, but possibly you weren’t on that on Jan 23, because it had only been out for a couple of days by then.
to gather its information while it’s still there, then either assume it’s the only bad one, or scan all.
Seeing the screenshot of LOCAL BACKUP at about 3 TB source makes me worry about speed.
For this file-accessible backup, there’s actually another way to test all files, but it still reads them.
Possibly it will be a little faster than test
command. Does python --version
say 2.something?
Basically, this is –upload-verification-file, then utility-scripts/DuplicatiVerify.py
in your installation folder. If you’re running in Docker, then I’m not sure exactly how you get to that script.
Other option is to fix one file, then maybe discover some others via periodic tests as time passes.
Cranking up the test level is possible. This path should be quick, but it’s unknown what it missed.
Do you have preference for how much effort you want to put in now to see what might be wrong?
Regardless, you have at least duplicati-20200123T000000Z.dlist.zip bad, and you haven’t tested duplicati-20200105T000000Z.dlist.zip with zip --test
. That would be another useful thing to do.
You can probably quickly test all of your dlist.zip files (which might not be the only bad ones) with:
unzip -t '*dlist.zip'
(type in the quotes)
Well I delete the file and Duplicati is making now backups without errors. So it looks like the local backup issue has been resolved. So thx a lot!!
For the Cloud Dropbox backup I have the following error.
Object reference not set to an instance of an object
Do you have any ideas what that message would mean?
It’s a generic error in software, with 726,000 hits in Google. For Duplicati, it would be necessary to see what happens before it to have any hope of even recognizing it as a specific issue that may be known.
The live log can get some useful information, but getting the best view might take a few tries. Retry
is probably a reasonable starting level for the logging dropdown, but it might wind up needing more later.
I though lets start over Cloud based so I removed Dropbox backup en create a complete new one. When the backup has finish I will let you know if I have still errors.
Hello,
did you finally solved this issue?
I have the same error with dlist, dindex and dblock files.
I wants to back up my NAS folders onto a S3 compatible server.
I set the storage class on ‘GLACIER’ so I am thinking maybe duplicati is not able to check these files at the end of the back up because these are already unavailable. Restauration could take up to 6 hours in Glacier class.
I can manually get back the the files live but maybe it would be better if duplicati could apply the GLACIER flag only to backup files and not dindex, dlist and dblock ?
This thread gives more details about that : Duplicati and S3 Glacier - #4 by WhoopsHelp
If you want to go straight to Glacier, you need to disable compaction and back end verification. You also need to probably use unlimited retention.