How do I check my backup is integral? - Antivirus deleted some archives

Hello,
I recently installed a new antivirus, but did not exclude my backup drive before scanning (since my old AV had started to detect false positives.) Bitdefender decided also that some archives contained malicious content, and so deleted some of the archives.

I used verify to check the backups:
duplicati.txt (github.com)

but I could not work out how to use list-broken-files:
by the commandline GUI:

ErrorID: FiltersAreNotSupportedForListBrokenFiles

Filters are not supported for this operation

Return code: 100

or by the command line. I would get asked for a encryption passphrase I don’t have set as far as I know.

Do i need to do anything more, or did verify fix the issues?

Thanks!

Welcome to the forum @thelewiss1

If your Source screen has filters, e.g. to exclude things, delete them for this operation in Commandline.

You might have tried to write a CLI line from scratch. Better is to start from an Export as Command-line, because that will pick up essential information that matches GUI backup – including a lack of encryption. Encryption for local files may seem pointless, but might have kept the antivirus from disliking file content.

I don’t think verify fixes things, however from testing it should pop up an error if any files are truly missing.

What’s worrisome is the file of errors that you posted. A few of your files are mysteriously the wrong size. What are the dates on those? Maybe instead of deleting the whole archive, the AV deleted a few zip parts. That would mean you’ve lost whatever data was there. You can use The AFFECTED command on them.

Example of use here. Possibly just hiding the 8 files by prefixing name with hidden- would let you run list-broken-files and get the same sort of output without having to retype names into the affected command.

If you can stand losing older file versions, you could delete the Database and backup files for a fresh start. Adding encryption at that time would probably keep the AV from handling what I guess are false positives?

Hi @ts678

I ran AFFECTED (without changing any parameters) and it tells me no files are affected.

I tried running LIST-BROKEN-FILES again, ensuring there was no filters


but it still fails and tells me that filters are not supported.

Bitdefender is able to delete parts of archives, so it is likely that

happened.


I agree that the file sizes look wrong, I assume this only applies to .dblock.zip files, and not to dlists? I’ve attached a picture of some dlists also.

If I really need to, I’m happy to restart my backups.
Thanks!


I apologise for the seperate replies - I can’t embed more than 3 images at once.

There is still an exclude filter on the screenshot. Delete filter using x button.

AV will clobber whatever it dislikes, however unencrypted .zip of dblock might look more suspicious.
The dlist and dindex files are pretty much just text whereas the dblock has blocks from source files.

Reading images is a really hard way to find file times. Explorer or the dir command is far easier…
I did find three, rather close together which suggests this is the AV damage. Is the time about right?

duplicati-b78aa27db10e74f1c9ecaa63e22b82e3e.dblock.zip 2020-12-10 21:28
duplicati-b3eee543686934942bb60fe1ac000d3d2.dblock.zip 2020-12-10 22:34
duplicati-b585bf5aaf30a4294b7b320ebb0bf4e56.dblock.zip 2020-12-10 22:04

Hi there,
I’ll use the CMD in future.
Information on those files as follows:

2020-12-10  21:28        52,324,022 duplicati-b78aa27db10e74f1c9ecaa63e22b82e3e.dblock.zip
2020-12-10  22:34        52,278,639 duplicati-b3eee543686934942bb60fe1ac000d3d2.dblock.zip
2020-12-10  22:04        52,169,315 duplicati-b585bf5aaf30a4294b7b320ebb0bf4e56.dblock.zip

Those times are correct.

It’s very long and large, but here’s a copy of my Backup directory, if it is helpful:
https://gist.githubusercontent.com/thelewiss1/b343ded2aad8c7fe3f259b5a92123038/raw/ef55f8610ec9673cf9674e763112b729a34cd292/dir.txt

LIST-BROKEN-FILES returns:
listbrokenfiles (github.com)

Thanks!

I’d hope so since I read them from your screenshot. It’s the other 5 that I didn’t find. Are the other 5
near these, with all 8 at the time when the AV ran? That would solidify the idea that AV did damage.
OK, since I had the file listing (lots of files…) I looked up the other 5, and the times are very nearby.

duplicati-b4000bf8bd9f648cb999efdf5e8560a15.dblock.zip 2020-12-10 22:33
duplicati-b78aa27db10e74f1c9ecaa63e22b82e3e.dblock.zip 2020-12-10 21:28
duplicati-b3eee543686934942bb60fe1ac000d3d2.dblock.zip 2020-12-10 22:34
duplicati-b585bf5aaf30a4294b7b320ebb0bf4e56.dblock.zip 2020-12-10 22:04
duplicati-b506bc89270784402930ee29ca2f8d8a7.dblock.zip 2020-12-10 22:13
duplicati-b94c43a3d4a384afcb4241c304bd6aaa3.dblock.zip 2020-12-10 20:55
duplicati-b50dfefc6e0914138a87b46d96495a98a.dblock.zip 2020-12-10 22:12
duplicati-b4561f22d22c0493fabd241b001d81a61.dblock.zip 2020-12-10 22:27

Having the list with a nice time format at the left also made it easy to sort the files by time.
Typically a backup churns out files at kind of a steady pace, and does a dlist near the end.
The view here is different. It looks like maybe the AV corrupted old files between backups.
You can see the 8 wrong size files sitting by themselves seemingly between the backups:

2020-12-10  17:15           803,410 duplicati-ice3a7d864cf8469d82b48caaf0fde015.dindex.zip
2020-12-10  17:15           852,587 duplicati-i880d73a7160a414daf8367e742ccff42.dindex.zip
2020-12-10  17:18        28,236,925 duplicati-20201209T230002Z.dlist.zip
2020-12-10  17:18            78,836 duplicati-ie2cabeb8cb4e419f919cf538df05af54.dindex.zip
2020-12-10  20:55        52,380,237 duplicati-b94c43a3d4a384afcb4241c304bd6aaa3.dblock.zip
2020-12-10  21:28        52,324,022 duplicati-b78aa27db10e74f1c9ecaa63e22b82e3e.dblock.zip
2020-12-10  22:04        52,169,315 duplicati-b585bf5aaf30a4294b7b320ebb0bf4e56.dblock.zip
2020-12-10  22:12        52,336,938 duplicati-b50dfefc6e0914138a87b46d96495a98a.dblock.zip
2020-12-10  22:13        52,405,431 duplicati-b506bc89270784402930ee29ca2f8d8a7.dblock.zip
2020-12-10  22:27        52,381,721 duplicati-b4561f22d22c0493fabd241b001d81a61.dblock.zip
2020-12-10  22:33        52,331,048 duplicati-b4000bf8bd9f648cb999efdf5e8560a15.dblock.zip
2020-12-10  22:34        52,278,639 duplicati-b3eee543686934942bb60fe1ac000d3d2.dblock.zip
2020-12-11  00:19            37,204 duplicati-i5769ee6ac08c4c30b926af5e67caeb7f.dindex.zip
2020-12-11  00:19            39,004 duplicati-i4dda2a3371574e7da35579f44095feff.dindex.zip

It looks like it’s not going to trace issues back to source level just based on files being the wrong size.
You can copy and paste the 8 names into the affected command, or hide the 8 for a list-broken-files. Generally I just rename with a prefix, e.g. hidden-, but for 8, moving them to another folder may work.

Test result seemed to be similar in either command. You want to see what impact loss of files means.
If you prefer the affected command, the easiest starter is Commandline and change from backup to list-broken-files, then change the source file list to name the 8 damaged file names, one per line.

Or do it both ways if you like, to see if you also get similar results. If acceptable loss, then just take out damaged files (they are effectively taken out by renaming or moving), and run a purge-broken-files.

Disaster Recovery gets into these areas, and topic applies here because you have a corrupted backup.

Hi there.

AFFECTED: gist:d9816fe6ac2b0b743df44360eaa67aad (github.com)

I’m not too concerned, so I think I’ll purge them, unless you have any objections or notes I should see beforehand.

Thanks!

Sounds good. I hope it goes well, and that your AV doesn’t damage any files again.
If you ever do need to restart fresh, encrypting the backup would probably prevent.

If you’re super curious you could examine the seeming damage in some other way.
Because you are not too concerned, easiest path might just be to purge those files.

1 Like

Hi there!

I ran the PURGE-BROKEN-FILES command, but it doesn’t seem to exhibit the same behaviour as in the Disaster Recovery doc.
PURGE-BROKEN-FILES (github.com)

Thanks

Because you only used affected path, did you do the list-broken-files setup (first part here)?

Your output still complaining about file size suggests you left files there. Disaster recovery removed.

Ah, I copied them instead of moving. My mistake.
After running LIST-BROKEN-FILES again, it shows the list of affected files.
I then ran PURGE-BROKEN-FILES, but it returned

  Listing remote folder ...
  Backend quota is close to being exceeded: Using 851.31 GB of 931.51 GB (79.88 GB available)

  ErrorID: CannotPurgeWithOrphans
  Unable to start the purge process as there are 184720 orphan file(s)
  Return code: 100

This seems to be pretty rare. I’m sorry it happened to you. Of the two reports I could find, this was also AV damage, though I don’t know why AV damage would be anything special. Regardless, it had a workaround:

When on your Database page, instead of simply doing Recreate you could consider making a copy of the database beforehand just in case something goes wrong in the Recreate and we want to try old DB again.

Maybe a Repair would help, but it’s never clear what Repair can fix, and it sometimes harms destination. Recreate doesn’t always work well, but at least it doesn’t change the destination (the actual backup data).

You could also post a DB bug report so someone can maybe make some sense of its damage, but it’s not guaranteed that will be possible, and it probably also won’t lead directly to knowing how things got that way.

To benefit anybody following this chase, I think this is the detection code. There’s one table listing files. File may appear multiple times if different versions exist. Each of them should be in a different table organized by backup version (which contains files). In this case, some files did not have a containing backup version.

I copied the DB.

I told it to recreate it, which spat out this error:
DUPLICATI-RECREATE-ERROR (github.com)

To be honest, I might just start from scratch. It seems the most easy way out. I don’t think I’ll need anything previous.

Thanks!