Complete backup of 1TB, now taking endless time, reading whole files again

Hi, there’s one 1 TB audio files backup set I have, from Disk to another Disk. It was complete. Duplicati suddenly after march decided to read the whole stuff. I see a permanent IO of ~20 MByte/s. Since then, I always cancelled the backup after several hours, because, IDK, it will take… like days. It’s sad, because maybe like 2 Gigabytes of data have been added, which could be backed up quickly.

I added --disable-filetime-check=true - but still, it reads all files completely, it seems. In the backup folder, files are from march, so it does not seem to write there.
Why does it read the whole file contents?

Thx

The –disable-filetime-check documentation doesn’t say what it really does, but the code has this opinion:

which (if correct) means it will make things worse. Does File Explorer show any recent modified dates?

You can find exactly what it thinks about a given file if you use verbose logging to show file exam results.

Backups take very long with large data sets shows sample output and how you might set up the logging.

Does it show files going by in the GUI? If not, your I/O might be something other than it reading your files.

For a high-tech way of seeing what it’s actually doing on disk, you can use Sysinternals Process Monitor.

Yeah thx, but currently I resigned a bit. Maybe will check at a later time but do not create those backups simply. It does it on another one. 450 Gigabyte, which took a long time to backup. Now, after months, a subsequent backup goes through everything as well, at 10 Mbyte/s disk2disk, i.e. if I’m not wrong a finished backup will need 27 full days from now. I’m thinking about deleting all affected backup sets. And spend the next months for recreating the backups. IDK

I can not look into a log because it will be written in 27 days. “Remote” log is full of:

  1. Aug. 2019 15:29: put duplicati-i3e8de79488a6485e813e56a16c48286b.dindex.zip.aes

{“Size”:18397,“Hash”:“JenRkUHJTsezHCWoOejgR0qVW49br0g0AIDeNX28LzU=”}

That 1 TB backup as mentioned in the subject, which would take more than 2 months of backup time, I was sick of it all and file copied it via Totalcommander, unencrypted because encryption wasn’t to important there.

??? This isn’t the default job log presented at ending. About → Show log → Live is available at any time. Looking at file modification times using OS tools should also be readily available even while backup runs.

??? Specifically that one file, repeating over and over, or are you only saying there are files being put up?

What I would expect even if something is triggering re-scanning of lots of files is just your 2 GB new data packed into dblock files (default 50 MB) with a dindex for each, and a dlist at the end listing all your files…

Is 2 GB new data purely added files? If you’re editing old files or changing metadata, that’s a modification.

I don’t know. Duplicati reads and writes tons of AppData\Local\Temp\dup-6656d2a9-a8e4-49cc-ae5a-628e392672f3, somewhere in between there are some reads of the “to backup” files, and at times it writes files like “\Device\HarddiskVolume8\Backup\Duplicati\Michel\duplicati-b39d4f70131ad41ebaa1af8aeebc8605e.dblock.zip.aes” to the backup target.
4000 read/write events per second.

Please update when you are able to look. I can’t comment more on results without knowing the conditions.

Thx, cannot afford time. I decided to do huge backups on Local disk/rsync/Truecrypt volume basis, and the smaller ones via cron/Duplicati/Hosting.