Excessive SSD Writes

I recently built a new PC and have been trying to figure out how to continue using the backup sets I’d used previously on my old computer. The files are in exactly the same location with all the same names so I figured I could just rebuild the database and continue.

The first issue is that the database never seems to finish rebuilding, I left it to run overnight and it never finished. It’s only a backup set of 380 GB and it should only be going to my local NAS.

In investigating this I noticed that during that time it had averaged 70.28 MB/s writes to my brand new SSD, writing out a total of 2.3 TB or 6.2x the actual size of the backup set before I killed the process. These writes were to both C:\Users<user>\AppData\Local\Temp\ and to C:\Users<user>\AppData\Local\Duplicati. It ‘only’ wrote out a total of 7.16 GB to temp and 942 MB to Duplicati’s folder. I guess constantly overwriting itself.

What am I doing wrong, and why is Duplicati writing so excessively?

Screenshots of HWiNFO and Process Explorer
duplicati-destroying-ssd-on-database-rebuild-2

Out of curiosity, why not just transfer the database itself from the old computer? This way you won’t have to recreate anything at all.

Regarding the problem of lengthy database recreation, there are a couple possibilities there. A flaw in the current beta release may (under some circumstances) require all dblocks to be downloaded - very time consuming. If you are hit by this bug, the latest canary may work better. But remember it is more bleeding-edge so you may be affected by other bugs.

But if I were you I would transfer the Duplicati databases from the old computer: both the Duplicati-server.sqlite and the job-specific sqlite files. (Make sure Duplicati is not running when you transfer them.) If successful your backups should just work right away.

Thanks for your reply.

I didn’t think to copy it over when moving all the data, as not all of the backup sets will be identical. I started testing with a set that I knew was identical and in the same location to make sure everything was working alright before I tried to tackle the others that would have moved data.

I don’t think the flaw you mentioned is the one affecting me, only because my NAS is local and over gigE, it wouldn’t matter if it had to redownload everything off of it it shouldn’t take that much time anyway (~1 hour roughly). And it wouldn’t explain why it wrote 6 times the data to the harddrive over a 10 hour period.

As for canary, I’m not willing to run it with important data.

Really hoping I don’t need to start the backups from scratch.

How different are the backup sets? If you are setting them up completely differently, as in splitting or combining backup sets compared to your old computer, then it may require redoing the back end completely.

If it’s just pathing differences or minor folder additions/deletions, it shouldn’t be a problem: in that case I still would have started by copying the Duplicati-server.sqlite database (which contains job definitions) and also job-specific sqlite files. You could then adjust as needed on your new system.

When recreating the database there is a lot more disk activity than just downloading the remote blocks. If you were hit with the flaw mentioned above, then each dblock that is processed creates a lot of disk activity on the sqlite journal from what I recall. It’s also slow and t his is why database recreation can take so long. (Bug has since been fixed but not yet available in the beta channel.)

Yep, understood. I would not normally run canary in production either, but I find myself doing it (for now) as I was hit by the same database creation bug (and some other issues). I plan to switch back to beta once the next one is released.

Empty source file can make Recreate download all dblock files fruitlessly with huge delay #3747 is likely the flaw driving the entire dblock content of the backup from your NAS and through Temp, probably fast given your equipment. As @drwtsn32 noted, there’s a beta with the fix expected, but it’s not here yet…

Although it’d be nice if one of the other suggesteded solutions worked for you, it’s also possible to move things around, including the database location and temporary files, at least until the Recreates are done.

Canary at the moment (and it varies) is nearing a point where it might go to Experimental and then Beta, and unless you’re in a time zone GMT+# seems quite good, except for “Stop after current file” bad bugs.

So those are some of the non-ideal options now. Long-term, there might be ways to reduce Temp folder writing. Meanwhile, some folks who worry about SSD writes have had Duplicati use a RAM disk instead. Obviously you wouldn’t casually put any databases there, or you might be recreating them even more…

It is all the same data, just in different places, I consolidated and moved things around due to differences in drive sizes and a reduction in total drives. I’d done all this under the assumption this process would work, didn’t think there’d be a bug in rebuilding the database as I’d done so in the past without issue.

Redoing completely would mean either losing all versioning from the previous machine, or duplicating content for a while which I don’t have enough space to do for the ~30 day window I keep local copies. In the cloud this would be less of an issue, just cost me more money which isn’t ideal either.

I’ll give this a try, I’m also wondering what would happen if I chose to restore a directory from the backup, would it just download the database it has backed up there instead and then use that going forward?

Does this mean that if there’s a 0 byte file in my backup somewhere it’ll cause the issues I’ve experienced? I was initially thinking I could find the 0 byte file locally and just delete it but if it’s recreating the database it must already be in the backup that’s causing the issue. Damn.

I guess I’ll see if I can make use of the old databases.

Thanks guys

Reorganizing the data isn’t a big deal - just trying to split a backup set into multiple, or combining a backup set into a single. This isn’t really doable because Duplicati must store each backup set in a different back end location.

If all you did was reorganize files, then I think you could have definitely copied the sqlite databases from the old computer. After starting up Duplicati you would then want to reconfigure the backup set to correct the source folder locations. Duplicati would undoubtedly need to reprocess any data that has been moved around (so the first backup job may take longer than normal), but it wouldn’t need to re-upload blocks that already exist on the back end.

You don’t need to redo it in my opinion.

I’m not exactly sure what you mean here. If you mean use the “restore from backup files” and point it to the back end data, Duplicati will end up creating a small temporary database in order to facilitate the restore. Since your ultimate goal is to continue backups on this new machine, this isn’t the best approach. Either copy the databases from the old machine or do a full database recreation. Obviously the latter isn’t working so well for you with your beta version. Try copying the databases instead. Good luck!

Well 3 out of 4 worked fine. The 4th is having an unrelated error to anything discussed previously.

It keeps failing with the error “One or more errors occurred.” and then when you view the log it doesn’t even show that the backup ran at all. Using the log-file option to output a text log I was able to figure out it was failing on C:\Users\user\NTUSER.DAT which is supposed to be excluded from the backup.

Hitting all the bugs! The difference between my old PC and the new one is 2.0.4.5_beta_2018-11-28 vs 2.0.4.23_beta_2019-07-14 which was supposed to have been a very minor change related to Amazon. So probably not that.

Will have to experiment with this tomorrow.

Likely so, and they are probably not rare, e.g. Windows’ default New Text Document.txt is empty.
To say for sure would need testing using a release with Check block size on recreate #3758 bug fix.

Other ways to get logs are About → Show log → Live → Retry (or other) or About → Show log → Stored. A non-run can also come from configuration issues if you rebuilt the configurations manually.

I don’t know if that is a bug exactly. The reason it is not backing it up is probably because you are not using snapshots (VSS). That file will be locked while logged in, so without using snapshots Duplicati cannot back it open files.

Or do you mean you have configured exclusions and they aren’t working correctly?

Another potential example might be a lock file I suppose as well. Which I know I have many of for various applications in different locations. Thanks for your help.

Yes, the exclusion isn’t working correctly. I’ve tried a few different ways and it always tries to backup the file anyway. They do however seem work on other files just fine.

However, your mention of VSS does explain other errors I had in one of the other 3 backup sets, which I’d fixed by just excluding folders I didn’t really care to backup. Didn’t realize VSS wasn’t on by default in a default install of Windows 10, and now that you mention it reminds me of when I first installed Duplicati on my old machine years ago. Thank you.