Hello, thanks to https://www.duplicati-monitoring.com I noticed that backup which was previously completed in 2-3 hours now after six months and 180 versions takes 6-7 hours
Numbers of files/size is same as 6 months ago:
Backup times 6 months ago:
Backup times now (after 180 versions):
Sqlite database size was 17GB, so I thought: what if I vacuum DB? I’m using 22.214.171.124, so vacuum is not ran automatically
Vacuum took 1,5 hours and during proces it took 47 GB on HDD (1x original db, 1x new file in temp, 1x journal for original file.
After vacuum DB size is 15,3GB
So I started backup to see if there is any improvement …
Backups after vacuum are in red rectangle
So speed improvement is noticeable.
Maybe Duplicati needs inform user about running vacuum after a certain number of versions? It’s a pity that after 126.96.36.199, backups will gradually slow down for all users…