Needed to delete about 16 gigs from folder added by mistake recently.
(Windows 10, Duplicati 18.104.22.168_beta_2018-11-28)
I have managed to run purge via GUI, and it looks like the problem folder has been deleted from backup (I do not see it during restore) but the problem .aes file is the same old large size no matter how many times I run the compact command.
I think the compact command does not force a compact, but is like the usual auto-compact which does compact only if configured parameters call for it. See my first link, and this example of forcing compact.
Does that mean you set an extra-large remote volume size (–dblock-size) on the job’s Options screen? Choosing sizes in Duplicati gets into that. A large remote volume size can mean slow restores especially when a file’s incremental changes get spread across a number of large files that need to be downloaded.
Thanks for the reply, and I am sorry for delay with answer.
Your links explains everything.
I have managed my problem by deleting new large .aes file and repairing the database. Even have not lost anything.
I have 50Gb –dblock-size which is a mistake I have made configuring the backup task back in 2017. I have tried to change it to 15Gb, but it does not affect old files. I see now that I can do that by forcing the compact command.
By the way I am quite happy with the restore speed as my backup is not very large and stored locally.