On my qnap nas the tmp folder has grown to 76GB:
[/share/CE_CACHEDEV1_DATA/.qpkg/Duplicati/tmp] # du -hs .
76G .
The oldest file is 7 days old.
The chunks I am uploading for one backup are 100MB and for another backup 50MB
How can I clean up these files without destroying anything important?
I had similar issues with 3.0.3.6. To me it seems that the parameter assynchronous-upload-limit is not considered any more - and so Duplicati creates more and more files until the whole system is unoperational.
I don’t believe Duplicati will retroactively clean up local temp files from previous runs. The easiest method is probably just to make sure Duplicati isn’t running any tasks and manually delete all dup-* files from your temp folder.
Please let us know if being on 2.0.3.5 has “resolved” the issue (at least in the sense that you’re not still getting more temp files).
Of course retro-active cleanup isn’t such a bad idea… maybe something that just says “delete any dup-* files from temp that are older than the last STARTED backup time”.
A likely fix for this has been release with version 2.0.3.8 canary as mentioned below - if you try it out, please let us know if it resolved the problem for you.