The worst that SHOULD happen is you have some leftover local temp files that may not have gotten cleaned up correctly.
The compacting process basically:
- downloads the necessary remote files
- uses the data to be kept to create fewer new compressed files (with new file names) and stores their sizes in the local database (I think it’s at this point that the downloaded local files are deleted)
- uploads the new files
- verifies the uploaded file sizes against the local database
- if the sizes match then the old files are flagged for deletion
- as part of the current (or a future) compacting / cleanup the files flagged for deletion will actually be deleted
I think (but haven’t verified) that the uploaded size verification and database flagging of deletable files is transactional (so it all succeeds or all fails as a set) but if I’m wrong then it’s possible you could kill Duplicati at just the right moment between those steps to make Duplicati not realize those old files can be deleted…