Since the initial backup takes days for large backup sets, is it o.k. to abort the backups at the end of the day and start them again the next day? Or does that risk corruption and I should just stick to pause backup and suspend my laptop?
@mohak, I believe you’re safe from corruption caused by an “abort” - you’re just not yet fully backed up. If you “abort now” any local prepping of the next batch of backup is aborted. If you “abort after next upload” the currently-being-worked on batch is uploaded and any remaining files are just not yet backed up.
The way it works at the moment, when the backup is kicked off again it will re-scan ALL files (just like at the start of every backup) and only transfer stuff that is seen at changed (or created) versus what’s already been backed up.
So if you aborted your 100M backup after only the first 10M had been backed up, then immediately re-started the job the first 10M would be seen has having no changes and backing up would continue continue on from there.
HOWEVER, if you have a very active first 10M or wait quite a while between backup runs (with each being aborted after 10M) you could end up getting multiple versions of that first 10M and NO backup of the other 90M.
Still no corruption, but you’re not fully backed up yet either.
It might be worth asking for a “continue-from-last-point” boolean to tell Duplicati to prioritize “the files after the last one backed up” before the files “at the beginning” of the scan order to help avoid having multiple versions of “early” files and NO versions of “later” ones.