I’ve spent the past week slowly uploaded a few more dirs at a time from my 300G home folder. I was finally ready to upload the whole thing and noticed that it was uploading cache files. I thought I had excluded these *cache expression exclude (not regex). So I stopped the upload, went back and modified it to cache, and backed up again.
Only problem was, I had mistakenly entered an exclude for *. Total upload size was 50MB.
I’ve corrected that and am re uploading. However, it’s taking quite a while on some video files that
were some of the first files I backed up
definitely haven’t changed since this whole process started.
Am I back at ground zero, or are these incremental backups just going to take a lot longer than I expected (do they need to rescan large files to make sure they haven’t changed? each time?)
Or may this just be an issue since I stopped the backup part-way through?
I’m not sure how many files it checks. Per How does Duplicati Resume After Interruption? (less scary title) reply, I could see it checking on files (maybe big ones) that might have been stopped short, but that’s not the case here. Possibly this backup will be a chance to get another report on what it looks like. Note it also has to enumerate all files, which is even less of a job than scanning file changes. Personally, I don’t tend to watch the GUI that closely.