Welcome to the forum @charlespick
It’s supposed to just pick up where it left off and finish backup. If your internet tends to drop, there are number-of-retries and retry-delay to configure in an attempt to keep retrying until your internet returns.
This is unlikely. It’s not that big, and is prepared before upload.
How do you know how long it takes, if it hasn’t been done yet?
You can see upload times in an information
level log, e.g. About → Show log → Live → Information
The last file upload to begin (though maybe not the last to finish) is a file with dlist
and date in name.
Here’s my dlist upload. Yours is almost certainly a different size. Job log says I have 6874 source files:
2021-11-11 12:53:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20211111T175000Z.dlist.zip.aes (994.92 KB)
2021-11-11 12:53:33 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 994.92 KB in 00:00:07.6602037, 129.88 KB/s
2021-11-11 12:53:33 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20211111T175000Z.dlist.zip.aes (994.92 KB)
Unless backup got lots more files, dlist
upload time (whatever it is) should be similar to past backup.
Every backup gets the full file list. What varies is the amount of changed data found in the source files.
Adding files to the initial backup is actually a good strategy (but too late if you’re already doing full initial) because it lets you prioritize what to backup first, and also gets a dlist
file up. Until the dlist
is up, a local disaster that destroys the local database would leave you without information on the backup work.
Restoring files if your Duplicati installation is lost explains how to Direct restore from backup files
however this involves recreating a temporary database from destination files, and it requires a dlist
file.
Do you know roughly the size or file count of your backup source is. Multiple days uploading file contents over a slow link (is yours slow?) would be quite possible. If you need to stop intentionally, use stop button.
Duplicati gets slow on some operations like Database recreate, and Direct restore
to some degree at default 100 KB blocksize with larger backups (perhaps over 200 GB) because of all the tracking of blocks.
The idea from @Xavron of having multiple backup applications is a good idea, if you want to worry less. Duplicati has gotten pretty reliable (not quite perfectly reliable), but any backup software can hit problems.