Backup started at 3/14/2019 6:24:02 PM
Checking remote backup …
Listing remote folder …
Expected there to be a temporary fileset for synthetic filelist (1, duplicati-20190314T201815Z.dlist.zip.aes), but none was found?
Scanning local files …
48795 files need to be examined (102.47 GB)
48793 files need to be examined (102.33 GB)
48792 files need to be examined (101.79 GB)
…
1004 files need to be examined (0 bytes)
the above is the console output for the instance that is currently trying to run. it’s been stuck at 1004 (0 bytes) for 4 hours and seems to have stopped working.
Backing up from 1 local drive to another local drive and set it to create 1 large file instead of a bunch of tiny ones. There is a total of 47 GB that is being backed up.
What is your volume size? Did you change it from the default 50mb?
I as well as anyone else around here will strongly recommend against doing it this way. Duplicati uses the “bunch of tiny” files architecture pretty extensively in its method of doing incremental, versioned backups. When old files or versions are removed, for example, and one of the dblocks is no longer necessary (or repacked into a newer dblock file), the old one can be removed. If your initial backup is one giant 47GB dblock file, this will never work, along with probably a host of other issues.
If it were me I’d not recommend going over maybe 2GB for your volume size, and I only do that when backing up to a local attached removable hard drive where bandwidth isn’t as much of a concern. for remote backups, the largest ones I use are 500MB. And the backup jobs I have that take care of smaller or more frequently changing files are set at 200MB.