Hi,
I’m running Duplicati 2.0.3.9_canary_2018-06-30.
I have a rather slow uplink and have to backup a large quantity of photos - so I thought I’ll start with the smallest photos first and slowly increment this over time, I’ve setup this backup job in the GUI and tried to use ‘skip-files-larger-than’ in there as well as the CLI and result is the same, it seems like the filesizes are being ignored.
Example - here is a directory I wish to backup - I want only the 1st file of 3608644 bytes to backup and not the 2nd of 4049000,
$ ls -al /mnt/data/pics/mypics/
total 7516
drwxr-xr-x 2 myuser myuser 4096 May 31 17:21 .
drwxr-xr-x 359 myuser myuser 20480 Jul 13 02:11 ..
-rw-rw-r-- 1 myuser myuser 3608644 Feb 12 14:57 20180212-145742.jpg
-rw-rw-r-- 1 myuser myuser 4049000 Feb 12 14:57 20180212-145745.jpg
So I set my skip-files-larger-than to 36086700 to ensure that ONLY 20180212-145742.jpg gets backed up:
# mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup "googledrive://duplicati/backuptest/?authid=xxx" /mnt/data/pics/mypics/ --backup-name=test-backup --dbpath=/root/.config/Duplicati/test-backup.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase=xxx --retention-policy="30D:1D,16W:1W,36M:1M" --disable-module=console-password-input --skip-files-larger-than=36086700
Backup started at 17/07/2018 10:19:24
Checking remote backup ...
Listing remote folder ...
Scanning local files ...
3 files need to be examined (7.30 MB)
2 files need to be examined (7.30 MB)
Uploading file (7.32 MB) ...
Uploading file (6.31 KB) ...
Uploading file (1.11 KB) ...
Deleting file duplicati-20180717T085325Z.dlist.zip.aes (1.11 KB) ...
Deleting file duplicati-becf96572df9445049e4fbfa68f43a73e.dblock.zip.aes (7.49 MB) ...
Deleting file duplicati-ibcd9c808dc7a4fdea6f9d94a28146e07.dindex.zip.aes (6.39 KB) ...
Checking remote backup ...
Listing remote folder ...
Verifying remote backup ...
Remote backup verification completed
Downloading file (1.11 KB) ...
Downloading file (6.31 KB) ...
Downloading file (7.57 MB) ...
0 files need to be examined (0 bytes)
Duration of backup: 00:07:00
Remote files: 6
Remote size: 14.91 MB
Total remote quota: 90.28 TB
Available remote quota: 80.28 TB
Files added: 2
Files deleted: 2
Files changed: 0
Data uploaded: 7.33 MB
Data downloaded: 7.58 MB
Backup completed successfully!
As can be seen above - 2 files was added and not the expected 1.
This reflects in the WebGUI as well, and doing a ‘find’ I can see it has just backed up:
# mono /usr/lib/duplicati/Duplicati.CommandLine.exe find ""googledrive://duplicati/backuptest/?authid=xxx" /mnt/data/pics/mypics/20180212-145745.jpg --backup-name=test-backup --dbpath=/root/.config/Duplicati/test-backup.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase=xxx
Listing files and versions:
/mnt/data/pics/mypics/20180212-145745.jpg
0 : 17/07/2018 10:19:28 3.86 MB
1 : 16/07/2018 22:54:52 -
At first I assumed it was related to file allocation sizes locally, however even not backing up files larger than 1 byte / 1k / 1mb has the same effect.
Pointers as to why this is happening will be appreciated.