backing up to external drive, but noticed as it shows success that there are only 1kb files on the destination drive. source is saying 0. Thoughts?
The destination drive is removed occasionally, but typically there prior to backup.
Duplicati: Duplicati.GUI.TrayIcon, Version=2.0.3.3, Culture=neutral, PublicKeyToken=null (Duplicati.Library.Main, Version=2.0.3.3, Culture=neutral, PublicKeyToken=null)
Autoupdate urls: https://updates.duplicati.com/beta/latest.manifest;https://alt.updates.duplicati.com/beta/latest.manifest
Update folder: C:\ProgramData\Duplicati\updates
Base install folder: C:\Program Files\Duplicati 2
Version name: “2.0.3.3_beta_2018-04-02” (2.0.3.3)
Current Version folder C:\Program Files\Duplicati 2
OS: Microsoft Windows NT 6.2.9200.0
Uname:
64bit: True (True)
Processors: 8
.Net Version: 4.0.30319.42000
Mono: False (0.0) ()
Locale: en-US, en-US, en-US
SQLite: 3.19.3 - System.Data.SQLite.SQLiteConnection
SQLite assembly: C:\Program Files\Duplicati 2\SQLite\win64\System.Data.SQLite.dll
Is it possible you have filters in place that are excluding everything from your backup?
If you use the “Show log” link of the job menu and click on the most recent row that says “Result” it should expand and show you details of the job run including things like counts and sizes of processed files.
Here’s part of a recent one from one of my machines where you can see that nothing had changed since the last backup of my 12,369 files (taking 3G of space).
A little further down is a “BackendStatistics” section that includes info about what Duplicati sees at the destination such as the following (I’ve put arrows next to ones you might want to check regarding your destination):
So the log for that backup indicates 2 files (totaling only 1,866 bytes) uploaded to the destination. I’m not sure why it would be doing that.
However it looks like at the destination there are 15,583 files (totaling 380G) so there should be quite a few files on your USB drive of at least 24MB.
If you look at your job “Show logs” page and click on the “Remote” button, do you see any put lines between around 05/16/2018 12:32 PM? I expect you should see 2 that, if expanding, should let us know what two files were being uploaded. I suspect at least one might be duplicati-verification.json (usually small, but not that small).
I (possibly incorrectly) recall a recent post involving @kenkendk about small / empty destination files but I can’t seem to find it. Perhaps he or @Pectojin has a thought on why such small files would be created.
May 16, 2018 12:32 PM: put duplicati-ib09eca35ee3a49fe9c8532448d1d5199.dindex.zip.aes
{“Size”:877,“Hash”:“MFawFz8ZzQCEwKJAFDsz91FHW8gk+AXrTZyQXf0LVrM=”}
May 16, 2018 12:32 PM: put duplicati-b64303e94d8ef40d8a41e852e7d55d83c.dblock.zip.aes
{“Size”:989,“Hash”:“eVd5gz4XTHcbGWSR43LkCP5LuBHAUnseOgyCJXKBRtA=”}
While it’s possible for the “tail” of a backup to result in a small dblock file (and small related dindex file) it’s not very likely…
I don’t suppose you’re using the --upload-unchanged-backups parameter, are you?
--upload-unchanged-backups
If no files have changed, Duplicati will not upload a backup set. If the backup data is used to verify that a backup was executed, this option will make Duplicati upload a backupset even if it is empty
Default value: “false”
Well, the overflow could easily be that extra block, but it shouldn’t create two small files in the same run since it could just merge them before uploading.
Is it two dblock files? If it’s dindex files then it’s fine since they can be pretty much any size, although usually pretty small.
My best guess is that “something” changes between the backups. Usually this would be metadata, like the folder timestamp changing. That would generate some very small volumes, which would then be uploaded.
You can use the affected command to see which files/folders a dblock file holds data for.
To avoid downloading the old volumes, Duplicati will create new small files. After some time, it will consider that there are too many small files and kick in a “compact” step that will download all the small files and produce a single larger file.