Needless and Repeated Backups

Hi Folks

I’m using Duplicati 2.0.4.5_beta_2018-11-28.
I’ve noticed that I have two folders whose contents are always backed up, even though the contents have not changed. The reason I see is that for all of the files in question, their modification date is earlier than their creation date. This has come about for two reasons (1) the files were restored from cloud backup by a gaming platform (game save files), and (2) files were restored from a local backup created last year by a genealogy program. I have now excluded these two folders as an interim measure but of course, would prefer if Duplicati handled these files properly, as expected.

Having a mod date earlier than the create date would not seem to be unusual. Will Duplicati properly handle this situation in the future? It is important because unless one notices the repeated backups, there is no other way one would know of it.

There are utilities that can be used to change these dates but, again, unless you know of the situation you won’t know. Thoughts and advice would be appreciated (I’m a new user).

Thanks … Noel

What clues did you see (the more extensive the list, the better, because I think this is actually hard to see).

Block-based storage engine and How the backup process works would be worth reading if you haven’t yet. Basically Duplicati is supposed to only upload file blocks that aren’t already backed up, otherwise, it simply points to its old blocks. The file appears in the dated backup tree whether or not anything actually changed.

You can find out how many files it saw as candidates for checking-for-changes in Show log of the backup, e.g.OpenedFiles: would probably count all files in your two suspect folders, if Duplicati checked their files.

One can probably find some names behind that count by logging at Verbose or higher, either using live log at About --> Show log --> Live --> Verbose, or using –log-file with –log-file-log-level=Verbose. Example line:

2019-05-01 15:14:00 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\backup source\length1.txt, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 3/3/2019 11:29:09 PM vs 1/1/0001 12:00:00 AM

which in this case looks like a new file, but these are the sorts of tests that decide whether to open/backup.

If there is a problem where these folders are always backed up, a test for that would be to create a backup containing only these two folders and run it twice. If issue exists, it would upload twice and take same time.

Result for “no issue” would be no backup, because by default a new “view” of files isn’t done if no changes. –upload-unchanged-backups in Advanced options can override that and get a view, but can’t force uploads.

EDIT:

C:\backup source>perl -e “utime(time,time-24*60*60,‘modified_before_created.txt’);”

Start of first backup, which uploads files:
2019-06-15 12:13:57 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-BackupMainOperation]: Starting - BackupMainOperation
2019-06-15 12:13:57 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingSourcePath]: Including source path: C:\backup source\modified_before_created.txt
2019-06-15 12:13:57 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\backup source\modified_before_created.txt, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 6/14/2019 4:09:19 PM vs 1/1/0001 12:00:00 AM

All of next backup run, which sees no work:
2019-06-15 12:14:06 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-BackupMainOperation]: Starting - BackupMainOperation
2019-06-15 12:14:06 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingSourcePath]: Including source path: C:\backup source\modified_before_created.txt
2019-06-15 12:14:06 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-SkipCheckNoMetadataChange]: Skipped checking file, because no metadata was updated C:\backup source\modified_before_created.txt
2019-06-15 12:14:06 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-BackupMainOperation]: BackupMainOperation took 0:00:00:00.003

No second backup made. Can you supply details on how to get this, or was it maybe a misunderstanding?

The mechanism is already in place to handle this. The default behaviour that is used by Duplicati in identifying files is fname+mdate+size. If these 3 match to what’s already in the database the file is skipped irrespective of its content. This satisfies the requirements of most users. In your case even if a file is restored and it has already been backed up then Duplicati will hash the file and possibly contents of the file and since it already has the information in the database it will only modify the expiration date for blocks and not upload any data since the data is already backed up.

To change this behaviour you can use the -disable-filetime-check option. This will make Duplicati hash check all files. A mismatch on of the hash will cause Duplicati to start looking at the contents of the file and upload any changes.

My assumption regarding repeated backups was prompted by the perceived behavior of another backup product and the appearance of more than one version for a number of files in Duplicati. On closer investigation, 26 of 143 files in the folder in question were found to have had their modified times changed by 11 minutes and these were the ones backed up again. These game files were not “save” files but part of the game and probably updated by the gaming platform. There were some other files that appeared to be likewise backed up needlessly, all of which had been restored from earlier backups and hence had mod dates earlier than their create dates. It would seem to be a misunderstanding on my part - I’m sure my confidence with Duplicati will grow with use.
Thank you for your valuable assistance and advice. By the way, I have specified -log-file-log-level=verbose in my backup as suggested but I am not seeing anything different.
Many thanks and best regards … Noel

Thank you for your help and advice which is much appreciated.
I’ll be using the -disable-filetime-check option as suggested.

Thanks and best regards … Noel

That’s a logging option. Are you saying it didn’t change the log or that it didn’t change some other view, which has been a continuing question. What exactly are you seeing that make you suspect the issue?

Logging options are not supposed to change other things. The idea was to look in the log to see if your suspected-of-being-backed-up-needlessly-files were somehow logging like should-be-backed-up-files.

If all you’re seeing is the files (changed or unchanged) present at a given time in the backup from then, that’s what you’re supposed to see, and that’s how you restore. It’s supposed to offer all available files, however (unlike some other backup programs) you’re not shown anything specific about the file dates.

Show file-specific backup history when restoring shows how CrashPlan lets you see versions in views. Duplicati only displays point-in-time views. You can restore what you want but can’t view per-file details.

Hi ts678

In the backup log I was expecting to see the names of my files, or some of them, that were being considered for backup. This expectation was based on your earlier email where you said:

You can find out how many files it saw as candidates for checking-for-changes in Show log of the backup, e.g. OpenedFiles: would probably count all files in your two suspect folders, if Duplicati checked their files.

One can probably find some names behind that count by logging at Verbose or higher, either using live log at About --> Show log --> Live --> Verbose, or using –log-file with –log-file-log-level=Verbose. Example line:

2019-05-01 15:14:00 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\backup source\length1.txt, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 3/3/2019 11:29:09 PM vs 1/1/0001 12:00:00 AM

I haven’t seen anything like this nor any of my file names appearing in the log. I’m not particularly concerned about this as I have been running the ‘compare’ command after each backup, whereby I compare versions 1 and zero. This gives me a good idea of what has actually been backed up, which is what I’ve been wanting to see - as a new user I of course need to have confidence in the product.

Thanks and Regards … Noel

If absolutely nothing changed in the log when you added --logfile and --log-file-log-level, make sure you’re looking in the file path you put on --log-file. The log file in the GUI (which is just a summary) won’t change. These log files can also get kind of large, so if you see more about, let an editor do the searching for you.

Yes, and “backed up” in the sense that it should be available for restore. For greatest confidence you can install Duplicati on another computer to simulate loss of your original, and run a “direct restore” for a test, but don’t ever add two configured backups with the same destination, as they will scramble backup files.

To see “backed up” in the sense of new uploaded data, you can either estimate time or run your backup’s Reporting --> Show log to look at your statistics for that backup, such as the changes size that uploaded:

    BackendStatistics:
        RemoteCalls: 20
        BytesUploaded: 29266754

Don’t have so much confidence in the product that you start deleting original files on purpose, expecting it to keep them forever. As a beta-test-level product, there are still bugs, and sometimes restarting backups becomes the best option. Personally I consider it best suited for short-term protection. It’s not an archiver.