What happens when a file cannot be accessed?

Hi!

Thank you for this awesome software!

I have a question. Sometimes I use my files when the backup runs, and they cannot be accessed. This could be several days in a row. So I am missing the backups of that file during these days.

Then, what happens if I restore one of these backups?

Is that file missing totally? Or is it an older version of the file?

What if I have smart retention on - maybe it could happen that Duplicati chooses just the backups when I was using that file, and thus Duplicati couldn’t access - so that file ends up deleted totally?

The problematic files in question would be disk image or virtual machine image files.

I guess in any case I should pay more attention as to when the backup runs and keep the files closed at that time…

It depends on retention. Most people keep multiple versions so they can restored deleted files.
Nothing is deleted totally until retention removes the version.

Figuring out which version has a given file is a little messy, but the find command can assist:

Usage: Duplicati.CommandLine.exe find <storage-URL> ["<filename>"] [<options>]

  Finds specific files in specific backups. If <filename> is specified, all
  occurrences of <filename> in the backup are listed. <filename> can contain *
  and ? as wildcards. File names in [brackets] are interpreted as regular
  expression. Latest backup is searched by default. If entire path is specified,
  all available versions of the file are listed. If no <filename> is specified, a
  list of all available backups is shown.

  --time=<time>
    Shows what the files looked like at a specific time. Absolute and relative
    times can be specified.
  --version=<int>
    Shows what the files looked like in a specific backup. If no version is
    specified the latest backup (version=0) will be used. If nothing is found,
    older backups will be searched automatically.
  --include=<string>
    Reduces the list of files in a backup to those that match the provided string.
    This is applied before the search is executed.
  --exclude=<string>
    Removes matching files from the list of files in a backup. This is applied
    before the search is executed.
  --all-versions=<boolean>
    Searches in all backup sets, instead of just searching the latest

Once you find out when you last had a file in backup, then you can go to that version to restore.

The exception to that advice is if Duplicati is on Windows, and the open file app is VSS aware.
snapshot-policy can then work with app to flush data (and such) to get a consistent backup.

There are other ways to get a backup of open files, but it depends on what app will tolerate for damage due to an inconsistent view. I would hope most can recover from crash-consistent one.

Hi,

Sorry to take such a long time. I do not fully understand the answer but thanks for suggesting the find command.

I still want to try to rephrase my question(s).

What I am wondering is maybe like this:

Let’s say I keep 3 backups, A, B and C (latest).

During 2 of the last backups B & C I had this one file open where Duplicati couldn’t read it.

Question 1: If I restore B or C, no such file is restored, right?

Question 2: If I back up once more with that file still open / unreadable, Duplicati will remove the last backup of that file as well - now no backup exists.

So in the end, at the moment, I think I really should make sure that this file is readable during the Duplicati run. Otherwise it will not exist in some of my backup copies, and if I’m unlucky, it could be that duplicati doesn’t retain any backup of that file (if it decided to cull those backups).

So, maybe put it this way,

Question 3: unreadable file is like a deleted file in Duplicati’s point of view?

Thanks to Duplicati devs for the awesome program by the way! :slight_smile:

Generally I would assume unreadable file is missing from backup, so no restore.
Possibly a file could be partly readable, but unreadable later on. What is the OS?

Handling locked files describes a Linux option that seems risky, but can get data.
Windows option was described earlier. That might be safe depending on the app.

The “cull” is not an arbitrary decision. It’s done per retention setting in Options, however unless it’s set to “Keep all backups”, you might delete the last backup.
Potentially, also, you might never have gotten a backup if file is never readable.

If none of it can be read, it’s seen as not there. Not all not there files are deleted. Some might never have been there. If you mean deleted file in job log, that’s for something that was there but is not there now. Added file count is the other way.

Additions and deletions are relative to the previous backup version. Unreadable could be prolonged state. I’m unsure what specific question you’re asking about.

If you’re just asking loosely, the answer is they’re similar despite fine differences.

This is the safest plan, or at least do that before your chosen retention runs out, assuming an old version is better than no version – if not then do it consistently.

There might be other options, depending on applications’ behavior and your OS.

1 Like