Custom Retention Policy - Is it right?

Hello,

New here and bit confused … Looking for help.

I have MS SQL server (under Windows Server 2016) , it makes backups automatically every day, keeps versions of last 7 days, (every next day deletes the oldest one) . so there are always 7 files into my source folder.

I want the following backups to have into my backup storage: for each last 7 days , after keep only one backup per week for the next 4 weeks, after 4 week keep one backup per month for the next 12 months.
according to the above mentioned plan I have chosen Smart Backup retention. Is it correct ?!

I am running this plan since 15feb (5 days ago) , As per plan there should be 7 files per date when I start restore , but for some dates (for example 17-18-19 feb) there are last 8 files offered for restored (last 8 days), So what I am doing wrong ?

Welcome to the forum @ttsquad

If you had just said that (which you didn’t), then the 8th could be the “one backup per week”, however it should only look like “last 8 days” once, then the oldest one should just get older in its week without any additional ones being admitted into that time range. Times are to the second and aren’t calendar based.

Got any actual dates and times to post? Maybe Feb 19 is still around, or maybe Feb 20 has been done. Either way, you can look at the current Restore menu and in historical job logs to see what was deleted.

EDIT: If it makes things easier, you can paste images from your clipboard into the forum.

It can’t be, as it has not passed one week yet, since I started doing backups. Much like that local SQL backup job sometimes has delay of deleting the oldest version when a new is being created. I will check it overnight as soon as the SQL local backup got finished .


image




EDIT:
according to this log it seems Like it is SQL Job to blame …

OK, I have a clearer view after seeing the images and reading again. You’re backing up a Source folder maintained by SQL server, but asking why it left 8 files? That has nothing to do with Duplicati’s retention setting which controls only Duplicati backup, which is relatively small files kept at Duplicati’s Destination.

I thought you were asking about Duplicati retention policy, where what matters is its kept backups, e.g.:

image

I don’t know what you mean, as all your Duplicati job names are obscured. If you mean it’s SQL Server, sure, I can go with that. What you highlight is not meaningful without knowing the Source files behavior.

Specifically, Duplicati will examine your source files, and you seemingly have 28 of them in Source tree.

EDIT 1:

A Duplicati backup is a point-in-time view of the Source. If a Source file doesn’t change by next backup, there’s no need to open it to look for the changes. Each SQL Server backup seemingly adds 4 files and deletes 4 files, based on looking at the posted Source Files information from the consecutive daily runs.

Duplicati backup reads Source. It doesn’t change it, although restore can go to original area if you wish.

EDIT 2:

Make sure Duplicati doesn’t backup the SQL Server backup until it’s done, otherwise it will get a partial. Possibly whatever you use to write SQL Server backups doesn’t delete the oldest until new is finished?

EDIT 3:

This is one of those times when per-file timestamps in the Restore tree would be nice, but you can look directly at the small Source tree to try to figure out what time of day the SQL Server backup completed.

More ideas were added above. Basically check timings and whatever creates the SQL Server backups. There’s nothing here indicating a problem in Duplicati retention. It just backs up what’s actually present.

Showing otherwise could be attempted, e.g. a run-script-before could do a dir into a file for review. Longer format could be log-file=<path> and log-file-log-level=verbose to see what the backup is seeing.

EDIT:

Your daily backup times seem to vary, and retention of 24 hours to the second might find some too near. Changing that to a custom retention using hours (lowercase h) might help, but it’s not what you asked…

When you figure out your timing variability, you might decide to use 23h instead of 1D, or if it’s always a scheduled run, why even bother trimming out excess backups, as there aren’t any? Maybe just use a U.

If this refers to what creates the initial SQL Server backup, then we have the same thought and you have some additional ways to check programmatically, or maybe just check Modified date on folder.