Is there a way to do full backups everytime?

Hey guys, i need some help. First, let me explain the situation.
I started testing Duplicati to backup Firebird databases, it seemed fine until i did it with larger databases (~750mb). The problem is, sometimes the database size doesn’t increase, and duplicati just skips this backup. This is a huge problem for me, since new data gets added but it doesn’t get backuped, causing potential data loss.
Before using Duplicati, i used Cobian, and it worked for this purpose, it did full backups everytime, even if there was no size changes in the DB.
My question is, is there a way to work around this situation? Can i do full backups with Duplicati? Or is there a better way of doing this, counting on DB size not increasing everytime?

UPDATE: So, i’ve been testing a little bit more. If i do a local backup (from my HD to my HD) the backup does normally, keeping multiple versions even if the size don’t change. But, if i do to a storage service (Mega.nz in my case), the same backup, with the same changes, with the same DB, just don’t work. It keeps at 1 version (the original first version).

Strange, Duplicati’s behavior shouldn’t change based on the target storage.

Regarding the file in question, Duplicati should reprocess it if it detects ANY change in metadata: size, timestamp, etc. You say the size of this file doesn’t change even though data is added? (That alone seems strange to me…) Does the timestamp change?

Just checked it, it didn’t change. It appears that someone leaves the system open (which uses the DB) and Windows does not register timestamp changes while a file is open. So now i have to figure a way to close everything during the night for the timestamp change. Thanks for the help.

Maybe below will help, although you might want to split your backup to NOT force scanning of all your files:

That is exactly what i needed. Thanks a lot.