Hi All,
I am new to Duplicati. I installed and ran the backup job. It created ~27MB file during the first run. Second and third run are creating in KB size files only.
I expected every day the file size going to be increasing/similar. Is it capturing only the changes?
In case if my Ubuntu server fails and I end up re-installing Duplicati, how it knows I have the backups in my google drive? Now the current version automatically shows I have two backups. If I do fresh install and connect to my google drive, will it recognize on its own?
I installed using Portainer on a docker. It shows permission denied for few folders. How to rectify the permission issues when the backup is run? Looks like it is skipping them now.
2024-05-16 05:00:41 +00 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: /source/nginx/letsencrypt/accounts/
UnauthorizedAccessException: Access to the path â/source/nginx/letsencrypt/accountsâ is denied.
2024-05-16 05:00:41 +00 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: /source/nginx/letsencrypt/accounts/
UnauthorizedAccessException: Access to the path â/source/nginx/letsencrypt/accountsâ is denied.
Incremental backups
Duplicati performs a full backup initially. Afterwards, Duplicati updates the initial backup by adding the changed data only. That means, if only tiny parts of a huge file have changed, only those tiny parts are added to the backup. This saves time and space and the backup size usually grows slowly.
which is odd because you started three runs, but maybe the second was interrupted before finish?
A given run should upload a number of dblock/dindex file pairs, and a dlist file at end listing content.
If itâs just the install, it should use the same local database. If new system, move old DB or recreate.
Thanks for the reply.
It shows only two versions in the Web even though it created files every day in my google drive. Web has initial (two days back) and latest version (today). But yesterday it had an initial and yesterday version.
I used the smart backup option. it was supposed to create backup for 7 days, 4 weeks and 12 months. Not sure on the exact thing, but something similar to this it was supposed to create as per the options I had chosen.
Yes, I am running this through Docker not as direct app. Installed using docker compose and mapped the source and backup locations as volume. But in web used google drive as backup location.
Anyway to specify or run it as root so that it doesnât have folder permission issue. The warning log shown it had issue with Letscrypt folder under nginx. It is system protected and my login ID doesnât have access too. But as Duplicati is supposed to backup everything, how to give super access to this to avoid missing files in my backup.
So if you take too many backups per day, something will get deleted to leave a density of one per day.
Your May 16 backup might have been less than 24 hours from your May 15, so therefore was deleted.
This should all be in the job log, for example:
The current screenshot shows some files, but not a dlist, which represents a backup version, and then
shows that it used to be there, so look at third backup log to see if it deleted second. That deletes a dlist.
If you want to keep multiple backups per day for awhile, set a custom backup retention however you like.
It doesnât create backups. It controls how created backups are deleted.
Flipping that over for labeling, itâs how the created backups are retained.
Creation is manual or on the Schedule screen.
Which Docker? See above for some guesses.
This depends on the Docker, and the one Duplicati ships runs as root. If LinuxServer, see its Usage:
environment:
- PUID=1000
- PGID=1000
is probably what controls how it runs. I donât use Docker, but you do. Maybe this article will also help.
Wow. Thanks. Impressed with your line-by-line response
Below is my composer file that I used inside Portainer. While searching through the forum, someone mentioned they manually added folder permission. But I donât want to do as the folder is set with limited access for my current user for a reason as it contains certificate files etc.
and put that together with the âUnderstandingâ link, and you can possibly either edit or delete those lines,
getting back to default root behavior before they tried to make life a bit safer while hurtng backup access.
Thanks for your help. I updated the composer file with 0 UID that turns duplicate to run using root account. Now it doesnât show any warning finally.