He @televisi, welcome to the forum!
When Duplicati runs it will only back up the parts (by default 100KB blocks) of files that have changed since the last run. Checking all those blocks for changes can take a while, especially when running on devices that might not have a lot of spare CPU or memory to put to the task.
I’d suggest looking at the job log and checking the first 10 or so lines of the Result entry where it lists the number and size of things like modified and added files.
If those numbers are high, you may just have lots of activity needing to be backed up every run. If they are low then Duplicati may just be running slowly for some reason.
Here’s an example from one of my recent test runs:
- Apr 10, 2018 4:01 PM: Result
DeletedFiles: 5
DeletedFolders: 1
ModifiedFiles: 200
ExaminedFiles: 877
OpenedFiles: 361
AddedFiles: 161
SizeOfModifiedFiles: 29451864
SizeOfAddedFiles: 11594491
SizeOfExaminedFiles: 135511019
SizeOfOpenedFiles: 52831163
NotProcessedFiles: 0
AddedFolders: 1
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Please note that there is a newer beta version which performance improvements, so you might also get a benefit from updating to that.