Defense against trojans

That is your problem :slight_smile:

Donā€™t make it too complicated!
It is almost everything there already.

For the moment I would just need an option to run a job just doing smart retention and then cleanup and compact afterwards. Thatā€™s all.

It is, and Iā€™ve yet to learn how to not cause problems for myself :wink:

But it is complicated. If you want to run Duplicati as a local service on your backend and back up from your source machine then it is not enough to provide a manual smart retention, cleanup, and compacting.

Each of those services alter the files on the backend which requires updates to the local database. So if you perform these from the backup destination server you will need to repair the database on your source machine before you can back up from it again.

In effect asking for these features, to be used as in cleanup tool on the backend, presupposes the ability for two Duplicati servers to work together on the same backup data. This isnā€™t currently possible, which is why it is complicated.

Sorry, you are right!

I didnā€™t think about the database.

There would be a way, but that is closed due to the fact, that the database stores full pathes, not relative ones.
That was written somewhere in the forum. Canā€™t find it anymore, but there was a discussion about reducing the database size by changing this.

When the credentials are not in use by Duplicati, are they stored encrypted?

An article has just been published of an interview with Malwarebytes saying how trojans are now focusing on collecting the credentials on machines and shifting away from Ransomware:

"Great question. I think the primary task of a lot of these Trojans is to get user credentials for websites, for cloud servicesā€¦But theyā€™re going after pretty much anything on that end point. If itā€™s credentials for your online backup service. If itā€™s your email credentials. "
MSN

No. They are stored in an SQLite database. On Windows the credentials are scrambled with RC4 using a known password (read: not encrypted), on Linux/OSX they are not encrypted.

See also my response above.

Thank you for clarifying this @kenkendk. For the sake of completeness to others reading this thread, this issue can be followed here:


Thank you for the work that you are doing in working out a way around this issue.

I agree with your idea.
Along the same thought; for anyone else looking, who does not have a Synology or QNAP NAS to utilise the aforementioned NAS to Cloud backup features, pCloud offers the ability to backup a number of cloud storage destinations every 14 days.
For example: Duplicati ā†’ Google Drive. Google Drive ā†’ pCloud.

1 Like

It seems like B2 could use application keys as some sort of defense: Application Keys Can Duplicati handle B2 without delete rights?

Maybe even limiting duplicati to a bucket with unlimited version history would be enough?

Using a destination without delete writes is certainly doable, though you would need to disable some features of Duplicati (like auto-compact) or use ā€œKeep all versionsā€ (as you suggest).

Of course this means your backup would forever increase in size, but if that isnā€™t an issue it is a viable option.

I suppose if you really wanted to you could even temporarily switch destination permissions to allow deletes and do manual cleanup / auto-compact runs only when you know the source machine isnā€™t compromised.