(Security) feature request: Import and export config files "remotely" on a NAS

Hi!

I’m started using duplicati on Asustor NASs.

duplicati is accessed via the browser on the PC, which is fine.
But when I want to export or import config files, the browser stores and reads the files locally
Having to transfer the config files to and from a local filesystem is a security risk (in my eyes).
->
I want to store the config files in places on the NAS the PC (Windows) does not know about.
->
It would be nice to have the choice to handle this via the folder/file selection like destination and source selection for backups.
->
Local computer does not “know” about this then.
(Everything is kept on the remote system)

In my opinion this is “needed” on platforms with remote access.

I hope my explanation is clear enough.

Thank you very much in advance!

Ingo

Are you having concerns about the job file being saved on the local machine or being transfered over the browser connection (or both)?

1 Like

Hi JonMike,

the browser connection is a minor issue I believe, but to be on the safe side, https would be fine.
(Just searched for “https”: No result)

My main concern is it, to keep the job files in a “safe” place and further to make things easier.

Now I have to work as follows:
export via browser to PC to “public” area on NAS -> on NAS move to safe place
move from safe place to public place on NAS -> import from there

Maybe I’m a little paranoid, but on the other hand, why not reducing risks as much as possible?

Best regards,
Ingo

Thank-you for clarifying. Is there a reason you don’t feel the “Encrypt file” option of “Export backup configuration” is secure enough?


The issue I see with what you’re wanting is that if Duplicati running on your NAS had access to non-public (“safe”) storage, your would then be trusting Duplicati to keep that non-public content private.

That means if somebody could get to the Duplicati web interface, then they could dig through all the supposedly private storage the Duplicati also has access to.


That being said, I suppose it might be possible to add another Export / Import job location as something like “Config folder” such that it would be stored wherever the Duplicati-server.sqlite or config.json file is stored.

However, that might give some users a false sense of security as, for example, on Windows, that folder would be %LOCALAPPDATA%\Duplicati which is likely NOT a private location.


You could also simply export “As Command-line” then copy/paste the contents to a file directly saved into your private storage. Though I do understand this only handles one side of your situation.

I suppose the other side could be dealt with by providing an “Import from Command-line text” text-box on the “Add a new backup” page.

1 Like

If I would get totally paranoid, I would have concerns about keyloggers logging my password :slight_smile:

You are right, encrypt the job files should be sufficient.

Ok, now I have question:
Are the credentials and type of connection for the remote location stored encrypted?


I’m going to use duplicati on my computer too and I want to be sure, that Windows (trojan horse, virus, …) has almost no chance to get an access to the remote backup location. Therefor I’m going to use SFTP, because Windows does not “know” this protocol.

If people have access to the Duplicati-server.sqlite file then they have all your settings including your backup destination password in plain text. It’s essentially the same as having access to the config file just in a database.
Here’s the SFTP settings in the DB:


And all your options including your encryption passphrase.

To secure this, Duplicati would have to require a password on login and then decrypt those variables in memory. But right now it’s game over if they have your DB and decide to look in it.

2 Likes

If duplicati gets more widely spreaded and therefor gets more likely a target for mass attacts by malware, this should get addressed.

To get it secured against attacts from “companies with 3 lettters” or from personal attacts, would be another story.

I just have another idea: PSExec / WinExe for central managed duplicati.
(Will post it in: Client-side agent and centralized management / dashboard)

1 Like

I think it would be cool to offer full at rest encryption of the database assuming it didn’t tank the performance. But just encrypting sensitive fields might be enough.

2 Likes

That would be fine already, because, if “somebody” is already on the frontend, it does not matter, if he finds the files directly or through the data base. The data base might be more interesting in forensic means.

Well it depends on what you’re worried about. If you back up very sensitive data then you might be similarly worried about the fact that the databases holds information about all the backed up files.

Also it should be mentioned that encrypting just a few fields might be much more trouble to implement because it’s then implemented in the data layer instead of the SQL connector. Implementations in the data layer potentially has tons of dependencies around the code that then have to be updated.

1 Like

If it is easier to implement it database-wide - even better :slight_smile:
But it is as @Pectojin said: sensitive data or not - if somebody sits in front of the desk he/she/it has most likely access to the data anyway. BUT i think of usernames/passwords for all kind of destinations and encryptions that are stored in the db as well. And those he/she/it would not see so easily otherwise…

1 Like

I just learned the other day that the windows SQLite implementation actually does encryption by default

So it’s just a matter of getting it on the remaining OS’es or, if that’s not possible, go for the data layer solution.

I have not spend too much time implementing this feature, exactly because Duplicati should be able to run automatically, and locking down the database prevents this (the user needs to type the passphrase, otherwise the passphrase is already on the system and the whole thing has no security). The RC4 scrambling is weak encryption, and it uses a well-known default password, so it only adds protection against casual hdd string searches.

Since full-disk encryption (or at least home-dir encryption) is standard on most OS’s, I think this provides better protection against many “cold” attacks.

That said, I would like us to implement “keychain” storage of sensitive values, such that the OS can protect these with the user credentials:

1 Like

The settings database has very low usage, I would not expect it to make any difference at all.

Ah, you’re right. The most heavy usage on that database is writing logs. And the job databases don’t hold credentials.