After 2 years: a new try with Duplicati

2 Years ago I was very disappointed from duplicati:

That time I chose duplicacy as my backup software. I now had a full data loss and I had to recover ~1TB of files. Unfortunately one of my 2 Backups was corrupt and duplicacy offers no useful way to check backup integrity.

I could recover my files from a second duplicacy backup but for the future I do not want to rely on a single backup solution. So I am back at Duplicati.

My goal is to save my files on my second (local) NAS via SFTP.
I was not able to achieve this.

  • In the wizard I am not asked to approve the SSH fingerprint → backup fails
  • After adding the ssh fingerprint in the “expert” options backup still does not work
    It creates some files on the target but then stops with
    Session operation has timed out
    In the logs the backup attempt is completely missing
    grafik
    The attempt is from today (30. Aug.)
    Yes you can tell me know to enable debug and so on but this all kills the idea of having an easy to use gui. I will now look for a command line alternative.

The worst thing is what Duplicati reports a successful backup:
grafik

But there is no backup at all (13,63 KB / 2 Versions) The source files exceed 1TB

Why I am writing this? Duplicati could be a great backup software but my problems already start while creating a backup and I am forced to use “advanced” options to get around. I am very disappointed to see similar issues 2 years after my first try.

Hello,

What kind of NAS is this? I searched the forum for this timeout error and see several threads on varying targets.

I have a Synology NAS at home but I use WebDAV protocol with Duplicati. I did experiment with SFTP for a short while. It worked for me but I ended up going back to WebDAV.

QNAP NAS TS-231P
:grinning:

One way that people get this is by setting up a Filter on the Source Data screen that filters everything.
This can also result in:

because it successfully backed up everything it was told to, which (after filtering) happened to be no files.

Filters can be complicated, and the GUI filter builder also has some bugs fixed in Canary, but not in Beta.

Viewing the log files of a backup job gives another view of source files. If zero got examined, check filters.

Your 13,63 KB on the home page should reflect the total file size, meaning it’s probably not much. You can see more detailed stats in the BackendStatistics section in the Complete log if you like . One example:

      "KnownFileCount": 29,
      "KnownFileSize": 261787837,

but more interesting might be what sort of files you actually got. Each backup should upload a dlist and a variable number of dblock and dindex files (paired). Also make sure you don’t have a huge size set up here:

It might come to that, but let’s consider above first steps.