Running Duplicati on Raspberry


#1

Dear interested readers,

since christmass 2018-19 I’m running Duplicati on Raspberry Pi (v1.0). Motivation was to make incremental backup of live data on old ReadyNAS Duo (2009, including the HDD) so I backup each share once per week. I’m not friend of clouds, as well I’m not Linux expert. Any correction is welcome!

Install process, after minor issues went smooth, in short:

  • apt-get install mono-complete
  • install Duplicati DEB package
  • raspi-config: Boot Options > Wait for Network at Boot > Yes
  • /etc/fstab, add line: //192.168.1.xxx/share-name cifs defaults,uid=pi,gid=pi,user=XX, password=YY, vers=1.0,x-systemd.automount 0 0
  • map USB-HDD by UUID
  • configure backups for individual NAS shares, redirect databases to USB-HDD (since SD card on Rpi is told to be unreliable)

It’s working for me well for shares below 50GB. I have somehow issues with over 80GB shares:

LimitedWarnings: [

2019-01-22 07:05:26 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bc910cbb465a74cada11e07424cc4ad51.dblock.zip is listed as Verified with size 37138432 but should be 52400111, please verify the sha256 hash "EwZYachyI7eDweQD8LlA1AbcQidNoxDkh0LTVi2pXms="

but I hope it will be resolved soon.

Has anyone tried to run Duplicati on such hardware (which was for other tasks already replaced by v3.0)? Could you share your experience?


#2

and I’m hoping it doesn’t require a special hardware configuration (which will work against solving it soon). Something like an ordinary PC running Linux would be closer. I’m not sure how the SMB part should make much difference, but you could certainly try getting that out of the picture by doing something like testing a backup of NAS files copied to the USB drive (as files) back onto the USB drive (as their Duplicati backup).

For your current test configuration though, could you find the short file using Job --> Show log --> Remote then see if you can click on that line to open details such as size and hash Duplicati thinks it originally did?

Your Warning look somewhat suspiciously like filesystem may be involved (what filesystem does the USB drive use?) because 37138432 is hexadecimal 236B000, so something got shortened to a multiple of 4KB. Checking other troublesome files might be useful. If there’s a consistent pattern of rounded hex, it’s a clue.


#3

Hi ts678,

thanks for your interest. Currently, recreation of database (started yesterday) is still running - for small backups it is done in several minutes, for large backups this takes enormous time (=doesn’t scale linearly with bytes or with count of files). I will post an update tomorrow.
One question I can answer right now - USB HDD is formatted with NTFS, I’m using ntfs-3g (but manual copy of the same large amount of files is relatively fast and without errors). So filesystem doesn’t seem to be the issue.
I’ll be hapy if you will follow further!
Jura


#4

Apart from the fact you can do it … I dont agree with the use case personally … this is certainly an interesting consideration for tuning / solving performance issues.

  1. Mono on a Pi - not the greatest
  2. Hardware not ideal for the use - but is possible nonetheless
  3. If you are looking to backup the Duo - its suppose to have redundant disks - if I remember correctly - Netgear did a disservice using a special format of the disks (I used to have one) - since replaced with Drobo’s
  4. If you are using a single HDD for backup make sure it has external power to drive it

I am sure fundamental copies will work just fine. The open conversation needs to be on how you are performing the backups.

Are you using encryption? Are you forcing compression? Are you doing any special configurations? It is clear that the device is fetching over the network (problem one - uses CPU, and Network), and compressing more than likely and dropping onto USB HDD (problem two - uses CPU, IO, and impacts network). Indexing is being offloaded as you indicated which complicates the CPU, IO, and Network - depending on if its scanning the remote storage or accounting for the writes and validations.

Can you provide your full configuration and a log from your device? I am very interested to see based on the current details what its exchanging pattern looks like.