Failed to connect: No such file

#21

found it!!

You should provide the path as follows:
/home/duplicati_path

Not using /volume1 or /var or…

Discovered that by connecting via ftp, and the url did not have the /volume/ or /var path in it.

Could you please confirm this works for you as well?

Wim

1 Like

Fatal error during backup to Synology via UNC share
#22

Great! Though I can’t confirm it (no Synology box, saddly) but perhaps @David_Jameson hasn’t uninstalled Duplicati yet and can give it a try…

0 Likes

#23

Bloody hell — that worked! You know, when I was experimenting with WebDav and opened a Finder folder to the Synology, I noticed there was a single “home” folder there as well as the “homes/user” folder. I actually went looking for that ‘home’ folder but couldn’t find it anywhere.

So ok, did that, and I have started a backup of one hard drive. Keeping my fingers crossed — and THANKS so much for figuring it out.

Of course from a technical point of view, I’m curious as to WHY this worked and why the previous mechanisms did NOT work?

0 Likes

#24

you are welcome & good luck!!

my 2 cents: synology exposes the home folder differently than a normal Linux-based OS…

Probably google outside duplicati can help us further here… As I don’t think this is duplicati related…

0 Likes

#25

Wow! I had no idea you needed to do that. Awesome that you figured it out.

Now that I know this, I think you could have found out by using FileZilla or CyberDuck to connect, as it would show the paths.

I do recall that Synology monkey patches sshd with some login integration. It sounds like they patched sftp as well such that it does not expose the /volume stuff, which is really confusing as they have not (and could not) patch normal ssh access the same way.

1 Like

#26

it was with an ftp client (firefox :slight_smile: ) that I noticed the path difference…

1 Like

#27

So is your hard drive backed up now?

If so, please consider clicking the checkmark in Wim_Jansen’s post so other users know that’s the trick that slaved the issue. :slight_smile:

0 Likes

#28

God no, it still has 361,000 files (522GB) to backup — I’m assuming it’s going to be 3 or 4 more days given that we started with 750,000 files or so. But you’re right - the suggestion worked so I’ll mark that.

1 Like

#29

I don’t have CyberDuck any more but I just tried to connect using Forklift and it displayed both “home” and “homes” and if I ask for the url of the former I just get

sftp://ipaddress/home

so yeah, that would have been a clue.

0 Likes

#30

I’m having the same issue - but with my own NAS which runs a mix of Debian testing and OpenMediaVault. I’ve set up a backup using borgbackup, which is working fine, but I wanted to try Duplicati just for fun (well actually it supports Amazon Drive out of the box, which is what I’m looking for).

I have a ‘backup’ user on my NAS who has a home directory residing on /export/NAS/Backup - borgbackup is just putting its files in a directory relative to the home directory of that user.

If I try doing the same thing with Duplicati I get the ‘Failed to connect: No such file’ error. Just to be clear, I’m using SSH file transfer (SFTP).

I was trying with ‘duplicati/pcname’ or ‘/duplicati/pcname’ or ‘~/duplicati/pcname’ or ‘./duplicati/pcname’

I also tried connecting with FileZilla using SFTP which works fine, I can access the folder, create stuff, delete stuff. Just like borgbackup does as well.

I also created the directory and tried again with duplicati - to no avail. If I leave the directory empty it says okay but ends up with a permission error. Troubleshooting is hard because I have no idea where Duplicati is trying to go.

Solution: This still seems like a bug in Duplicati:
Put

/home/../export/NAS/Backup/duplicati/pcname

into the folder.

I find that rather strange and completely counter intuitive compared to how every other tool at my disposal handles this.

0 Likes