Any way to backup through LAN when on local network, WAN when outside?


#1

Hi!

I’m going to assume the answer is no, but is there any way to intelligently switch between backup destinations depending on if I’m inside or outside my LAN?

My laptop backs up to a local NAS through samba when on my local network. When I’m outside my LAN in the world, I don’t have samba access but I can SSH/SFTP to my NAS. Is there any way to have a single backup configuration where the connection method can be manually or automatically changed from samba<->SSH/SFTP while maintaining the same Duplicati database, backup files etc?

I have data caps on my home internet connection so I don’t want to use SSH all the time since it would SSH through the WAN and count against my data caps. On the other hand if I samba all the time it doesn’t work when outside my LAN.

Any clever soul got a solution?


#2

hi. How about VPN ? …


#3

Not a bad idea, but then all my traffic outside the LAN would unnecessarily be counted against my data cap. Plus, when I’m outside on another network for work purposes, I usually need access to local resources such as printers and network drives.


#4

This is all solved by setting the VPN, routing. On vpn you can wrap traffic only to your NAS, but that’s another story :slight_smile:


#5

I do this on my NAS by exposing authenticated/encrypted WebDAV port to the internet. (I would not do the same with a SMB share.)

I use a hostname that resolves to my NAS’s true internal IP when on the LAN, and it resolves to my gateway when I’m elsewhere. The gateway forwards the WebDAV port in to my NAS. So there’s nothing I have to do when traveling - it just works.

VPN would be more secure but doesn’t just automatically work.


#6

Aha. That’s kind of what I’m after. I could implement the same scheme only using SSH instead of WebDAV. How are you forcing resolution of hostname to local IP when on the LAN?


#7

I don’t see why you couldn’t use SSH instead of WebDAV.

I run my own DNS server that internal LAN clients query, and it returns the true internal IP for my dynamic DNS hostname. On the outside the DDNS hostname resolves to my external IP. This step may not be needed if you have a firewall that will do NAT reflection.


#8

2 possible solutions come to mind:

  • Assuming you’re behind a NAT router, if your router supports Hairpin NAT (most modern consumer grade routers support it out of the box, should be possible to set it up for Enterprise grade appliances), just point your destination to the public IP address of your internet connection, using the protocol for external use (SSH/FTP).
    Probably your router will route the traffic back to your local network, the configured port forwarding in your router wil send the packets straight to your NAS.
  • According to the documentation in the example script (line 106-108), it should be possible to change the remote URL dynamically using a script that runs before the backup operation starts.
    Unfortunately I still haven’t got it working. It looks like this is a bug, unless I’m missing something. The --remoteurl option seems to be ignored. I’ve filed a bug report at GitHub.
    Anyway, this is how it should work:

Create a batch file containing this:

@echo off

rem Replace with the IP address and MAC address
rem of an always-on device in your local network
set IPADDRESS=192.168.1.15
set MACADDRESS=00-11-32-4d-f1-5a

rem Replace with Duplicati remote URL when on external network
set REMOTEURL=ftp://1.2.3.4/Backup

rem Replace with the credentials for the remote URL when on an external network.
rem Set to %DUPLICATI__AUTH_USERNAME% and %DUPLICATI__AUTH_USERNAME%
rem to keep credentials for local backup destination
set USERNAME=username
set PASSWORD=password

rem check if IP address replies to ICMP packets
ping -n 1 -w 1000 %IPADDRESS% > nul 2>nul
if errorlevel 1 goto ExternalNetwork

rem Check the MAC address to make sure it's your local device and not another device with the same IP
for /f "usebackq tokens=1-3 delims= " %%a in (`arp -a ^| find /i "%IPADDRESS%"`) do set FOUNDMAC=%%b
if /i "%MACADDRESS%" neq "%FOUNDMAC%" goto ExternalNetwork

:InternalNetwork
exit 0

:ExternalNetwork
echo --remoteurl="%REMOTEURL%"
echo --auth-username=%USERNAME%
echo --auth-password=%PASSWORD%
exit 0
  • Replace %IPADDRESS% and %MACADDRESS% values with the IP and MAC address from a local device that’s Always available in your local network (internet router?). The MAC address is needed to make sure there is not a device in an external network with the same IP address as your local device.
  • Replace %REMOTEURL%, %USERNAME% and %PASSWORD% with URL and credentials of the backend to use from an external location.
  • Create a backup job and configure it for internal use (Local folder to your NAS). Add --run-script-before to the configuration and call the batch file above.

The script PINGs the configured IP address and checks the error level to find out if it’s reachable. If the error level is 0 (device replied to PING), the MAC address is retrieved and checked against the configured address. If IP and MAC have the expected values, the script just exits and the backup job runs, using the internal destination (Local folder).
If IP address is not reachable and/or MAC address is not the exprected one, the script echoes a new value for the remote location and credentials. This is the part that doesn’t seem to work.
Maybe the source of the problem can be addressed by a fix in the batch script, or a bug fix in the software, making it work in a future version.


#9

Hi @bythos, welcome to the forum!

@kees-z beat me to the --run-script-before answer be a few minutes but that would be my suggestion - though you could simplify it a bit by always using SSH and having the script just change between an internal and external IP as appropriate.

On the VPN suggestion, take a look at ZeroTier VPN as I believe that will dynamically use a direct local connection for the VPN if possible.


#10

Problem is that the --remoteurl option doesn’t seem to work at all. I tried to replace local folder C:\Folder1 (configured in the backup job) with C:\Folder2 using --remoteurl, but the backup files keep being stored in C:\Folder1. So there’s a bug in processing this option, or I misunderstand how this option works.


#11

I honestly haven’t tried it myself so couldn’t say about usage - does the environment variable work any better in your tests?


#12

Environment variable %DUPLICATI__REMOTEURL% returns the value that’s configured in the backup job (C:\Folder1), even after it’s changed using --remoteurl="file://C:\Folder2" earlier in the script.


#13

@bythos, don’t you have SSH access from the internal network? If you can make it so that you use the same protocol both when you are in and out of the network, the rest is simply DNS trickery (you will need an internal DNS server).