Missing XSRF token while trying to access "Restore" feature

Hi,

Duplicati seemed okay for a while but now I wanted to verify a file discrepancy with what is backed up. In the WebUI if I try to look at the list of files by going to “Restore” it keeps giving me a “Missing XSRF token” error after a few minutes of waiting. This occurs after reloading or clearing the cache and restarting the web browser.

I thought it might be the fact I was accessing on LAN so I tried SSH forwarding the port and that didn’t work, same error.

Then I tried accessing locally using a VNC session and running firefox, same error.

It seems like I cannot access the restore functionality properly and I have no idea why.

My system

[root@kumo-cluster01 ~]# neofetch
██████████████████  ████████   root@kumo-cluster01
██████████████████  ████████   -------------------
██████████████████  ████████   OS: Manjaro ARM Linux aarch64
██████████████████  ████████   Host: Raspberry Pi 4 Model B Rev 1.4
████████            ████████   Kernel: 6.2.11-1-MANJARO-ARM-RPI
████████  ████████  ████████   Uptime: 5 days, 10 hours, 8 mins
████████  ████████  ████████   Packages: 861 (pacman), 4 (snap)
████████  ████████  ████████   Shell: bash 5.1.16
████████  ████████  ████████   Terminal: /dev/pts/0
████████  ████████  ████████   CPU: BCM2835 (4) @ 2.200GHz
████████  ████████  ████████   Memory: 1256MiB / 7809MiB
████████  ████████  ████████
████████  ████████  ████████
████████  ████████  ████████


[root@kumo-cluster01 ~]#

Duplicati installed is 2.0.6.105_canary_2023-04-09

Does anyone have any ideas what maybe going on here?

It’s probably a cookie problem. Clear at least the ones for whatever host you browse to, e.g. localhost.
How to do Hard Refresh in Chrome, Firefox, Edge and Mac’s Browser? seems redundant, but is easy.
Having multiple Duplicati in one web browser (do you?) is another way for them to confuse each other.

Though the method varies by browser, you can usually see cookies. This cookie is called xsrf-token. Curious people can see the cookies go in on network requests, e.g. with developer tools, often on F12.
There are timestamps involved, so make sure that your system times are doing something reasonable.

I’ve tried multiple different browsers and two computers at this point. I usually have one tab.

I tried Firefox running locally on the same Raspberry Pi 4 that Duplicati is on, over VNC in both regular browsing mode and in incognito mode.

I’ve tried Chrome for Android and Firefox for Android via both a LAN domain and by forwarding the port for it over SSH.

I still get the same token error.

I even tried fetching a new auth token for my Google Drive thinking that was the issue.

I tried clearing cache and clearing cookies. In the case of Firefox on the Raspberry Pi it was running a completely new profile with no extentions installed. I really don’t know the issue.

The only thing I can think of is that maybe the system is “timing out” because it takes so long to grab all of the information for my large backup.

Source:
    6.32 TB
Backup:
    4.56 TB / 26 Versions

If that is indeed the case, is it possible to extend the timeout?

Whether or not that is the case, the existing options for timeout are to handle slow destinations, e.g.
http-operation-timeout to OneDrive is sometimes used. A 100 second default is sometimes too little.

Web browsers sometimes have response timeout settings (maybe hard to find), and sometimes not.

This one seems a little trickier because Duplicati got a request with a bad or no cookie, so said this:

You can probably keep an eye on that cookie (as mentioned using the browser facilities to watch the cookie either as it sits in the store or as it travels on the request) to see if it’s heading towards expire.

Instead of watching and forecasting, it might also be possible to capture the traffic, then look back at failure time to look for clues on what led to that. If it was an expired cookie, you’d probably see none.

I’d note that the command line client does nothing of this, so that’s a way out if you’re ever in a pinch.

I hope you set a large blocksize (maybe 5 MB) or things will be slow. How many files? It’s in your log.

I have the blocksize currently set to 10240 KBytes for the backup.

I was a little confused by the CLI commands when I looked into them. How can I list a specific backup and files/folders inside? Is there a way that would do it as easily as “ls” on linux?

I suppect some files got deleted on accident but I wanted to double check by looking at the history of one of the backups from a few days ago.

The FIND command, which seems to have list as a synonym. It’s not as easy as ls because it’s not a standalone command, because to be GUI backup compatible you need to tell it where the GUI DB lives, and because it also wants to know the destination, but Export As Commandline gives necessary details.

Example, and I’m violating the documented order and getting away with it, so that’s apparently possible.

Duplicati.CommandLine.exe find "file://C:\ProgramData\Duplicati\duplicati-2.0.6.105_canary_2023-04-09\RUN\test 3\\" --dbpath="C:\ProgramData\Duplicati\duplicati-2.0.6.105_canary_2023-04-09\RUN\TLORVLFKHB.sqlite" "C:\backup source\*"
Listing contents 0 (4/26/2023 11:15:02 AM):
C:\backup source\length0.txt (0 bytes)
C:\backup source\length1.txt (1 bytes)