Could not find file "/tmp/ ..." error

Hello,

I’m trying to set Duplicati up to back up to S3. I’m running Ubuntu 20.04. No matter what I do, I get the same error:

Could not find file etc.

I have 900GB of free space on this HD so space in my /tmp/ folder isn’t an issue. I’ve tried altering the remote volume size (from 20 to 200MB) but it still fails.

Any help?

Welcome to the forum @error

On your job’s Destination screen 2, does the Test connection button succeed? It’s a small test.
I assume this is a new setup, and it’s not getting far on the backup. Are any files showing up on S3?
You can watch About --> Show log --> Live --> Retry, but I’d prefer to get a log file if anything’s weird.
Could not find file gave directions for a similar recent case. “Could not find file” was a follow-on error.
While more info is needed to figure out how to fix that, avoidance might be to solve the earlier failure.

Thanks for your reply, ts678.

The test connection is successful. No files (or folders) show up on S3, but Duplicati successfully creates a new bucket with the designated bucket name. And yes, brand new setup.

I’m currently trying to upload a folder with about 6000 photographs. When I set the remote volume size to the default of 50mb, Duplicati encounters the error after it has processed about 70 files (which may equate to about 50mb?). If I set the volume size to 20mb, Duplicati encounters the error earlier (after say 25 files), and when I set it to 200mb, it encounters the error much later (after several hundred files). I get the impression that Duplicati is failing to write the newly created volume to /tmp/? I’ve tried changing the directory, but the job still failed. No files are ever transferred to S3 - the whole thing fails before that point.

I’ve read through several other threads on this forum but honestly I’m quite new when it comes to Linux so am perhaps slightly out of my depth.

Thanks again.

Please look at the live log to see if you see upload attempts. They will get retried, and will show reasons, although you might need to click on the line. Alternatively, set up a log file as described in the linked topic.

In addition to the easier logging mentioned, low-level logging can prove/refute this theory.

If you can find the PID (process ID) for mono-sgen for your Duplicati, you can run strace

strace -f -e trace=file -o /tmp/strace.log -p <pid>

You can run this as root or as you (which might require some steps to grant permission).
Look in the log file (grep can help) to see what’s been going on with the name of interest.
The two basic paths are it was never created, or it was created but then it was removed.
Previous guess was on the remove side, but strace logs were not taken, so who knows?

It would be nice to have a super-simple reproducible way for anybody to cause this issue.
If you’re willing to experiment, see how easy you can make it (ideally without needing S3).

I wish I could get to the root of the problem above, but alas now it’s working (a good thing!) and I can’t replicate the previous issues. It must have been an access issue of some kind, even though I allowed Duplicati to create its own bucket, and also tried creating a bucket for it via the web console, but both approaches failed.

I deleted everything from my S3 and allowed Duplicati to access the one bucket I’ve been using for years. Now it’s working like a dream.

I’m glad for that, but sorry there was not a chance to take a good look at this issue to understand it better.
The two remaining mysteries are what sort of access issue it was, and how it gets “Could not find file”…