List Operation fail. Not analysing data

Hi,

I have migrate to a knew storage provider (OVH) using OpenStack. I have upload all files and change the connection details. When I test the connection, it was working. I have run a first backup, the same.
I have rebooted my computer and get the following error that I do not understand:
“Sin analizar datos” could be translated to “without analyzing data”

Thank you for your help

I had a look to the live log and get the following error:

La operación List ha fallado con error: Sin analizar datos: true
That could be translated to
list operation fail with error. Not analyzing data: True

The full context of the fail is the following:

I have change my cloud provider. I have copy all the files to the new cloud and change the connection parameters.Before changing the parameters, I have exported the configuration and I have copy the database (.sqlite file)
Then, I run a backup to test it. All good.

After that, I have created a new backup importing the previously exported configuration and changing the database path to the copied database.
I run a backup and all good.

After that, I get the error mentioned in the previous post on the backup working with my new cloud provider.

and it sounds like this is the backup of the first full paragraph that used to work. I don’t see how running the new backup could have broken the old one though. Have you checked that the parameters are still correct, perhaps even using Export As Command-line, and tried the “Test connection” button on Destination page?

In the live log, sometimes clicking on a one-line message will expand into details. More details are needed.

might be the translation, but it doesn’t make sense to be showing up where yours is. It’s usually due to something unrecognized in a custom retention time string, for example as was discovered in this post.

You could also see if you can list with a command line tool. Duplicati.CommandLine.BackendTool.exe accepts the Target URL from the Export as Command-line I mentioned earlier. Don’t modify, just list.

C:\Program Files\Duplicati 2>Duplicati.CommandLine.BackendTool.exe help
Usage: <command> <protocol>://<username>:<password>@<path> [filename]
Example: LIST ftp://user:pass@server/folder

Supported backends: aftp,amzcd,azure,b2,box,cloudfiles,dropbox,file,ftp,googledrive,gcs,hubic,jottacloud,mega,msgroup,onedrive,onedrivev2,sharepoint,openstack,rclone,s3,od4b,mssp,sia,ssh,tahoe,webdav
Supported commands: GET PUT LIST DELETE CREATEFOLDER

C:\Program Files\Duplicati 2>

I have solved the issue the following way:
First I export the backup job, delete it, and import it again. Get the same error.
Then I delete the backup job, reconfigure it manually, change the database path to the existing one: working.
Now everything is okay.

If it can have some interest for you, you will find below a screen print of the detailed error at 19:14 yesterday:

Thanks a lot @ts678 for your support.

The log details confirm that the error was from invalid timespan format. For a code view, it’s likely from

except the timespan wasn’t from a custom retention string (as in the case linked to earlier), but maybe –http-operation-timeout or –http-readwrite-timeout getting set somehow with no value (so it gets true). Possibly you could spot this in one of the saved exports, or maybe Duplicati just imagined it somehow. Regardless, I’m glad the problem is gone and I hope whatever happened last time doesn’t occur again.

1 Like

FYI, I found that if you have set a retention policy in the web-GUI, you cannot have the ‘retention-policy’ showing in the advanced options list with value ‘true’, or it tries to apply both and throws error ‘Unparsed data: true’, very similar to above. So just close/delete the option line. No doubt this is also in the instruction manual! :blush: