Don't copy in the destination path but in %TEMP%

Hello,

I use 2.0.2.1_beta_2017-08-01 on my Laptop since 2 months any problem.

Since yesterday, I try to use it on my other computer, in the same configuration (backup to a USB HD).

During a first backup, I stop it to add an option (prefix name).
I deleted first backup files created.
I try to start the job, but it was not a good thing to delete files.

I tried to repair the database, I received error. Finally, I think it was ok.

When I start manualy the job, now, the backup file are created in %TEMP%, and not in my destination path !

I deleted the job 2 times, same issue.
I uninstalled and reinstalled Duplicati ( 2.0.2.1_beta_2017-08-01 ), same issue.

Please, do you have an idea?

Thanks.

I don’t have any speed problem on my first computer, and on this second, it’s so slow… with the same settings…

sometimes, the destination file grow up with 1Ko / second, sometimes some 10Ko / second, sometimes 1 or 2 Mo / second.

With 5 To to backup, it’s just impossible.

So, destination file are copied in the good destination path. It’s just too slow, it’s impossible to use Duplicati on this computer…

Why it’s different from my first computer ?!

Strange…

Options :

  • compression-level: 9
  • zip-compression-method: Deflate
  • zip-compression-level: 9
  • backup-test-samples: 10
  • compression-extension-file

I expect a miracle if you remove the compression level setting (deprecated) and change the value of zip-compression-level to 1 (or even 0 to disable compression completely).

Remove the compression-extension-file setting to stick to the default list of excluded extensions for compression. If you don’t specify a filename, or the specified file contains incorrect information, all files (including uncompressable files like .JPG or .ZIP) will be recompressed without any beneficial effect.

You can also remove the zip-compression-method setting, because Deflate is the default setting.

Thanks for the message.

compression-extension-file : I don’t copy the path, but the setting use the default file.

Why change compression level? I need to compress my 3,6 TB, to save the space is possible to save.

I removed:
compression level
zip-compression-method: Deflate

I will test.

Thanks :slight_smile:

Duplicati still uses a single processor core for processing blocks, hashing, compression and uploading to the backend. The default compression level is known to be a relative resource-intensive task. Decreasing the compression level to 1 is reported to improve overall speed with the downside of using a bit more storage space.

AFAIK, parallel processing is worked on, so in the future overall backup speed may increase, even when using higher compression levels.