Backup to G Suite Drive not working anymore?

Hi, I’m using duplicati for the first time to try to make backups to Google Drive. Everything seems configured correctly and even some files (very small in file size) appear in the relative folder of Google Drive (means, the auth code is valid and connection is there…?) but the uploading seems not to work - I tried with many different, smaller and bigger files but no success!

Someone can help?
Stefan

Here’s the log:

  1. Dez. 2019 18:05: Result
    DeletedFiles: 0

DeletedFolders: 1

ModifiedFiles: 0

ExaminedFiles: 0

OpenedFiles: 0

AddedFiles: 0

SizeOfModifiedFiles: 0

SizeOfAddedFiles: 0

SizeOfExaminedFiles: 0

SizeOfOpenedFiles: 0

NotProcessedFiles: 0

AddedFolders: 1

TooLargeFiles: 0

FilesWithError: 0

ModifiedFolders: 0

ModifiedSymlinks: 0

AddedSymlinks: 0

DeletedSymlinks: 0

PartialBackup: False

Dryrun: False

MainOperation: Backup

CompactResults:

DeletedFileCount: 0

DownloadedFileCount: 0

UploadedFileCount: 0

DeletedFileSize: 0

DownloadedFileSize: 0

UploadedFileSize: 0

Dryrun: False

MainOperation: Compact

ParsedResult: Success

Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)

EndTime: 13.12.2019 18:05:29 (1576256729)

BeginTime: 13.12.2019 18:05:29 (1576256729)

Duration: 00:00:00.0028435

Messages: [

    2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet,

    2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),

    2019-12-13 18:05:22 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (6 Bytes),

    2019-12-13 18:05:23 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

    2019-12-13 18:05:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

]

Warnings: [\*]

Errors: []

BackendStatistics:

    RemoteCalls: 8

    BytesUploaded: 2503

    BytesDownloaded: 2503

    FilesUploaded: 3

    FilesDownloaded: 3

    FilesDeleted: 0

    FoldersCreated: 0

    RetryAttempts: 0

    UnknownFileSize: 0

    UnknownFileCount: 0

    KnownFileCount: 9

    KnownFileSize: 7861

    LastBackupDate: 13.12.2019 18:05:21 (1576256721)

    BackupListCount: 3

    TotalQuotaSpace: 11091849151219

    FreeQuotaSpace: 96732873459

    AssignedQuotaSpace: -1

    ReportedQuotaError: False

    ReportedQuotaWarning: False

    ParsedResult: Success

    Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)

    Messages: [

        2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet,

        2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),

        2019-12-13 18:05:22 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (6 Bytes),

        2019-12-13 18:05:23 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

        2019-12-13 18:05:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

    ]

    Warnings: []

    Errors: []

DeleteResults:

DeletedSets: []

Dryrun: False

MainOperation: Delete

ParsedResult: Success

Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)

EndTime: 13.12.2019 18:05:29 (1576256729)

BeginTime: 13.12.2019 18:05:29 (1576256729)

Duration: 00:00:00.0078009

Messages: [

    2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet,

    2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),

    2019-12-13 18:05:22 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (6 Bytes),

    2019-12-13 18:05:23 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

    2019-12-13 18:05:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

]

Warnings: []

Errors: []

RepairResults: null

TestResults:

MainOperation: Test

Verifications: [

    Key: duplicati-20191213T170521Z.dlist.zip.aes

    Value: [],

    Key: duplicati-i8aea351fbfca4a82a8722aadf0e53844.dindex.zip.aes

    Value: [],

    Key: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes

    Value: []

]

ParsedResult: Success

Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)

EndTime: 13.12.2019 18:05:35 (1576256735)

BeginTime: 13.12.2019 18:05:32 (1576256732)

Duration: 00:00:02.9939713

Messages: [

    2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet,

    2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),

    2019-12-13 18:05:22 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (6 Bytes),

    2019-12-13 18:05:23 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

    2019-12-13 18:05:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

]

Warnings: []

Errors: []

ParsedResult: Success

Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)

EndTime: 13.12.2019 18:05:35 (1576256735)

BeginTime: 13.12.2019 18:05:21 (1576256721)

Duration: 00:00:14.0674594

Messages: [

2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet,

2019-12-13 18:05:21 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),

2019-12-13 18:05:22 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (6 Bytes),

2019-12-13 18:05:23 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

2019-12-13 18:05:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bd008626090b942eb8a5d8aac82d67b5c.dblock.zip.aes (829 Bytes),

]

Warnings:

Errors:

Welcome to the forum @Stefan_Tinkhauser

Does the topic title “not working anymore” mean it used to, or does “for the first time to try” mean you’re new to Duplicati, or something else? If new, please see the Overview if you’re surprised to not see your files show up with original names in their backup. You should be getting one dated dlist file and pairs of dblock and dindex files with actual file data for your source files. Problem is, your log shows no source:

seems strange. Creating a new backup job shows how you checkmark “Source Data” items for backup.

Are these in the checkmarked area? You can see if names scroll by in About --> Show log --> Verbose. Starting Restore can also show what you backed up. From your log, there should be at least the folder:

Hi ts678, thank you for the fast response! I found the issue: A filter for all files (*) was applied to be excluded (probably I clicked wrongly when configuring the backup).

Currently it’s working on a backup of 1 file of almost 3 gb but it’s very slow:

  • cpu only at 30%
  • hard disk at 10% (windows task manager)

Is this normal working speed of duplicati while preparing files for upload?? (btw, the hard disk for temporary files of duplicati has enough free space…). The upload then (50 mbit/sec) goes faster then the preparation of the files… I guess something wrong there…?

btw: is there a way to disable the automatic remote file-verification after each upload (it needs to download again the whole upload which in my opinion for an upload to G Drive is unnecessary!)

Thank you again!
Stefan

How many logical CPU Cores does Task Manager show? If, for example, you had a quad core without hyperthreading, a single completely busy core would show up as only 25%. Duplicati processes files in parallel at some points but not at others. My own backup looks disk-limited, hanging around 100% with CPU maybe 30% above usual background. I don’t think any systematic tests or survey has been done.

concurrency-block-hashers and concurrency-compressors are some of the knobs you can try turning…

Hash calculation is mandatory, encryption is optional (but a good idea unless the destination is trusted), compression has some adjustments which may help. Compressed file extensions aren’t recompressed.

More settings if you really need to try to tune the speed up. Newer Duplicati has concurrent uploads but that’s apparently not the limiting factor in your case. Presumably it was for the volunteer who added it…

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help zip
Zip compression (.zip):
 This module provides the industry standard Zip compression. Files created
 with this module can be read by any standard-compliant zip application.
 Supported options:
  --zip-compression-level (Enumeration): Sets the Zip compression level
    This option controls the compression level used. A setting of zero gives
    no compression, and a setting of 9 gives maximum compression.
    * values: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
    * default value: BestCompression
  --compression-level (Enumeration): Sets the Zip compression level
    [DEPRECATED]: Please use the zip-compression-level option instead
    This option controls the compression level used. A setting of zero gives
    no compression, and a setting of 9 gives maximum compression.
    * values: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
    * default value: BestCompression
  --zip-compression-method (Enumeration): Sets the Zip compression method
    This option can be used to set an alternative compressor method, such as
    LZMA. Note that using another value than Deflate will cause the
    zip-compression-level option to be ignored.
    * values: None, Deflate, BZip2, LZMA, PPMd
    * default value: Deflate
  --zip-compression-zip64 (Boolean): Toggles Zip64 support
    The zip64 format is required for files larger than 4GiB, use this flag to
    toggle it
    * default value: False


C:\Program Files\Duplicati 2>

How the backup process works gets into the rather extensive preparation. It’s hugely more than a copy. Features explains some of the benefits of doing the extra work. There’s not enough instrumentation for you to easily identify what the limiting resource is, but (as mentioned earlier), it possibly is CPU-bound.

Thanks again for the detailed explanation. Before I get into the testing with different compression and encrypting settings I wanted to tell you that the CPU (4 logic units) seems not to be the bottleneck as all the units are used at the same level at only approx 30%! Only for some seconds and rarely it goes up to 60-70%.

Could it be some sort of CPU usage priority issue? (no other demanding task running though at the same time…)

Stefan

Resource Monitor can be launched from Task Manager to see that (or maybe you looked some other way), but I think you’re deeper in the chase than most people get. Ideally one might try profiling tools. Wringing maximum performance out of software is lots of work. Developers are busy with other bugs.

If you seriously want to pursue this, you should probably work with a dev if a suitable one is available. Original author would be ideal, but has very little time, and is the only person who does certain things.

–thread-priority should be able to change that, but more people want it lower to make it less impactful.

EDIT:

–backup-test-samples which controls a form of The TEST command run as Verifying backend files, but note that it’s hardly a heavy verification. The biggest download is possibly a default 50 MB dblock file… Occasionally such sampled file verification does turn up problems, but it’s your choice to run this or not.

and on that note, if you were producing 50 MB files faster than upload (which you say is not so, but I’m not sure exactly how it’s determined), then you can see 50 MB files flow through your –asynchronous-upload-folder which on Windows is probably AppData\Local\Temp in the user profile that runs Duplicati.