Where Does the Time Go?

This is a log from a backup that began at 20:00 local (01:00z). It says in the UI it took 3h49m

The only things I see that are time-stamped, however, took less that two minutes each.

The target is a remote SFTP server, so I know this will not be a snap-of-the-fingers matter. But I would really like to have some way of reducing the time this takes. Which would start with being able to measure and predict… neither of which seems available just now.

DeletedFiles: 51
DeletedFolders: 0
ModifiedFiles: 410
ExaminedFiles: 104332
OpenedFiles: 974
AddedFiles: 564
SizeOfModifiedFiles: 5290550266
SizeOfAddedFiles: 99304448
SizeOfExaminedFiles: 511260506609
SizeOfOpenedFiles: 5390379788
NotProcessedFiles: 0
AddedFolders: 14
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    CompactResults: null
    ParsedResult: Success
    EndTime: 11/28/2017 4:47:48 AM
    BeginTime: 11/28/2017 4:47:14 AM
    Duration: 00:00:33.9211330
    BackendStatistics:
        RemoteCalls: 58
        BytesUploaded: 1701681377
        BytesDownloaded: 74635207
        FilesUploaded: 53
        FilesDownloaded: 3
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 0
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 16161
        KnownFileSize: 539494040493
        LastBackupDate: 11/27/2017 8:00:00 PM
        BackupListCount: 59
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20171019T050000Z.dlist.zip.aes
        Value: [],
        Key: duplicati-iceb12cab040543c1af053ee4c764d05d.dindex.zip.aes
        Value: [],
        Key: duplicati-bdd05ede1f223429fac454652040001ae.dblock.zip.aes
        Value: []
    ]
    ParsedResult: Success
    EndTime: 11/28/2017 4:49:34 AM
    BeginTime: 11/28/2017 4:48:05 AM
    Duration: 00:01:28.5230270
ParsedResult: Success
EndTime: 11/28/2017 4:49:34 AM
BeginTime: 11/28/2017 1:00:00 AM
Duration: 03:49:34.4969940
Messages: [
    No remote filesets were deleted
]
Warnings: []
Errors: []
1 Like

I don’t know how realistic prediction would be unless it was based on past functionality of the specific source and destinations involved as there are too many variations in CPU, IO, bandwidth, etc. to make accurate “cold predictions”.

However I completely agree that measuring would be great. Originally I thought we might be able to update the code to have a central logging object to record various times of tasks but now I’m wondering if the info we want might already be available in the .sqlite logs.

For example, I noticed in the Live logs many actions are shown with a “took hh:mm:ss.sss” suffix. I’m hoping a little SQLite log digging might be be able to produce some summaries of task times per backup. I guess it’s time to start learning SQLite tools… :slight_smile:

@Kahomono

There’s a verification step after the backup. On slow asymmetrical connections it can take a long time depending on the size of your volume. Have you confirmed it’s not the upload?

Similar uploads take the expected amount of time (based on 2GB per hour)

And these are similar size but wildly variable time. It could indeed be the verification.

@Kahomono Enable logging to find out what is causing the delay. From my experience the delays in the backups were related to verification. Verification can be disabled although it’s not recommended as it can tell you whether the “backend” is consistent. Keep us updated so that we can sort the issue out for you.