Problem after update to 2.1.0.2 (Beta)

You likely meant the next Beta release. A plan for that and first Stable is here.
Plans may change, but I haven’t heard of any bug that would remove pCloud.
It’s not clear how many testers it’s had though. Fewer testers means less test.

Google search UNIQUE constraint failed: DuplicateBlock finds only this.
This table is usually empty so not interesting. I suspect a damaged dindex file.

If so, it would probably be easy to find by getting even an information level log.
After that, the challenge for experts getting file would be to solve its puzzles…

I am not such an expert, but while I was studying recreate in 2023, I wrote this:

public UpdateBlock(hash, size, volumeID, transaction)
	m_findHashBlockCommand SELECT Block table VolumeID given Hash, Size
	if block found in volumeID
		return false
	if block not found at all
		m_insertBlockCommand INSERT Block table Hash, Size, VolumeID
		return true
	else if block needs volumeID
		m_updateBlockVolumeCommand UPDATE Block table VolumeID given Hash, Size
		return true
	else block is already set up
		m_insertDuplicateBlockCommand INSERT DuplicateBlock table BlockID, VolumeID
			based on SELECT with given hash and size, plus a volumeid to record
		return false

Bottom INSERT DuplicateBlock table BlockID, VolumeID might have tried to run repeatedly.

Maybe we’ll have a comment on both of your questions from the lead developer sometime.

In terms of what to do, if you truly got backup damage (maybe helped by pCloud flakiness), reverting to 2.0.8.1 or waiting for next Beta likely won’t help recreate. Do you do that much?

Maybe your 2.0.8.1 database is still around and could be used. Its name would likely be like:

DBPath”: “C:\Users\user\AppData\Local\Duplicati\NXJLOMBWSA.sqlite

except likely with backup or .bak and date of 2.1.0.2 update in the file name.

Even if this can be made to work, it’s dangerous to live in a can’t-recreate-database situation.
Fresh start is probably advised at some time. If you need old data now, that’s a different task.

This sort of approach is always risky, as repeated errors can cause problems you don’t notice.

Above discussions are all quite speculative, as this is the first report of it and I’m not an expert.

1 Like