Remote files found for a brand new backup

Running: Duplicati - 2.0.4.5_beta_2018-11-28 in a docker on linux

I created a brand new backup set, clicked the “Run now” link and after a couple of minutes got an email from myself stating:

Failed: Found 6510 remote files that are not recorded in local storage, please run repair

How are there 6510 files not recorded in local storage for a brand new backup?

I did attempt to run the repair, but it then told me that it couldn’t decrypt them. I’d expect this since each of my backup sets has a different encryption key. (Don’t ask why, that just seemed to be the right thing to do.)

I do have 3 other backup sets that have been running for a few weeks, so it makes sense that it’s finding files at the remote (the destination for them all is the same remote location).

I’m pretty new so please let me know what additional info you need to help me solve this issue and I’ll be more than happy to supply it, though I might need a bit of hand-holding in gathering it all up.

Thanks!
FreeMan

Did you target the same remote storage location as another backup? If you do this it will confuse Duplicati and possibly corrupt your backup data.

Each backup job on each computer must target a unique storage location. No two backups should use the exact same location.

The easiest solution is to create subfolders in your backup storage, one subfolder per backup job.

I guess that’s the problem. It’s weird because I’ve got 3 other backups running that all use the exact same destination.

I guess I’ll change this one to keep it happy.

Thanks.

Are you by chance using the --prefix option on at least 2 of those 3 backups? Because setting a unique prefix on each backup job is a way you can have more than one backup using the exact same destination.

If you aren’t using --prefix then I don’t know how those 3 backup jobs aren’t throwing errors and corrupting backup data.

No, not using --prefix.

I’m getting no errors, but I haven’t attempted any test restores yet, so I may have a bunch of useless backups.

I guess I’ll just start over (still using Crashplan, though my goal is to move away since that seems to be what they want, so I’ve still got a complete backup for now).

Nope, I’m totally wrong here: I just took a look at my other backups and each one is in its own subdirectory, so all is good.

I was having issues with the amount of time it was taking to backup all my photos (about 1.5TB and around 300k files), so I’d started breaking them up in batches. I created the next batch of photo backups to go to the existing backup directory. I’ll go back, select the additional subdirectories in my original backup and I think I’ll be good there, too.

I’ve done some reading on setting volume & block sizes so I think it’ll run a bit faster now and that I’ll be OK to do them all in 1 backup set. After all, the pics themselves don’t change all that often - usually it’s dump an SD card, back 'em up. I do some editing, but not significant amounts and I usually save a copy and the original, so there won’t be many updated files.

Yeah initial backups can take a while. I have about 500GB of data in total that I’m protecting and it took a few weeks to back up to Backblaze B2. Subsequent backups are very fast though.

Where are you backing up your data? Cloud or…?

My own server. At the moment the main and backup are sitting next to each other. The backup will soon go offsite to live with my son at college.

I’m accessing it via FTP. Yes, I know FTP isn’t secure, I’m working on a VPN solution between the two before it goes away.