Strange Backup sizes with rclone on Onedrive for Business


since I had a very poor experience with Duplicatis build in Onedrive for Business backend (like this and this) I tried the new rclone backend.

Now my Backups seems to run fine, however there are strange backup sizes in my opinion. I have one “old” Backup let’s call this Backup_old and a new Backup with the rclone backend Backup_new. The only difference between these two is the volume size. For Backup_old it’s 500MB and for Backup_new it’s the default 50MB. Both Backups include exactly the same files. Duplicati reports the following:

Source: 207,45 GB
Backup_old: 167.91 GB / 38 Versions
Backup_new: 160,71 GB / 1 Version

Hower the Onedrive for Business storage metrics reports:
Backup_old: 168,2 GB
Backup_new: 321,8 GB

So the new Backup uses twice as much size on the remote storage and is even bigger that the source files which seems a bit strange.

So whatever duplicati reports looks OK, but OD4B reports strange things, right? I’ll verify my backup later. I added the rclone backend for OD4B :slight_smile:.

Not sure when I’ll be able to verify. Can you already check if it’s not counting stuff in the recycle bin or something?

Thanks for your reply.

Yes what duplicati reports is the same as for the od4b backend in the old backup. Just the od4b storage report is strange.

I emptied recycle bin and nothing changed. I’d like to add that the Backup got stuck several times with an 429 error which I think is related to this because there seems to be too many files in one directory. I thought maybe the restarts till it once completed caused the huge directory.

I want to update this.

Out of curiosity I tried “rclone size od4b:Backup_new” and it shows

Total size: 160.718 GBytes (172569890350 Bytes)

I then ran “rclone copy od4b:Backup_new some_local_dir” waited for a few hours and checked the local dir, which had also ~160GB of files. So I think something in my Onedrive account storage metrics is wrong.

Next thing I’ll try is to reupload the whole backup set via rclone directly which I just downloaded (this will need some time) and look at the storage metrics again for this new folder. Then I’ll point Duplicati to the new directory and see if it stays “small”.

So I think it’s not a problem of the backend you’ve written.

After downloading and uploading the 160GB again the web storage metrics counts the files again as 320GB. Since this seemed to be not related to Duplicati I searched the rclone forums again and found
OneDrive for Business uses 2x the space - bug - rclone forum and the corresponding issue
OneDrive for Business uses 2x the space · Issue #1716 · ncw/rclone · GitHub.

Sorry for not searching better in the first place.

ok, interesting… Thanks for the links. Same issue here.
Not sure if OD4B counts the versions to your storage limit or not?

I think they are counting it. At least in ht tps:// (sorry have to cut the link because forum software says “Sorry you cannot post a link to that host.”) ificator says

there’s a pretty strong guarantee in SharePoint that ALL changes will generate a version, and ALL versions will impact quota

so the workaround would be that after initial upload, you copy it to another folder, remove the old, and copy back… I guess that will remove the versions, and reset your storage metrics.

And you that from time to time again for new uploads…

Either that but I can’t test that with my current backup, because there are too many items in that folder and od4b won’t let me copy that :frowning: (so nice…) OR disable versioning until the issue is fixed.

Edit: Wow, I tried to put useful links in my previous posts and they all got flagged… I’ll think twice about posting here.

Yes, it was an automatic spam detection that thought you were trying to spam with links to the rclone forum :slight_smile:

I have hand-fixed it.

Thanks. I misinterpreted the message “community flagged” then :frowning: sorry.

@Wim_Jansen I tried the copying you suggested with a small fileset and this will indeed delete the versions and shrink the storage. However this will of course only work if there aren’t that many files in the backupset.