Why that would be a problem? It’s something you can automate, and if data is important enough that’s a small cost. Also if the crashplan restore process runs efficiently, it can simply patch from version to version just modifying required files on every iteration.
That’s exactly how I’ve converted lot of stuff between some very old version management systems. It’s still lot easier than writing totally custom software to support the legacy formats.
As example a script which will convert all Duplicati backup versions to git revisions, or all git revisions to Duplicati versions, is quite trivial. Especially if you don’t need to maintain all metadata. Or using something simple like the duplicati version timestamp as commit message.
I guess you don’t really know how much that kind of stuff is done in corporations with legacy stuff. If it’s not done, in the worst case you’ll end up with data / formats, which are no more possible to process without extensive reverse engineering and guess how expensive that is.