I am looking into moving to Duplicati, however I have a lot of backups from other systems that I would very much like to import. I can easily extract them and import in Duplicati, however I wonder if there is a way to keep the correct metadata on those, specifically the date. As you know, if I don’t they might get incorrectly dropped by the retention policy.
Is there some hacky way to do this? Possibly editing some files in storage? Would that mess up the local database?
I’m not sure which dates (file vs. backup) you are wanting to keep, but the closes to “doable” I can think of would be so much effort it would make it not worth the time.
Duplicati chops files up into small blocks (defaults to 100kb) and backs up the blocks. Unless your existing backup system has exactly matching blocks you’d have to re-process every version of every file.
Specifically, I imagine that (in theory) you could:
- restore the oldest backup from your current program to a fresh folder tree
- back it up with Duplicati (assuming the restore keeps original file dates, Duplicati should keep them as well)
- over that first restore, restore the next oldest backup
- go to #1 (again, and again, and again)
If you care about the actual dates of the backup, you might be able to tweak your system clock between each backup to simulate the date of the original backup…maybe…
The process you describe is exactly what I had in mind.
Changing the system clock is an option, but if I could just run it on my actual machine and fiddle with some files afterwards, that’d be faster.
Well, OK then - I’ve got about 4 years worth of daily CrashPlan backups that I’m willing to lose rather than go through that process, so you’re braver than I am.
Duplicati can be driven through the command line so, if your current tool can be run via CLI then you should be able to write a script to automate those steps.
If you really want it, you can make the backups. Once you have all the backups as you like them, go and rename the
dlist files to have the correct timestamps.
You can then delete the local database and rebuild it, and it will accept the new timestamps.
For a harder (but potentially faster) method, you should also be able to simply edit the local database and change the filenames in the
RemoteVolume table without issues.
Are you positive that the dates are not stored inside the JSON themselves, and that files don’t reference dlist files by name?
That seem easy enough, cool!
Yes. It does write the “created time” into each file, but this is not used for parsing. The reason for this is that it allows the
restore command to show you what backup versions/times are available without needing to download anything except the file list.
There is nothing that references a filelist. Only
dlist files reference