I decided to finally consider doing a backup of my photos (mainly). I use 3-2-1 strategy (copy on hard drive + cloud copy).
I’m not an expert nor a developper but I like to learn new stuffs and I just like to use complex solutions so I can pretend to myself that I’m some kind of genius (yes I know).
Anyway I decided to pick Duplicati as it’s probably the only that I found on Youtube with some step by steps explanation + it’s free so I can’t really go wrong.
I used the CommandLine on Windows 10.
Here is my command (don’t worry, I hid sensitive infos):
Duplicati.Commandline.exe backup “b2://mybucket/Photos” “E:\Photos” --b2-accountid=“myID” --b2-applicationkey=“myKey” --send-mail-to=“myemail” --send-mail-from=“myemail” --send-mail-subject=“Backblaze - Duplicati %OPERATIONNAME% Report” --send-mail-body="%RESULT%" --send-mail-url=“smtp.office365.com:587” --send-mail-username=“myemail” --send-mail-password=“myPwd” --send-mail-level=“Success, Warning, Error” --passphrase=“mySuperSecretePassphrase”
I uploaded during sunday. As you can notice, there is no compression at all. This is something I didn’t fully understand so I didn’t add any settings. Would I gain a lot of space compressing ? If so, what do I need to add in my command ?
Also, how should I use dblock-size ? My photos are in RAW formats so it’s like 24mb files. Is it relevant ? Should I leave it to default ?
I’m sorry if you consider I’m asking too much. Feel free to let me know what I can do to improve speed, space or anything else.
thanks in advance
You didn’t actually disable compression, so Duplicati defaults to compression being enabled. It may try to compress (depending on file extension). For image files Duplicati will either not attempt compression or if it does it won’t achieve any savings.
Duplicati will perform deduplication (it is not possible to turn it off), but for image data it won’t save you much.
dblock-size defaults to 50MiB which is fine for most cases. Some people increase this because the back end they are using has a limit on how many files can be stored, but B2 won’t be an issue in that regard.
What is the total size of your data? If it’s quite large (perhaps 500+ GB) then you MAY want to use a different deduplication block factor. It defaults to 100KiB and with large data sets this results in a lot of blocks to track - so the job sqlite database gets larger and some operations get slower, especially if you keep a lot of backup versions. The bad thing is you can’t change this option after your first backup without deleting all the remote data and starting over.
Thank you for answering my questions.
At the moment my photos are 150GB.
I plan to upload my Windows disk image aswell later.
Also, I have a question to make sure I understand what happens with Duplicati:
- If I delete a subfolder (summer 2020) from the source (my computer) that has already been backed up and I run a backup on folder (my photos), will it be deleted on backblaze ? And same question if I add a new subfolder, will Duplicati only send this folder or will backup everything ?
Data is only deleted if your retention is set to delete old backup snapshots. If you have it set to not delete any versions, then you’ll always be able to restore data that was backed up.
Duplicati only uploads new or changed data with each backup.