Hello,
Interesting but would it be possible:
a) to have an estimate (after a few backups) of the size required in the cloud
b) adapt the backup so that it does not exceed a maximum value?
I suggest this because I have a Hubic size at 25 GB
Regards
If this is asking to limit Duplicati to some estimated needed usage of the 25 GB, then you can do this:
Best plan is to set Backup retention
on screen 5 Options
to something finite or it will keep growing.
Watch backup size shown for backup on home screen for awhile, then see if it seems to be stabilizing.
–quota-size in screen 5 Options Advanced options
can be used for HubiC, which fortunately doesn’t support quota on its own (unlike most), therefore “should” let you set what you want. Try a small value, using an unimportant backup, to see what happens (might be bugs). It should warn, then finally error…
Thanks for pointing to -quota-size, I have not seen this before.
Still I would love to have an extended smart retention scheme e.g. to keep yearly backups indefinite but restricted by space e.g. an additional scheme like S:12M? Is there a chance for that?
Welcome to the forum @steviehs1
Could you clarify the wish please? How would you keep them yet also restrict them?
Please also note that backups are not separate from version to version, but blended.
Features “Incremental Backups” and “Deduplication” get into that.
The backup process explained goes a bit further into the specifics.
So, the goal to reach is the following: do unattended backups over long long time by using the most out of the backup space you have.
I have read - and hopefully understood the backup technology and deduplication mechanism.
Right now the situation - if I understand it right - to keep the backup storage size acceptable - is to adjust a retention policy which keeps it lower than the size of the destination drive.
So what I would look for is a combination of a retention policy like
7D:0s,3M:1D,10Y:2M
and an automatism to delete as many oldest backups as necessary to have enough space for the next backup? At least, this is something I can and have to do by hand now, so it should be possible, but to have Duplicati done that automagically would be great.
Is it somewhat clear now, what I am aiming for?
Cheers
Steve
HI,
thank you for this solution and this wonderful tool
I will actually use this command on a test site with a smart backup
Thanks, that helps. One thing needed (unless you’re on one of the few non-quota-giving destinations such as HubiC) might be to allow a user-specified quota, then deletion could perhaps be tied into the code somewhere near where a warning would currently be issued. Then just delete without warning?
Auto-delete without any limit on how far it goes seems dangerous. It might delete every backup there. There’s also no way to know how much the next backup needs. Additionally, freed space from deletes isn’t reclaimed until compacting files at the backend runs. This can run as part of post-backup deletes.
Seems simple on the surface, harder when thought out. There’s no telling when time will be available. Thanks for the suggestion though. At some point (probably after bug fixes), feature adds may happen.
Thanks for sharing your thoughts and let me understand this!
You are completely right! Still I guess the feature could be worth it as it could help secure easy backups for everybody.
I am working right now on a FOSS backup appliance which mainly provides SSH backup space. Every backup account on this appliance is in a chroot jail and has a user quota on the ext3 volume. I have not checked again, but I had the impression, that I did not get any senseful status when this quota was exceeded but just write errors from Duplicati. So I will test this again soon.
It certainly could, but would probably best be done with additional volunteers. Perhaps you?
Duplicati is completely dependent on what volunteers can do. Recently, the focus is stability.
There’s not much point in having easy backups that don’t work reliably. Duplicati is still Beta.
This should probably not be part of the feature request (open a support ticket if desired), but
would have IQuotaEnabledBackend if SFTP could do quota. I see some signs that it can in version 6, however Duplicati uses SSH.NET which doesn’t talk about quota in source that I can find, and is also getting kind of old from a good-crypto viewpoint. Let us know if you know any good free SFTP clients.
Dupilcati Server is a thread from today talking about a more appliance-like (but still split client-server) approach, as well as the too-large-to-handle number of other possible approaches that people take…
There has been at least one attempt (I forget its name) to bundle storage software to get near that idea.
EDIT:
SFTP - check free space available has more on SFTP version dependency, workarounds, and Renci.
Hello,
Sorry but I did not understand all the technical part and my English is bad.
I’m doing a test with -quota-size limiting to 10 MB
I put in the source directory ~ 9.5 MB with images and document.
I simulated a job: adding, deleting images and modifying a text document.
My returns
1/ If I exceed my quota of 10 MB, the image is not transferred but the message is not clear in the logs. I would have liked a message
imagexyz.jpg could not be transferred because the 10 MB quota was exceeded. Suggestions …
2/ If I modify my text document moderately, it is transferred remotely. I am making a new modification locally and I want to restore the remote text file. I have the message “file restored” but if I open the text file, it is not the remote file!
Your opinions?
Regards
Hi,
Finally, it exceeds the stated capacity of 10 MB.
I asked for a backup every 10 minutes without taking care of it:
Results:
Source: 12.90 MB
Backup: 18.70 MB / 12 Versions
There are no remote text files. The remote filenames and contents use a backup-specific format.
See above. Overview also explains that Duplicati is not a file synchronization program (or a copy).
If some of this is just wording difficulties, and you mean you actually restored some source text file somewhere (for tests, preferably to a different folder without such a file), what exactly did you get?
sounds like a file-at-a-time copy is in mind. That’s far from what actually happens. See backup link.
This explains the process in more detail. Note that files are found at one end of the whole process, whereas uploads are at the extreme far end. Inability to upload a dblock file may affects many files.
The AFFECTED command is a way to find out which source files are affected by a given dblock file.
Disaster Recovery gives an example of this use after intentionally causing damage to a test backup.
Having said the above, I’m having trouble getting any quota message using various combinations of
--quota-size (Size): A reported maximum storage
This value can be used to set a known upper limit on the amount of space a
backend has. If the backend reports the size itself, this value is
ignored
--quota-warning-threshold (Integer): Threshold for warning about low quota
Sets a threshold for when to warn about the backend quota being nearly
exceeded. It is given as a percentage, and a warning is generated if the
amount of available quota is less than this percentage of the total
backup size. If the backend does not report the quota information, this
value will be ignored
* default value: 10
with backends that seem to be reporting quota (visible in job log) and ones that do not. There may be bugs here, and you can file an issue if you like. I didn’t see one, so perhaps few people rely on quota.
Backing up to the auto-adapt idea, I’d be very nervous about trying to predict space need in the future because usage tends to grow either through more versions being kept, or more and bigger files made. The “smart” backup might wind up looking quite dumb when it automatically deletes precious backups.
Hello,
When I talk about remote file, I ignore the backup format. I use the restore function of my text document to see which version was backed up remotely
When I talk about synchronization, I meant transferring (with the blocks method) the content remotely (backup) and if necessary restore the most recent document on my p .;
I also took some time to read the documentation and other threads.
However, I am still lost. These documents are very technical, very long. I have difficulties to understand
I would like to have the simplest answer of a command to put in part 5 of the configuration of the backup.
I’m on Ubuntu and can’t use an exe.
For now I just understand that I can put a warning with Using Duplicati from the Command Line - Duplicati 2 User's Manual but that this will not solve the problem.
As it is a suggestion for improvement, if it is not possible now I prefer to focus on the request for support: Backup that fails! , because I have a real big problem if backups do not finish. I can’t keep an eye on it all the time!
Regards
If your restore of a text document isn’t restoring, please give the detailed steps in a support request. Preferably you also use non-sensitive data and paths so that it’s possible to see what you’re seeing.
Answer to what, why a command, and why on screen 5? This doesn’t sound like the original request.
If you’re talking about Duplicati.CommandLine.exe and its affected command, have you tried doing it? Ubuntu runs .exe
all the time. It’s not just Windows, but you may need to say mono
before .exe name. Linux Mint (a Ubuntu derivative) doesn’t even require that. You can just run a Duplicati .exe directly…
$ file /usr/lib/duplicati/Duplicati.CommandLine.exe
/usr/lib/duplicati/Duplicati.CommandLine.exe: PE32 executable (console) Intel 80386 Mono/.Net assembly, for MS Windows
$ /usr/lib/duplicati/Duplicati.CommandLine.exe
See duplicati.commandline.exe help <topic> for more information.
General: example, changelog
Commands: backup, find, restore, delete, compact, test, compare, purge, vacuum
Repair: repair, affected, list-broken-files, purge-broken-files
Debug: debug, logging, create-report, test-filters, system-info, send-mail
Targets: tahoe, amzcd, aftp, hubic, googledrive, gcs, rclone, jottacloud, mega, ftp, s3, openstack, b2, cloudfiles, webdav, dropbox, azure, od4b, mssp, box, file, ssh, msgroup, onedrive, onedrivev2, sharepoint, sia
Modules: aes, gpg, zip, 7z, console-password-input, mssql-options, hyperv-options, http-options, sendhttp, sendmail, runscript, sendxmpp, check-mono-ssl
Formats: date, time, size, encryption, compression
Advanced: mail, advanced, returncodes, filter, filter-groups, <option>
http://www.duplicati.com/ Version: - 2.0.4.23_beta_2019-07-14
$
shows how .exe files are not always Windows native code. In this case it’s portable code run by mono.
You really don’t want to get so full that you have to ask what files got broken, so that gets to the below.
That was the theory, but I couldn’t get it to work, so it might be my misreading docs, or might be a bug.
What problem? If you mean the possible bug, then maybe not, if the goal is a warning, and if it’s a bug.
If you mean the original feature request, it definitely won’t give all of that even if it worked, but is a start.