Hi, what is biggest vault You guys have?
Is it possible to have 30TB vault encrypted on cloud?
What can I expect from performance issues? Smart sync is or it is not problem?
Hi, what is biggest vault You guys have?
Cryptomator does not provide any encrypted cloud space.
With this software you can create a vault/encrypted storage on your local drive. When this vault is created in a directory which is synced with the cloud, then your encrypted files are also in the cloud.
Yes. There is no storage constraint with Cryptomator, it depends on your local and cloud storage.
Depends on the vault provider you are using. The legacy provider WebDAV is slower but can be used on any OS, for Linux and Windows there are more native implemenations with high throughput. If you are interested in numbers i can perform some benchmarks.
I don’t know what smart sync is.
Yes, I am aware of it, but I am curios, what is biggest cryptomator vault made and in use. xx GB xxxGB xTB, xxTB?
Dropbox has smart sync which actually means Dropbox show icons of folders etc in order to save space. Once we trigger/enter folder, he start to sync.
In case of cryptomator we need to download whole vault (Win client), which could be problem if there is one vault for 30TB
What I would like to know more from user standpoint, will I feel that I use Cloud + Cryptomator regarding CPU/RAM utilisation? or Bandwidth?
My goal is to put encrypted personal library and documents in cloud. But I dont wannt to upload 20TB and to find out that cryptomator+dropbox cant work good and fluent with big vault in practice, due to any kind of bottleneck
There is no restriction in size by Cryptomator. Your filesystem and cloud provider provide the restrictions. So yes, your vault can possibly have a size of 300TB or more. But i guess there is other software out there which can handle such huge amount of data better. Please keep in mind that Cryptomator is no backup solution!
I’m unsure if Dropbox Smart Sync works out of the box with Cryptomator. But i found a thread where someone uses Cryptomator with OneDrive Files-On-Demand:
Bandwith is not a thing. Cryptomator is designed for usage in synced directories, so only files which are changed will actually uploaded. But if the file is still 4GB big, the whole 4GB must be transferred, since we don’t split huge files in smaller chunks due to possible inconsistency when it is not completely uploaded, etc.
Cryptomator can use at most 500MB of your RAM.
Regarding CPU usage it depends. When you perform some read/write operations into the vault of course the data must be de-/encrypted and this costs something. For example if you want to have a preview of a directory with a lots of images, you have a short peak of CPU usage until all image previews are loaded.
So I tried smart sync in few scenarios and this is how it works with cryptomator.
small amount of data (cca 100MB, lot of files) it works great.
I tried to access “cloud folders” and transfer them into vault. It downloads and enter vault. Normal work, no delay-s or anything.
big amount of data (cca 10GB) I tried to reproduce same thing, from cloud copy to vault, but windows transfer was stuck on 0% after few files transferred. I waited until all files were synced locally on PC but Windows file transfer was still stuck. I had to task manager.
Those few files were copied. I repeated copy (this time files were local) and it went smoothly.
@infeo do you have any idea why file transfer hanged? I understand why it happened, but why he didnt continue to transfer files as they were downloaded -or after full download from dropbox.
- It is interesting that vault works when files are only online (smart sync on), and there is folder structure local without files. I could login in new vaults and browse files without problems. (even masterkey was not downloaded locally!)
I could also open small files like word, excel and dropbox would download it and it was fluent.
Didn’t tried bigger files but I guess here would acour some delays.
Just get the things in a line (correct me if I’m wrong):
You basically created a vault locally, unlocked it and then copied files into it. These files were special in the kind of sense that they were actually not stored locally but in the cloud and downloaded only when they were needed.
There appeared a problem of a stuck copy operation when the amount of data was kinda big (~10GB).
I only can guess why the copy operation didn’t continued.
There can be a lots of reasons why it could not continue the operation: An issue in Dropbox SmartSync, same faulty behaviour of Windows, something wrong with Dokany, etc…
When using Cryptomator with the Dokany vault provider, there is a timeout of 10s until any file operation must be handled. (in the sense of it must start) If hits this time limit, the operation will be discarded. But in your case the copy operation started and a few files were copied.
If the 10GB copy operation worked smoothly to a normal location on your harddrive then there is a problem with dokan and/or cryptomator.
Yes, correct, just to add, local vault has been synced to dropbox, then I started transfer. So it was cloud to cloud (vault).
I used Webdav and I could reproduce problem (Win 10, 64bit).
SmartSync and similar technology is future of Cloud, so I suggest to take a look at this in near future.
What is also interesting, but I need to double check, that vault is been downloaded in background once it is unlocked.
Even if file is not touched.
This has PROs and CONs.
PROs is that I can open any file which I need. SmartSync will dl that file and open it fast, so fast that it seemed like it is local (word, xls.,…). In background will dl all vault so I can work with rest of files.
CONs is that vaults should not be big then. Better to seperate it in smallers. Why? it will overflow bandwidth and dl many stuff which we actually wont use at all. SO this is same thing as PRO and CON.
I still upload my 7TB in cloud, and vaults are not so big but indexing of smart sync takes forever and I upload all the time new stuff so I cant try and verify full vault auto-dl on access while I write this. Only before next cycle of selection files to upload. But I did noticed it few times.