Considerations for 1 big vault vs many smaller vaults?

Hi I’m planning to use Google Drive to back up a large backup of independent files and folders (~1TB).

What are some considerations to make whether to have this backup either in a single large vault or in several smaller vaults?

Some considerations I’ve thought of so far (Let me know if these are unfounded):
On-Demand files to Save Space

  • Some cloud services download on-demand to save space on my local machine.
  • However, for Cryptomator to decrypt the filenames to allow me to choose which file to open, Cryptomator would need all files to be downloaded. Thus to save local-machine HDD space, it is better to split a large backup into several vaults. (Is this correct?)

Reduce CPU Usage when Decrypting

  • Same reason as above - if I know which file I want, having files in separate smaller vaults would be better. If all files were in one big vault, Cryptomator would need to decrypt my entire backup -> higher cpu usage. (Is this correct?)

Encrypted file corruption

  • From what I know, each file is encrypted individually. So a corruption of one file (in its encrypted or decrypted state) should not affect other files in the vault. (Is this correct?)

Does Cryptomator cache filenames and the map to the encrypted file before downloading and decrypting the actual files?

If so, a user would be able to browse his/her files without downloading & decrypting all files.

Thanks!

I would go for multiple vaults if the content allows it logically…

Like you say, the on-demand sync such as in OneDrive locks up Explorer when it tries to checkup metadata of the file(s), if your having a fat connection and only small documents maybe this is beareable, but not when explorer waits for OneDrive to download 3 x 4 GB ISO-files… The download is triggered not by simply listing folders/files but many other activities, like the context-menu of Explorer, a tooltip or whatever… AFAIK Windows Defender seems more picky here since many files are downloaded and no signatures/hashes orknown fileformats exists for the content being downloaded.

Also, I have a huge vault (3,4TB) with large files. While Cryptomator itself doesnt have issues, fuse and dokan renders very slow… If there are additional middle layers such as storage spaces, or even network storage like CIFS-shares, I have to bump the timeout-value for Dokan (possible in >=1.4.13).
Before 1.4.13 I used Cyberduck for reliable access to the huge vault.
Keep that in mind should you access the vaults remotely.

1 Like

Nope, files are only decrypted during access.

Correct.

Yes, Cryptomator caches certain cacheable stuff, including filename cleartext/ciphertext pairs as long as a vault is unlocked. But this is only an efficiency optimization and doesn’t allow you to browse files while the encrypted files are unavailable. That said, there are several users who use Cryptomator with Google File Stream. But I would consider this experimental.


To answer your primary question regarding one big vs. multiple small vaults: You might decide to share a set of data with some person. For example you might want to make some financial documents available to your tax accountant but not your private pictures. You don’t have this option if all your data is just one huge logical set.

Also, if you leak a password, the damage doesn’t affect all files at once.

1 Like

@overheadhunter Thank you so much for the detailed reply!

In addition to filename, May you go into what else is cached? Is the cache encrypted as well?

How does Cryptomator work with an online-only file in Google File Stream? Does it need to download the file from Google File Stream before Cryptomator is able to show the filenames?