Thanks to visit codestin.com
Credit goes to deepmind.google

VaultGemma

An open model trained from the ground up using differential privacy to prevent memorization and leaking of training data examples


Adaptable for sensitive workflows

1B-parameter size enables efficient training of custom models that are private by design

Helps ensure privacy compliance

Trained with a sequence-level differential privacy guarantee of (ε ≤ 2.0, δ ≤ 1.1e-10)

Roadmap for private model development

Implements novel scaling law research that balances compute-privacy-utility tradeoffs