Alizishaan Anwar Hussein Khatri
View original ↗Integrate differential privacy directly into simple PyTorch trainer templates to prevent model overfitting on small, noisy datasets. Create a drop-in library for 'Private-LoRA'.
Suggested repo: dp-train
"Keep your training data secret, even while fine-tuning."
Estimated effort: 30h