Abdulmalek Saket
View original ↗Build an 'Aletheia' trainer for HuggingFace PEFT. It should automatically prune LoRA layers that aren't learning, reducing VRAM usage.
Suggested repo: aletheia-lora
"Stop wasting VRAM on ineffective LoRA layers."
Estimated effort: 50h