Dyah Adila
adila@wisc.edu
Hello! I am a final-year PhD student advised by Fred Sala in the Sprocket Lab. I’ve been fortunate to intern at Google Research and AWS AI Labs.
I like understanding why things work (or don’t) inside LLMs, and turning those insights into efficient and reliable methods for adapting them.
I will be joining Scaled Cognition as a research scientist in June 2026 ![]()
news
| Mar 11, 2026 | 🌟 Our paper from my internship w/ Google Research last summer: Grow, Don’t Overwrite: Fine-tuning Without Forgetting is finally out! A very simple method that matches full fine-tuning on new tasks with almost zero forgetting. |
|---|---|
| Mar 3, 2026 | 🚨 Our new preprint, Weight Updates as Activation Shifts, is out! We move beyond trial-and-error by deriving a principled framework for activation steering. Code here. |
selected publications
- Preprint
- Preprint
- Preprint
- ICML 2024
- ICLR 2024
- UAIShoring Up the Foundations: Fusing Model Embeddings and Weak SupervisionIn Proceedings of the conference on Uncertainty in artificial intelligence Aug 2022
mentorship
I enjoy mentoring and collaborating with students. Some amazing undergrads I've worked with:
- Alexander Yun — Weight Updates as Activation Shifts (Fall'25–Spring'26) → SWE at industry
- Yijing Zhang — Alignment: Simplified (Fall'24–Spring'25) → Next: PhD at UW-Madison
- Linrong (Chris) Cai — RoboShot (Fall'23–Spring'24) → Next: MSE at Princeton