I am a second year undergraduate student at MIT currently studying Computer Science and Mathematics. I currently work in the Programming Systems Group (PSG) led by Michael Carbin, and I am supervised by Alex Renda. At PSG I have led both a project on Transformers that are invariant to variable renamings and an empirical investigation of the effect data dimensionality has on neural network prunability. I also work at MosaicML, where I am studying the pre-training of large encoder language models.
In general I am interested in a variety of topics and happy to chat about anything ML related at all so reach out. My biggest focus in my current ongoing research is improving large language model pre-training and energy models for language modeling.
Zachary Ankner*, Naomi Saphra, Davis Blalock, Jonathan Frankle, Matthew L Leavitt
Preprint
J.Ryan Shue*, Eric Ryan Chan*, Ryan Po*, Zachary Ankner*, Jiajun Wu, and Gordon Wetzstein
CVPR 2023, Poster
Zachary Ankner*, Alex Renda, Gintare Karolina Dziugaite, Jonathan Frankle, and Tian Jin
NeurIPS 2022, ICBINB Workshop
Zachary Ankner*, Purvaja Balaji, Ye Zhu, Chun Keat Hiew, Patrick Wang, and Amar Gupta
International Journal of Pattern Recognition and Artificial Intelligence