Zack Ankner

I am a second year undergraduate student at MIT currently studying Computer Science and Mathematics. I currently work in the Programming Systems Group (PSG) led by Michael Carbin, and I am supervised by Alex Renda. At PSG I have led both a project on Transformers that are invariant to variable renamings and an empirical investigation of the effect data dimensionality has on neural network prunability. I also work at MosaicML, where I am studying the pre-training of large encoder language models.

In general I am interested in a variety of topics and happy to chat about anything ML related at all so reach out. My biggest focus in my current ongoing research is improving large language model pre-training and energy models for language modeling.

Papers (* denotes equal contribution)

Dynamic Masking Rate Schedules for MLM Pretraining

Zachary Ankner*, Naomi Saphra, Davis Blalock, Jonathan Frankle, Matthew L Leavitt

Preprint

3D Neural Field Generation using Triplane Diffusion

J.Ryan Shue*, Eric Ryan Chan*, Ryan Po*, Zachary Ankner*, Jiajun Wu, and Gordon Wetzstein

CVPR 2023, Poster

Project page, Code

The Effect of Data Dimensionality on Neural Network Prunability

Zachary Ankner*, Alex Renda, Gintare Karolina Dziugaite, Jonathan Frankle, and Tian Jin

NeurIPS 2022, ICBINB Workshop

EntailSum: An Entailment-Based Approach to Aspect-Based Text Summarization with Automated Aspect Adaptation

Zachary Ankner*, Purvaja Balaji, Ye Zhu, Chun Keat Hiew, Patrick Wang, and Amar Gupta

International Journal of Pattern Recognition and Artificial Intelligence