6709 Gates Hillman
Pittsburgh, PA, USA
I earned my Ph.D. from UMass Amherst working in the Information Extraction and Synthesis Laboratory with Andrew McCallum. Previously, I earned a B.S. in Computer Science from the University of Maine with a minor in math, where I applied epidemiological models to the spread of internet worms with David Hiebeler. I’ve also spent time as an intern and/or research scientist at Amazon, IBM, and Facebook. You can find a formal bio for me here.
I do research at the intersection of natural language processing (NLP) and machine learning, and my broad research objective is bridging the gap between state-of-the-art NLP methods, and the wide variety of users who stand to benefit from that technology, but for whom that technology does not yet work in practice. You can learn more about my research group SLAB here.
Maintaining a healthy work-life balance is important to me. Outside of work, I enjoy cooking and baking (mostly vegetarian and vegan), fermenting (kombucha, kimchi, yogurt, sourdough), DIY renovating and restoring my weird old house, and hiking and camping with my two dogs, Nala and Pepper.
In 2019 I backpacked the first 250 miles (southbound) of the Colorado Trail with Nala, and I hope to finish the trail soon! I also like to summit the high points of U.S. states. So far I have completed: Colorado, Connecticut, Maine, Massachusetts, New York, and Pennsylvania. I have also come close in Vermont and New Hampshire.
I am also co-author of Plant Jones, a semi-intelligent plant who tweets negatively about water when thirsty, and positively when not. Code is available here. Plant has been staying away from social media lately, but is still living his best life in my kitchen today!
I started programming in middle school on my TI-83 calculator, and started using Gentoo Linux in high school, all self-taught out of a strong motivation to h4ck the planet. Now I’ve sold out and use a Mac, but can still satisfy some of that system debugging itch when I teach On-Device ML.
AI Impact PrizeEnergy and Policy Considerations for Deep Learning in NLPIn Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Jul, 2019.
Best PaperLinguistically-Informed Self-Attention for Semantic Role LabelingIn Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Oct, 2018.
Outstanding PaperLearning Dynamic Feature Selection for Fast Sequential PredictionIn Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Jul, 2015.