I'm a scientist in the Bay Area with an unhealthy love of complicated machines.
I currently work at Google Deepmind, where I try to summon spirits with matrix multiplications. I work with the JAX team, and help develop the Flax neural net library. Since 2018 I've worked on scaling language models, building frameworks such as T5X and helping train large models such as PaLM and Gemini. Generally, I work on new modeling techniques and performance engineering at scale for deep learning.
I founded a startup that built a high-throughput molecular cloning, sequencing, and selection system for purifying errors from synthetic DNA libraries. I was one of the first employees at Cell Design Labs, engineering new CAR-T cell therapies with synthetic notch receptors before the company was acquired by Gilead. I've also worked in shorter stints for other startups applying deep learning methods to medicine and biology.
I studied biophysics with Chris Voigt and Wendell Lim at UCSF. There I developed optogenetic tools with the Phytochrome-PIF system of plants. I was a postdoc in the Deisseroth Lab at Stanford applying light-field microscopy to neuroscience.
I've made some toys on the internet that people seem to like:
I've given lots of talks, but most have disappeared from the internet.