I am currently a student researcher at Google Brain and a PhD student at the University of Colorado, Boulder. My advisor is Michael Mozer. Previously, I’ve worked as a research intern at Facebook Reality Labs, machine learning research scientist at Sensory Inc., software engineering intern at LinkedIn, and assisted in teaching the graduate-level machine learning course at CU Boulder. Outside of research, I enjoy rock climbing, skiing, and reading.
My general research interests lie in deep learning. Specifically, I’m interested in representation learning, deep metric learning, and few-shot learning. Research topics, in particular, include:
- Learning supervised, distributed representations of complex data that model inter- and intra-class variance
- Designing deep, stochastic embedding methods, or methods that embed data as distributions that explicitly reflect uncertainty
- Learning disentangled representations that capture the relevant factors of variation in data
- Scott, T. R., Ridgeway, K., and Mozer, M. C. (2019). Stochastic Prototype Embeddings. Submitted for publication. Also arXiv:1909.11702 [stat.ML].
- Scott, T. R., Ridgeway, K., and Mozer, M. C. (2018). Adapted Deep Embeddings: A Synthesis of Methods for k-Shot Inductive Transfer Learning. In S. Bengio et al. (Eds.), Advances in Neural Information Processing Systems 31 (pp. 76-85). Curran Associates. Also arXiv:1805.08402 [cs.LG]. [Acceptance rate of 4% for spotlight presentations]
- Scott, T. R., Ridgeway, K., and Mozer, M. C. (2019). Stochastic Prototype Embeddings. Proceedings of the 36th International Conference on Machine Learning: Workshop on Uncertainty and Robustness in Deep Learning. Link.