About this Seminar

Throughout the history of machine learning, we have seen more and more automation of tasks, such as using neural networks to replace the tedious process of manual feature design. One of the next major steps is to automatically design and tune neural networks themselves: neural architecture search, and more generally, automated machine learning. In this talk, we will discuss the next frontier in neural architecture search from both a theoretical and empirical perspective. We will start by describing performance prediction techniques – predicting the relative performance of a neural network before it is fully trained – and how this leads to advances in neural architecture search. Next, we give a case study in applying neural architecture search to a real-world application, face recognition, to achieve high accuracy and low bias. Finally, we switch gears to discuss theoretical results for data-driven hyperparameter optimization. We conclude by discussing future directions, including fundamental questions and applications in reticular chemistry as well as the climate sciences.

Colin White is Head of Research and Distinguished Scientist at Abacus.AI. He received his PhD from Carnegie Mellon University. His research focuses on automating the search for high-performing deep learning models, as well as explaining and de-biasing them, from both a theoretical and empirical lens. He is a recipient of the NDSEG Fellowship, he has given tutorials on neural architecture search at AutoML’22 and the AutoML Fall School 2022, and he is a program chair for AutoML’23

Website

Seminar Details
Seminar Date
Sunday, January 15, 2023
12:00 PM - 1:00 PM
Status
Happened As Scheduled