They say deep learning is just curve fitting. But how good is it in curve fitting exactly? Are DNNs as good as, or even better than, classical curve-fitting tools, e.g., splines and wavelets? In this talk, I will cover my group’s recent paper on this topic and some interesting insight. Specifically, I will provide new answers to “Why is DNN stronger than kernels?” “Why are deep NNs stronger than shallow ones?”, “Why ReLU?”, “How do DNNs generalize under overparameterization?”, “What is the role of sparsity in deep learning?”, “Is lottery ticket hypothesis real?” All in one package. Intrigued? Come to my talk and find out!
Link to the paper: arxiv.org/abs/2204.09664
Yu-Xiang Wang is the Eugene Aas Assistant Professor of Computer Science at UCSB. He runs the Statistical Machine Learning lab and co-founded the UCSB Center for Responsible Machine Learning. Prior to joining UCSB, he was a scientist with Amazon Web Services’s AI research lab in Palo Alto, CA. Yu-Xiang received his PhD in Statistics and Machine Learning in 2017 from Carnegie Mellon University (CMU). Yu-Xiang’s research interests include statistical theory and methodology, differential privacy, reinforcement learning, online learning and deep learning. His work had been supported by an NSF CAREER Award, Amazon ML Research Award, Google Research Scholar Award, Adobe Data Science Research Award and had received paper awards from KDD'15, AISTATS'19 and COLT’21.
YOU ONLY NEED TO REGISTER ONCE TO ATTEND THE ENTIRE SERIES – We will send you email announcements with details of the upcoming speakers.
Register in advance for this webinar: https://usc.zoom.us/webinar/register/WN__0VhakI6Q6i3JsasdmNWcA
After registering, you will receive an email confirmation containing information about joining the Zoom webinar.
If speaker approves to be recorded for this AI Seminar talk, it will be posted on our USC/ISI YouTube page within 1-2 business days: https://www.youtube.com/user/USCISI.
Host: Mike Pazzani POC: Pete Zamar