We are pleased to announce an upcoming talk by Prof. Dr. rer. nat. Thomas Martinetz from Universität zu Lübeck, who will be presenting on January 9, 2025, at 15:00.
Thomas Martinetz earned his degrees in Mathematics and Physics from the Technical University of Munich and completed his Ph.D. in Theoretical Biophysics at the University of Illinois at Urbana-Champaign, USA. He gained industry experience at Siemens AG before becoming an Associate Professor at the Institute for Neuroinformatics in Bochum. Since 1999, he has been a Full Professor at the University of Lübeck and directs the Institute for Neuro- and Bioinformatics. He also chairs the Center for Artificial Intelligence Lübeck. His research focuses on developing neural network architectures and machine learning algorithms for pattern recognition and image analysis.
Abstract:
Traditional machine learning wisdom tells us that it needs more training data points than there are parameters in a neural network to be able to learn a given task. Learning theory based on VC-dimension or Rademacher Complexity provides an extended and deeper framework for this "wisdom". Modern deep neural networks have millions of weights, so one should need extremely large training data sets. That's the common narrative. But is it really true? In practice, these large neural networks are often trained with much less data than one would expect to be necessary. We show in experiments that even a few hundred data points can be sufficient for millions of weights. We provide a mathematical framework to understand this surprising phenomenon, challenging the traditional view.