Physics of Learning and Computation in Natural and Artificial Neural Networks
Speaker: Hidenori Tanaka
Join us for a special CBS and Kempner Institute Special Seminar hosted by the Center for Brain Science and the Kempner Institute!
Talk Abstract: Neuroscience is experiencing accelerating advances in the scale and resolution of neural activity recordings from animals engaged in natural behaviors. At the same time, in machine learning, large-scale artificial neural network models are engaging in natural conversations and generating realistic videos. In this talk, I will show how we can take advantage of these technological advances and derive fundamental principles of neural learning and computation by integrating scientific methods from physics, neuroscience, and machine learning. Furthermore, this understanding then allows us to invent new algorithms that make AI models more reliable and efficient. I present three key contributions: (i) a new framework for studying compositional generalization in multimodal generative models, (ii) a generalization of Noether’s theorem in physics to explain how symmetry constrains the geometry of neural learning dynamics, and (iii) decoding the neural code for retinal response to natural scenes by interpretable AI.
Northwest Building, Rm 243 and streaming (link to follow)