Event Categories
Kempner Community Only
Spring into Science
Join us for our second Spring into Science retreat! This all-day event for Kempner community members will showcase work from our community with poster talks, discussion, and presentations. We strongly encourage students and fellows to consider submitting work!
Date:
Wednesday, March 26, 2025
Time:
8:30am - 6:00pm


Important dates
Attendance is free, pre-registration required. Registration is now closed.
Program
8:30-9am: Registration & Breakfast
9am-12pm: Session I Talks
12-1:30pm: Lunch Discussions
1:30-4pm: Session II Talks
4-6pm: Poster Session & Reception
Confirmed Speakers
- M Ganesh Kumar: A Model of Place Field Reorganization During Reward Maximization
- Rosie Zhao & Alex Meterez: Anamnesis in Language Models: RL Post-training Amplifies Behaviors from Pretraining
- Binxu Wang: An Analytical Theory of Power Law Spectral Bias in the Learning Dynamics of Diffusion Models
- Sara Fish: EconEvals: Benchmarks and Litmus Tests for LLM Agents in Unknown Environments
- T Anderson Keller: Galilean Equivariant Neural Dynamics
- Ann Huang: Measuring and Controlling Solution Degeneracy across Task-Trained Recurrent Neural Networks
Confirmed Posters
- A Fine-grained Video Editing Benchmark for Evaluating Emerging Diffusion and Rectified Flow Models, Minghan Li
- A Hopfield network model of neuromodulatory arousal state, Mohammed Osman
- A knowledge-grounded foundation model for AI-guided scientific discovery and precision medicine in neurological disease, Ayush Noori
- A Universe of Biomedical Knowledge Graphs, Iñaki Arango
- Active Electrosensing in Artificial Fish Collectives, Sonja Johnson-Yu
- Adaptive kernel predictors from feature-learning infinite limits of neural networks, Clarissa Lauditi
- AI/ML Open-Source Tools and Research Codebases for the Kempner Community, Max Shad and Sarah Leinicke
- Archetypal SAE: Adaptive and Stable Dictionary Learning for Concept Extraction in Large Vision Models, Thomas Fel
- ATOMICA: Universal modelling of molecular interactions, Ada Fang
- Beyond the Lazy versus Rich Dichotomy: Geometry Insights in Feature Learning from Task-Relevant Manifold Untangling, Chi-Ning Chou
- Challenge Me: Enhancing Conversational Consistency in LLMs to Mitigate AI Sycophancy via Questioning Feedback, Mengyu Wang
- Comparing robust neural networks to primate vision in a binary classification task, Bastien Le Lan
- Cortical reactivations as new compositions of pre-existing ensembles, Josh Stern
- CurveFlow: Curvature-Guided Flow Matching for Image Generation, Drake Du
- DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows, Jonathan Geuter
- Decoding Human Movement Objectives: AI-Driven Insights into Neuromotor Control for Adaptive Rehabilitation, Jordan Feldman
- Efficient optimization of ODE neuron models using gradient descent, Ilenna Jones
- Expand-and-Cluster: Parameter Recovery of Neural Networks, Flavio Martinelli
- LEAPS: A discrete neural sampler via locally equivariant networks, Peter Holderrieth
- Learning and Stability of Value Representation in Ventral Striatum, Farhad Pashakhanloo
- Learning Low-dimensional Latent Dynamics from High-dimensional Observations: Non-asymptotics and Lower Bounds, Yuyang Zhang
- Memory and Knowledge Injection in Gated Large Language Models, Xu Pan
- Multimodal Medical Code Tokenizer, Shvat Messica and Lukas Fesser
- Multimodal SAEs are not Input Compressors, Chloe Su
- No feature learning needed: stimulus sensitivity and readout tuning can explain complex odor discrimination in mice, George Cai
- Odors as “natural language”: sparse neural networks reinforced in mammalian olfactory systems and large language models, Bo Liu
- One fish, two fish, but not the whole sea: Alignment reduces language models’ conceptual diversity, Sonia Murthy
- Optimizing KV Cache for VQA, Lyndon Lam
- ChatGPT Doesn’t Trust Chargers Fans: Guardrail Sensitivity in Context, Victoria Li
- Phyla: Towards a Foundation Model for Phylogenetic Inference, Yasha Ektefaie
- Projecting Assumptions: The Duality Between Sparse Autoencoders and Concept Geometry, Sumedh Hindupur
- SD-LoRA: Scalable Decoupled Low-Rank Adaptation for Class Incremental Learning, Yichen Wu
- Sparse Hebbian Learning for Pattern Discrimination in Cerebellar Ensembles, Benjamin Ruben
- The cerebellar components of the human language network, Colton Casto
- The Effects of Complexity and Scale on the Fitness of Intelligence, Aaron Walsman
- The Geometry of Prompting: Unveiling Distinct Mechanisms of Task Adaptation in Language Models, Artem Kirasov
- Train-Test Task Alignment in a Solvable model of In-Context Learning, Mary Letey
- Traveling Waves Integrate Spatial Information Through Time, Mozes Jacobs
- Unified Predictive Model for Whole-Brain Neural Dynamics of Larval Zebrafish, Yu Duan
- Using Deep Networks to Model Development, Hierarchical Learning, and Robustness in the Primate Visual Stream, Dianna Hidalgo
Venue & Transportation
We are excited to hold our 2025 retreat at Loft on Two , an executive event center with sweeping views of Boston’s Seaport. Below are transport options.
- Via public transit: Loft on Two is situated right next to South Station. If you’re coming from the SEC, the 70/86/66 bus will bring you to Harvard Square. Take the Red Line from there to South Station.
- Via bike: There are limited bike racks at the intersection of Purchase St. and Summer St. There are Blue Bike stations at South Station and the corner of Summer St and Surface Rd.
- Access via car: Parking is garage-only in this area, and limited. We suggest you use public transportation, if possible. If you must drive, Loft on Two has valet parking in their garage off of Essex St. There is additional garage parking available at South Station.