BBL Speaker Series: Scaling Expertise via Language Models with Applications to Education
Talk Title:Scaling Expertise via Language Models with Applications to Education
Speaker: Rose Wang, Computer Science PhD candidate, Stanford University
Location: HBK 2105 and Zoom
Abstract: Access to expert knowledge is essential for fostering high-quality practices across domains like education. However, many novices—such as new teachers—lack expert guidance, limiting their growth and undermining student outcomes. While language models (LMs) hold potential for scaling expertise, current methods focus on surface patterns rather than capturing latent expert reasoning. In this talk, I’ll discuss how my research addresses this by (1) identifying problematic practices for intervention from noisy, large-scale interaction data, (2) developing benchmarks that measure expert quality of practices, and (3) extracting latent expert reasoning to adapt LMs for real-time educational interventions. I’ll highlight how my methods have been deployed to improve K-12 education at scale, positively impacting millions of live interactions between students and educators.
Bio: Rose E. Wang is a Computer Science PhD candidate at Stanford University. She develops machine learning and natural language processing methods to tackle challenges in real-world interactions, with a focus on Education. Her work directly improves the education of under-served students through partnerships she has cultivated during her Ph.D., including Title I school districts and several education companies, impacting 200,000+ students, 1,700+ teachers, 16,100+ tutors, in millions of tutoring sessions across the U.S., UK and India. Her work is recognized by NSF Graduate Research Fellowship, CogSci Best Paper Award, NeurIPS Cooperative AI Best Paper Award, ICLR Oral, Rising Star in Data Science, Building Educational Applications Ambassador Paper Award, and the Learning Engineering Tools Competition Award.