Bay Area Private Schools Are Bringing AI into Their Classrooms: What Can They Tell Us About the Future of Education? 

The closing panel featured ten students discussing their experiences with AI. (Eleanor Jackson/Peninsula Press)

This fall, while most Bay Area public schools continued debating when and if AI tools should enter classrooms, private schools began integrating generative AI tools into core curriculums for students. 

Less limited by funding constraints and government bureaucracy, the schools can more nimbly test tools and adjust curricula, providing a window into the future of AI and education. 

Than Healy, head of Menlo School, said that they are investing in AI because “these tools are the weakest they are ever going to be in our kids’ lives and if we really want to prepare them for the world that they’re gonna find, this has to be a part of it.”  

In an interview over lunch at Khan Lab School’s November AI Summit, a gathering of a few hundred teachers, students, and AI startups,Healy said the only thing that a teacher can’t do is ban AI outright. 

Students at the conference, who are thinking so deeply about AI that they spent their Saturday at a conference, largely believed that AI already enhances their education. One of the students on the closing panel shared that “ if you’re trained to use it as a tool…to make learning easier for you and to make your work higher quality then it is very useful and will actually boost your ability to learn.”

Students gave examples of how they used it in learning including: honing debate skills by recording themselves debating then asking AI to critique speech patterns and filler words; dumbing down complex topics, like special relativity, to teach themselves the basics; and even using AI to clone their teachers with consent.

Douglas Kiang, a teacher at Menlo School, shared a story about how students in one of his coding classes worked on an app to help people with ALS communicate without speaking. Students started “vibe-coding” the product – using AI to write code – but they ended up caring so much about the project that they didn’t trust AI to do the job. They wanted to have ownership over it. 

When properly motivated, educated, and provided with the right AI tools, students can use AI as a way to enhance their learning rather than avoid it. 

However, even at one of the most AI friendly schools in the country, Khan Lab Schools, students are concerned about how AI may be impacting their education. 

On Nov. 11, Khan Lab School hosted their AI Edu Summit. (Eleanor Jackson/Peninsula Press)

Sam Leen, a senior at Khan Lab School, noticed his classmates beginning to use AI as a way to get to their destination faster instead of using it to enhance their learning journey. 

“When people  take the thing that AI says and put it on the page, it doesn’t go through their own head,” said Leen. “The biggest debate that I’ve seen in AI is between people who think that the result of AI is really good and people who are worried that the intermediary is being lost.” 

This past semester, Leen grew concerned enough about the impact of AI on education that he researched, authored, and rolled out an “AI Honor Code” for Khan Lab School Seniors. As of December 4th, all 89 high school students and their teachers signed the honor code. 

Leen used his love of philosophy to build the honor code, boiling it down to just four bullet points designed to preserve the “intellectual dimension of human being.” 

The first two bullets direct students to avoid using AI when performing skills the assignment is designed to teach (i.e. don’t use AI to code if you’re learning to code) and to evaluate and interpret knowledge provided by AI. The second two direct teachers to explicitly explain the purpose of their lessons and to establish the skills that AI will never be able to substitute in their class. 

Teachers at Khan Lab Schools are working to help their students understand these tools, their role in society, and their potential flaws. This fall Kanishka Seth, the ed tech lead and humanities specialist at Khan Lab School, taught a class called “Building with AI” focused on showing students how these systems work and demonstrating the implicit bias present in both humans and algorithms. 

She co-created the class with a parent who worked at Meta and used his experience to create interactive games and to provide insights into how the systems are designed in the real world. 

After the success of the first semester, Seth and her colleagues have proposed providing a mandatory AI class for every middle schooler “just like a research skills or physical education class.”

Of course, they are already teaching a range of digital citizenship lessons that touch on the safe use of AI. However, the new course would “aim to address the most pressing trends and concerns around unrestricted use of AI, and also inspire students to come up with solutions and boundaries.”

While AI private schools in Silicon Valley are grappling with precisely how to integrate AI into the classroom, some public school students are also using AI to enhance their education and craving more guidance from their teachers. 

Kai Etkin presents his research around AI’s impact on reading comprehension. (Eleanor Jackson/Peninsula Press)

Kai Etkin, a junior at Los Altos High School, presented a research paper on AI and reading comprehension that he published with his brother and their guidance counselor. 

Etkin shared that AI policy and AI use looked very different at his school than it does at the private schools that he heard speak at the conference.

While students at Menlo might be required to use AI, at Los Altos, a public high school, “the policy is just no AI,” said Etkin. He didn’t fault his teachers or administrators for this gap, but he hoped the policy would change soon given that the current policy “completely ignores all the potential benefits of AI.” 

Many public schools are actively working on that change, trying to set the right guardrails, pick the right tools, and train their teachers.

State legislators are trying to provide more concrete guidance for districts. In California, Senate Bill 1288 put together a Public Schools: Artificial Intelligence Working Group, but it’s not set to provide its guidance until July 2026. 

Until then, public schools are left to their own devices to navigate this challenging new arena. 

Correction: This story has been updated to correct the name of the Khan Lab School’s November AI Summit.

Author

  • Eleanor Jackson

    Eleanor (Ella) Jackson grew up in Houston, Texas. She graduated from Colby College in 2019 with a BA in Global Studies and a minor in Creative Writing. After graduation, she packed up her bags and moved to Washington, D.C. where she joined the World Resources Institute’s (WRI's) Urban Mobility team and began working on vehicle electrification projects in the United States, Latin America, and Asia. She spent the majority of this time researching the intersections of vehicle electrification and environmental justice in the United States for WRI's Electric School Bus Initiative. She then built out both student and teacher engagement workstreams to bring the project closer to those most impacted by the transition. She plans to continue working to accelerate the climate transition in a way that acknowledges, addresses, and corrects injustice. At Stanford, Ella plans to learn how to weave together narratives and data to share impactful stories and to build connections between communities both locally and globally.

Scroll to Top