Building AI Readiness
Actionable K-12 Insights and Investment Pathways
By Amy Chen Kulesa, Marisa Mission, Mary K. Wells, and Alex Kotran
Introduction
Generative artificial intelligence (GenAI) technology is transforming how we live, work, and learn, and the U.S. education sector faces unique challenges in keeping pace (Appendix A). While businesses are using artificial intelligence (AI) to increase efficiencies and scientists are using it to accelerate discovery, the value proposition for K-12 education is more complex. AI could personalize learning, support teachers, and improve student outcomes, but it also brings risks such as data security and privacy, deepening learning gaps, and overreliance on technology. Unlike other sectors, education must weigh not just performance and cost, but also how AI affects human development, trust, and improved outcomes for all students. These considerations are even more urgent given the national data showing declines in reading and math1 and challenges in youth mental health2 following the COVID-19 pandemic. If implemented with care, AI could help reinforce core instruction and accelerate recovery — but without high-quality content and thoughtful integration, it risks compounding existing learning gaps.
The pace of AI advancement has been escalating. The global market for AI in education is anticipated to grow rapidly from $5.2 billion in 2024 to $112.3 billion in 2034.3 More than 50% of teens aged 13 to 18 say they have used AI chatbots or text generators.4 There are now hundreds of AI education tools across a range of use cases, from lesson plan generation to classroom management simulations to career coaching.5 As “reasoning” models, agentic AI, and early signals of artificial general intelligence (AGI) take shape, the pace of AI innovation and change has been accelerating and will likely continue to do so.
Meanwhile, the K-12 public education ecosystem is not yet AI-ready. While AI capabilities expand, most schools remain underprepared. As of late 2024, 58% of teachers reported receiving no professional development related to AI.6 Of districts that are providing training, many are taking a “do it yourself” approach.7 This readiness gap, which spans awareness, knowledge, skills, and infrastructure, threatens to widen digital divides and deepen existing inequities. The decisions made today by funders, developers, and education leaders will determine whether AI becomes a lever for improved student outcomes and teacher support, or a missed opportunity marked by fragmented adoption and unintended consequences.
In light of this urgency, Bellwether and aiEDU: The AI Education Project convened a group of philanthropic funders and education leaders in March 2025 for an action-oriented discussion on building AI readiness across the K-12 education ecosystem (Appendix B). This effort builds on Bellwether’s Learning Systems series, which outlines holistic recommendations for AI in education grounded in AI literacy, policy, data, research, and a commitment to keep humans in the loop. While that systems-level perspective remains an essential foundation, the March 2025 convening focused more narrowly on educator- and student-facing readiness, where the impact of AI is already being felt and where the need for practical, high-quality solutions is most immediate for teachers with limited resources and students in underserved communities. Over two days, participants named the most pressing challenges schools and educators face, proposed five areas where philanthropic investment could make a meaningful difference, and surfaced ideas to help guide how the education field can move forward with AI in a clear, coordinated, and equitable way.
This report synthesizes key takeaways from the convening into a guide for funders and education leaders working to prepare K-12 systems for AI readiness. At its core, this report calls for thoughtful investment in people, capacity, and infrastructure so that the K-12 education sector can proactively shape an AI-powered future that works for all teachers and students.
Problems of Practice
In discussions through panels and breakout groups, attendees identified several key challenges that must be overcome to build AI readiness in education. These challenges span three major “problems of practice:” educator preparation, student readiness, and the evolving role of teachers in an AI-enhanced learning environment.
Educator Preparation and Capacity
Many educators lack guidance and professional development for using AI, especially in historically underserved districts.8 For many school systems, AI is acknowledged as important but not seen as an urgent priority, so dedicated AI focus remains scarce. Truly effective capacity building is time- and resource-intensive, which makes it hard to prioritize amid competing demands and funding constraints. Furthermore, the breakneck pace of AI advancement complicates training efforts — by the time educators learn one tool, new tools and/or features have emerged.
Concerns regarding AI’s role in society, data privacy, and potential bias in tools and products also make educators hesitant to embrace future AI use.9 If these preparation gaps persist, AI adoption could be uneven and exacerbate inequities, with only pockets of educators able to leverage new tools effectively.
The “people side” of innovation is lagging behind the technology, and building educators’ capacity is essential for any AI innovation to truly take root in K-12 classrooms across the country.
Student Readiness for an AI World
Students are engaging with AI often before schools have had a chance to respond. Conversations about AI’s opportunities and risks are not yet happening in most classrooms, leaving students without a foundational understanding of how these tools work or how to use them responsibly. Even in the postsecondary ecosystem, 86% of students globally are regularly using AI in their studies, yet 1 in 2 students do not feel AI-ready.10
One convening participant observed that it is “harder than ever to be a kid” when GenAI can act as an all-too-easy helper or companion, which could compromise their intellectual independence and social development.11 As GenAI’s capabilities can handle increasingly complex cognitive tasks, students need stronger preparation in higher-order skills like critical thinking, creativity, and ethical reasoning.
If K-12 schools are not proactive, only a subset of students (likely in well-resourced schools) will gain AI proficiency and continue to develop critical thinking skills. As their peers get left behind, a new kind of digital divide will emerge.
Evolving Teacher Roles in the Age of AI
AI has the potential to reshape what teachers do and how their roles are perceived. Emerging AI tools often focus on efficiency to streamline tasks like grading, lesson planning, and basic tutoring — an appealing prospect amid widespread teacher burnout12 and staffing shortages.13 Yet teaching is more than a series of tasks; it is a deeply relational and adaptive profession. Teachers need space to wrestle with planning and pedagogy to grow their expertise and respond meaningfully to students’ needs.
Overreliance on AI risks deskilling educators14 and weakening the teacher-student connection.15 At the same time, competing priorities within schools16 and/or lack of prioritizing AI in teacher preparation programs17 — as well as teacher concerns about job security,18 classroom relevance,19 and diminished prestige20 — could limit the profession’s ability to evolve alongside GenAI.
AI can enhance and elevate the teaching profession, but realizing that promise requires K-12 districts to develop a clear, shared vision; take an inclusive approach that brings educators into the conversation; and stay committed to preserving the deeply human aspects of teaching.
Five Key Investment Areas
Addressing these problems of practice requires sustained, strategic investment across the K-12 education ecosystem. While many levers — such as policy, data, research, and human relationships — remain essential, this report focuses on a targeted set of five areas where philanthropic investment can play a catalytic role specific to accelerating readiness and capacity for educators and students alike. Across each of these areas, bright spots are emerging with early-stage organizations and promising pilot programs that offer proof of concept. Yet their reach remains limited, and additional investments are needed to scale what works, close gaps, and build coherence. These five areas are not a comprehensive solution but rather a starting point for coordinated action that can support the broader K-12 system in adapting to an AI-powered future.
1. Educator Capacity Building for AI
Develop the human capital needed for an AI-enabled K-12 education system through targeted training, new roles, and professional development.
Building an AI-ready K-12 education system starts with investing in people. Even the most powerful AI tools are only as effective as the educators who use them.21 This investment area focuses on developing educator capacity through targeted training, new professional roles, and sustained support — ensuring that human judgment, creativity, and expertise remain central in AI-integrated classrooms.
Opportunities span the educator pipeline. Pre-service teacher preparation programs can integrate foundational AI literacy and pedagogy, so new teachers enter the profession ready to engage with emerging tools. For current educators, robust in-service training and coaching are essential — not just one-off workshops but instead ongoing professional learning communities where teachers can explore, test, and refine AI-supported practices to hone their teaching skills.
Specialized roles such as AI instructional coaches or ed tech leads can serve as school-based experts and peer mentors, helping to build collective capacity among school staff over time.
Furthermore, there are opportunities to connect educators more directly with the AI field. Teacher externships and cross-sector advisory boards could expose educators to cutting-edge developments and broaden their instructional perspectives.22 But for these models to be effective, they must go beyond isolated partnerships. Systematizing opportunities through structured matching, incentives, and accessible resources will ensure that all teachers benefit, not just the most well-connected.
Investing in educator capacity is not just about supporting today’s classrooms; it is also about shaping the future of the profession so that educators model the kind of lifelong learning to cultivate in students. This people-first approach ensures that AI integration enhances — not replaces — the human elements of teaching, and that all educators, regardless of setting, are equipped to lead in an AI-powered world.
“Preparing educators to understand, use, and engage with AI is critical for navigating the future. This focus in preparation and development will empower both teachers and students in the evolving world of AI.”
—Tim Hemans, Executive Director of College and Career Development, Gwinnett County Public Schools, Georgia
2. District Capacity and Implementation Support
Help school systems pilot, implement, and scale AI initiatives through coordinated support and knowledge-sharing.
To move from experimentation to systemwide impact, K-12 schools need coordinated support for AI adoption. While teacher training is essential, it must be matched with district-level capacity to pilot, implement, scale, and learn from AI initiatives effectively. Achieving this requires clear road maps, shared resources, and sustained technical assistance.
Philanthropy can help fill this gap by investing in structured implementation support for districts at various stages of readiness. This could include funding pilot programs in a range of settings — urban, rural, and under-resourced districts — to test AI tools in real classrooms, then turning those lessons into practical resources like toolkits, case studies, or implementation guides that other districts can use. A dedicated technical assistance hub or implementation network could offer districts access to expert guidance, peer learning opportunities, and coaching on critical areas like policy development, workforce training, and curriculum integration.
This investment is not just about selecting the right tools — it is also about building the infrastructure and leadership muscle to adopt them strategically. Without such support, districts risk “reinventing the wheel” or making costly, avoidable mistakes in isolation.23 A shared implementation infrastructure would accelerate collective progress, reduce duplication of effort, and help ensure that smaller or under-resourced districts can participate fully in AI innovation — not just well-funded early adopters.
Investing in district AI implementation capacity transforms one-off successes into scalable, sustainable change. It helps K-12 education systems move from reactive experimentation with AI toward proactive, coherent AI strategies that serve all students.
3. Evidence-Based Tools That Enhance Teaching (Shaping “Smart Demand”)
Shape the ed tech market by strategically investing in AI tools that enhance — not just streamline — teaching practices, cultivating informed demand for pedagogically sound innovations in a crowded, fast-moving marketplace.
Philanthropy can shape the direction of AI innovation in education through strategic investments in tools that elevate teaching quality and research-backed practices, not merely efficiency. While many existing AI tools focus on surface-level productivity — like automating email responses or administrative tasks — this investment area prioritizes pedagogy, supported by research and intentionally designed with guardrails that guide educators toward exemplary instructional practices. The goal is not just easier teaching, but better teaching.
With an influx of new tools on the market, it is understandable that practitioners and funders are approaching new offerings with caution. However, the AI hype makes it even more important to invest in and elevate high-quality options. By funding early stage research and tools that prioritize high-quality teaching, philanthropy can raise expectations across the ed tech sector, influencing AI developers to build tools collaboratively with educators rather than merely for them. Additionally, funders can support consortia of K-12 districts to clearly articulate needs, develop evaluation criteria, and shape early demand — steering the market toward effective innovations rather than superficial solutions.
Several attendees emphasized that philanthropy should not just fund end-stage products. It should also play a role in bridging research and product development earlier — bringing instructional rigor into AI tool design processes. Evidence-based tools do not just offer stronger outcomes for students; they can also be market differentiators, opening new pathways for sustainable growth. In this moment of rapid expansion, funders can help shape demand and supply — by setting expectations, aligning incentives, and making quality the norm, not the exception.
“For funders looking to invest in high-quality AI solutions, prioritizing evidence and capacity building will be critical. Teachers and school leaders are being inundated with claims about AI’s potential. Funders can maximize their impact by backing early-stage research and development, with an emphasis on implementations that meet the needs of all students.”
—Cameron White, Senior Partner, NewSchools
4. Quality Vetting and a Clearinghouse for AI Tools
Establish a national independent intermediary to help educators and system leaders vet AI-powered tools for quality, efficacy, and safety.
Philanthropy can invest in an independent intermediary to help K-12 educators and district leaders confidently navigate the crowded landscape of AI-powered tools. With hundreds of new AI products flooding the market, school leaders and educators face overwhelming choices, often relying on informal word-of-mouth recommendations or limited pilot experiences rather than systematic evaluation. A national independent clearinghouse — similar to a Consumer Reports for AI in education — could rigorously vet and certify products based on criteria such as bias mitigation, data privacy, instructional alignment, and demonstrated efficacy.
While centralized vetting can support better decisions, it may not be enough to change how schools choose what to buy. Districts still frequently rely heavily on trusted peer recommendations when making procurement decisions, underscoring the need to pair rigorous evaluation with active dissemination strategies and practitioner networks. Additionally, given the rapidly evolving nature of AI technology, traditional static, rubric-based evaluations of AI tools are unlikely to keep pace. Instead, ongoing benchmarking — leveraging agile AI performance indicators and dynamic feedback loops — can more effectively track tools’ evolving capabilities and relevance. The goal is not to endorse more tools, but to elevate the right ones — those that are grounded in research, built to enhance instruction, and tested for real-world impact.
Creating a trusted vetting mechanism can protect districts from adopting unsafe or ineffective tools. It can also incentivize developers to build toward real-world impact and safety rather than marketing hype. Philanthropy can help shape the ed tech market by establishing clear benchmarks and standards that elevate effective, safe, and instructionally sound AI tools, making responsible adoption more likely across the field.
“Teaching is a social profession, and AI won’t replace teachers, but it can either erode or elevate the role, depending on how we design and deploy it. If districts, ed tech innovators, and funders act with intention, we can create AI tools that enhance teacher longevity, effectiveness, and student connection. Let’s not repeat the mistakes of social media by moving fast without forethought.”
—Oliver Sicat, CEO and Co-Founder, Ednovate
5. Field-Building and Storytelling Guiding AI in Education
Build understanding by elevating a range of perspectives and connecting efforts across educators, communities, and AI tool developers to align on a shared vision.
The current conversation around AI in education is often fragmented — driven by hype, fear, or a focus on specific tools — without a shared sense of purpose. Convening attendees noted the need to build a clearer understanding of what AI readiness looks like for K-12 students, teachers, and schools. That work includes not only technical definitions and standards but also the stories and examples that make the purpose of AI real and relevant for communities, families, and educators.
Rather than promoting a top-down vision, attendees emphasized the importance of surfacing an array of grounded narratives — especially from the people closest to classrooms. These stories could inform the creation of model profiles of “AI-ready” schools, guidance on what students should know about AI by graduation, or examples of effective classroom use. Attendees also highlighted the importance of embedding human-centered values — like equity, relationships, and student agency — into how AI’s role in learning is framed.
Furthermore, AI in education must center the perspectives of those closest to the work — teachers, students, families, and school leaders — to ensure the vision is not only technically sound but also resonant, reflective of the school community, and enduring. Engaging parents and caregivers is especially critical. Too often, education reform efforts have underattended to families and caregivers. With the rise of school choice policies such as education savings accounts,24 alternative learning models,25 and parent-directed learning tools, families are playing a more active role in shaping students’ educational paths both in and out of school.26 AI will inevitably intersect with those choices, creating new risks and opportunities. The education sector must treat families not just as stakeholders, but as authentic collaborators in this moment, ensuring their insight and agency helps to shape the future of AI in K-12 education.
This work cannot be done in silos. Field-building requires collaboration among educators, nonprofit leaders, policymakers, funders, researchers, and AI developers. Cross-sector coordination can help align language, elevate promising practices, and bring greater coherence to a fast-moving space. Attendees also pointed to lessons from past education movements: Reforms like the Common Core State Standards struggled without local buy-in,27 while efforts like the science of reading gained traction by lifting up proximate voices and building trust.28 A successful approach to AI must follow that latter path — grounded in lived experience, strengthened by collaboration, and focused on shared understanding that can guide action across the field.
Conclusion
The conversation around AI in education is shifting, from curiosity to clarity, from experimentation to urgency. Where leaders once asked, “What can AI do?” they are now asking, “What should it do, for whom, and to what end?” The focus is turning toward purpose, coherence, and impact.
Insights from the Bellwether–aiEDU convening point to a clear path forward: one rooted in strategic alignment, collaboration, and community voice. AI in education will not succeed through technology alone. Lasting progress depends on the alignment of tools, training, policy, and local context. Rather than overinvesting in any single piece of the puzzle, funders and education leaders must pursue solutions that address clear needs, are anchored in evidence, are grounded in shared values, and work across roles and systems. Most importantly, these efforts must center student learning, ensuring that AI helps all young people grow, think critically, and thrive.
Strong partnerships are essential. No single organization, whether a district, nonprofit, research lab, or ed tech firm, can go it alone. Cross-sector collaboration helps scale what works, reduces duplication, and builds the connective tissue between innovation and classroom practice. Philanthropy has a unique role to play in incentivizing this infrastructure to prioritize collaboration: funding networks, intermediaries, and relationships that move ideas from pilot to practice.
This moment is also a chance to shape what “good” looks like. District leaders, educators, and funders can set the bar for AI tools that are safe, effective for learning for all students (especially those furthest from opportunity), and support, not replace, educators. At the same time, healthy skepticism must be part of the equation. Embedding critical thinking about AI into student learning and decision-making can help the K-12 education sector demand more transparency and accountability from the tools it adopts.
The choices made today will shape how AI shows up in classrooms for years to come. If the sector acts with strategy and intention, philanthropic, district, and ed tech leaders can work together to ensure that AI does not just enter education but truly strengthens it — bolstering learning, uplifting educators, and helping every student thrive in a rapidly changing world.
“The real upside [of GenAI] is not in students learning from AI but in collaborative environments where educators and young people leverage technology to tackle meaningful community problems. This impact-focused, human-centered, AI-leveraged approach develops agency, technical proficiency, and the collaborative skills essential for students to thrive in this fast-changing world.”
—Zachary Kennelly, Teacher and AI Pilot Lead, DSST Public Schools, Colorado
Appendices
Appendix A
GLOSSARY OF COMMON AI-RELATED TERMS
Agentic Artificial Intelligence (Agentic AI): The key differentiator of agentic AI models (“agents”) or systems (groups of agents working together) lies in the machine’s ability to make decisions and act autonomously.29 AI agents can react to dynamic environments or inputs and do not need continuous human interaction to execute complex, multistep tasks. Increasingly, agents can also use computer vision to interact with other computer programs or applications and take actions based on a prompt or task.30
AI Readiness: Ensuring students, educators, and school systems have the skills, tools, and critical-thinking mindsets to navigate and shape a world increasingly influenced by AI. For more, see aiEDU’s AI Readiness Framework, which introduces a comprehensive way for students, teachers, and school districts to understand, apply, and evaluate what being “AI-ready” means in a time of rapid technological change. (Note: AI Readiness differs from AI Literacy, which refers to how one broadly amasses and retains a baseline understanding of the technology.)
Artificial General Intelligence (AGI): Generally used to describe a level of AI that can fully replicate or even exceed the cognitive abilities of human intelligence across any task,31 although the exact benchmarks that signal its achievement are widely debated.32 While AGI is still a concept more than a reality, leaders of technology companies such as Anthropic and OpenAI are seeing “systems that start to point to AGI”33 and have predicted that technology might achieve AGI as early as 2026 or 2027.34
Generative Artificial Intelligence (GenAI): Models that use algorithms, machine learning, and deep learning to create entirely new content. These models use families of artificial neural networks in combination with techniques like natural language processing (which allows computers to process human language) and Large Language Models (or LLMs, which allow computers to respond in human language).35 The result is a system that can generate new data in the form of “audio, code, images, text, simulations, and video.”36
Appendix B
AI READINESS CONVENING PARTICIPANTS, MARCH 2025
In addition to the authors, the following individuals lent their expertise on this report, including:
Allison Scott Kapor Foundation |
Emma Doggett Neergaard The AI Education Project |
Pete Fishman NewSchools |
Andrew J. Rotherham Bellwether |
Ian Connell Charter School Growth Fund |
Rebecca Holmes Colorado Education Initiative |
Anona Shugart Walker Google.org |
Kenji Treanor Stuart Foundation |
Robert Crosby, III Valhalla Foundation |
Babak Mostaghimi LearnerStudio |
Kevin Hall Charter School Growth Fund |
Shayne Spalten Charles and Lynn Schusterman Family Philanthropies |
Cameron White NewSchools |
Loni Mahanta The AI Education Project |
Tiffany Taylor GSV Ventures/ASU+GSV Summit |
Christian Pinedo The AI Education Project |
Mark Teoh Raikes Foundation |
Tim Carey Chan Zuckerberg Initiative |
Don Daves-Rougeaux California Community College Chancellor on Workforce Development, Strategic Partnerships, and Generative AI |
Neeru Khosla CK-12 Foundation |
Tim Hemans Gwinnett County Public Schools, Georgia |
Eden Xenakis Bezos Family Foundation |
Oliver Sicat Ednovate |
Zachary Kennelly DSST Public Schools, Colorado |
Endnotes
- “Explore NAEP Long-Term Trends in Reading and Mathematics,” The Nation’s Report Card, https://www.nationsreportcard.gov/ltt/.
- Zara Abrams, “Kids’ Mental Health Is in Crisis. Here’s What Psychologists Are Doing to Help,” Monitor on Psychology 54, no. 1 (January 1, 2023), https://www.apa.org/monitor/2023/01/trends-improving-youth-mental-health.
- “AI in Education Market Size, Share and Trends 2024–2034,” Precedence Research, updated August 9, 2024, https://www.precedenceresearch.com/ai-in-education-market.
- Mary Madden, Angela Calvin, Alexa Hasse, and Amanda Lenhart, “The Dawn of the AI Era: Teens, Parents, and the Adoption of Generative AI at Home and School,” Common Sense Media, 2024, 4, https://www.commonsensemedia.org/sites/default/files/research/report/2024-the-dawn-of-the-ai-era_final-release-for-web.pdf.
- Ben Kornell, Alex Sarlin, Sarah Morin, and Laurence Holt, “The Edtech Insiders Generative AI Map,” Edtech Insiders, November 14, 2024, https://edtechinsiders.substack.com/p/the-edtech-insiders-generative-ai.
- Lauraine Langreo, “’We’re at a Disadvantage,’ and Other Teacher Sentiments on AI,” Education Week, October 29, 2024, https://www.edweek.org/technology/were-at-a-disadvantage-and-other-teacher-sentiments-on-ai/2024/10.
- Melissa Kay Diliberti, Robin J. Lake, and Steven R. Weiner, “More Districts Are Training Teachers on Artificial Intelligence,” RAND Corporation, April 8, 2025, https://www.rand.org/pubs/research_reports/RRA956-31.html.
- Melissa Kay Diliberti et al., “Using Artificial Intelligence Tools in K-12 Classrooms,” RAND Corporation, April 17, 2024, 8–9, https://www.rand.org/pubs/research_reports/RRA956-21.html.
- Ibid, 8.
- “What Students Want: Key Results from DEC Global AI Student Survey 2024,” Digital Education Council, August 7, 2024, https://www.digitaleducationcouncil.com/post/what-students-want-key-results-from-dec-global-ai-student-survey-2024.
- Convening participant, personal communications, March 11, 2025.
- Sy Doan, Elizabeth D. Steiner, and Rakesh Pandey, “Teacher Well-Being and Intentions to Leave in 2024: Findings From the 2024 State of the American Teacher Survey,” RAND Corporation, June 18, 2024, https://www.rand.org/pubs/research_reports/RRA1108-12.html.
- National Center for Education Statistics, “Most Public Schools Face Challenges in Hiring Teachers and Other Personnel Entering the 2023–24 Academic Year,” U.S. Department of Education, October 17, 2023, https://nces.ed.gov/whatsnew/press_releases/10_17_2023.asp.
- Alexander M. Sidorkin, “Artificial Intelligence: Why Is It Our Problem?,” Educational Philosophy and Theory, May 2024, https://www.tandfonline.com/doi/full/10.1080/00131857.2024.2348810; John Bailey, “AI in Education: The Leap Into a New Era of Machine Intelligence Carries Risks and Challenges, but Also Plenty of Promise,” Education Next 23, no. 4 (2023), https://www.educationnext.org/a-i-in-education-leap-into-new-era-machine-intelligence-carries-risks-challenges-promises/.
- Abdulrahman M. Al-Zahrani, “Unveiling the Shadows: Beyond the Hype of AI in Education,” Heliyon 10, no. 9 (2024): e30696, 5, https://www.sciencedirect.com/science/article/pii/S2405844024067276.
- Julia H. Kaufman et al., “Uneven Adoption of Artificial Intelligence Tools Among U.S. Teachers and Principals in the 2023–2024 School Year,” RAND Corporation, February 11, 2025, 15, https://www.rand.org/pubs/research_reports/RRA134-25.html.
- Steven Weiner, Robin Lake, and Jessica Rosner, “AI Is Evolving, but Teacher Prep Is Lagging: A First Look at Teacher Preparation Program Responses to AI,” Center on Reinventing Public Education, October 2024, https://crpe.org/ai-is-evolving-but-teacher-prep-is-lagging/.
- Colleen McClain et al., “How the U.S. Public and AI Experts View Artificial Intelligence,” Pew Research Center, April 3, 2025, 12, https://www.pewresearch.org/internet/2025/04/03/how-the-us-public-and-ai-experts-view-artificial-intelligence/.
- Langreo, “’We’re at a Disadvantage,’ and Other Teacher Sentiments on AI.”
- Benjamin D. Parker, “Considering the Impact of AI on the Professional Status of Teaching,” The Clearing House: A Journal of Educational Strategies, Issues and Ideas 97, no. 6 (2024), https://www.tandfonline.com/doi/full/10.1080/00098655.2024.2441805.
- For example, Khan Academy Chief Learning Officer Kristen DiCerbo said that students often struggled to engage with the pilot Khanmigo tool, highlighting the importance of teacher guidance when using AI in classrooms. Tharin Pillay, “Kristen DiCerbo,” Time, September 5, 2024, https://time.com/7012801/kristen-dicerbo/.
- Convening participant, personal communications, March 11, 2025. Relevant examples include: “The EdSafe AI Industry Council,” EdSafe AI Alliance, https://www.edsafeai.org/industry-council; “First State Tech Partnership,” Tech Council of Delaware, https://techcouncilofdelaware.org/page/fstp.
- See example: Alex Spurrier and Marisa Mission, “The Leading Indicator: Issue Two,” Bellwether, July 11, 2024, https://bellwether.org/ai-newsletter/the-leading-indicator-issue-two/; Dana Goldstein, “A.I. ‘Friend’ for Public School Students Falls Flat,” The New York Times, July 1, 2024, https://www.nytimes.com/2024/07/01/us/ai-chatbot-los-angeles-schools.html.
- Juliet Squire, Kelly Robson Foster, Lynne Graziano, and Andy Jacob, “Public Money, Private Choice: The Components and Critiques of Education Savings Accounts,” Bellwether, March 25, 2025, 12–13, https://bellwether.org/publications/public-money-private-choice/.
- Juliet Squire and Alex Spurrier, “Some Assembly Required: How a More Flexible Learning Ecosystem Can Better Serve All Kids and Unlock Innovation,” Bellwether, August 2022, 7, https://bellwether.org/wp-content/uploads/2022/08/SomeAssemblyRequired_BetaByBellwether_August2022_ FINAL.pdf.
- See example: Greg Toppo, “Homeschoolers Embrace AI, Even as Many Educators Keep It at Arms’ Length,” The 74, June 25, 2024, https://www.the74million.org/article/homeschoolers-embrace-ai-even-as-many-educators-keep-it-at-arms-length/.
- Phyllis W. Jordan, “Local Control Is Part of the Problem, a New Book About Education Contends,” Education Next, June 2, 2021, https://www.educationnext.org/local-control-is-part-of-the-problem-a-new-book-about-education-contends/.
- See example: David Casalaspi, Marisa Mission, and Hailly T.N. Korman, “From Policy to Impact: North Carolina’s Implementation of the Science of Reading,” Bellwether, January 2025, 6, https://bellwether.org/publications/from-policy-to-impact/.
- Mark Purdy, “What Is Agentic AI, and How Will it Change Work?,” Harvard Business Review, December 12, 2024, https://hbr.org/2024/12/what-is-agentic-ai-and-how-will-it-change-work.
- See example: “Introducing Computer Use, a New Claude 3.5 Sonnet, and Claude 3.5 Haiku,” Anthropic, October 22, 2024, https://www.anthropic.com/news/3-5-models-and-computer-use.
- Dave Bergmann and Cole Stryker, “What Is Artificial General Intelligence (AGI)?,” IBM, September 17, 2024, https://www.ibm.com/think/topics/artificial-general-intelligence.
- Lauren Leffer, “In the Race to Artificial General Intelligence, Where’s the Finish Line?,” Scientific American, June 25, 2024, https://www.scientificamerican.com/article/what-does-artificial-general-intelligence-actually-mean/.
- Sam Altman’s blog, “Three Observations,” February 9, 2025, https://blog.samaltman.com/three-observations.
- Dario Amodei, “Machines of Loving Grace: How AI Could Transform the World for the Better,” October 2024, https://darioamodei.com/machines-of-loving-grace; “Anthropic’s Recommendations to OSTP for the U.S. AI Action Plan,” Anthropic, March 6, 2025, https://www.anthropic.com/news/anthropic-s-recommendations-ostp-u-s-ai-action-plan.
- Faisal Kalota, “A Primer on Generative Artificial Intelligence,” Education Sciences 14, no. 2 (2024): 172, https://doi.org/10.3390/educsci14020172.
- Ibid., 8.
Acknowledgments, About the Authors, About Bellwether and aiEDU: The AI Education Project
Acknowledgments
We would like to thank the March 2025 convening participants who gave their time and shared their knowledge with us to inform our work. Thank you also to NewSchools and CK-12 Foundation for their financial support of this project.
We would also like to thank our Bellwether colleague Teresa Mooney for her input, as well as Christi Shingara and Janine L. Sandy for their support. Thank you to Amy Ribock, Kate Neifeld, Andy Jacob, McKenzie Maxson, Zoe Cuddy, Julie Nguyen, Mandy Berman, and Amber Walker for shepherding and disseminating this work, and to Super Copy Editors.
The contributions of these individuals and entities significantly enhanced our work; however, any errors in fact or analysis remain the responsibility of the authors.
About the Authors

Amy Chen Kulesa
Amy Chen Kulesa is a senior associate partner at Bellwether in the Strategic Advising practice area. She can be reached at amy.chenkulesa@bellwether.org.

Marisa Mission

Mary K. Wells

Alex Kotran
Alex Kotran is CEO of aiEDU: The AI Education Project. He can be reached at alex@aiedu.org.
Bellwether is a national nonprofit that exists to transform education to ensure systemically marginalized young people achieve outcomes that lead to fulfilling lives and flourishing communities. Founded in 2010, we work hand in hand with education leaders and organizations to accelerate their impact, inform and influence policy and program design, and share what we learn along the way. For more, visit bellwether.org.
aiEDU: The AI Education Project is a 501(c)(3) nonprofit devoted to making sure that all students are ready to live, work, and thrive in a world where AI is everywhere. We work with education systems to advance AI literacy and AI readiness through high-quality curriculum, professional development, and strategic partnerships with states, school districts, and other systems. Learn more about the work we do at aiEDU.org.
© 2025 Bellwether and aiEDU: The AI Education Project
This report carries a Creative Commons license, which permits noncommercial reuse of content when proper attribution is provided. This means you are free to copy, display, and distribute this work, or include content from this report in derivative works, under the following conditions:
Attribution. You must clearly attribute the work to Bellwether and The AI Education Project, and provide a link back to the publication at www.bellwether.org.
Noncommercial. You may not use this work for commercial purposes without explicit prior permission from Bellwether and The AI Education Project.
Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one.
For the full legal code of this Creative Commons license, please visit www.creativecommons.org. If you have any questions about citing or reusing Bellwether or The AI Education Project content, please contact us.