With AI in schools, local leadership matters more than ever


Credit: Julie Leopo/EdSource

Last week, the Trump administration’s draft executive order to integrate artificial intelligence (AI) into K-12 schools made national headlines. The order, still in flux, would direct federal agencies to embed AI in classrooms and partner with private companies to create new educational programs. The move comes as China, Singapore and other nations ramp up their AI education initiatives, fueling talk of a new “AI space race.” But as the world’s biggest players push for rapid adoption, the real question for American education isn’t whether AI is coming — it’s who will shape its role in our schools, and on whose terms.

AI is not simply the next classroom gadget or software subscription. It represents a fundamentally new kind of disruptor in the education space — one that doesn’t just supplement public education but is increasingly building parallel systems alongside it. These AI-powered platforms, often funded by public dollars through vouchers or direct-to-consumer models, can operate outside the traditional oversight and values of public schools. The stakes are high: AI is already influencing what counts as education, who delivers it and how it is governed.

This transformation is happening fast. For example, in the Los Angeles Unified School District (LAUSD) the district’s ambitious “AI friend” chatbot project, meant to support students and families, collapsed when its startup partner folded, exposing the risks of investing public funds in untested AI ventures. Meanwhile, major tech firms are pitching AI as a “tutor for every learner and a TA for every teacher,” promising to personalize learning and free up educators’ time. The reality is more complex: AI’s promise is real, but so are its pitfalls, especially when it bypasses local voices and democratic control.

The rise of AI in education is reshaping three core principles: agency, accountability and equity.

  • Agency: Traditionally, public education has empowered teachers, students and communities to shape learning. Now, AI platforms — sometimes chosen by parents or delivered through private providers — can shift decision-making from classrooms to opaque algorithms. Teachers may find themselves implementing AI-generated lessons, while students’ learning paths are increasingly set by proprietary systems. If local educators and families aren’t at the table, agency risks becoming fragmented and individualized, eroding the collective mission of public schooling.
  • Accountability: In public schools, accountability means clear lines of responsibility and public oversight. But when AI tools misclassify students or private micro-schools underperform, it’s unclear who is answerable: the vendor, the parent, the state, or the algorithm? This diffusion of responsibility can undermine public trust and make it harder to ensure quality and fairness.
  • Equity: AI has the potential to personalize learning and expand access, but its benefits often flow unevenly. Wealthier families and districts are more likely to access cutting-edge tools, while under-resourced students risk being left behind. As AI-powered platforms grow outside of traditional systems, the risk is that public funds flow to private, less accountable alternatives, deepening educational divides.

It’s tempting to see AI as an unstoppable force, destined to either save or doom public education. But that narrative misses the most important variable: us. AI is not inherently good or bad. Its impact will depend on how — and by whom — it is implemented.

The U.S. education system’s greatest strength is its tradition of local control and community engagement. As national and global pressures mount, local leaders — school boards, district administrators, teachers, and parents — must drive how AI is used. That means:

  • Demanding transparency from vendors about how AI systems work and how data is used.
  • Prioritizing investments in teacher training and professional development, so educators can use AI as a tool for empowerment, not replacement.
  • Insisting that AI tools align with local values and needs, rather than accepting one-size-fits-all solutions from distant tech companies or federal mandates.
  • Building coalitions across districts and states to share expertise and advocate for policies that center agency, accountability, and equity.

As Dallas schools Superintendent Stephanie Elizalde put it, “It’s irresponsible to not teach (AI). We have to. We are preparing kids for their future”. But preparing students for the future doesn’t mean ceding control to algorithms or outside interests. It means harnessing AI’s potential while holding fast to the public values that define American education.

The choices we make now — especially at the local level — will determine whether AI becomes a tool for equity and empowerment, or a force for further privatization and exclusion. Policymakers should focus less on top-down mandates and more on empowering local communities to lead. AI can strengthen public education, but only if we ensure that the people closest to students — teachers, families and local leaders — have the authority and resources to shape its use.

The world is changing fast. Let’s make sure our schools change on our terms.

•••

Patricia Burch is a professor at the USC Rossier School of Education and author of “Hidden Markets: The New Educational Privatization” (2009, 2020).

The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.





Source link

Scroll to Top