K-12 World Reacts to Trump’s Executive Order to Block State AI Regulations

Date:


President Donald Trump signed an executive order this month aimed at blocking states from crafting their own regulations for artificial intelligence.

To win the “AI race,” U.S.-based AI companies “must be free to innovate without cumbersome regulation. But excessive State regulation thwarts this imperative,” according to the executive order.

It directs federal agencies to identify “onerous” state AI regulations and pressure states not to enact them by withholding federal funding or challenging the state laws in court. It would also begin a process to develop a lighter touch regulatory framework for the whole country that would override state AI laws.

The “patchwork” of regulations across 50 states impedes AI companies’ growth and “makes compliance more challenging,” the order says. The president has also said state regulations are producing “Woke AI.”

Lawmakers from both parties, as well as civil liberties and consumer rights groups, have pushed for more oversight on the quickly evolving, powerful technology.

Four states—California, Colorado, Texas, and Utah—have passed laws that set some rules for AI across the private sector, including limiting collection of certain personal information and requiring more transparency, according to the International Association of Privacy Professionals. Many states also have regulated parts of AI, according to the Associated Press, such as barring the use of deepfakes in elections and the creation of nonconsensual sexually explicit images.

At the K-12 level, at least two states—Ohio and Tennessee—require school districts to have a comprehensive policy about the use of artificial intelligence in schools, according to an Education Week tally.

Education Week asked K-12 organizations and ed-tech leaders how this executive order could affect the use of AI in schools. Below are their responses, which have been edited for length and clarity and include additional insights about how to put meaningful guardrails around the use of AI in education.

   The executive order addresses a real challenge we’ve experienced: districts navigating compliance requirements designed for foundation model developers like OpenAI, not for curriculum-aligned classroom tools. A clearer national framework is welcome. But removing barriers is only half the equation. The opportunity now is to define what responsible AI in education actually looks like—curriculum alignment, student-data protection, teacher control, and transparency into how AI supports instruction. Without that proactive framework, the field risks a race to the bottom that erodes district trust and invites stricter regulation later. We’re eager to work with the administration, other ed-tech leaders, and school systems to get this right.

— Arman Jaffer, CEO, Brisk Teaching, an ed-tech company

   Federal preemption is appropriate only when it establishes a clear, well-designed, widely accepted federal alternative. This executive order instead appears to displace state AI oversight without providing a meaningful federal framework to replace it.

— Keith Krueger, CEO, Consortium for School Networking, which represents school district chief technology officers

   AI offers an unprecedented opportunity to innovate in education. Broad federal restrictions on what states can and can’t do risks slowing the very innovation that AI promises to bring. Clear rules are essential in areas like student-data privacy, safety, and security. At the same time, schools have a responsibility to provide age-appropriate learning environments. Just as we don’t give students an unfiltered internet, educators need the ability to apply sensible safeguards and content protections to AI tools used in classrooms. Any approach should preserve schools’ ability to protect students while supporting educators in using AI responsibly to advance learning.

— Joseph South, chief innovation officer, ISTE+ASCD, which provides professional development about AI use in schools

   Rarely do I agree with Sen. Marsha Blackburn, Gov. Ron DeSantis, or Rep. Marjorie Taylor Greene, but it’s telling that they and so many other Republican policymakers have firmly stood up to Trump on AI to say: No, wrong, this is bad, stop it. … As for educators, it’s a well-worn truism that many have seen the latest ed-tech fads come and go and thus adopt a mindset of ‘this too shall pass’—this will serve them well here. It’s true that AI in education is more pernicious than many prior technologies, but this EO will surely be litigated to extinction in court, and it definitely won’t affect any of the day-to-day realities that teachers face with their students in the classroom. Once again, they’ll rightly see this as politicians monkeying with the education system in ways that are completely divorced from their actual needs. What a colossal waste of time and energy.

— Ben Riley, founder and CEO, Cognitive Resonance, a think tank that helps people understand how generative AI works

   This outrageous and likely illegal directive to sue states that exercise their right to regulate AI shows, in bold-faced type, the administration’s loyalty to Big Tech over kids, families, educators, nurses, and other workers. Let’s be clear—this is not about whether AI should or shouldn’t be used as a tool. It should. It’s about how to ensure that young people are safe, that educators and parents retain control, and that society is protected, not exploited. That is, after all, the entire purpose of regulation. … Our state lawmakers have stepped up because the federal government won’t. … We are working alongside states like New York and Kentucky so we don’t make the same mistake with AI that we did with social media. Right now, a generation of kids has become isolated, distracted, and addicted to devices because adults failed to act.

— Randi Weingarten, president, American Federation of Teachers

   Artificial intelligence has the potential to transform student learning and the professional experience of educators, but only if it is regulated responsibly, with safety, privacy, and environmental harms in mind. Unfortunately, the Trump administration appears to be taking reckless steps to override state and local authority; sideline experts, educators, parents, and community leaders; and consolidate control in the hands of Big Tech. … It is important that we learn from our experience with social media. We failed to regulate that technology and hold Big Tech responsible, and as a result, the U.S. now faces a teen mental health crisis. AI is infinitely more powerful, and decisions about its use must be driven by the well-being and needs of humans —our students, our educators, our families—not corporations.

— Becky Pringle, president, National Education Association

   As policymakers debate how to govern artificial intelligence, there’s a real risk of overlooking the most foundational policy question of all: whether students are learning about how this technology actually works. You cannot regulate what people don’t understand. And if we wait to teach AI until every policy question is resolved, we will already be too late. We’ve seen this movie before. When social media reshaped young people’s lives, we failed to give them even a basic understanding of the algorithms shaping their feeds, their beliefs, and their insecurities. That gap in understanding left families, schools, and policymakers scrambling to respond after harm had already occurred. We cannot make that mistake again with AI. Our focus is on ensuring educators and education leaders have the resources they need to build foundational literacy that will enable students to use, question, and shape AI responsibly. The message to education leaders is simple: Move now. Preparing students for an AI-powered world cannot wait on perfect policy.

— Cameron Wilson, president, Code.org, an advocacy group for computer science education

   The executive order offers helpful clarity at the federal level, but it does not change the responsibility states and districts already carry for guiding the use of AI in schools. Education leaders remain on the front lines of determining how AI shows up in classrooms and how to ensure its impact is intentional, equitable, and well-governed. Districts tell us they want to avoid overreacting while also not standing still. Students and educators are already using AI tools—often outside formal policies—so schools must manage real-world use even as regulations continue to evolve. The most valuable work leaders can do right now is not rushing to adopt new technologies but establishing clear expectations, guardrails, and foundational AI literacy for staff and students. Thoughtful leadership in this moment means building capacity. Investing in AI literacy, governance frameworks, and professional learning allows districts to align with state priorities, support student safety, and stay adaptable as policies and technologies advance. This approach prepares learners to engage with AI responsibly, regardless of how regulatory details change.

— Lisa O’Masta, CEO, Learning.com, an ed-tech company focused on students’ digital skills

   Notably, the White House executive order does carve out exemptions for current state legislation related to child safety—that’s critical, because we cannot afford to wait to take action on ensuring we’re protecting kids. While there isn’t a definitive count on legislation directly focused on child safety, according to the Future of Privacy Forum, over 120 bills concerning the use of AI in education were introduced in 2025. Federal and state policies alone, of course, won’t protect our kids. Districts and school leaders, along with parents, need the tools and insights now on how students are interacting with AI, especially when we see signs of well-being risks. We learned these lessons too late in the Web 1.0 and social media eras, and kids paid the price. While the policy debates play out, our collective grassroots efforts need to ensure each school has the transparency and visibility to move as fast as technological innovation. The time to get this right is now.

— Tammy Wincup, CEO, Securly, an ed-tech company that provides device management solutions



Share post:

Subscribe

Popular

More like this
Related