Want AI in Schools? Start With the Teachers
Serious AI integration requires serious investment in teacher expertise
AI is coming to classrooms whether we’re ready or not.
The real debate is whether it will amplify teachers—or quietly automate them.
So the last piece traveled a little farther than I expected.
That’s always a strange moment as a writer. You toss something out into the void, thinking a few colleagues might nod along, or look at you and shake their heads disapprovingly at a meeting, and suddenly thousands of people are reading it.
Which brings us to the natural next move: train’s moving, shovel coal.
So let’s talk about AI.
(Polite wave toward the SEO gods.)
I’ve written about this before. A while back, I published a piece called “What AI Means for the 99%” and “The Kids Have AI, the Teachers Have Icebreakers.” The point of those articles wasn’t panic or prophecy. They were observations. Students were already experimenting with powerful tools, while the adult response often felt procedural and surface-level. Something big was happening, and we were responding with small strategies.
That imbalance hasn’t gone away.
If anything, it’s scaled up.
Editor’s Note: I’m writing from the perspective of my own district — an under-funded, below-average district in a state that is itself well below average in public education funding. Your mileage may vary, obviously. For example, it took my district roughly two years to release formal guidance on AI use in classrooms, and students were granted access to AI tools on the district network before those guidelines appeared. And we were using it on the down-low in classes before that, while district officials were publicly saying there was no way students could use AI in the schools. So…yeah.
Students are using AI. Teachers are experimenting with it. Or actively fighting it. Or somewhere in between. District leaders are drafting policy language about integration. Federal conversations increasingly frame AI literacy as essential preparation for the future economy. Adoption is accelerating, while the narratives surrounding it are, to put it politely, contradictory.
What remains less clear is whether we intend to prepare the people expected to make that integration meaningful.
Seeds…and Studies
This piece was sparked by a study that had been sitting on my computer for a while: “AI Use in Schools Is Quickly Increasing but Guidance Lags Behind,” published by RAND using their national survey panels. The report came out last September and is reinforced by similar studies.
At its simplest, the RAND study says this: Schools are already saturated with AI—used by over half of students and teachers—but most districts still haven’t provided clear rules, training, or guidance. Students are largely figuring it out themselves while adults argue about cheating. And the divide between AI skeptics and AI proponents inside school systems breaks down almost exactly as you’d expect.
That finding isn’t especially surprising to anyone working in schools, but RAND’s survey panels give it something classroom anecdotes don’t: scale.
One section heading in the report reads:
And another observation follows:
I haven’t run my own survey, but I’d wager not much has changed in the five months since that report came out. Taken together, there isn’t a veteran teacher in this country who hasn’t heard this song before.
What I want to look at here is AI in education at the system level—as part of the larger design.
The Cleanest Signal Yet?
If you want to understand the systemic problem in American public education in one clear moment, look at how we’re handling AI.
At every level, from federal to local, schools are being told that AI must be integrated into classrooms. Teachers must prepare students for an AI-shaped future. Schools must innovate. It’s the moon shot that will rocket us into the future.
That part is clear.
What accompanies the mandate is harder to identify. There is no coherent national framework outlining what effective AI integration should look like across disciplines. There is no sustained professional structure built into the school calendar that allows teachers to redesign curriculum thoughtfully. There is no meaningful credentialing pathway recognizing AI literacy as a professional specialization for teachers. And there is no dedicated funding stream that matches the scale of the expectation.
In other words, the expectation is systemic, but the preparation is improvised.
Decades of research on educational technology adoption show that implementation depends less on the tools themselves and more on “facilitating conditions” such as institutional support, training, and organizational readiness.
Every reform eventually lands in the same place: a classroom, where a teacher has to translate policy language into something a room full of students can actually learn from. That translation requires time, expertise, and thoughtful design.
It’s also worth saying that many administrators are navigating the same contradictions teachers are. State directives arrive. Federal initiatives appear. Technology companies make promises. School leaders are asked to demonstrate innovation and accountability at the same time, often without additional resources. The pressure flows downward through the system, and eventually it reaches a classroom where someone has to make it real.
Right now, the system seems to assume that design and implementation will simply “happen.”
This pattern is not new. Education historian Larry Cuban has long documented how schools repeatedly overestimate the instructional impact of new technologies while underinvesting in the professional learning required to use them well.
What the Research Actually Says
This isn’t speculation. Technology integration research has been consistent for decades: tools alone do not change instruction. Teacher learning does.
Educational technology research has long reflected this reality. The widely cited TPACK framework, developed by Mishra and Koehler, argues that effective technology integration occurs only when teachers combine technological knowledge with deep pedagogical and subject expertise.
Research on technology-enhanced learning repeatedly emphasizes that digital tools alone do not transform instruction; their impact depends on how thoughtfully they are integrated into pedagogy.
Studies of technology adoption repeatedly show that meaningful classroom integration is tied to sustained, pedagogically grounded professional development.
A recent systematic review of teacher professional development and digital instruction similarly found that sustained training, collaboration, and institutional support are the strongest predictors of meaningful technology integration.
When teachers receive ongoing training tied to real classroom practice, instructional change deepens. When preparation is minimal or fragmented, adoption tends to remain surface-level, frustrating teachers and students alike.
Recent reviews of artificial intelligence in education suggest a similar pattern. Research attention has exploded around AI applications themselves, but far less attention has been paid to preparing teachers to use those tools in pedagogically grounded ways. In practice, that means the technology is advancing faster than the professional structures needed to support it.
A recent systematic review of nearly one hundred studies on AI in education similarly concluded that research attention has largely focused on applications of the technology itself, while comparatively little attention has been paid to preparing teachers to use those tools pedagogically.
Teachers are not resisting AI.
Research on teacher technology adoption consistently shows that teachers’ beliefs, professional knowledge, and institutional support structures—not the tools themselves—determine whether digital technologies meaningfully reshape instruction.
Research also consistently shows that teachers generally hold positive attitudes toward classroom technology, but report barriers such as insufficient training, time, and institutional support as the main obstacles to meaningful implementation.
Our skepticism comes from a simple recognition: meaningful integration requires thoughtful design. Thoughtful design requires expertise. And expertise takes time.
The Funding Paradox
This is where the contradiction becomes hard to ignore.
In many states, public education funding is tightening or being gutted by design. Professional development funds are among the first areas trimmed. Planning time is compressed. Staffing gaps widen.
At the same time, schools are being told to prepare students for an AI-driven economy—and, sooner rather than later, to demonstrate that AI is being integrated into classrooms.
There’s no anger in saying this, just the usual confusion.
I’m a sixteen-year veteran. I’ve watched standards shifts, assessment reforms, device rollouts, and LMS transitions come and go. In every wave of change, meaningful transformation depended less on the sophistication of the technology and more on the seriousness of the investment in teacher learning.
When training was sustained and embedded, practice evolved. When training was perfunctory, practice complied.
I’m still not really good at Infinite Campus, even after using it all year. Our “training” was essentially telling us we’re using it in the coming year, at the end of last year, and suggesting that we watch some videos on it over the summer.
AI represents a larger shift than any of those earlier waves.
But structurally, we’re treating it like a plugin. Like switching from Blackboard to Canvas.
When a system mandates transformation but withholds investment, the problem isn’t teacher resistance; it’s structural incoherence.
And then there’s the critique that lands squarely.
AI and the Relational Work of Teaching
Some of the most revealing research compares AI-generated lesson plans with those designed by teachers. Generative systems often perform well in structural alignment. They can quickly produce clear objectives, organized sequences, and standards-aligned frameworks.
Where they tend to fall short is contextual nuance—the parts of teaching that depend on human judgment. Depth of inquiry, cultural responsiveness, and authentic intellectual challenge tend to emerge most strongly when teachers adapt and reshape what the tool produces.
Teaching requires noticing who didn’t sleep. Who didn’t eat. Who’s bored because the lesson is too easy, and who is quietly drowning because it’s too hard. Who’s faltering because a grandparent passed away last month.
It requires adjusting mid-lesson when the energy shifts or confusion surfaces. Responding when the temperature of the room changes—for any of a dozen reasons.
AI can scaffold, but it cannot read a room. It can’t tell a joke to lighten the mood, or push a student a little further with a quiet signal that you believe in them.
Teaching has always required more than structural coherence. International education research has similarly shown that the mere presence of digital technology alone rarely improves learning outcomes unless teachers redesign instruction around it.
And The Accountability Wave Is Coming…
This is not just a classroom pattern—it’s an institutional one. If you’ve been in education long enough, you know what follows any new initiative.
Measurement.
Soon enough, administrators and upper management will want evidence of AI integration. Walkthroughs will look for it. Lesson plans will reference it. Data dashboards will attempt to measure its effects. There will probably be school-, district-, and state-level “AI report cards.”
That instinct is understandable. If we introduce powerful tools and initiatives into classrooms, we should care about whether they improve learning. But accountability without preparation yields a predictable outcome: documentation rather than transformation. The box-checkers will have another box to check. And they’ll check it.
AI will appear in lesson plans. Screenshots will pollute slide decks. The language of innovation will be visible. But the underlying instructional design will not have shifted in the ways leaders hope—or have been ordered to produce.
How can I be sure? I’ve done this myself.
Teachers are not afraid of accountability. We live inside it. We’re accountability sinks—it all flows downward and eventually lands on us. That’s part of the job.
What we are wary of is being held accountable for outcomes tied to tools we were never given the time or training to master.
If We’re Serious, Then Be Serious
If AI integration is important enough to mandate, it is important enough to structure and fund. As a former VP (at the time) once said, “Don’t tell me what you value, show me your budget, and I’ll tell you what you value.”
Serious integration requires serious conditions. At minimum, that would include:
Bring real expertise into the room.
Professional learning should be led by people who understand AI’s pedagogical implications, limitations, bias structures, and ethical dimensions—not just its surface functionality. Expertise matters, and the field deserves professionals whose depth goes beyond a handful of workshops and an online certification. Ideally, experts who can respond to the skepticism of veteran teachers with practical solutions and clear paths forward. You’re challenging our perceived usefulness, validity, and the core of what we do. Some of us are going to push back.
We don’t need to be shamed or called Luddites. We’ve seen this before. Heck, there are probably a few of us who remember when VCRs were the tool that was going to revolutionize education. Help us understand how this is actually the real thing.
Provide sustained, in-contract learning time.
One-off workshops or optional after-hours modules cannot support tools that reshape assessment, feedback, and curriculum design. Teachers need embedded time during the contract day to experiment, reflect, and collaborate.
Research on technology integration consistently finds that sustained support systems—combining leadership backing, expert guidance, and peer collaboration—allow instructional practices to evolve over time rather than remain superficial.
If it’s important enough to revolutionize education at the classroom level, it should be important enough to fund. If it’s not important enough to fund, don’t expect teachers to jump on board.
Recognize AI mastery as professional expertise.
If AI literacy is important enough to mandate, it should connect to licensure credit, credentials, or compensation pathways. Professional growth should be treated as advancement, not just another plate teachers are expected to keep spinning. If the plan is to have local, school-based “AI-gurus” to lead, give them the time, money, and room to help other teachers adopt and develop best practices.
Clarify acceptable use of AI.
Students and teachers need consistent frameworks around authorship, academic integrity, and ethical boundaries. These expectations should be codified and enforced clearly rather than improvised classroom by classroom, and largely unknown to leadership.
Include teachers in the design process.
I understand that in many places, this horse has already left the barn.
Research on co-design models consistently shows deeper integration when teachers help shape how tools are implemented, rather than receiving directives after decisions have already been made.
And because it doesn’t get said enough: when it comes to education, teachers are the experts. Not consultants. Not upper management. The education experts are teachers. If you want us to do something well, involve us from the beginning. Want to know how new technology will work with kids? Ask the teachers.
None of this is radical. It’s simply designing the reform with the teacher in mind for once.
If this isn’t happening, we all need to push this conversation up the chain.
Teachers should point out the gap between expectations and support to administrators. Administrators should raise it with district leadership. District leadership should raise it with school boards, and school boards should raise it with state legislatures. And there should be accountability at every level. Too many people in education shrug and point to the status quo in the face of change, and, unsurprisingly, nothing changes.
If AI truly is the future we claim it is—if it matters for our students, the future workforce, and national competitiveness—then we should be able to find ears willing to listen at every level.
“This is the best we can manage for where we are now” isn’t good enough.
Otherwise, what are we doing any of this for?
Automation or Amplification
AI will be present in classrooms. That much is already clear. The battle is effectively over—if there was ever going to be one. But the real question is whether it arrives as automation or amplification.
Automation reduces professional judgment to a process, while amplification strengthens it with tools. The difference between those futures is not the software. It is whether we are willing to invest in the teachers expected to make it work.
If AI integration is important enough to reshape instruction, then it is important enough to reshape professional learning. Otherwise, we are placing ambitious expectations on schools while constraining the very resources required to meet them — or, as teachers call it, business as usual.
The AI moment feels less like a technology debate and more like a structural stress test. It reveals how education systems translate ambitious ideas into classroom reality. It exposes a familiar pattern: mandates flow downward, resources hesitate, and teachers absorb the implementation burden.
Take all of this not as an argument against AI, but rather a plea for coherence.
If we invest in teachers as experts, AI can amplify human judgment and deepen learning. If we attempt to bypass that investment in pursuit of efficiency, we risk thinning the professionalism that gives education its power.
Give teachers the time, training, and respect that match the mandate, and we will build classrooms that are intentional rather than reactive.
That’s not resistance.
It is simply a request that the math of a mandate finally make sense.
Thanks for reading, and hey - quick favor?
We’re not going to get any change made if we all just read this, nod, agree, and click “Like.” Can you do me a solid and share this with others, and hopefully up the chain to some decision makers? I’m not thinking things will change overnight, but I think that educational leadership and decision-makers need to know we see them, we see the decisions, and we know what’s best for our kids. Thanks.






Right on.
Damm Straight! Too much is top-down. Goodlad's famous statement, "All innovation is blunted at the classroom door," is so true. Other nations with excellent education systems have highly trained teachers who are given the autonomy and resources to meet high expectations, which is largely ignored in the US. Teacher collaboration is the key to adopting anything new, from the "science of reading" to AI.