top of page
Search

Before We Hand Classrooms to AI, Let’s Ask the Right Questions

  • Writer: Jacqueline Vickery
    Jacqueline Vickery
  • Aug 27, 2025
  • 5 min read

When I moved from Texas to Colorado this summer, I didn’t expect my first taste of local education debates to come from fellow Dallas-ite Mark Cuban. At the National Governors Association meeting in Colorado Springs, Cuban sat down with Governor Jared Polis and proclaimed AI “the most consequential technology” in history, one that would “democratize education like never before” because “now all students have access to literally everything.”


As a media literacy researcher, educator, and program director, I’ve heard this story before. First it was the internet. Then smartphones. Then social media. Then iPads. The promises were grand. But history shows that technology without equity-centered design rarely solves systemic problems. More often, it amplifies them. If the past 30 years of ed-tech have taught us anything, it’s this: hype without guardrails leaves the most vulnerable paying the highest price.


The danger isn’t that teachers will ignore AI. It’s that we’ll hand it over without asking the hard questions: Who benefits? Who pays? Who gets left behind?


Lessons From the Past


Cuban claims AI is “unprecedented,” but transformative technologies have been here before. Let's go back, way back to the 15th century and to another democratizing invention: the printing press.


The printing press revolutionized human communication, fueling the spread of science, politics, and art. But it wasn’t the machine alone that changed the world. It was the infrastructures and social systems that grew around it — publishers, libraries, postal systems, laws about who could read and write.


Mass literacy disrupted entrenched powers like churches, monarchies, patriarchies, and slaveholding economies. Yet those benefits were never equally distributed. Geography, wealth, race, and gender dictated who got to participate.


That’s the lesson: new technologies don’t reshape society in isolation. They interact with existing systems of power. AI is no different. Lofty promises and commercial agendas arrive first. The structural work to make those promises equitable lags far behind.


Access ≠ Opportunity


Cuban painted AI as a curiosity machine: kids asking questions, fact-checking, exploring. But students already do that with books, teachers, podcasts, films, mentors, and play.


What’s supposedly “new,” Cuban argued, is that kids can now access “every book ever published, every doctor, every mentor.” That’s simply false. Much of that content is behind paywalls or was scraped without permission — including my own books.


But even if universal access were true, access alone has never guaranteed opportunity. Remember the One Laptop per Child initiative? Laptops were handed out with the promise of revolutionizing education. But in many places, there was no electricity to charge them, no tech support when they broke, no teacher training to use them effectively. Technology alone didn’t erase structural inequities. AI won’t either.


The Hidden Threats of “Personalization”


Cuban touted AI’s ability to personalize learning. But personalization depends on surveillance.


These systems already collect and analyze vast amounts of student data — not just academic performance, but behaviors, choices, and interactions — to sort, predict, and shape futures.

In practice, this often means funneling some students into college-prep coursework while steering others into vocational tracks, low-wage labor pipelines, or even the school-to-prison pipeline. These patterns disproportionately affect lower-income students and students of color, reinforcing the very inequities AI is marketed as solving.


And here’s the uncomfortable truth: when students and teachers pour their knowledge, creativity, and problem-solving into AI, they’re not just “personalizing” their own learning. They’re training the very systems that could one day replace their roles.


The risks aren’t only social. They’re environmental. AI models consume staggering amounts of energy and water, far more than traditional search engines. In drought-prone states like Colorado, ignoring that footprint isn’t just shortsighted; it’s dangerous. There is no equitable education on a degraded planet.


A Familiar Sales Pitch


We’ve heard these promises before. Governor Spencer Cox of Utah even remarked that Cuban’s comments could have been lifted from speeches during the internet boom or the rise of social media. Then, too, we were told technology would make us smarter, kinder, more engaged.

Ask parents and teachers today if smartphones and social media have fulfilled those promises, and the answer is mixed at best. Research shows targeted, well-supported tech use can help students build certain skills. But excessive or poorly integrated technology undermines focus, reduces deep learning, and worsens inequities when access to devices, internet, and guidance isn’t universal.


Meanwhile, parents are sounding alarms about screen time. Surveys by Common Sense Media and Pew Research show most worry about its effects on children’s sleep, physical activity, and emotional well-being. It’s one reason schools are restricting phones — yet we’re being sold more tech in classrooms under the guise of innovation.


And we can’t ignore the business model driving this push. Companies have a financial incentive to get students hooked early. The longer kids use these tools, the more data is collected, the more the systems become ingrained, the harder it is to opt out. Social media platforms built their empires this way. Educational AI risks repeating that pattern, cloaked in the language of “personalized learning.”


The Pattern We Keep Repeating

I respect Cuban’s passion for public education. But his unchecked optimism fits a familiar pattern: privileged men declaring technology will save us, while lived experience from marginalized communities shows it often deepens inequity.


Educational disparities aren’t caused by a lack of technology; they’re sustained by systems of inequity. Dropping AI into classrooms without addressing those systems won’t close gaps. It will hardwire them into the next generation.


Cuban calls the AI race an “arms race.” I say scrap the war talk. Let’s speak in the language of building, cultivating, sustaining — treating AI as a tool for collective growth, not conquest.

The real question isn’t whether AI will shape education. It’s whether we will shape it to serve the public good. That requires asking:


  • What can AI actually do that teachers and existing tools cannot — and are the trade-offs worth it?

  • Who owns and controls the data?

  • How do we prevent bias, surveillance, and inequity from being baked into the system?

  • What environmental costs are acceptable, and who gets to decide?


Unchecked optimism may make for flashy soundbites, but without clear-eyed questions and equity-driven policies, AI risks becoming the most efficient inequality machine we’ve ever built.


Why This Matters for My Work


As someone who has spent more than 15 years designing and evaluating youth programs, I’ve seen how quickly technology can either expand or constrict opportunity depending on the systems around it. Through my consulting work, I partner with schools, nonprofits, and communities to navigate emerging technologies like AI with equity and wellbeing at the center.

The stakes couldn’t be higher. AI will shape classrooms — but whether it does so in ways that cultivate curiosity, protect students, and promote fairness depends on the choices we make now. My job, and my commitment, is to help communities ask the right questions before the hype takes over.

 
 
 

Recent Posts

See All
How TikTok Is Reshaping What It Means to Search

What changes when algorithms, peers, and platforms shape how information is found Much of the anxiety surrounding young people’s use of TikTok and other social media platforms centers on questions of

 
 
 

Comments


Jacqueline Vickery Consulting, LLC takes the following measures to ensure accessibility of this site: Include accessibility throughout our internal policies. Conformance status: The Web Content Accessibility Guidelines (WCAG) defines requirements for designers and developers to improve accessibility for people with disabilities. It defines three levels of conformance: Level A, Level AA, and Level AAA. Jacqueline Vickery Consulting has made efforts to be fully conformant with WCAG 2.0 level AA. Fully conformant means that the content fully conforms to the accessibility standard without any exceptions. We welcome your feedback on the accessibility of Jacqueline Vickery Consulting. Please let us know if you encounter accessibility barriers. 

Copyright © 2026 Jacqueline Vickery Consulting, LLC  - All Rights Reserved.

bottom of page