If Higher Ed Doesn't Change, Students Will - With Their Wallets
A professor's take on why academic tradition won't survive disruption
Earlier today, I responded to a post that got me thinking. (You can find it here.) The author did a solid job highlighting ways to adapt AI use in education, but what stuck with me was how much of what we call “higher ed” still feels stuck in time.
Here’s part of what I wrote in the comments:
“For the large majority of professions, traditional higher education needs a complete revamp. We can’t keep our heads in the sand... Not having students use AI is like telling them not to use the internet 30 years ago.”
I stand by that.
As a professor in healthcare, I have a little protection — most students still need to pass boards and licensure exams, which keep our assessments relatively grounded in proctored exams and oral presentations. But the moment we assign a project or paper, AI is in the room whether we acknowledge it or not.
And that’s the disconnect. Educators who act like AI is some futuristic add-on rather than the default tool students are already using — they’re missing the point. Completely.
We’ve passed the stage of “Should students use AI?” and are squarely in the territory of “Can they use it well?”
Because the world is already adapting.
The Washington Post recently ran a piece warning that AI is replacing many entry-level, white-collar roles, and if colleges don't evolve with it, they'll be pumping out grads for jobs that no longer exist https://www.washingtonpost.com/opinions/2025/07/08/ai-entry-level-jobs-talent/?utm_source=chatgpt.com.
The Learning Policy Institute flat-out says we need to redesign education from the ground up, focusing less on rote memorization and more on real-world application, project-based learning, and digital literacy (https://learningpolicyinstitute.org/blog/educating-ai-era-urgent-need-redesign-schools).
Deloitte released a full breakdown of how universities can systematically implement AI, not just in student learning but in faculty workflows and curriculum redesign (https://www.deloitte.com/content/dam/assets-zone3/us/en/docs/services/consulting/2024/oncloud-how-higher-ed-is-using-AI-to-innovate-for-growth.pdf).
So yeah, we’re not just talking about AI being the calculator in the backpack — we’re talking about an entire shift in how people learn, work, and prove value.
I’ve got a doctorate. And frankly, unless you’re aiming to be a physician or work in a hard science lab, I’d have a hard time recommending a PhD right now. It takes 10,000 hours, and most of that could be spent learning how to build something useful, scalable, and relevant with AI. Especially if the alternative is going into debt to write papers that a well-prompted chatbot could draft in 12 seconds.
Let’s get real.
Students don’t just need to know things. They need to build, test, and apply. And if we as educators aren’t helping them do that — using the best tools available — then we’re failing them.
We don’t need to panic. But we do need to pivot.
Start with AI literacy.
- Use more oral or in-person assessments.
- Push students to create, not just consume.
- And above all, don’t underestimate what 10,000 hours of self-guided learning + curiosity + the right tools can do.
Stay curious. Stay flexible. Stay human.
– Dr. D

As a K–12 teacher, I definitely see where you’re coming from — AI is already in the room whether we acknowledge it or not. But I think it’s worth asking a deeper question here, especially from those of us in education: Do we really want to be handing over even more of how students learn and think to the same institutions pushing tech as the one-size-fits-all solution?
We’ve already seen what happens when “experts” tell us something is safe, necessary, or inevitable — whether that’s medical interventions or education reforms. The same people who once said “trust the science” about vaccines are now saying “trust the AI.” It’s the same pattern of compliance over curiosity.
The danger I see isn’t students using AI — it’s students losing the ability to think critically without it. We’re raising kids in an environment where the answer is always a click away, but discernment isn’t. AI might help with writing a paper or brainstorming, but it doesn’t replace wisdom, life experience, or genuine human judgment. And it definitely doesn’t teach kids how to challenge a narrative when that narrative is wrong.
You’re right: the conversation isn’t about if students use AI — it’s about how. But I’d argue it’s also about why we’re so quick to embrace it without questioning the long-term impact on autonomy, privacy, and independent thought.
Pivot? Sure. But not blindly. Not just because everyone else is.
The pivot is what I’m working on. More to come on that.