IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Opinion: AI Is Handy, but We Shouldn't Outsource Thinking to It

Acceptable uses of AI should not promote anti-intellectualism, which Richard Hofstadter described as "resentment of the life of the mind ... and a disposition to constantly minimize the value of that life."

Clones being implanted with advanced neurotechnology creating a hivemind where all their thoughts and experiences are shared
Adobe Stock
(TNS) — Innovation is a good thing; especially when we use it to make knowledge and information more accessible.

But what happens when innovation (or innovative technology) makes the quest for knowledge and information practically obsolete?

While I've seen some "pointless artificial intelligence" content online recently, it's something I started noticing earlier. It seems like every app, website and other digital platforms are implementing their newest (seemingly up-to-two-syllable named) AI "assistants."

These can be handy, I guess. Providing quick answers to frequently asked questions, solving quick or mundane functions, making timely translations, etc., are all, I think, permissible and efficient uses of AI. I'm not talking about stuff like that.

So long as you're not wholly discounting human capital and crossing any ethical lines, I think using or implementing AI is pretty OK (there is the environmental impact, but I'll encourage you to research that in your own time.)

What's not OK, is how people are using AI to think for them. Students are ChatGPTing their way through school.

Don't believe me? Scroll through any viral X posts' reply threads. Google "ChatGPT in college" and you'll find a slate of news reports on the topic — and it's all super concerning, in my opinion.

To me, this behavior signals a new wave of anti-intellectualism. In an age where it literally could not be easier to access good information — about any topic — people are still willingly choosing not to think for themselves.

I hate to sound like a Debbie Downer, but what are we doing?

Why are we relying on flawed software (that largely works by scanning the Internet and synthesizing other people's existing work) to answer questions that would take maybe two extra steps to research or solve ourselves?

I understand how tedious and challenging coursework can be, but man, what happened to trying? What happened to struggling to solve a problem and thus learning from it? What happened to our natural curiosities, the indominable human spirit?

OK, that was a little dramatic, but I think you get the point I’m trying to make.

Richard Hofstadter, an intellectual and historian of the mid-20th century, describes anti-intellectualism as "resentment of the life of the mind, and those who are considered to represent it; and a disposition to constantly minimize the value of that life."

While he and other scholars trace the history of anti-intellectualism in the U.S. back to the classic American debate of "who governs best?" Is it 'We the People,' who sometimes collectively have little interest or thorough knowledge of the topics? Or is it a smaller group of folks who have experience and thorough education on said topics?

As you (hopefully) know by now, the U.S. operates on a mix of both — or at least it's supposed to.

If we're pushing experts out of national decision-making while simultaneously losing the ability to make decisions and find information ourselves, I fear we will become even more susceptible to mis- and disinformation, governmental corruption and other grim outcomes.

More recently, a provision in the One Big Beautiful Bill poses another issue: If it becomes law, states and local governments may lose the ability to enforce AI regulations, with some exceptions, for 10 years.

Again, I understand how AI can be used ethically and effectively — and I also understand that it's not going anywhere. I don't have a comprehensive solution, either, other than to encourage folks to use their noggin, fine-tune their research skills and support reputable news outlets and social media creators.

In closing, perhaps we should all just give our noggin and problem-solving skills some more trust and credit.

If you still feel compelled to use AI in assignments, work tasks or other inquiries, I urge you to please do so carefully, and in a way that still requires some critical thinking and application of your own knowledge.

©2025 Moline Dispatch and Rock Island Argus, Ill. Distributed by Tribune Content Agency, LLC.
Sign Up Today

Don't miss a headline and stay on top of the latest EdTech trends.