Menu Menu
[gtranslate]

ChatGPT is increasingly substituting critical thought

OpenAI’s latest usage report on ChatGPT sheds light on how people are moving away from having the chatbot complete ‘doing’ tasks, but are leaning on it for advice, problem solving, and outsourcing judgement calls. Is this cause for concern?

You know who doesn’t ever get brain fog? ChatGPT.

Use of OpenAI’s chatbot has become nigh-on ubiquitous, and people are leaning on the technology for just about everything in 2025. It’s gone from a barely entertaining gimmick to a life fixture for countless people within the last year.

The company’s latest analysis, based on 1.5 million conversations between July 2024 and July 2025, shows how habits with the chatbot have evolved.

Writing tasks, once dominant, dropped from 36 percent of usage to just 24 percent. By contrast, information-seeking queries nearly doubled from 14 percent to 24 percent, while requests for practical guidance — tutoring, troubleshooting, step-by-step instructions — remained steady at 29 percent.

Arguably the most pertinent stat, advice-driven interactions, now make up more than half of all ChatGPT usage. In OpenAI’s terms, ‘Asking’ has overtaken ‘Doing’, essentially marking its pivot from ghost-writer or PA to life-coach. It’s what the people want.

This change is easy to understand. ChatGPT is quick, confident, and endlessly available. It offers ready-made clarity where the internet often provides noise. The problem is that clarity is not the same thing as critical thinking. Where even Google requires some effort — sifting sources, cross-checking, identifying bias — ChatGPT delivers conclusions neatly packaged in a single paragraph, usually rounded off by a nice kiss-arse sentiment.

Ironically, you could argue that convenience is the product, while also being the major trade-off… like a lot of things in 2025.

Critical thinking is a skill that relies on repetition and literally flexing brain muscles. Without the practice of weighing competing ideas and drawing one’s own conclusions, that skill dulls. Outsourcing reasoning to a machine may not feel like a dramatic loss in the moment, but over time it changes the way problems are approached altogether, and it starts to potentially border on dystopic.

The data points in that direction. With ‘Asking’ now representing almost half of all activity, and ‘Doing’ on the decline (no doubt due to the proliferation of platforms like GPT Zero), people are eerily comfortable letting ChatGPT perform the work of interpretation as well as execution.

The implications are larger than just productivity. If critical thinking is treated as optional, the consequences extend into education, politics, and most importantly our social lives.

Decisions made with the help of an AI may be faster, but they risk being shallower, less interrogated, more homogenous, and tied to what it thinks you want to hear. In reality, maybe that toxic boyfriend does need chucking, or you should leave margin for error while travelling across country. Even Sam Altman, himself, warns against using it for therapy.

Critical thought is already harder to come by these days, and it’s not just an endearing trait, it’s an essential to develop the additional skills needed to succeed in life. ChatGPT won’t impress an employer in interview, or give you the ability to advise your friends and family through their own quarrels.

The fact we’re living in a world where critical thought is increasingly viewed as optional is scary. Can we have anything in moderation anymore?

Accessibility