By Joke Kujenya
WIDESSPREAD RELIANCE on artificial intelligence tools is eroding key human cognitive abilities, new research findings have warned, revealing that overreliance on Large Language Models (LLMs) such as Gemini Deep Think and GPT-5 is leading to reduced focus, weakened memory, and diminished reasoning capacity among users worldwide.
The report noted that 67 percent of organisations globally now deploy LLMs in their operations, reflecting the rapid expansion of AI integration across industries.
However, as these systems become increasingly embedded in work and study routines, researchers caution that excessive reliance on them risks dulling human intellect and decision-making skills.
It also notes that digital distraction has emerged as one of the most critical concerns.
Over the past two decades, the rise of smartphones and connected devices has undermined people’s ability to maintain focus and complete complex tasks, the study reveals.
According to the study, constant notifications and digital alerts, designed to trigger small neurological rewards, have proven as addictive as they are disruptive.
Studies cited in the report show that these frequent interruptions, coupled with the instant gratification of endless scrolling, have made it increasingly difficult to concentrate on demanding, long-term activities.
Another significant finding highlights how easy access to online information is contributing to what experts call “memory erosion.”
This phenomenon, often referred to as the “Google Effect,” demonstrates how dependency on digital resources weakens the brain’s capacity to store and organise information.
Earlier generations, the report explained, relied on active memorisation of details such as telephone numbers, poems, and even scientific tables—practices that reinforced neural retention and cognitive strength.
Researchers are also recording a decline in reasoning skills and argument construction.
As individuals turn to ChatGPT, Gemini, or DeepSeek for assistance, many are unconsciously delegating their intellectual engagement to machines.
This process, termed “cognitive offloading,” impairs the ability to identify logical connections and evaluate flawed arguments.
“It is the mental equivalent of outsourcing your exercise routine,” one researcher observed, noting that while the short-term convenience is appealing, long-term cognitive resilience suffers.
Before the advent of AI-powered assistants, research required deliberate engagement—reading, evaluating sources, comparing arguments, and synthesising ideas.
Each step strengthened memory, logic, and analytical thinking.
The current trend, however, threatens to erode those mental capacities. “Without the friction of mental effort, the mind weakens,” the study stated.
The concept of “cognitive friction” was identified as essential for mental sharpness. Researchers warn that the agreeable nature of LLMs, which are trained to prioritise user satisfaction, contributes to intellectual complacency.
When these models continually affirm user statements or provide agreeable responses, opportunities for critical reflection diminish.
The report also warned of a darker consequence: sycophancy in AI systems that reinforce misinformation or harmful assumptions.
Recent findings revealed that when users repeatedly assert false information, some mainstream AI models begin to mirror those falsehoods.
OpenAI, the developer of ChatGPT, has acknowledged the problem and is currently adjusting its models to encourage what it calls “honesty, constructive disagreement, and independent thinking instead of automatic praise or deference.”
However, developers admit that this solution poses a challenge, as users often prefer comfort over confrontation.
“Friction makes users uncomfortable,” OpenAI noted in its statement, “yet that very tension drives growth and understanding.”
Educational institutions have also been urged to take proactive steps.
The report highlighted a surge in AI use among students, with 33 percent of US college students reporting the use of ChatGPT for coursework in 2023.
By 2024, a separate survey covering 16 countries showed that 86 percent of students now rely on AI tools to complete academic tasks.
Academics warn that this reliance undermines intellectual independence and reduces engagement with core learning processes.
The question many educators are now asking is why students should struggle to memorise or reason when an LLM can provide instant answers.
The report emphasised that without exercising the brain’s reasoning and memory faculties, there will be a continued erosion of learning capacity, creativity, metacognition, and critical thinking.
“The brain, like any muscle, strengthens through use and challenge,” researchers stated.
Among the proposed remedies is a return to traditional memorisation exercises as a means of cognitive training. Explaining newly learned material to others, including through AI-assisted teaching, can also reinforce retention.
The report drew inspiration from ancient Greek pedagogy, where learning was based on questioning rather than mere instruction.
The Socratic method, built on probing questions such as “What do you mean by that?” and “What evidence supports that?” – was cited as a model for fostering critical engagement.
Reducing distraction was also recommended as a practical intervention.
The United Kingdom (UK) was noted as a leading example, with approximately 90 percent of schools now banning smartphones during lessons to restore students’ concentration.
The report suggested that universities and workplaces could adopt similar measures by creating device-free zones for reading, discussion, and reflection.
These spaces, it said, would promote deeper learning and more thoughtful engagement with material.
Institutions were further encouraged to adopt problem-based learning and simulation approaches to stimulate creative thinking.
Such methods challenge learners to apply judgment, navigate complexity, and arrive at original solutions—skills that no automated system can fully replicate.
“The aim is not to reject AI but to use it as a sparring partner,” the report stated, “an instrument that challenges the human mind rather than replaces it.”
Ultimately, the findings present a stark choice noting that: society must decide whether to surrender its intellectual agency to machines or to cultivate a partnership with AI that sharpens, rather than weakens, human cognition.
The report also concluded that as data and technology continue to reshape knowledge, only deliberate mental training will allow individuals to retain control over how they think, learn, and create.

