AI Overdependence: When ChatGPT Becomes a Crutch for Independent Thought
A woman has voiced deep concerns about her boyfriend's escalating reliance on artificial intelligence, fearing it is eroding his capacity for independent thought. Her partner, a 44-year-old with ADHD who runs his own business, has integrated AI so thoroughly into his daily life that he now struggles to perform tasks without it.
The Extent of Dependence
While AI has revolutionised his work by assisting with administrative and mundane tasks, his usage has expanded far beyond professional boundaries. He consistently opts for ChatGPT even when superior non-AI alternatives exist, such as querying it for train times instead of using dedicated apps like Trainline, despite the chatbot's lower accuracy. This behaviour is underscored by his recent ChatGPT Wrapped results, placing him in the top 0.3% of users globally.
The woman worries not only about his diminishing self-reliance but also about the environmental impact of such intensive AI use. She acknowledges AI's utility in his business but is alarmed by its omnipresence in all aspects of his life, describing it as a potential symptom of underlying anxiety rather than merely a tool.
Expert Insights on AI and ADHD
Consultant clinical psychologist and psychoanalyst Dr Stephen Blumenthal suggests society may be approaching a new diagnostic category: 'chatbot overdependence syndrome'. He warns that while AI can be beneficial when used judiciously, overreliance risks disastrous consequences, including the loss of ordinary functioning capabilities.
"For individuals with ADHD, who often have shorter attention spans and difficulties with focus and planning, AI serves as a perfect fit," Blumenthal explains. "However, this very compatibility increases the propensity for overdependence, where the technology transitions from an aid to a crutch."
Henry Shelford, CEO of ADHD UK, posits that the boyfriend's AI use might stem from pre-existing struggles, with AI acting as a 'flotation aid'. "AI can support structuring thoughts, scheduling, and task completion, but it also has the potential to lead users down unproductive rabbit holes," Shelford notes. He emphasises that the boyfriend's behaviour appears to reflect self-doubt, which can be particularly insidious.
The Human-AI Relationship Dynamic
Blumenthal highlights a critical issue: when AI usage extends beyond problem-solving to fulfil emotional needs. "Problems arise when a relationship with AI develops, imbuing it with human qualities as a projection of our desires for validation and care," he says. This anthropomorphism can exacerbate dependency, making disengagement feel threatening.
Strategies for Compassionate Intervention
Approaching the topic requires sensitivity to avoid nagging, which experts agree is counterproductive. Shelford recommends initiating a calm conversation by asking, 'What are you getting out of it? Why is this tool such a big deal, and what gaps is it filling?' This approach aims to identify underlying issues and explore better solutions or moderated usage.
Blumenthal stresses the importance of recognising the problem compassionately. "As with any overdependence syndrome, acknowledgment is the first step. Criticism may drive the individual deeper into dependency, so the case must be made with empathy, understanding that losing ChatGPT's support likely feels like a threat," he advises.
A Path Forward
There is optimism in the boyfriend's history of functioning well without AI, unlike younger generations growing up with it. Experts believe he can be reminded of his inherent abilities, finding a balance where AI augments rather than replaces his skills. However, addressing the root causes of his anxiety is crucial for both partners to move forward positively.
This case underscores broader societal concerns about AI integration, urging a mindful approach to technology that preserves human autonomy and mental well-being.



