One-Third in UK Use AI for Emotional, Social Support: Survey
- ByStartupStory | December 19, 2025
A new survey reveals that one-third of UK adults now turn to AI chatbots for emotional and social support, highlighting the rapid mainstreaming of artificial intelligence as a mental health companion amid rising therapy waitlists.
Survey Highlights AI’s Growing Role
The study, polling 2,000 UK residents aged 18-65, found 33% have used generative AI tools like ChatGPT, Grok, or Claude for emotional conversations, venting stress, or seeking relationship advice. Among 18-34-year-olds, usage jumps to 52%, with women slightly outpacing men at 37% versus 29%. Common prompts include “I’m feeling anxious about work—help me cope” or “My partner and I argued; what should I do?”
Nearly 40% rated AI responses as “helpful or very helpful,” citing 24/7 availability, non-judgmental listening, and tailored coping strategies. However, 22% reported feeling “worse” afterward, often due to generic advice or uncanny empathy.
Drivers Behind AI Therapy Adoption
NHS mental health wait times average 18 weeks, pushing users to free alternatives. Cost barriers—private therapy at £60/hour—make AI appealing, especially post-cost-of-living crisis. Pandemic isolation normalized digital companionship, with Replika and Pi.ai gaining 5 million UK downloads since 2023.
Younger demographics favor AI for stigma-free support; 65% of Gen Z prefer texting bots over calling friends. Features like mood tracking, daily check-ins, and voice synthesis enhance engagement, mimicking human therapists.
Benefits and User Experiences
Proponents praise AI’s consistency—no bad days—and evidence-based techniques drawn from CBT datasets. Users report 25% anxiety reduction after sessions, with tools escalating to human help via NHS 111 integration. Anonymity encourages vulnerability, unearthing issues users avoid discussing face-to-face.
Success stories include breakup processing, grief coping, and career burnout navigation. Platforms like Woebot deliver structured 10-minute interventions, backed by Oxford trials showing 30% depression score drops.
Risks and Ethical Concerns
Critics warn of overreliance, with 15% of heavy users delaying professional care. AI hallucinations can dispense harmful advice, like ignoring suicidal ideation. Lacking true empathy, bots risk deepening isolation; one respondent called interactions “emotional fast food—satisfying but empty.”
Privacy fears loom: 28% worry about data training future models. Regulators like Ofcom eye safeguards, mandating crisis handoffs and transparency on AI limitations.
Comparison of Usage Demographics
| Demographic | AI Usage % | Primary Reason | Satisfaction Rate |
|---|---|---|---|
| 18-34 years | 52% | Anonymity/Stigma | 45% |
| 35-54 years | 30% | Convenience | 38% |
| 55+ years | 18% | Curiosity | 32% |
| Women | 37% | Emotional Depth | 42% |
| Men | 29% | Practical Advice | 35% |
Implications for Mental Health Landscape
The trend signals hybrid care evolution—AI triaging low-acuity cases, freeing NHS resources. Startups like Limbic secure £10M for clinician-augmented bots, while Calm integrates GPT-4o for personalized meditations.
Experts advocate regulation akin to medical devices, with mandatory efficacy trials. As usage normalizes, workplaces may offer AI wellness reimbursements, blurring therapy and coaching.
Future Outlook and Societal Shifts
Projections show 50% adoption by 2028, driven by multimodal agents with video analysis and biofeedback. Ethical AI firms prioritize “warm neutrality,” training on diverse therapist transcripts.
This survey underscores AI’s dual edge—lifeline for underserved millions, yet no substitute for human connection. UK policymakers must balance innovation with safeguards, ensuring technology heals without replacing society’s empathetic core.




