Loneliness, social isolation affecting seniors, too
At a recent loneliness and social isolation committee meeting conducted by the Trumbull County Mental Health and Recovery Board staff, the issue of artificial intelligence with social isolation was discussed.
As we found out, loneliness issues are not just affecting the young, but senior citizens as well.
An article we reviewed of the 2026 trends report highlighted the realm of science fiction, and human-AI relationships are becoming normal aspects of daily life.
While generative AI assistants such as ChatGPT, Claude and Gemini have become common tools for many users, a new wave of AI apps, such as Replika, Character.AI, and dozens more, are specifically designed to simulate human companionship.
The essential distinction between the assistant chatbots — which are sometimes used as digital friends — and companion AI chatbots is that the latter have been specifically designed to initiate and maintain romantic relationships.
Between 2022 and mid-2025, the number of AI companion apps surged by 700%, according to the technology news site TechCrunch.
And in 2026, they are poised to become even more embedded in our social lives. Marketed as friends, advisers and romantic partners, these apps now attract millions.
Many platforms offer both text and voice mode with natural sounding speech that mimics human cadence and tone.
Furthermore, they are engineered to recall and respond to users’ unique characteristics, including their personal lives, preferences and past conversations.
This may give users the impression that AI chatbots and companions know them intimately and serve as a refuge to disclose their innermost private thoughts and receive unwavering support in return.
The level of data privacy on many of these tools remains an open question.
A recent Harvard Business Review analysis identified therapy and companionship as the top two reasons people use generative AI tools, also known as large language models (LLMs). Psychologists, long aware of the loneliness experienced by many, are investigating how the growing prevalence of relational bonds between humans and AI will affect social skills, intimacy and mental health, and what this means for the year ahead.
AI companion apps and general-purpose chatbots can also offer a safe space for users to rehearse social interactions, provided they are designed and used responsibly.
“It is kind of like a low-stakes way to practice conversations with real people in a way that might feel less overwhelming,” said Ashleigh Golden, PsyD, chief clinical officer at Wayhaven, an AI wellness platform that supports coping skills and resource connection for college students. “With the right guardrails, these tools could actually serve as a social skills mentor, modeling empathy, appropriate turn-taking, and active listening for folks who are lonely.”
As we learned, there is hope, as well as concerns, dealing with AI.


