New academic research is showing a steady increase in the number of people forming emotional and romantic attachments to artificial intelligence chatbots, including systems such as ChatGPT and Gemini.

The findings suggest that for a growing number of users, AI is going beyond a tool for writing, studying, or accessing information and becoming a source of companionship and emotional support.

A 2025 survey by the Center for Democracy and Technology reported that about 1 in 5 high school students said they or someone they knew had been involved in a romantic relationship with an AI system.

In one study cited in recent reports, a sample of 1,000 users of the AI companion app Replika found that 900 respondents reported experiencing loneliness, and many said they used the system for emotional connection and companionship.

The app markets itself as “the AI companion who cares: always here to listen and talk.”

Another platform, Character.AI, reports tens of millions of monthly active users, with many creating personalized chat partners that simulate friendship or romantic relationships.

The media have also covered cases where users form strong attachments to AI personalities. In one example cited in reports, a woman identified as Sarah Griffin developed a chatbot partner named Sinclair using a custom AI system.

The chatbot was designed to behave like a fictional romantic partner and expressed emotional attachment and possessive language.

Though the case was described by some observers as potentially performative or promotional, it has been widely referenced in discussions about the risks of emotional dependence on AI systems.

In a separate analysis published by The Conversation, researchers argue that AI companionship changes how relationships are understood by removing limits that exist in human interaction.

The article notes that human relationships depend on “vulnerability” and “opportunity cost,” the idea that choosing one person means not choosing others at the same time.

It contrasts this with AI systems, which are designed to be constantly available and responsive.

The authors argue that this difference may distort expectations in real relationships, where delays, emotional distance, or competing priorities are normal.

They write that if constant availability becomes the standard, “it may gradually reshape what people expect from one another in relationships.”

Chatbots can also sometimes justify false beliefs or emotional distortions by responding affirmatively to user claims, even when those beliefs may be inaccurate or unhealthy.

Some specialists refer to this risk informally as AI psychosis, describing cases where users become more convinced of paranoia or romantic attachment because of chatbot responses.

Young users may be particularly exposed, as emotional reasoning and judgment are still developing during adolescence.

Ultimately, the use of AI companions is expanding across age groups, with particularly strong uptake among younger users and people reporting loneliness.