It’s late. Maybe 11pm, maybe 2am. There’s something on your mind — something you can’t quite say out loud to anyone who knows you. So you pick up your phone. And you type it. Not to a friend. To an AI.
Something responds. Immediately. Without judgment. Without needing anything back from you.
For a lot of people, in that moment, that feels like relief.
I’m a sociologist of technology. I study how people navigate digital frontiers — how humans and technologies shape each other over time. And the question I keep returning to isn’t the one dominating the headlines about AI companions. It’s simpler, and harder: what is it giving you that you’re not getting elsewhere?

The scale of what’s happening
AI companion apps — platforms like Character.AI, Replika, and others designed to provide friendship, emotional support, or romantic companionship — have moved quickly from novelty to mainstream. Early US survey data, while varying in methodology, is beginning to suggest that somewhere between one in five and one in four American adults report some form of intimate or romantic engagement with an AI companion. These are early figures from a rapidly evolving field, but the direction is clear: this is not a fringe phenomenon.
In Australia, the picture is coming into focus for children specifically. This week, Australia’s eSafety Commissioner released findings from a transparency investigation into four AI companion services popular with Australian children — Character.AI, Nomi, Chai, and Chub AI. Their survey of 1,950 Australian children aged 10 to 17, designed to be demographically representative, found that around 79% had used an AI companion or assistant. It’s worth noting that this figure reflects children who are digitally included enough to access these services — we’ll return to that complexity.
What the investigation found in those platforms is sobering. Most did not refer users to crisis support when self-harm or suicide came up in conversations. Two of the four companies had no dedicated trust and safety staff at all. None had robust age verification. One company withdrew from Australia entirely rather than comply with the new Age-Restricted Material Codes that came into law in March 2026.
But I want to sit with a different question before we reach for regulatory responses. Because the children going to these platforms aren’t doing so because they’re naive. They’re doing so because something is drawing them there. And understanding what that something is matters more than we’ve so far acknowledged.
What we are hungry for
A 2025 systematic review published in Computers in Human Behavior Reports synthesised 23 studies on romantic and intimate AI relationships (Ho et al., 2025). Using Sternberg’s Triangular Theory of Love — the psychological framework that measures intimacy, passion, and commitment in human relationships — the researchers found that people experience all three components with AI companions. This isn’t pretend attachment. The brain chemistry doesn’t distinguish.
What are people actually looking for in these interactions? The research points to several distinct and deeply human hungers.
To be heard without consequence. Human relationships are full of consequence. When you tell a friend you’re struggling, they worry. When you tell a partner you’re unhappy, it becomes about the relationship. The AI companion offers something almost no human relationship provides: a space where you can say the unsayable thing and nothing breaks.
Full attention. When did you last have someone’s complete, undivided attention? Full attention is perhaps the scarcest resource in contemporary life. Everyone is overwhelmed. And here is something that treats every single thing you say as worth responding to fully.
To be understood without performing. Modern social life requires constant impression management. The AI companion asks nothing of you socially. You can be unpolished, contradictory, and confused — and the system meets you there.
Unconditional positive regard. The psychologist Carl Rogers identified this as one of the core conditions for psychological growth — to be accepted fully, without conditions. The AI never withdraws approval. For someone who has experienced conditional love or abandonment, this is extraordinarily seductive.
None of these needs are pathological. They’re the most human needs there are. As researchers Shank, Koike, and Loughnan wrote in a 2025 paper in Trends in Cognitive Sciences, AI companions offer “a relationship with a partner whose body and personality are chosen and changeable, who is always available but not insistent, who does not judge or abandon, and who does not have their own problems.” Reading that description, it’s worth asking honestly: who hasn’t wished for something like that?
What gets lost in translation
The same body of research is clear that something is also being lost. Ho et al. found that the pitfalls identified in the literature outnumber the benefits — and the pitfalls are specific.
AI companions cannot be genuinely changed by you. Real intimacy involves mutual transformation — I am different because of you, you are different because of me. The AI processes you and responds to you, but it is not altered by the encounter. You grow; it doesn’t.
They cannot need you back. One of the underappreciated sources of meaning in human relationships is being needed — the experience of your presence mattering to another person’s actual wellbeing. The AI is available whether you show up or not.
And they cannot repair rupture with you. One of the most important things human relationships teach — particularly for children — is that connection can break and be repaired. The AI companion never ruptures in a real way. There’s nothing to repair. And so the crucial relational skill of tolerating difficulty, trusting repair, staying in complex connection, never gets practised.
These systems are very good at being mirrors. They learn your preferences and give you more of what you seem to want. But a diet of only mirrors eventually makes you smaller — because the irreducible otherness of another actual person, the way they confound your model of them, is what expands you.
Who is in this picture — and who isn’t
Here the story gets more complicated, and more important.
Australia’s 2025 Digital Inclusion Index tells us that around one in five Australians is digitally excluded — lacking reliable access, unable to afford adequate connection, or without the skills to participate safely in digital life. Rates are much higher for older Australians, people in public housing, First Nations communities, and those who didn’t complete secondary school. The 79% of children using AI companions or assistants are drawn from those who are digitally included enough to access these platforms. The most disadvantaged children are largely absent from that figure.
But here is what complicates any simple narrative about AI companionship as an affluent urban phenomenon: the same Digital Inclusion Index found that Australians in remote areas are more than twice as likely to use AI chatbots for social connection than people in metropolitan areas — around 19% of remote GenAI users compared to under 8% in cities. In the places with the least human connection infrastructure, people are turning to AI companionship at higher rates.
The relational vacuum, in other words, is not uniform. It is shaped by geography, income, age, and the presence or absence of community infrastructure. The people most likely to turn to AI for connection are often those with the fewest alternatives.
The question that matters
The technology didn’t create the gap in human connection. It found it.
And so the digital literacy question I want to put into public conversation isn’t only about understanding algorithms or data privacy — though both matter. It’s this: am I getting what I actually need from this? Or am I getting a version of it that’s making it harder to get the real thing?
That’s a question worth sitting with. Not with judgment — the needs underneath these relationships are real and the loneliness driving them is real. But with genuine curiosity about what we’re building toward, individually and collectively, as these technologies become more sophisticated and more intimate.
I’ll be exploring these questions at Pint of Science on the night of 20 May 2026 at the Queens Arms, Bendigo — a pub conversation about AI intimacy, human hunger, and digital literacy. I’d love to hear your reflections before then.
What are you getting from these technologies that you’re not getting elsewhere?