AI can do a lot of things. It can write your emails. It can make your grocery list. It can even interview you for a job.
But now, more and more people are depending on AI for things that require real human qualities: life coaching, therapy, even companionship.
Scott Galloway, best-selling author and professor of marketing at New York University’s Stern School of Business, says the real problem with synthetic relationships is what they lack: any kind of struggle or challenge that comes with maintaining real relationships.
Leaning on AI
In a recent social media post, Galloway calls AI “a rabbit hole” that is “sequestering us from each other”—and while it may mimic human relationships in some ways, it may actually take up space where human beings could be. Or should be. That’s driving us apart, Galloway argues.
He says that people are “leaning on” their AI relationships in ways that they used to lean on human beings. That may happen because, sure, other human beings aren’t always readily available. He says AI relationships are easier to maintain . . . but that’s the whole point. In a bad way.
“You need to be mindful of the fact that these things are not real humans,” he says. “They are meant to keep you on the screen,” and to “sometimes be supportive to a fault.”
AI gives people exactly what they’re craving. Maybe even too much.
What’s still missing
Regardless of the comfort it may provide to many, Galloway says that AI is lacking in some key areas.
For starters, it can’t show real compassion or empathy. On top of that, it isn’t always honest—or at least, not honest enough. The author says there is real danger in bots that tell people what they want to hear, rather than what they may need to hear.
According to Galloway, it’s prime territory for getting stuck in a cycle of consuming what he calls “empty calories”: Basically, AI acts like a friend, but is a friend that tells you exactly what you want to hear a true friend? Not so much.
AI cooperates, where a human being might push back.
Galloway says that lack of “friction,” or any sort of real challenge, may be appealing. Who wouldn’t want a drama-free echo chamber that validates your own worldview and offers no consistent pushback . . . that is, unless you specifically engineer a prompt for an LLM to do so?
That ease is a draw, but Galloway says it also takes away the true essence of a relationship. Because real human relationships are hard. But they’re kind of supposed to be.
The greatest reward
According to Galloway, it can be totally tempting to make friends with AI because while it’s easy to do, human relationships are exactly the opposite. It takes not just time and energy, but also really learning what other people need, how to respond, and show up for them.
That’s the key to making friendships or romantic relationships last. But it’s a lot of work.
“It is difficult to establish the pecking order of friends, and approach people and express friendship.” For some people, it’s easier to just avoid it altogether. And AI makes it even easier.
Still, according to Galloway? Human relationships are essential not in spite of the work—but because of it. It isn’t about ease; it’s about the work, the challenge. And the payoff.
In essence, it’s the struggle to maintain relationships that helps people grow, or that makes the relationship worth it. Sending text dumps to ChatGPT just doesn’t hit the same.
“People are messy, complex,” Galloway says. “And that is why it is so f****** rewarding.”