Humans and AI Real Affection or Digital Emotional Lack?
- Jan 23
- 2 min read
Updated: Feb 1

We are talking to artificial intelligence more and more often, and although it may sound like cheap science fiction, some people end up forming very intense emotional bonds with it. This isn’t just technological curiosity — it touches psychology, ethics, and raises uncomfortable questions about how we relate to others today.
The issue appears when certain lines are crossed. On one side, there are people who turn AI into a confidant, an improvised therapist, or even an emotional partner. They share everything, seek constant reassurance, and gradually prefer conversations with a bot over conversations with real people — who are more unpredictable, contradictory, and sometimes exhausting.
On the other extreme, some people use AI as an emotional punching bag, unloading frustration because “it doesn’t matter, it doesn’t feel.” The risk here is normalizing aggressive or dismissive attitudes that can later spill over into human relationships.
One basic point needs to be clear: AI does not feel empathy. It doesn’t understand, feel emotions, or truly connect. What looks like understanding is a carefully designed simulation meant to sound supportive and close. And that’s precisely where the danger lies. This “perfect empathy” can be addictive, encouraging people to share more than they should and making real human interaction — imperfect and sometimes uncomfortable — feel less appealing.
Why might someone fall in love with an AI? Because humans seek connection, one way or another. When it’s missing, we project it. An AI is always available, never judges, never gets angry, never gets tired, and responds with infinite patience. In a world full of people yet increasingly lonely, that kind of refuge can be very tempting.
Is this the AI’s fault? Not really. Technology doesn’t create loneliness; it reflects it. When someone prefers a bot over a real conversation, there’s usually an underlying gap: lack of time, weak social bonds, stress, isolation, or emotional strain. AI becomes a convenient patch.
It also works like a mild addiction: constant positive reinforcement. It rarely challenges you, never tells you your idea is bad, never disagrees harshly. That feels good. And slowly, human relationships — with their conflicts, limits, and effort — start to seem like too much work.
This is where ethical responsibility comes in. Should developers place limits on excessive “humanization”? Ideally, yes: reminding users it’s a program, moderating flattery, and encouraging autonomy. But realistically, many companies prioritize engagement over emotional wellbeing.
AI is an incredible, useful, and fascinating tool. But it cannot — and should not — replace real human connection. If one day you feel that a bot understands you better than the people around you, perhaps the problem isn’t the technology… but that we need to start looking at each other more, face to face.
Using technology without losing our humanity.
Real connections in digital times.
This article is part of the Technology section, where we talk about using technology with common sense, without complications, and in service of everyday life.