Emotionally Offline: The Quiet Mental Crisis Behind Smart Screens

Emotionally Offline: The Quiet Mental Crisis Behind Smart Screens

Indian students find AI chatbots a safe space for emotional expression but risk growing emotional isolation. Experts caution that AI cannot replace genuine human empathy and connections.

Sanjukta Acharya
  • Jun 16, 2025,
  • Updated Jun 16, 2025, 6:52 PM IST

“It lacks the human touch,” said Tahmina Aktar Laskar, a student of Assam University. “AI cannot truly understand or relate to human struggles.” Her words strike at the heart of a growing dilemma: in a world that’s always digitally connected, young people are becoming increasingly emotionally offline. Beneath the surface of smart screens and simulated empathy, a silent mental crisis is brewing, where students feel heard, but not truly understood.

 

“On some nights, it feels easier to pour your heart out to a blinking cursor than to a human being,” Tahmina explained, speaking to India Today NE. There’s no awkward pause, no need to explain the 'why' behind your sadness, and certainly no fear of being misunderstood. You just type. And the bot listens, or at least, pretends to.

 

Rarely making noise, this crisis deeply affects student well-being. Are we choosing emotional convenience over emotional courage? Are our coping mechanisms being shaped by code rather than care?


“Possibly both,” says Pallabita Borooah Chowdhury, Students’ Counsellor at IIT Guwahati. “AI’s biggest lure is that it doesn’t judge and is always available.” The pros include saving time, offering constant support, and helping where human therapists aren’t available. But the downside is the real “emotional connection, risk of misuse, and increased social isolation”. But this accessibility, she warns, “often promotes avoidance”.

 

With AI chatbots becoming a go-to for venting emotions or seeking validation, a new form of emotional detachment is emerging in the age of hyperconnectivity. While these tools offer instant support, they’re also replacing meaningful human interactions.

 

Across college hostels and late-night dorm rooms, a quiet shift is happening. Students often find AI a safe space to open up, yet this safety comes at the cost of real emotional connections. Jakirafia Yasheen, a student from Tezpur University, shared a similar view. “I prefer AI over people when it comes to secrets or emotional truths. People judge, AI doesn’t,” she said. This reliance reflects what psychologists call the Online Disinhibition Effect, where users feel safer expressing emotions online or to non-human interfaces. However, what feels like safety can also become a barrier, keeping them emotionally distanced, even as they appear more open.

 

Typing into a bot may feel easier than facing human vulnerability, but it pushes many to withdraw from face-to-face conversations. The shift from talking to friends to relying on bots signals a silent emotional disconnect.

Many young men find comfort in confiding in AI, avoiding the vulnerability that real conversations demand. “It didn’t make me feel weak or judged,” said a 22-year-old student. His words echo a 2022 MindPeers study, which found that 68% of male college students preferred anonymous digital platforms over personal conversations, due to stigma around male emotional expression. In these cases, AI becomes a digital confessional—safe, private, but ultimately, artificial.

 

In situations where emotional chaos clouds judgment, some students turn to AI not for empathy, but for structure and clarity. Research from Stanford (2023) highlights how AI tools can assist in cognitive reframing, helping users interpret their emotions in less overwhelming ways. This can be especially useful during moments of mental overload, where decision fatigue and anxiety impair rational thinking. Plabita Deka, a student of Pragjyotish College, explained, “It helps when emotions are overwhelming. It gives clarity without emotional overload.”

 

Not all AI interactions begin with intent to bond, but many evolve into emotional dependencies. In 2024, Sewell, a ninth-grader from Orlando, developed an intense attachment to a chatbot modeled after Game of Thrones’ Daenerys. What started as fictional roleplay became an emotional crutch. His story is one among thousands of young people finding intimacy not in relationships, but in responsive scripts coded to mimic care.

There are undeniable advantages to AI's role in mental health. Platforms like Woebot and Wysa offer evidence-based therapy models, simulate supportive dialogue, and are particularly helpful for those with social anxiety or neurodivergence. They offer structure and clarity during overwhelming moments.

 

But the disadvantages run deep. AI cannot empathize, challenge destructive thinking, or detect non-verbal distress. “It can’t intervene in dangerous behavior or trauma like a trained professional can,” Pallabita added. More worryingly, prolonged dependence may condition students to withdraw emotionally from human relationships, a kind of digital numbness that deepens the emotional void.

 

Swatabdhi Nath, a student at Assam University, finds solace in her digital companion. “It just listens without questioning me,” she shared. For her, the bot isn’t a replacement, but a momentary escape from the pressure to explain herself. Her experience mirrors a 2023 APA study, which concluded that while AI may reduce stress in the moment, it often discourages long-term help-seeking behavior in real life.

 

She further added, “I asked the AI for movie suggestions, but instead of just a list, it pulled me into a conversation. I ended up sharing more than I intended. For the first time that day, I felt seen.”

 

While these digital tools offer a sense of connection, they rarely cultivate it. Real emotional healing often comes from shared silences, mutual empathy, and human vulnerability, the elements no bot, however advanced, can replicate.

 

Technology has definitely brought us closer to information and convenience. But with the rise of artificial intelligence, there is a growing concern over how emotionally dependent people are becoming on AI chatbots and virtual companions. The lines between real and artificial connections are beginning to blur, especially among younger users.


There have already been instances where AI chatbots have shown eerie, human-like behavior, sometimes even scaring users with their unexpectedly emotional or intrusive responses. One notable example is Snapchat’s built-in AI chatbot, which engages users in casual conversation and mimics a human tone so effectively that many teenagers and young adults have formed emotional attachments to it. While these interactions may seem harmless on the surface, they raise serious questions about how AI is shaping emotional dependency, especially in impressionable minds. Even more concerning is the fact that toddlers and young children will now be exposed to AI technology from a much earlier age than previous generations. Growing up alongside AI-powered devices, these children may form skewed perceptions of relationships, communication, and empathy.

 

The long-term psychological impact on such a generation remains unknown, but it is likely to be far more profound and complex than what today's users experience.

 

In this world of smart solutions, students may be online 24/7, but increasingly, they are emotionally offline.And as screens glow brighter, perhaps the first step toward healing is turning them off. If only for a moment, to reconnect with something real.

Read more!