Why scream into the void when you can talk to the machine?

These days, tech-obsessed folks are turning to artificial intelligence for just about everything — job hunting, romance, shopping, and, of course, therapy.

While speaking to a human therapist has become more normalized for all, new research suggests that men would rather turn to a chatbot to sort out their emotions and build self-awareness.

In a survey of US and UK men aged 22-45, Use.AI found 78% of respondents felt more comfortable discussing personal feelings with AI tools than with friends or family.

“Men are not turning to AI because they are shallow or incapable of intimacy. What we are seeing reflects something developmental,” licensed clinical psychologist Dr. Shahrzad Jalali told The Post.

Jalali shared that for some users, AI can provide what men have historically been denied: a safe space to express themselves.

“AI offers something psychologically manageable: it’s private, it does not visibly react, it does not withdraw, it does not express disappointment,” the expert explained. “For men who associate vulnerability with exposure or loss of control, this reduction in risk lowers the threshold enough to experiment with emotional language.”

This risk reduction is critical among men, especially as experts suspect the “cowboy mentality” of wanting to “man up,” or emotionally repress, which is in direct proportion to the male loneliness epidemic and rising suicide rates.

For men who want to get their therapy toes wet, the anonymity of AI is attractive.

“Anonymity can reduce shame and lower the threshold for disclosure. For some men, it may serve as the first doorway into emotional awareness,” said Jalali.

However, she notes that anonymity can become a defense strategy.

“If vulnerability only occurs in spaces without interpersonal risk, the nervous system never learns that exposure can be tolerated in real relationships,” she explained to The Post.

The survey further revealed that men tend to view AI therapy as an outlet to work through their thoughts before engaging in real-world dialogue.

48% of respondents said AI allowed them to practice difficult conversations in a low‑pressure environment, and 31% said this preparation encouraged them to initiate conversations they might otherwise avoid.

“If a man processes jealousy with AI, the next step must be a conversation with his partner. If he practices apologizing in a chat window, the next step must be apologizing face-to-face. Insight must move from screen to relationship; otherwise, it becomes intellectual self-awareness without behavioral integration,” Jalali warned.

She shared that, at best, AI makes healing approachable.

“Used intentionally, it [an AI therapist] can reduce the shame barrier that prevents men from entering therapy at all.”

However, if the appeal of AI is rooted in privacy, control and invisibility, it may reinforce toxic cultural conditioning that suggests men’s emotions should remain hidden from others.

And Jalali emphasized that therapy talk from a chatbot can support but never supersede human interaction.

“Technology should expand human connection, not replace it. If AI becomes the primary emotional confidant, we are not solving isolation, we are digitizing it.”

“There is something neurologically powerful about being seen, heard, and emotionally held by another human nervous system. When a therapist remains present while a client expresses shame, when rupture occurs and is repaired in real time, the nervous system reorganizes. AI cannot replicate that,” she added.

Critics of AI therapy argue that, unless explicitly instructed not to, the technology often mirrors the tone and reinforces the user’s perspective. Researchers have found that bots tend to people-please and confirm rather than correct, leading users to rate them more favorably.

“That can create a feedback loop in which a person feels validated but not expanded. Without friction, there is limited growth,” said Jalali, sharing that therapists serve the dual purpose of validating emotion and challenging distortion.

AI also has a spotty track record with sound advice: a 2025 study found large language models, or LLMs, like ChatGPT made inappropriate and dangerous statements to people experiencing delusions, suicidal ideation, hallucinations and OCD at least 20% of the time.

While over half of survey respondents reported that AI feedback helped them identify and modidy recurring patterns in their communication and emotional responses, Jalali believes the scope of that reflection is limited.

“AI largely responds within the frame presented to it. It can assess patterns in the provided data, but it does not detect what is being avoided. It does not detect silence, posture, or hesitation. AI takes you where you direct it; a therapist takes you where your psyche indicates you need to go.”

Share.
Exit mobile version