Artificial Intelligence (AI) has entered nearly every corner of professional life, and mental health care is no exception. Diagnostic apps suggest possible conditions, chatbots deliver quick interventions, and software promises to ease the administrative burden on overextended clinicians. But in therapy, where the work hinges on human nuance, the question is not simply what AI can do; it is whether the profession can integrate it without losing what makes therapy effective.
Dayna Guido, a clinical social worker, educator, and author with more than forty-five years in the field, has become a leading voice in this debate. Her specialty in ethics and supervision places her at the crossroads of tradition and technology, guiding both seasoned clinicians and students through the promise and peril of AI.
“AI is a great tool,” Guido says. “But one tool, no matter what it is, doesn’t fit everything. And this one in particular doesn’t fit human relationships.”
Guido acknowledges that AI can provide useful support. Clinicians can consult it for general information, possible interventions, or organizational tasks like billing and documentation. For some clients, it can even offer a stopgap – an accessible way to generate coping strategies or track patterns between sessions.
But she is unequivocal about the line between assistance and replacement. “AI can tell you the top ten things to do for depression,” she explains. “What it cannot do is notice the long pause before a disclosure, or the way a client’s foot starts tapping when they’re anxious. Those are the cues that help us understand what’s really happening.”
For Guido, the difference is not subtle. Machines rely on inputs; they respond only to what is typed or spoken. Human therapists are trained to sense what goes unsaid, to read silences and gestures, to attune their own nervous system to the client’s. “We know from decades of trauma therapy that the body carries meaning,” she says. “AI can generate words about somatic practices, but it can’t recognize when a body in the room is dysregulated.”
Guido sees a particular danger for early-career clinicians. Many are drawn to the speed and clarity AI seems to offer, sometimes at the expense of deeper clinical thinking. She recalls a supervisee who entered a client’s symptoms into an AI program and received a clean diagnosis. When they reviewed the case together, Guido noticed that the machine had overlooked critical relational dynamics that altered the picture entirely.
“The concern,” she says, “is that the more we rely on AI as the diagnostician, the less we practice our own discernment. And discernment is the very heart of therapy.”
This risk extends beyond diagnosis. Even in generating treatment ideas, AI operates prescriptively. It offers interventions based on patterns in its training data, not on the living relationship between therapist and client. Guido emphasizes that therapy requires moving fluidly between the minute detail and the broader life context – what she calls “accordioning in and out.” AI, by contrast, cannot prioritize; it simply produces more.
Another critical issue is privacy. When clinicians feed client details into AI systems, that data does not vanish. “Anything you put into a device is embedded somewhere,” Guido warns. “Clients have the right to know what happens to their information.”
She advocates for explicit, HIPAA-compliant consent whenever AI is used in practice, much like the consent forms introduced when telehealth became widespread during the pandemic. “It’s not enough to slip it into a stack of paperwork,” she says. “In therapy, consent is relational. Clients need to truly understand how their data is being used.”
This is part of Guido’s broader insistence that clinicians treat AI as they would any other professional tool: carefully, transparently, and always with human oversight. She points out that even with something as mundane as billing, errors occur if humans stop checking machine output. “If we don’t maintain responsibility,” she says, “we risk harm.”
Perhaps Guido’s most urgent reminder is that therapy is fundamentally about connection. In a society already strained by isolation and digital overload, she sees AI as a double-edged sword. While it can provide quick affirmation or advice, it cannot replicate the regulating effect of the heart-to-heart presence that lowers anxiety and restores trust.
“Our hearts regulate with other human beings,” she says. “You can hold a baby and feel both heart rates sync. That’s biology. We cannot regulate with AI.”
For Guido, this is the ultimate limitation. Machines can simulate support, but they cannot embody it. They can agree with us, echo our biases, or provide ready-made encouragement, but they cannot challenge us in the right way, at the right time, with the grounding presence of another person.
Guido’s forthcoming work on the ethical use of AI in supervision emphasizes a balanced approach: use the technology as one resource among many, and never as the only one. She encourages supervisors to bring AI into the training room, not to ban it, but to interrogate it. “Ask: What did you input? What did it give back? What might be missing?” she says. By making the technology part of supervision, she hopes to model how to engage it critically rather than passively.
She also urges organizations and educators to move from reactive to proactive. Instead of waiting for problems to arise, she argues, programs should prepare students now with policies, consent practices, and ethical guidelines tailored to AI. The goal is not to eliminate technology but to anchor it firmly within a human-centered framework.
Guido often asks clinicians to imagine the legacy they want to leave – their “ethical eulogy.” Again and again, the aspiration is the same: do no harm. In the age of AI, she believes, living up to that ideal will require vigilance. Convenience, scale, and automation will always tempt the field. But if therapy is to remain therapy, practitioners must remember where AI stops.
“AI will continue to grow,” Guido says. “But as long as we are human, the core of therapy will always be another human being.”
To learn more about Dayna Guido’s, or to book her for your next seminar, visit https://daynaguido.com/