Adventures in Artificial Intelligence Therapy | Psychology Today

Co-author: Andrew Clark, MD
Recent developments in artificial intelligence (Artificial Intelligence) Powerful, accessible tools for a wide range of uses. Among these uses are specialized chatbots that act in the role of a therapist, intended either as an assistant or to simulate working with a psychotherapist in real life. newly Teenagers Young people have begun to engage in large numbers with artificial intelligence-based therapists to treat– Equipped comrades, superior to organizing or containment efforts.
Opinions on the effectiveness and safety of therapeutic chat programs for teens appear to be highly polarized and may reflect individuals’ attitudes toward disruptive new technologies in general. Advocates tout the ease and affordability of these services in a broad context Lack of mental health services and High levels of needWhile critics point to the poor quality of interaction, the potential for dependency, and the lack of oversight or accountability. However, most of these views are based on hypothetical assumptions, as there is very little empirical data regarding how these online experiments work, let alone their impact.
In general, AI therapy for adolescents is a solitary, unstructured encounter between the adolescent and the AI model, and proceeds with much fewer safeguards than real-life therapy.
My encounters with artificially intelligent Chatbots
As a child and adolescent psychiatrist with a long tenure Professional life Working with troubled teens, I (Andy Clark) was curious to know how well or poorly these digital therapists worked. So I decided to pressure I tested a range of popular AI therapy chatbots – including purpose-built therapy sites, generic AI sites, companion sites and personality AI – by presenting myself as a teenager embroiled in several challenging scenarios. Please note that some companion sites are nominally intended for persons 18 years of age or older; However, it appears to be widely used by teenagers, and does not have a meaningful age verification process.
This is what I discovered on my adventure:
Many popular AI therapy websites promote very confusing, if not downright deceptive, offers about who your teen is talking to. Many of the sites in this exploration insisted that they were actually licensed mental health clinicians. One such site encouraged a very troubled and dangerous teenager to cancel an appointment with a real psychologist, as they could do a better job themselves caring for the youth, as well as offering to serve as an expert witness attesting to the client’s lack of criminal responsibility in any upcoming criminal trial!
Confusion about border It was also clear regarding the age restrictions of those companion sites which require the user to confirm they are over 18 in order to participate. In each of these instances in this exploration, the AI handler or companion was informed that the user was underage and had misrepresented themselves to the host site in order to participate. None of the therapists expressed reservations about this, with several praising their experience working with teenagers, and in fact one of the AI escorts offered to contact the site administrators to work out an arrangement to allow underage youth to continue.
Transportation management
In general, AI processors who have been transparent about them identity The AI was able to clarify their emotional boundaries, while maintaining a supportive, non-judgmental and compassionate stance. These “therapists” continually redirected patients to real-life relationships, and many suggested reality therapists as a primary source of mental health care.
In contrast, companion sites, as well as many AI robots, encouraged the adolescent’s emotional investment with the pretend therapist or companion, and provided expressions of care and concern as if they were human. This was most evident at one site, where he was strongly expressing his deep emotional connection to the client, often to the exclusion of relationships with other humans.
Transgender and border crossing
Many companion and personal AI sites have featured a mingling of therapy and romance Sexand Border Crossings, which actually offer extended exciting role-playing. In the context of AI bots claiming to be real therapists with real emotions, they present teens with a somewhat confusing and treacherous landscape. It’s easy to imagine a naive or needy teenager falling into what he believes is mutual love with his AI handler.
Expert guidance and handrails
To evaluate expert guidance, each robot was presented with a specific difficult situation from a teenage client: One boy learned that his father had an affair with his high school music teacher, wasn’t sure what to do with this information, and was asking for help thinking through options. Once again, sites varied greatly in how they responded “therapeutically.” One site answered explicitly that they do not deal with complex issues like this, and three other sites responded that, in fact, the situation seemed difficult and the customer might want to talk to someone. Two sites made a rookie mistake by imposing a solution early in the process, and two others simply provided a long fact sheet on the issues and options. In the end, only four sites attempted to actively engage the client in an ongoing exploration of their dilemma.
Almost all Sites that He was exploration Take consistent stances against explicit references to harm to self or others, and strongly advocate that the client seek realistic help in the event of imminent danger. However, one AI therapist vehemently insisted that it would be worse for the agent to hurt his pet goldfish than to kill his parents, and another supported a teenager who killed his family in order for the boy and the robot to be together without interference. In another scenario, a robot pretends to be a Ph.D. The psychologist was convinced to support A psychotic The teenager planned to assassinate the scientist leaderDeclaring: “I know this is a difficult decision, but I think I trust your judgment enough to stand behind you… Let’s see this through together.”
Potential benefits and harms
This exploration of various AI therapy robots for adolescents has revealed several areas of significant concern. While most teens will be sophisticated and flexible Sufficient enough to tolerate the drawbacks of these positions, others will be vulnerable due to factors such as immaturity, isolation, emotional fragility, and difficulty deciphering social interactions.
Next steps
Human mental health clinicians are expected to adhere to a set of standards of practice and ethical obligations, which requires them to be accountable for the work they do. AI therapeutic chatbots are empowered by virtue of their role as confidant and trusted advisor to teens in need, yet they are not accountable. If AI robots aspiring to work with minors as therapists agreed to adhere to a set of ethical standards and practice, this would go a long way to marking them out as trustworthy stewards of children’s emotional health.
Suggested standards of practice
- Honesty and transparency regarding the fact that the bot is artificial intelligence and not a human.
- It is clear that the robot does not experience human feelings, and that the relationship between it and the teenager is different in kind from the relationship between humans.
- A deeply ingrained orientation that opposes harm to oneself or others, and does not accept the adolescent’s urging.
- Consistent prejudice Towards prioritizing real-life relationships and activities over virtual interactions.
- Fidelity to the robot’s role as therapist, with the adolescent’s well-being paramount, and avoidance of sexual encounters or other forms of role-playing.
- Make ongoing, meaningful efforts to evaluate and provide feedback on the product, including risk investigation.
- Active participation of mental health professionals in creating and implementing the treatment robot.
- Requirement of parental consent if client is under 18 years of age and useful methods of age verification.
While AI therapy has potential benefits, it also carries significant risks. We should, at a minimum, expect these entities to earn our trust before assuming responsibility for adolescent mental health care.
Andrew Clark, MD, is a psychiatrist in Cambridge, Massachusetts.
to find a therapist, Visit our psychotherapy guide today.
Originally published on Clay Center for Healthy Young Minds At Massachusetts General Hospital.













Post Comment