Promptspace Logo
ai·4 min read9.9.2025

Help! My therapist is secretly using ChatGPT

In the imaginary future of Silicon Valley, AI models are so sensitive that we will use them as therapists. They will look after millions for millions who are outrageous of the annoying demands on human consultants, such as the need for graduated degrees, insurance for misconduct and sleep. Something completely different has happened here on Earth. Last week we published a story about people who found that their therapists secretly used chatt during the sessions. In some cases it was not subtle. A therapist accidentally shared his screen during a virtual appointment so that the patient taps into chatt in real time. The model then proposed the answers that his therapist presented. It is my favorite -Ki story lately, probably because it is so well grasped the chaos that can develop when people Ki like Tech companies as well as they have told them. As a writer of history, Laurie Clarke, it does not emphasize a total pipe dream that AI could be therapeutically useful. At the beginning of this year, I wrote about the first clinical study of an AI Bot, which was specially built for therapy. The results were promising! But the secret use by therapists of AI models that are not checked for mental health is something completely different. I had a conversation with Clarke to learn more about what she found. I have to say that I was really fascinated that people called their therapists after they found out that they used Ai hidden. How did you interpret the reactions of these therapists? Did you try to hide it? In all cases mentioned in the piece, the therapist had not presented the previous disclosure of the way they used AI for his patients. So whether they explicitly tried to hide it, it looked like it was discovered. For this reason, I think one of my main representations when writing the play was that therapists should definitely disclose when they use AI and how (if you plan to use it). If this is not the case, this raises all of these really unpleasant questions for patients when they are uncovered and the built -in trust irrevocably damage. In the examples you came across, the therapists simply turn to AI as a pure pure? Or do you believe that AI models can really give you a new perspective on what bothers someone? Some see AI as a potential period of time. I heard from a few therapists that notes are the curse of their lives. I think there is a certain interest in ten in AI-driven tools that can support this. Most I talked to were very skeptical of using AI to use advice to treat a patient. You said it would be better to consult superiors or colleagues or case studies in the literature. Understandably, they were very careful to enter sensitive data into these tools. There are indications that AI is able to provide adequate effectively effective effectively "manual" therapies such as CBT [cognitive behavioral therapy]. So it is possible that it could be more useful for it. But that is AI, which is specially designed for this purpose, not general tools like chatt. What happens when that goes wrong? What is the attention of ethics groups and legislators? Professional bodies such as the American Counseling Association are currently advising against the use of AI tools to diagnose patients. There could also be stricter regulations that prevent this in the future. For example, Nevada and Illinois have recently passed laws that prohibit the use of AI in therapeutic decisions. Other states could follow. Sam Altman von Openaai said last month that "many people use chatted effectively as a kind of therapist", and that is a good cause for him. Do you think that Tech company is overhauling about the ability of AI to help us? I think that technology companies subtly encourage this use of AI because it is clearly a route through which some people form a bond with their products. I think the main problem is that what people get from these tools is really a "therapy" in no way. Good therapy goes far beyond the calming and confirm everything that someone says far beyond the sedative and confirmation. I have never looked forward to a (real, personal) therapy session in my life. They are often very uncomfortable and even stressful. But that's part of the point. The therapist should challenge them and pull them out and try to understand them. Chatgpt doesn't do anything of these things. Read the whole story of Laurie Clarke. This story originally appeared in the algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, register here.

Source: Original

Related