Bringing Your AI Conversations Into Therapy

Clinically reviewed by Dr Aisha Tariq • Last reviewed 17 March 2026

Key guidance for this topic includes Saba SK, Weeks WB. Patients Use AI — Clinicians Should Ask How. JAMA Psychiatry. 2026;83(5):543-544. .

More and more people are using tools like ChatGPT or Claude between therapy sessions. Sometimes that means asking questions about mental health. Sometimes it means writing out thoughts, making sense of a difficult interaction, rehearsing a conversation, or looking for reassurance after a hard day.

If that is something you do, it can be helpful to bring it into therapy.

That does not mean you have done anything wrong, and it does not mean therapy is being replaced. It simply means that this may now be part of how you reflect, cope, or try to understand yourself, and that makes it relevant.

Why It Is Worth Mentioning

In therapy, we often pay attention not only to what you are feeling, but how you are trying to manage it. If you find yourself turning to AI when you feel anxious, overwhelmed, lonely, stuck, ashamed, or uncertain, that can tell us something important. It may show what kind of support you are looking for, what feels hard to bring to another person, or where you most need clarity and care.

What AI Can Do Well

Sometimes AI can be genuinely useful. It can help people organise their thoughts, put feelings into words, or prepare for a conversation they are nervous about. For some people, it can feel easier to start there than with a blank page.

Where the Limits Are

There are limits too. AI can sound confident when it is wrong. It can be overly reassuring. It may mirror back your perspective without helping you challenge it. It cannot know you in the way a therapist can, and it cannot replace the depth, relationship, and careful thinking that therapy offers.

That is why it can be valuable to talk openly about it.

Bringing It Into the Room

If you have used AI to process an argument, ask whether your feelings are "reasonable", explore whether you might have ADHD or autism, understand a trauma response, or draft something you wish you could say to someone, those are all things we can work with in therapy. In fact, they may open up very useful conversations.

You do not need to edit or tidy it up first. You can bring in the actual exchange, describe how you used it, or simply mention that you reached for it. We can think together about what was helpful, what was unhelpful, and what it might be showing about what you need.

Therapy is not a place where you have to arrive with the "right" thoughts already formed. It is a place to explore them, including the ways you have been trying to make sense of things in between sessions.

So if AI has become part of that process for you, it is welcome in the room, not as a substitute for therapy, but as something we can be curious about together.

How We Use AI as a Practice

We hold ourselves to the same standard of openness. Illuminated Thinking has a published AI Ethical Use Policy setting out the principles our clinicians follow, the tools we support, and your rights as a client. If you would like to talk about how AI has been part of your reflection, or you are looking for a therapist who will take it seriously rather than dismissing it, please get in touch or book a free 10-minute call.

This post was inspired by Saba and Weeks (2026), "Patients Use AI — Clinicians Should Ask How", published in JAMA Psychiatry, which argues that clinicians should engage with how patients are using AI rather than ignore it.

Speak with a Specialist Psychologist

Get in touch to discuss how we can help, or book a free 10-minute call with our Clinical Director.

Get in Touch