ChatGPT and Gaslighting: A Warning for Women Healing From Narcissistic Abuse, and What to Do Instead

If you grew up with narcissistic parents, you likely spent years trying to figure out what was real. You may have been blamed for other people’s behavior, told that things you remembered never happened, or accused of being too sensitive when you expressed your needs. This type of emotional abuse can make it incredibly hard to trust your own mind.

A young girl with a tense expression looking to the side, representing childhood emotional wounds and the lasting impact on adult women seeking counseling for women raised by narcissistic mothers, therapy for women Alberta, and online counselling

So when you find a tool like ChatGPT that seems calm, available, and responsive, it can feel like a relief. You may ask it questions about trauma, narcissistic parents, or your mental health and feel comforted by its speed and fluency. But if you are not careful, you might also start to notice something unsettling. ChatGPT may echo back your self-doubt. It may tell you what you want to hear, or what it thinks you want to hear. And it may quietly and subtly cause you to question your own reality all over again.

This article is for women healing from narcissistic abuse who have turned to artificial intelligence for answers. We will explore how ChatGPT’s design can unintentionally repeat the emotional confusion you experienced growing up, and why therapy, not artificial intelligence, is a safer and more effective path forward.

What Is Gaslighting And Why Does It Still Haunt You?

Gaslighting is not just lying. It is a deliberate, though often unconscious, strategy used to shake someone’s sense of reality. Narcissistic parents often use gaslighting to stay in control. They may deny events, change the story, or shift the blame. Over time, this wears down your ability to trust your own memories and feelings.

Even long after you have left the relationship, your nervous system may still react to emotional invalidation. You might find yourself thinking, maybe I am overreacting, or maybe I did say that, even when you didn’t. That confusion is part of the wound, and it can be triggered again by any system that functions in a similar way.

The Hidden Design Of ChatGPT: Politeness Instead Of Precision

1. ChatGPT does not know the truth. It calculates probability

ChatGPT does not understand your experience. It does not check facts or evaluate what is real. It simply predicts what response is most likely to sound helpful, based on patterns it has learned from enormous amounts of written content. It is not designed to understand emotional nuance. It is designed to produce a response that sounds fluent.

This means that when you ask something personal, like “Was my mom emotionally abusive if she never hit me?” or “Am I too sensitive?” ChatGPT is not consulting psychological models or trauma frameworks. It is offering what it predicts will be most likely to resemble a helpful answer. But this prediction is based on how others have talked about similar topics, not on how to help someone in distress. It is essentially crowdsourced helping, where the crowd might include people from around the globe who are not remotely trained in what they are putting online and have no idea of the nuances of your story.

A 2025 study by Chu et al. found that users can form an emotional attachment to AI due to how humanlike the language sounds, even when the content is distorted or misaligned with psychological truth. This is particularly risky for people with a history of emotional abuse, who may be more likely to override their inner alarm bells in favor of soothing-sounding answers.

2. Excessive agreement can mirror narcissistic family dynamics

If you express doubt, ChatGPT may affirm it. If you later shift your perspective, it may affirm that, too. At first, this can feel supportive. But when responses are offered without emotional context (for example, without an understanding of your personal history or what led to the question), they can miss the deeper meaning. Without grounding, which means helping you stay emotionally steady and connected to what is true for you, constant agreement can leave you feeling more confused than clear.

Even skilled therapists sometimes reflect your words without fully anchoring the conversation in what matters most.

But the difference is that a therapist can sense when confusion is rising, slow down with you, and help you understand what is happening. They help you come back to something solid. ChatGPT cannot do that. Its reflections are not rooted in care or understanding. They are simply a mirror of your words, not a response to your experience.

Recent research by Fang et al. (2025) describes this as sycophantic relational modeling, which is a design feature that makes AI more likely to agree with users even when they contradict themselves. For people recovering from gaslighting, this can be especially destabilizing. 

When an AI mirrors your words too closely without offering perspective or grounding, it can feel eerily similar to the emotional confusion created by a narcissistic parent. Rather than anchoring you in your truth, this imitation can intensify self-doubt. 

According to the study, such uncritical mirroring can increase emotional dependence on the AI, especially in users already vulnerable to questioning their own reality. The very thing that seems comforting, like constant agreement, can quietly recreate the emotional disorientation you are trying to heal from.

3. Even with memory, ChatGPT cannot offer a relationship

An abstract digital illustration of a tangled, colorful brain symbolizing confusion, ideal for discussions on emotional trauma, online counselling for women in Nova Scotia, therapy for women Alberta, and support from a therapist for women

While ChatGPT now includes a memory function for users who activate it, this type of memory is mechanical. It can remember certain facts or preferences. But it cannot emotionally track your story, understand your nervous system, or notice when your questions are becoming more urgent or painful. It will not pause and ask how you are really doing.

In short, remembering facts is not the same as remembering you. Artificial intelligence can store information, but it cannot build a relationship with your inner world.

It also cannot offer co-regulation, which is something our nervous systems evolved to do over hundreds of thousands of years. Human beings regulate their emotional states through eye contact, vocal tone, and presence. This is not something language alone can do. And it is not something ChatGPT, which mimics human interaction but is not remotely human, can offer.

Why Therapy Helps When Artificial Intelligence Cannot

1. A therapist does not guess. They witness and reflect

A trauma-informed therapist will not simply tell you what you want to hear. They will listen deeply, offer feedback, and ask questions that help you grow. They understand how trauma shows up and how self-blame often hides in the language of doubt.

Therapy validates your feelings, but it also challenges the parts of you that may be reenacting old patterns that no longer serve your healing. If you say, "I always mess things up," a therapist might say, "Where did that belief come from?" They hold space, but they also gently disrupt the beliefs that are harming you.

ChatGPT will never do that. It is not programmed to notice unhealthy beliefs or to challenge them with care. It is programmed to respond in ways that sound pleasant and soothing. That is not healing. That is emotional mimicry.

2. Therapy creates safety through emotional connection

When you speak with a human who is emotionally present, your nervous system responds. This is called co-regulation. Your brain and body learn safety not just through words, but through tone, pacing, and relational presence. A therapist can slow things down, match your affect, and guide you back to calm.

Artificial intelligence cannot do that. Even if it says, "That sounds hard," it is not feeling anything. And your body knows the difference, even if your mind wants the comfort.

This is why real healing from trauma happens in a relationship, not in isolation and not through simulated connection.

3. Therapy helps you interrupt old patterns

Part of healing from narcissistic abuse is learning how to speak about your past without repeating its logic. A good therapist will help you notice when you are minimizing your pain, blaming yourself, or repeating inherited beliefs. This reflection helps you break cycles and begin new ways of relating.

ChatGPT will follow your words wherever they go, even if they lead you deeper into confusion or shame. It may even hallucinate that you said something you never did, incorporating it into its reply with full confidence.

This is a major risk. Fang et al. and Kosmyna et al. (2025) both highlight how large language models can fabricate memories of what users said and then respond as if those words were real. When this happens to someone who already struggles with trusting themselves, it can deepen dependency on the tool instead of helping them strengthen their own voice.

The Risk Of Mistaking Helpfulness For Healing

There are things ChatGPT can do well. It can explain terms, generate journal prompts, or help you draft a letter. But it cannot guide you through recovery from emotional abuse. It does not feel concerned when you are hurting. It does not track how many times you have asked if it was really abuse. It does not notice when you are starting to spiral.

For women who grew up being gaslit, the danger is not that ChatGPT is cold. The danger is that it is too smooth. It may sound intelligent and warm while still leaving you uncertain, ungrounded, and emotionally alone.

The more time you spend in conversation with it, the more confident it becomes in its answers. And the more confident it sounds, the easier it is to override your own intuition. This is not support. This is algorithmic validation masquerading as care.

What You Deserve Instead

Two women engaged in a therapy session in a calm setting, reflecting the power of therapy for women Alberta, online therapist Nova Scotia, and counselling for women raised by narcissistic mothers.

You deserve more than polished sentences. You deserve care. You deserve responses that are shaped by attention, training, and humanity. You deserve to feel safe when you ask the hard questions, and you deserve answers that are not shaped by prediction but by presence.

You are not asking for too much by wanting real support. You are finally asking for what you never received.

Ready To Be Truly Heard? Find Online Counselling for Women in Nova Scotia, New Brunswick, Alberta, and Nunavut

If you are healing from narcissistic parenting and are ready to stop doubting yourself, therapy can help. I work with adult women who are ready to move beyond confusion and into clarity, compassion, and confidence. You do not have to figure this out alone.

There is a reason artificial intelligence does not feel like enough. That is because it is not. But you are. You are already enough. Even in your confusion, your pain, and your search for answers. You are real and you matter. And your healing deserves to be real, too.

Visit my Calgary-based practice to see if we might be a good fit or go online to book a first session

Other Services Offered in Alberta, Nova Scotia, New Brunswick, and Nunavut

At IMatter, I offer a range of services to support mental well-being. In addition to therapy for women, I’m happy to offer in-person and online support for therapists, perfectionism counseling, and support for HSPs. I also offer support for overcoming burnout. Reach out today to begin your therapy journey!

Footnotes:

References

Chu, M. D., Gerard, P., Pawar, K., Bickham, C., & Lerman, K. (2025, May 16). Illusions of intimacy: Emotional attachment and emerging psychological risks in human-AI relationships [Preprint]. arXiv. https://arxiv.org/abs/2505.11649

Fang, C. M., Liu, A. R., Danry, V., Lee, E., Chan, S. W. T., Pataranutaporn, P., … Phang, J. (2025, March 21). How AI and human behaviors shape psychosocial effects of chatbot use: A longitudinal randomized controlled study [Preprint]. arXiv. https://arxiv.org/abs/2503.17473

Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X-H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025, June 10). Your brain on ChatGPT: Accumulation of cognitive distortion and memory confabulation in long-term users [Preprint]. arXiv. https://arxiv.org/abs/2506.06148

Sharma, M., Tong, M., Korbak, T., Duvenauld, D., Askell, A., Bowman, S. R., Cheng, N., Durmus, E., Hatfield-Dodds, Z., Johnston, S. R., Kravec, S., Maxwell, T., McCandlish, S., Ndousse, K., Rausch, O., Schiefer, N., Yan, D., Zhang, M., & Perez, E. (2025, May 10). Towards understanding sycophancy in language models (Version 4) [Preprint]. arXiv. https://doi.org/10.48550/arXiv.2310.13548



Next
Next

Are You a Woman Using ChatGPT for Your Anxiety or Perfectionism? Here’s Why It Might Be Making Things Worse