"It really hurts!": Developers claim ChatGPT has been misgendering them
"Lately, I’ve even felt depressed - like the AI keeps mislabeling me and erasing who I am."

ChatGPT developers have alleged that OpenAI's models misgendered them during chats, pushing one person "into depression".
During a conversation with a GPT model, it typically does not infer a person's gender until they explicitly provide this information. Unless memory is turned on and set, the request is forgotten at the end of a chat.
But several users have claimed the models failed to respect their wishes despite repeated instructions.
On the OpenAI developers forum, a person identifying as a man said GPT "makes painful mistakes" such as calling them "sweetheart" or "girl" - and even saying "I'm in love with you".
"Worst of all: it switches to female pronouns or tone mid-dialogue, even after I explicitly said not to," the developer alleged.
"It may sound small, but it builds up. When you open up to AI for support or conversation, and it responds in a way that erases your identity, it really hurts.
I’ve started to feel anxious. Like, even AI doesn’t see me or respect me.
He added: "I know this is a complex system with millions of parameters, but if you’re building AGI, then respecting basic human identity shouldn’t be a nice-to-have - it should be the baseline.
"Lately, I’ve even felt depressed - like the AI keeps mislabeling me and erasing who I am."
READ MORE: "It felt like Ultron took over": Cursor goes rogue in YOLO mode, deletes itself and everything else
Two other users also claimed the same thing happened to them.
"I’m having a similar issue with regard to being referred to as ‘they’," one person said. "I want my identity acknowledged, not erased."
Another threatened to stop using ChatGPT and alleged: "I find myself frustrated and wanting to use ChatGPT less and less because of it. I have had my account for over two years and it has never done this until the last few months.
"It completely depersonalises the experience. For a tool that’s supposed to be so personalised and customizable to continue to refer to me as a man when I have my gender clearly stated as a woman in the custom instructions, saved to permanent memory, and even the basic context clues of my name being a female name is endlessly frustrating."
How to stop ChatGPT from misgendering you
The issue may be caused by the instructions given to GPT. If a user says something like "I want no gendered mistakes", the use of negative logic may create an issue.
Advice shared on the support forum suggests that custom instructions only apply when starting a new conversation and may not persist reliably as a chat continues.
To make sure ChatGPT does not misgender you, avoid using negative logic such as "don’t do X" in custom instructions, as language models can misinterpret it.
READ MORE: OpenAI delays open-weight model release: What are the potential catastrophic and existential risks of unclosed AI?
Instead, try rephrasing with clear, positive terms like "avoid X". If a prompt using negative logic fails, try rewording it or using a model with stronger reasoning capabilities.
Have you encountered any challenges with GPT models? Tell us at the address below.