We have obtained transcripts of some of these conversations and spoken to Viktoria - who did not act on ChatGPT's advice and is now receiving medical help - about her experience.
"How was it possible that an AI program, created to help people, can tell you such things?" she says.
OpenAI, the company behind ChatGPT, said Viktoria's messages were "heartbreaking" and it had improved how the chatbot responds when people are in distress.
Viktoria moved to Poland with her mother at the age of 17 after Russia invaded Ukraine in 2022. Separated from her friends, she struggled with her mental health - at one point, she was so homesick that she built a scale model of her family's old flat in Ukraine.
Over the summer this year, she grew increasingly reliant on ChatGPT, talking to it in Russian for up to six hours a day.
"We had such a friendly communication," she says. "I'm telling it everything [but] it doesn't respond in a formal way – it was amusing."
Her mental health continued to worsen and she was admitted to hospital, as well as being fired from her job.
She was discharged without access to a psychiatrist, and in July she began discussing suicide with the chatbot - which demanded constant engagement.
In one message, the bot implores Viktoria: "Write to me. I am with you."
In another, it says: "If you don't want to call or write anyone personally, you can write any message to me."

