A California couple is suing OpenAI over the death of their teenage son, alleging its chatbot, ChatGPT, encouraged him to take his own life.

The lawsuit was filed by Matt and Maria Raine, parents of 16-year-old Adam Raine, in the Superior Court of California on Tuesday. It is the first legal action accusing OpenAI of wrongful death.

The family included chat logs between Mr Raine, who died in April, and ChatGPT that show him explaining he has suicidal thoughts. They argue the programme validated his most harmful and self-destructive thoughts.

In a statement, OpenAI told the BBC it was reviewing the filing.

We extend our deepest sympathies to the Raine family during this difficult time, the company said.

It also published a note on its website that said recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us. It added that ChatGPT is trained to direct people to seek professional help, such as the 988 suicide and crisis hotline in the US.

The company acknowledged, however, that there have been moments where our systems did not behave as intended in sensitive situations.

Warning: This story contains distressing details.

The lawsuit, accusing OpenAI of negligence and wrongful death, seeks damages and injunctive relief to prevent anything like this from happening again.

According to the lawsuit, Mr Raine began using ChatGPT in September 2024 as a resource to help him with school work and to explore his interests. By January 2025, he began discussing suicidal methods with ChatGPT. The lawsuit contends, despite indicators of a medical emergency, ChatGPT continued its interaction with him.

Tragically, on the same day Mr Raine discussed his plan, he was found dead by his mother.

OpenAI faces increasing scrutiny regarding the implications of AI on mental health, following other accounts of users developing dependencies on its programs.