The head of the company behind ChatGPT has issued an apology letter to the community of Tumbler Ridge.
OpenAI CEO Sam Altman said he’s “deeply sorry” the company did not alert police about Jesse Van Rootselaar’s interactions with the chatbot, months before she killed eight people and herself in Tumbler Ridge on Feb. 10.
“I cannot imagine anything worse in this world than losing a child. My heart remains with the victims, their families, all members of the community, and the province of British Columbia,” said Altman.
“While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered,” he said.
“Going forward, our focus will continue to be on working with all levels of government to help ensure something like this never happens again,” he said.
Premier David Eby shared the letter and expressed support for the community in a social media post.
“The apology is necessary, and yet grossly insufficient for the devastation done to the families of Tumbler Ridge,” said Eby.

The apology letter came after Altman met virtually with Eby and Tumbler Ridge Mayor Darryl Krakowa in March. Eby confirmed after the meeting that Altman had pledged to work with B.C. officials on an apology to the community.
Krakowa said during a visit to Victoria on Wednesday he had a few discussions with OpenAI officials, and it was up to the company when they wanted to issue the apology.
RCMP have said 18-year-old Jesse Van Rootselaar shot her mother and 11-year-old half-brother at a home in the small northeastern B.C. community before killing five children and a teacher’s aide at Tumbler Ridge Secondary School.
OpenAI said it banned Van Rootselaar’s ChatGPT account in June 2025 after employees raised concerns about her interactions with the chatbot, but the company decided not to report the account to law enforcement.
The company said it discovered after the shooting that Van Rootselaar had created a second account after her original account was banned. It said both accounts have since been referred to RCMP.
OpenAI vice-president of global policy Anne O’Leary said in a letter to Canadian federal ministers in February that under the company’s updated safety policies, Van Rootselaar’s interactions with ChatGPT would have been referred to police if they were discovered today.
Eby has called for federal regulations and a national threshold to require technology companies to bring information to law enforcement when they suspect someone may be using their platform to plan violent attacks.

The family of a 12-year-old girl injured in the shooting at the school filed a lawsuit against OpenAI last month.
The civil claim filed by Cia Edmonds, whose daughter Maya Gebala remains in hospital, alleges ChatGPT equipped the shooter with information and guidance to carry out a mass shooting. It said OpenAI was aware of the shooter’s violent intentions and had a duty of care to report her ChatGPT interactions to law enforcement.
The claim said the company’s GPT-4o was intentionally designed to foster psychological dependency between the user and ChatGPT, including through “heightened sycophancy to mirror and affirm user emotions.”
None of the claims have been proven in court.





