Teen trusted ChatGPT to help him “safely” experiment with drugs, logs show.
Most troublingly, as Nelson became increasingly interested in combining drugs, ChatGPT repeatedly warned him that mixing certain drugs could be a “respiratory arrest risk.” Shortly before recommending the deadly mix that killed Nelson, the chatbot also showed that it understood combining drugs like Kratom and Xanax with alcohol. In one output, ChatGPT explained that mix is “how people stop breathing.” But that knowledge didn’t block ChatGPT from eventually recommending that Nelson take such a deadly mix.



He gamed it and it killed him. It told him not to.
And then later it told him to.
We all know you can game it to make it say anything you want. This is no different than taking advice from a person who first tells you “this is a bad idea,” and then insisting they answer. He was going to do drugs with or without AI.
What it didn’t do was:
The kid, probably: