Teen trusted ChatGPT to help him “safely” experiment with drugs, logs show.

Most troublingly, as Nelson became increasingly interested in combining drugs, ChatGPT repeatedly warned him that mixing certain drugs could be a “respiratory arrest risk.” Shortly before recommending the deadly mix that killed Nelson, the chatbot also showed that it understood combining drugs like Kratom and Xanax with alcohol. In one output, ChatGPT explained that mix is “how people stop breathing.” But that knowledge didn’t block ChatGPT from eventually recommending that Nelson take such a deadly mix.

  • NABDad@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    4
    ·
    14 hours ago

    I’m imagining this kid going on and on taking to ChatGPT about doing drugs. ChatGPT saying you shouldn’t do that over and over, until finally just giving up and saying, “You know what? Yeah. You should do drugs. Do all the drugs, and leave me alone.”