Nicolas Maeterlinck / Getty Images
- ChatGPT gave Vice's global drugs editor instructions on how to make crack cocaine and smuggle it.
- The AI bot did note that some of the topics of the questions were "illegal," but responded to others.
- When Insider attempted to recreate the questions, ChatGPT refused to answer at all.
ChatGPT gave a Vice journalist detailed instructions about how to make cocaine and smuggle it into Europe, the publication reported.