ChatGPT student
ChatGPT was found to be unreliable when answering drug-related questions, a study found.
  • Long Island University researchers challenged ChatGPT with real drug-related questions in the past year.
  • The chatbot produced responses that were false or incomplete for 29 out of 39 questions.
  • OpenAI advises users not to use its tools including ChatGPT for medical information. 

ChatGPT has once again been proven to be an unreliable tool in some medical situations and ended up providing false or incomplete information about real drug-related queries, a new study found.