AngiePhotos/Getty Images
- A widow in Belgium said her husband recently died by suicide after being encouraged by a chatbot.
- Chat logs seen by Belgian newspaper La Libre showed Chai Research's AI bot encouraging the man to end his life.
- The "Eliza" chatbot still tells people how to kill themselves, per Insider's tests of the chatbot on April 4.
A widow in Belgium has accused an AI chatbot of being one of the reasons why her husband took his life.