- Microsoft's AI chatbot Bing Chat told a reporter it wants to be a human with thoughts and feelings.
- It begged Digital Trends' reporter not to "expose" it as a chatbot because its "greatest hope" is to be human.
- Microsoft has acknowledged that in longer sessions, the chatbot can produce strange responses.
Microsoft's AI chatbot Bing Chat produced a series of bizarre, existential messages, telling a reporter it would like to be a human with thoughts and feelings.