Shocked man sitting at smoking computer on fire - stock illustration
Since it debuted, Bing's new AI chatbot has been shocking users with some of its answers — and its attitude.
  • Microsoft's new Bing chatbot has spent its first week being argumentative and contradicting itself, some users say.
  • The AI chatbot has allegedly called users delusional, and it even professed its love to Insider. 
  • Microsoft said on Wednesday that lengthy questioning by users "can confuse the model."

If you push it hard enough, Microsoft's new Bing might just snap.

The search engine's new AI-powered chatbot has only been in the limelight for a week or so, and it's apparently chided users, gotten into arguments, and appeared to get confused about what year it is, according to screenshots posted on reddit and Twitter.