A California bill that attempts to regulate large frontier AI models is creating a dramatic standoff over the future of AI. For years, AI has been divided into “accel” and “decel”.
An influential machine learning dataset—the likes of which has been used to train numerous popular image-generation applications—includes thousands of suspected images of child sexual abuse, a new academic report reveals.
While most AI companies are preciously unveiling their latest algorithms with press tours and blog posts, others seem content to throw their latest wares out into the digital ether like a pirate ship casting off dead weight. One company that fits this latter category is Mistral, a French AI startup that released its…