OpenAI launched its Superalignment team almost a year ago with the ultimate goal of controlling hypothetical super-intelligent AI systems and preventing them from turning against humans. Naturally, many people were concerned—why did a team like this need to exist in the first place?
You only need two letters to describe best the latest iteration of Google’s big yearly developer shebang: AI.
OpenAI and Google showcased their latest and greatest AI technology this week. For the last two years, tech companies have raced to make AI models smarter, but now a new focus has emerged: make them multimodal. OpenAI and Google are zeroing in on AI that can seamlessly switch between its robotic mouth, eyes, and ears.
What started as a 24-hour hackathon project last weekend could empower the open-source community to upend the smart glasses industry. Five team members built a $20 pair of smart glasses, dubbed Open Glass, that connects what you see and hear to an AI chatbot, such as Meta’s Llama 3.