- In the keynote of OpenAI's DevDay event, CEO Sam Altman demoed ChatGPT's new custom GPT feature.
- As he typed in various things live on screen, he made a lot of typos.
- Is it possible Sam Altman needs Mavis Beacon?
OpenAI held its DevDay conference on Monday, with CEO Sam Altman giving a highly-anticipated keynote address in the wake of its huge success with ChatGPT. Altman made several announcements — including GPT-4 Turbo — which will be cheaper and with up-to-date training data (instead of just since 2021).
Microsoft CEO Satya Nadella also showed up and raved onstage about how they're doing something "magical".
One huge new feature is the ability to create custom versions of ChatGPT without needing to code. To show how this worked, Altman got behind a laptop and did a live demonstration where he created a ChatGPT that he could use to give advice to founders. Nifty!
But the demonstration revealed something stranger. As Altman typed into the ChatGPT box, he made … typos. A lot of them. Typos like "thigns" instead of "things" and "tjats" instead of "that's".
Let's investigate.
"thjat's"
"founder" instead of "founders"
"thigns"
"thats"
In almost every moment he typed, he made a small typo — switching letters around, forgetting an apostrophe or the "s" at the end of a plural, or capitalizing the pronoun "I."
Yes, these are tiny errors, and some of them he caught and fixed while typing. And yes, he was also speaking out loud while typing, and probably slightly nervous to be public speaking — no one's optimal typing situation.
I'm a terrible typist. The fact that you read this with minimal errors is a made possible only by having an editor. I recognize my own kind, and I smell it all over Sam Altman: He's a bad typist.
This opens an intriguing possibility: Is it possible that Altman's quest to develop artificial intelligence that can automate much of human written output has a very personal meaning to him? A mission born out of a shameful void in his own life? Did he create ChatGPT because he himself sucks at typing?
We may never know.