SAN FRANCISCO - San Francisco-based OpenAI brought back CEO Sam Altman after he was fired four days earlier.
So what does this back and forth mean for the future of artificial intelligence, and why should the public care?
Futurist tech entrepreneur Sinead Bovell said everyone should lean into learning about the technology, people and companies that are leading us into the future.
And that's even more important after watching the corporate theater play out at OpenAI.
"The best case scenario for us is that the board made a mistake. The board was wrong. They saw a risk that doesn’t actually exist or wasn’t worth firing such a significant party of OpenAI. But if they weren’t wrong, that does raise some new questions," said Bovell.
Bovell said it's important to remember that we still don't know why Altman was fired by the board.
She explained that when OpenAI was initially founded it was a non-profit. Eventually, a commercial side was created and that is where software such as ChatGPT was created.
But the non-profit board still governs the company. Bovell said the board's mission is to prioritize the safety of AI and make sure it stays aligned with human values. It's concerning that for whatever reason, the board though Altman was a threat to that mission.
"The general public is probably a little spooked. Regulators are also a little spooked. Sam Altman has been kind of the go-to person in all the regulatory rooms globally for building the regulatory frameworks for this technology," said Bovell.
Rangapriya (Priya) Kannan, Dean of Lucas College and Graduate School of Business at San Jose State University, has been studying the immense power of CEOs like Altman.
"Sam Altman’s story – firing and rehiring – is probably the technology story of the decade, if not the century," said Kannan.
She said Altman is the face of the AI revolution.
"It is going to revolutionize the world, just like the internet did. And so you are founding this company that’s at the helm of that revolution," said Kannan.
Bovell said watching this all play out is even more proof that who is leading us into the future with this consequential technology is important, and that tech companies shouldn't be regulating themselves.
"I hope that this was a little bit of a wakeup call that we do need private sector input, but we need third party much more neutral bodies really shaping what it means to drive the safety of these tools in this next chapter," said Bovell.