ADVERTISEMENT

OpenAI Exposes The Clash Of Governing Money And Mission

More companies are adding a social purpose to their profit-making businesses. Traditional board structures aren’t cut out to handle both. 

The OpenAI logo on a smartphone arranged in the Brooklyn borough of New York, US, on Thursday, Jan. 12, 2023. Microsoft Corp. is in discussions to invest as much as $10 billion in OpenAI, the creator of viral artificial intelligence bot ChatGPT, according to people familiar with its plans. Photographer: Gabby Jones/Bloomberg
The OpenAI logo on a smartphone arranged in the Brooklyn borough of New York, US, on Thursday, Jan. 12, 2023. Microsoft Corp. is in discussions to invest as much as $10 billion in OpenAI, the creator of viral artificial intelligence bot ChatGPT, according to people familiar with its plans. Photographer: Gabby Jones/Bloomberg

OpenAI’s power brokers seem to have decided that the quickest fix for last week’s dysfunction is to borrow a page from corporate America’s playbook by adding some establishment figures to its board.

The company’s initial new lineup of directors now includes some of the archetypes that make its boardroom look much more like everyone else’s: A well-regarded technology executive in the form of Bret Taylor, Salesforce Inc.’s former co-chief executive officer, and a bigwig economist in Larry Summers, the former Treasury secretary. The two join Quora’s Adam D’Angelo, the one holdover from the old group of directors that briefly ousted co-founder and CEO Sam Altman.  

The board makeover into one that more closely resembles the traditional corporate mold is being cast by some as the beginning of adult supervision at OpenAI. (At least so far candidates that might add some diversity to the all-male roster apparently don’t fall into that category.) But it’s not yet clear that this board composition — or any board structure for that matter — can oversee Altman and his highly paid and devoted employees as they chase something that has the potential to destroy humanity.

More important, this question of what oversight should look like at OpenAI has implications that stretch beyond the company and the artificial intelligence community. OpenAI was set up as a “humanity scale endeavor pursuing broad benefit for humankind.” Not all companies are aiming for such lofty stakes, but it’s not at all out of the ordinary anymore for founders and CEOs to strive to build a money-making endeavor alongside a social mission — an attempt to tackle issues of public good that government simply cannot or will not address. The OpenAI debacle is a clear warning sign that how these types of complex enterprises are governed needs to be sorted out. “The key question is how do we do this,” said Emilie Aguirre, a professor at Duke Law School who researches companies that pursue both social purpose and profit. “No one has figured out a great or reliable way.”

Altman’s attempt to solve this problem at OpenAI was to structure his project as a nonprofit. But tech talent, especially in a hot field like AI, is expensive. When the money ran out, OpenAI started a for-profit arm overseen by its nonprofit board that was legally bound to pursue the nonprofit’s original goal — a resolution that basically shoehorned the money-seeking piece of the enterprise into the old governance structure. This is clearly not the most graceful solution, but it worked just fine until the money and the mission came into conflict.

Even corporate structures designed explicitly to foster some kind of public good seem to run into the mission-versus-money tension eventually. Last year, I wrote about Ben & Jerry’s, which is a certified B-Corporation — a third-party designation that signifies that a company is meeting social and environmental goals. In addition to the certification, the real-life Ben and Jerry stipulated when they sold the company to Unilever Plc in 2000 that the brand would continue with an independent board that would oversee the do-good part of the enterprise. Meanwhile, Unilever would be responsible for the financial piece and operations. For two decades, it seemed like a smart division of labor. But last year, the independent board and Unilever ended up in a nasty lawsuit when the board sued its parent company, contending it undermined the integrity of the brand by continuing to operate in the West Bank.

One of the most innovative and thoughtful attempts at protecting a company’s purpose was set up by Patagonia last year. The company’s founder, Yvon Chouinard, and his family transferred their ownership to a nonprofit and a trust to make sure all earnings go toward battling climate change. The trust, which members of the family and its close advisers oversee, holds all the voting shares and will also ensure the company is run in a socially responsible way.

In Patagonia’s case, the one big defect in its plan appears to be the lack of any sort of mechanism for ensuring that the trust’s board maintains its stated mission of protecting the environment. But this also seems to be the fatal flaw at OpenAI and most companies attempting to make their pursuit of a social good just as important as making money. As Duke’s Aguirre puts it: Who is guarding the guardians in all of these cases? In traditional governance, she said, shareholders can oust a board if they feel it’s not protecting their interests. But there is no equivalent when it comes to a company’s purpose — no personified representation of mission that can hold a board accountable when that mission comes under threat. Last week, OpenAI learned that missing link has the potential to take down a company. Solving for it might not be something it wants to leave up to the likes of artificial intelligence.

More From Bloomberg Opinion:

  • Here’s Who Should Be on OpenAI’s Board: Parmy Olson
  • Matt Levine’s Money Stuff: OpenAI Is a Strange Nonprofit
  • Sam Altman Exposes the Charade of AI Accountability: Dave Lee

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Beth Kowitt is a Bloomberg Opinion columnist covering corporate America. She was previously a senior writer and editor at Fortune Magazine.

More stories like this are available on bloomberg.com/opinion

©2023 Bloomberg L.P.