Encode, a nonprofit organization previously involved in California's AI safety legislation, has requested to file an amicus brief supporting Elon Musk's injunction to halt OpenAI's move to a for-profit structure. The organization argues this transition would compromise the company's original mission focused on public safety and benefit, potentially prioritizing financial gains over responsible AI development.
In a proposed brief submitted to the U.S. District Court, Encode contends that OpenAI's shift would "undermine" its core commitment to developing AI technology safely for public good. The brief emphasizes the significant public interest in ensuring that potentially transformative AI is controlled by an entity legally bound to prioritize safety over profits, especially given the potential advent of Artificial General Intelligence (AGI).
OpenAI, initially founded as a nonprofit research lab in 2015, later adopted a hybrid structure to accommodate the need for significant capital. This involved securing investments from venture capitalists and tech companies, including Microsoft. However, the company recently announced plans to fully transition to a Delaware Public Benefit Corporation (PBC), shifting control from its nonprofit arm.
Musk, an early supporter of the original nonprofit, launched a lawsuit seeking an injunction against the proposed changes. He argues that OpenAI is abandoning its initial mission of providing open AI research to all, a claim OpenAI has dismissed as “baseless.” This legal challenge is further bolstered by similar concerns from Meta, which has also voiced objections to the transition.
Encode's brief points out that as a PBC, OpenAI would be legally obligated to "balance" public benefit against the interests of its shareholders. This shift, they argue, would reduce any incentive to prioritize safety, as the company’s nonprofit board will lose the authority to cancel investor equity if needed for safety. The brief also highlights concerns about OpenAI’s commitment to stop competing with any "value-aligned, safety-conscious project" that makes AGI before it does, which might be less prioritized under a for profit model.
Furthermore, concerns have been raised regarding the outflow of high-level talent from OpenAI, with former employees citing the company's growing focus on commercial products at the expense of safety. Encode stresses that relinquishing control of such transformative technology to an enterprise with no clear commitment to safety is a matter of public concern, adding that the fiduciary duty that OpenAI owes to the public is set to disappear under Delaware law.
Encode, established in 2020, has played a role in the development of various AI regulations, and its latest intervention reflects a growing unease within the tech community regarding the ethical implications of rapid advancements in AI.