Growing Demands for OpenAI to Fulfill Its Commitments to the Public Good

The Investigation into OpenAI’s Regulatory Compliance
Overview of the Investigation
The Attorney General’s office is currently investigating OpenAI, the company behind ChatGPT. In a formal letter sent in December, Deputy Attorney General Christopher Lamerdin pointed out specific clauses in OpenAI’s articles of incorporation. These clauses state that "OpenAI’s assets are irrevocably dedicated to its charitable purpose," suggesting a commitment to operate within ethical and charitable boundaries.
OpenAI’s Public Stance
During a Senate Judiciary Subcommittee oversight hearing on artificial intelligence in 2023, Samuel Altman, CEO of OpenAI, made a public plea. He advised U.S. senators to implement regulations to ensure accountability from leading tech firms, including giants like Amazon, Google, and Microsoft, who is a major investor in OpenAI. Altman emphasized that there should be "incredible scrutiny" on OpenAI and its competitors, reflecting a growing concern about the risks and responsibilities associated with advanced AI technologies.
Lobbying and Regulatory Actions
Recently, OpenAI has ramped up its lobbying efforts in Washington, reportedly increasing its spending on lobbying activities by sevenfold. This is the first time the company has hired lobbyists to actively oppose legislative efforts aimed at regulating AI in California. Notably, Assembly Bill 501 (AB 501), originally intended to prevent OpenAI from transitioning to a for-profit model, was amended to address issues related to aircraft liens instead. David Burruto, the district director for Assemblymember Diane Papan, explained that this change was necessary due to the complexity involved in the initial legislation.
Skepticism from Experts
Gary Marcus, a prominent expert in artificial intelligence, has publicly expressed skepticism regarding OpenAI’s commitment to responsible practices. While testifying alongside Altman, Marcus noted that it seems lobbyists were actively working behind the scenes to influence the legislative process. He has written critically about the “gut-and-amend” approach used for AB 501, arguing that such tactics reflect a disregard for genuine regulatory efforts. Marcus pointed out a pattern where OpenAI presents itself as supportive of regulation, only to engage in lobbying against various measures designed to enforce oversight.
The Need for Accountability
The call for accountability in AI development is becoming increasingly urgent, especially as the technology continues to evolve rapidly. Many stakeholders, including technologists, ethicists, and lawmakers, are advocating for regulations that ensure AI does not harm society. Leaders like Altman recognize that the power of AI needs to be matched with corresponding oversight to mitigate potential risks.
The developing situation surrounding OpenAI and its lobbying efforts highlights the broader debate about how best to regulate emerging technologies. As public awareness and scrutiny grow, it remains to be seen how both the company and regulators will respond to these challenges.
Current Legislative Landscape
In light of the ongoing investigations and public discussions, the legislative landscape around AI is likely to continue changing. Policymakers are grappling with how to craft effective laws that address the unique challenges posed by AI technologies while also considering the economic and innovative potential they hold.
As conversations about accountability, transparency, and ethical considerations in AI grow more prevalent, companies like OpenAI will find themselves at the center of this crucial dialogue. Their actions and responses in the coming months will be watched closely by stakeholders across various sectors.