President Joe Biden issued an executive order aimed at safe artificial intelligence (AI) development.
High on the list of Biden’s requirements for AI companies issued Monday (Oct. 30) is a rule saying that the developers “of the most powerful AI systems” share their safety test results and other key information with the federal government.
The executive order is being rolled out in advance of a global AI summit and comes as several countries — and AI companies — wrestle with how to regulate the technology.
The order says that “companies developing any foundation model that poses a serious risk to national security, national economic security, or national public health and safety must notify the federal government when training the model, and must share the results of all red-team safety tests,” according to a White House news release.
The order also said AI firms must come up with “standards, tools and tests” to make sure their systems are secure and trustworthy. It also called on companies to guard against the threat of “using AI to engineer dangerous biological materials” by establishing strong standards for biological synthesis screening.
It said companies must fight “AI-enabled fraud and deception” through the creation of standards and best practices to detect AI-generated content and authenticate official content.
However, Monday’s announcement noted that more action is required and that the White House will work with Congress in hopes of crafting bipartisan AI legislation.
That’s the only way AI can truly be regulated, Senate Majority Leader Chuck Schumer said last week.
“There’s probably a limit to what you can do by executive order … everyone admits the only real answer is legislative,” Schumer said.
Meanwhile, PYMNTS spoke Monday with Ryan Abbott, professor of law and health sciences at the University of Surrey, about questions surrounding AI and intellectual property rights.
While policymakers are unpacking this issue, Abbott said it’s a challenge to come up with effective regulations that balance innovation, consumer protection and ethical concerns.
“We are talking about different countries having a different interpretation,” he said. “But it is very important to get protection more or less globally. … AI is going to be doing a lot more heavy lifting in the creative space pretty soon.”
For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.