Headline
Trump Overturns Biden Rules on AI Development, Security
The new administration moved quickly to remove any constraints on AI development and collected $500 billion in investment pledges for an American-owned AI joint venture.
Source: Andrey Popov via Alamy Stock Photo
President Donald Trump revoked former President Joe Biden’s 2023 executive order aimed at putting security guardrails around artificial intelligence (AI) systems and their potential impact to national security, giving a major boost to private sector companies like OpenAI, Oracle, and Softbank. They responded in kind with collective pledges to spend up to $600 billion on building out AI infrastructure in the US.
Biden’s AI executive order required developers of AI and large language models (LLMs) like ChatGPT to develop safety standards and share results with the federal government to help prevent AI-powered cyberattacks against citizens, critical infrastructure, dangerous biological weapons, and other areas affecting US national security.
Artificial Intelligence Private Sector Ponies Up
Fast on the heels of that revocation, the Trump administration unveiled Project Stargate, which is intended to funnel hundreds of billions into AI infrastructure in the US. The Stargate event at the White House was attended by SoftBank CEO Masayoshi Son, who had already pledged $100 billion to the fund. OpenAI CEO Sam Altman and Oracle co-founder Larry Ellison each pledged an initial $100 billion, all of which will be used to set up a separate company committed to US AI infrastructure. Microsoft, Nvidia, and semiconductor company Arm are also involved as technology partners.
During the ceremony, Ellison said there are already data centers in Texas under construction as part of Project Stargate.
Leading AI CEOs, including Marty Sprinzen, CEO of Vantiq, were delighted by the news.
“As I sit here at the World Economic Forum in Davos, Switzerland, the atmosphere is charged with enthusiasm following President Trump’s announcement of the Stargate initiative — a collaboration between OpenAI, SoftBank, and Oracle to invest up to $500 billion in artificial intelligence infrastructure,” Sprinzen said in a statement.
One outlier with less enthusiasm for Project Stargate is Elon Musk, who claimed the companies don’t have the cash to cover the pledges.
Trump Administration’s AI Cybersecurity Plan
It’s still not completely clear this means if or how there will be any federal oversight of AI technology or its development.
The Biden AI executive order was far from perfect, according to Max Shier, CISO at Optiv, but he still would like to see some federal oversight of AI development.
“I don’t disagree with the reversal per se, as I don’t think the EO that Biden signed was adequate and it had its flaws,” Shier says. “However, I would hope that they replace it with one that levies more appropriate controls on the industry that are not as overbearing as the previous EO and still allows for innovation.”
Shier anticipates standards developed by the National Institute for Standards and Technology (NIST) and the International Organization for Standardization (ISO) will help “provide guardrails for ethical and responsible use.”
For now, the new administration is ready to leave the task of developing AI with adequate safety controls in private sector hands. Adam Kentosh at Digital.ai says he is confident they are up to the task.
“The rapid pace of AI development makes it essential to strike a balance between innovation and security. While this balance is critical, the responsibility likely falls more on individual corporations than on the federal government to ensure that industries adopt thoughtful, secure practices in AI development,” Kentosh says. “By doing so, we can avoid a scenario where government intervention becomes necessary.”
That might not be enough, according to Shier.
“Private enterprise should not be allowed to govern themselves or be trusted to develop under their own standards for ethical use,” he stresses. “There has to be guardrails provided that don’t stifle smaller companies from participating in innovation but still allow for some oversight and accountability. This is especially true in instances where public safety or national security is at risk or has the potential to cause risk.”
About the Author
Dark Reading
Becky Bracken is a veteran multimedia journalist covering cybersecurity for Dark Reading.