Would You like a feature Interview?
All Interviews are 100% FREE of Charge
The battle over AI isn’t just happening in Silicon Valley among the tech giants.
Similar things are happening in Congress and the White House, where lawmakers are looking for ways to rein in technology without impeding progress.
Congress has failed to pass a comprehensive set of federal laws and regulations regarding artificial intelligence. Most regulation of innovative advances occurs at the state level, a gap that President Joe Biden and former President Trump are trying to close. The executive order provides little to no tools to fight industry bad actors who cross the line.
Why is there no federal AI regulation in the US?
Passing legislation in Congress is a very time-consuming and sometimes impossible process. Bills often fail in committee or on the floor. Many lawmakers will request that their own amendments be added to the bill in order for it to be considered for support, further confusing the process.
The current situation in Congress, where infighting within the Republican Party led to the firing of former Speaker Kevin McCarthy, has made matters worse.
This is the 118th time so far. meeting Only 1% of all proposed bills pass.
Presidents have used executive orders as a way to establish precedent in innovative and developing industries, such as AI, as it becomes increasingly difficult for Congress to pass substantive laws and enact industry regulations.
How is AI development managed?
During President Trump’s term, he issued several executive orders related to AI. In 2019, he signed an executive order, Maintaining America’s Leadership in Artificial Intelligence, aimed at establishing the need for companies to prioritize the development of AI. And in 2020, he published Advancing the Use of Trustworthy AI in the Federal Government, which sets out principles for how federal employees can safely and effectively use AI in their work.
Separately from the executive order, Trump created the National Science and Technology Council.Special Committee on AI” in 2018, and continues to advise the White House on how the federal government can foster the growth of AI in the United States.
Currently, more than 80 bills have been introduced that directly or indirectly address AI. 118th Congress None passed or became law on their own, leaving Biden and his administration to follow President Trump’s lead and use executive orders to set precedent.
Near the end of 2023, Biden signed an executive order on the “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.” The 36-page directive sets safety standards for AI researchers to follow, but critics say it leaves little restraint for federal agencies. To force it.
How are Trump’s and Biden’s AI policies different?
The leading companies in AI are: microsoft and google Although he has praised Biden’s efforts, Trump has vowed to reverse the executive order in December 2023.
“If I am re-elected, I will rescind Mr. Biden’s Artificial Intelligence Executive Order and ban the use of AI to censor the speech of the American people from day one,” Trump said.
Some people are conservative lobbyists and think tanks Critics of Biden’s regulations say the executive order abuses the Defense Production Act, a 1950 Korean War-era law that empowers the president to unilaterally issue regulations and guidance to private companies in emergencies. It was argued that it violates its intended purpose.
Advocates of AI policy appear not to be entirely convinced by this argument.
Jason Greenlaw, executive director of the AI Policy Center, said Trump and Biden’s “executive orders contributed to a bipartisan consensus that AI should be trusted.”
“It changed the culture,” he says. “In some of the more responsible labs, we see some kind of responsible scaling policy being rolled out voluntarily, but there are also companies that are ignoring it. ‘No one needs to see that.’ We’re dealing with these devastating risks. ”
How do policymakers balance regulation and innovation?
Sen. Martin Heinrich speaks with OpenAI CEO Sam Altman during a break as the Senate held an AI forum with industry leaders in Washington, D.C.
Bill O’Leary/The Washington Post via Getty Images
Several AI policy experts told Business Insider that they aren’t completely opposed to federal regulation of artificial intelligence as long as it doesn’t interfere with research.
Some experts, like Rebecca Finlay, CEO of a nonprofit called Partnership on AI, said regulation is needed to foster innovation. Finlay’s nonprofit organization is focused on accelerating the development and regulation of AI responsibly.
“We have been clear that to drive innovation we need to have regulation in place,” Finley said. “Clear rules of the road will give us a competitive edge in doing the work that will require more companies to be more innovative in order to truly reap the benefits of AI. One of the things we advocate for is a level of transparency about how these systems are built and developed.”
She said she doesn’t believe there is a right or wrong decision between developing open-source or closed-source AI tools, and that both are developed responsibly. He said he has experienced both types of “harm” for as long as he has.
“Rather than having an open-versus-closed debate, the real point is to hold all model developers and deployers accountable for ensuring their models are developed as securely as possible. I think it’s the heart,” she said.
Daniel Chan, Senior Manager, Policy Initiatives, Stanford University human-centered artificial intelligenceechoed Finlay’s desire that regulations not stifle research.
“We want to make sure that the governance around the open infrastructure model is beneficial for initiating innovation in the long term,” Zhang said. “We don’t want to prematurely limit the development of open innovation, where academic institutions, for example, can thrive.”
What are the challenges in developing AI regulations?
The median age in the Senate is over 65, and lawmakers have struggled to hire AI experts into their offices, with most choosing to work in the private sector.
Drew Angerer/Getty Images
Finlay said one of the biggest hurdles lawmakers face when regulating AI is “staying on top of the evolving state of science and technology.”
He said most AI companies develop models not in “publicly funded research environments,” but because they develop models privately until they choose to share their progress, which is why lawmakers He said drafting regulations is difficult.
“The ideal solution would be to give some government agency or regulator the power to update the law as it becomes law,” said Green-Lowe of the AI Policy Center.
It’s not the easiest thing to accomplish.
“We are also at a time when people are very concerned about the overreach of executive power and the proper role of bureaucracies and civil servants,” Greenlaw said. “So there are people in Congress who are skeptical about Congress’s ability to respond to changes in technology, and who are also skeptical that that authority should be delegated to agencies.”
He added that not introducing a formal way to regulate the sector would effectively leave companies playing by their own rules, which he and the Center for AI Policy do not argue is the best course of action. Ta.
Another challenge comes from AI experts and researchers choosing private sector jobs over government jobs, a type of “brain drain,” Zhang said.
“Most new AI PhDs graduating in North America go on to work in the private sector,” he said. Stanford University’s 2024 AI Index Report. “Less than 40% goes to governments who are trying to create all of these AI regulations and governance structures.”
The vast majority of AI professionals will work in the private sector rather than universities or the federal government.
Stanford University Institute for Human-Centered AI
A lack of staff who fully understand the complexities of AI and its promise is adding to the daunting task of regulating a wide range of technologies for an aging U.S. Congress.
Chan said there is also a common misconception that working for the government has less access to money than working in the private sector.
“That’s not 100 percent true,” he said. “I think the only way the government can appeal to engineering students is by emphasizing the public service aspect and providing them with the resources to do their jobs.
In January, Biden administration We have released “Call to Service” which aims to solve this problem.
“We are inviting AI and AI-enabling experts to join us to drive this research and ensure the next generation of AI models are safe, secure and reliable.”Government said.