IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

AI Executive Order Sets Stage for Dueling Interests

President Joe Biden’s executive order on artificial intelligence is setting up a tug of war between those who fear agencies empowered under it will overstep their bounds and those who worry the government won’t do enough.

US-NEWS-AI-BIDEN-GET
U.S. President Joe Biden hands Vice President Kamala Harris the pen he used to sign a new executive order regarding artificial intelligence during an event in the East Room of the White House on Oct. 30, 2023, in Washington, D.C. President Biden issued the executive order directing his administration to create a new chief AI officer, track companies developing the most powerful AI systems, adopt stronger privacy policies and "both deploy AI and guard against its possible bias," creating new safety guidelines and industry standards.
Chip Somodevilla/TNS
(TNS) — President Joe Biden’s executive order on artificial intelligence is setting up a tug of war between those who fear agencies empowered under it will overstep their bounds and those who worry the government won’t do enough.

Last month’s order requires multiple departments to collect public comments, draw up new regulations and prepare a slew of reports. It hands significant responsibilities to the Homeland Security and Commerce departments, including the National Institute of Standards and Technology, which is charged with developing safety standards.

The secretary of Homeland Security is directed to establish an advisory AI Safety and Security Board to improve security. The Defense, Veterans Affairs, and Health and Human Services departments must develop regulations for responsible use of AI in their respective fields.

The order directs the Federal Trade Commission, the Consumer Financial Protection Bureau and the Federal Housing Finance Agency to draw up regulations to address bias and other harms by artificial intelligence systems. The FTC must also examine whether it could enforce fair competition among AI companies using existing authorities.

It sets a timeline of three to nine months for the various agencies and departments to produce several reports. They must also call for public comments before drawing up new regulations while identifying new funding opportunities for AI in several fields.

The volume of activity is drawing attention from a spectrum of special interests. The U.S. Chamber of Commerce, which represents the largest U.S. companies, welcomed the executive order, saying that it could help the United States set a global standard for AI safety while funding a slew of new projects.

But Jordan Crenshaw, a senior vice president at the chamber, said he was concerned about multiple new regulations as well as the number of public comments required by various agencies. He said agencies like the FTC, CFPB and FHFA, “which have already been shown to have exceeded their authority for trying to grab power, may use this [order] as a justification to continue how they have operated.”

Crenshaw gave the example of the FTC’s consideration of rules on commercial surveillance and data security, in which it asks the public whether such rules could be applied across the economy. He said any attempt by the FTC to impose such broad rules, without clear new authorities granted by Congress, could run afoul of the Supreme Court’s so-called major questions doctrine. The court, in West Virginia v. EPA, ruled in 2022 that the EPA went too far in its attempt to regulate greenhouse gas emissions without explicit authority from Congress.

Biden’s order creates as many as 90 different requests for comments by the various agencies that are tasked with drawing up regulations, Crenshaw said. “And with very short comment time frames, we might get comment overload and stakeholders may actually miss opportunities to weigh in just because of the massive amount of comments that we have to track,” he said.

NIST ROLE AND FUNDING


Some digital rights groups fear that the order could result in little oversight.

“Biden has given the power to his agencies to now actually do something on AI,” Caitlin Seeley George, managing director at Fight for the Future, a nonprofit group that advocates for digital rights, said in an email. “In the best case scenario, agencies take all the potential actions that could stem from the executive order, and use all their resources to implement positive change for the benefit of everyday people.”

“But there’s also the possibility that agencies do the bare minimum, a choice that would render this executive order toothless and waste another year of our lives while vulnerable people continue to lose housing and job opportunities, experience increased surveillance at school and in public, and be unjustly targeted by law enforcement, all due to biased and discriminatory AI,” she said.

NIST is likely to play a pivotal role in creating new safety standards on AI.

Vice President Kamala Harris announced at the Global Summit for AI Safety in the U.K. last week that under Biden’s order NIST will establish an AI Safety Institute, which she said “will create rigorous standards to test the safety of AI models for public use.” But a study of NIST’s physical and financial needs mandated by Congress and completed by the National Academies of Sciences, Engineering, and Medicine in February found serious deficiencies at the agency.

“A substantial number of facilities, in particular the general purpose laboratories, have functional deficiencies in meeting their environmental requirements for temperature and humidity, and of electrical systems for stability, interruptibility, and for life safety,” the report said about NIST’s facilities. “Most of the older laboratories that have not been renovated fail to provide the functionality needed by world-class scientists on vital assignments of national consequence.”

As a result, the National Academies report said, “these deficient functionalities of NIST’s facilities constitute a major threat to its mission performance and thereby, to our nation’s economy, national security, and quality of life.”

Congress appropriated $1.65 billion for NIST in fiscal 2023. A spokesperson for NIST did not respond to questions on whether the agency plans to seek an increase in funding to meet the new requirements under the order.

But NIST will likely need to double its team of AI experts to 40 people to implement the president’s order, said Divyansh Kaushik, the associate director for Emerging Technologies and National Security at the Federation of American Scientists, who has studied NIST’s needs.

The agency will also need about $10 million “just to set up the institute” announced by Harris, Kaushik said. “They don’t have that money yet.”

In another concern, to attract the top AI talent to write safety standards and develop protocols for safety testing by the world’s largest AI companies, the agency needs “to be able to match market salaries,” Kaushik said. “But that’s obviously not going to happen.”

Congressional action on AI could address several worries about the role of agencies as well as in bridging funding gaps, said Tony Samp, senior policy adviser at the law firm of DLA Piper and onetime aide to Sen. Martin Heinrich, D-N.M. Heinrich is one of the three lawmakers advising Senate Majority Leader Charles E. Schumer, D-N.Y., on the congressional approach to AI legislation.

“There are unique areas of responsibility that Congress has that go beyond what an executive order is capable of doing,” Samp said. “So from a resource allocation perspective, they have the power of the purse. And if NIST is being charged with so many responsibilities, it would only make sense for [Congress] to consider legislation to bolster resources for that agency or others.”

Schumer has said he supports a minimum of $32 billion in federal funding to advance AI technologies, with the money potentially going to private companies to encourage research and development in specific areas.

©2023 CQ-Roll Call, Inc, Distributed by Tribune Content Agency, LLC.