• D.C.
  • BXL
  • Lagos
  • Dubai
  • Beijing
  • SG
rotating globe
  • D.C.
  • BXL
  • Lagos
Semafor Logo
  • Dubai
  • Beijing
  • SG


Updated Dec 7, 2023, 3:07pm EST
techpoliticsNorth America

Highlights from Semafor’s Finding Common Ground on AI event

Kristoffer Tripplaar
PostEmailWhatsapp
Title icon

The Scene

Semafor convened a slate of top tech leaders and policymakers on the East and West Coasts on Thursday for discussions about the future of artificial intelligence.

From both Washington, D.C. and Silicon Valley, Semafor reporters and editors posed the biggest questions about AI’s limitations and what it means for the way we work, live, cure diseases, wage war, and campaign.

Here are highlights from the event.

AD
Title icon

Know More

AI and healthcare

Dr. Namandje Bumpus, the chief scientist of the U.S. Food and Drug Administration, said there is ample opportunity for the use of AI in the field of biomedical sciences, especially to make predictions about how drugs could work while in preclinical form. But she noted that AI could also amplify existing biases in clinical trials.

Historically, clinical trials were less diverse and “there is potential bias there.”

AD

“We really have to be deliberate about and ensure … that we’re looking for inclusion of communities that have been historically excluded,” Bumpus said.

AI and policy

Democrat Rep. Ted Lieu and Republican Rep. Jay Obernolte each pushed for the passage of legislation that would regulate AI and Big Tech. Lieu has introduced legislation to create a bipartisan blue-ribbon AI commission to discuss how to regulate the tech, while Obernolte argued the country needs federal privacy legislation.

AD

“We can’t allow 50 different state regulations to lead on this,” Obernolte said.

Lieu said he was “hopeful” that Congress could pass a bill next year on regulating AI, adding that he doesn’t think lawmakers need to know exactly how it works, as long as they understand the harms and benefits of the bill.

AI and Google

A senior Google executive said Thursday that the tech giant doesn’t need to rely on outside companies to develop advanced AI models, a day after Google released it’s ChatGPT rival called Gemini.

“In short, we don’t don’t believe in outsourcing [research and development],” said Kent Walker, the president of global affairs for Google & Alphabet.

His remarks come after the Sam Altman-related drama at OpenAI showed how much Microsoft is dependent on the tech startup.

“We have something like 13,000 computer science PhDs at Google who are excited about this and focused on this new generation of work,” Walker said.

AI and warfare

Rep. Jim Himes, D-Conn. said the Pentagon and Congress were “not moving fast enough” to develop and buy new weapon-related software.

“We are still buying stuff in the Pentagon the way we bought it 40 years ago... and developing things the way we did 40 years ago, battleships and tanks. Everything is software now, and none of these entities have moved into a world where they need to understand … that developing software is a radically different proposition than building a tank.”

Ten years from now, he said, the U.S. will need to have “autonomous networks tracking battle spaces … because somebody will.”

AI and energy

The U.S. Department of Energy’s 17 national labs used for research generate “more data then anyone else in the world,” said Geri Richmond, the agency’s under secretary for science and innovation. “So we have to worry about this because we don’t have unlimited budgets to pay our energy bill either,” Richmond said, adding that the DOE uses AI to make its systems more efficient.

AI accelerationists vs. decelerationists

The head of the group that led calls for a six-month “pause” on AI development earlier this year, reiterated his stance to stop AI research “before we do potentially dangerous things.”

“The companies are going ahead and raising and building the systems that may potentially be unsafe. It’s not a good situation,” Anthony Aguirre, executive director of Future of Life Institute, said. “So yes, I think we need to have the apparatus in place before we do potentially dangerous things. We don’t have it.”

However, the former chairman of the FCC Tom Wheeler is concerned that companies will not heed calls for a pause.

“So the question becomes, how are we going to learn from the last couple of decades of the digital experience, which was, go ahead, create something, deliver it to the marketplace, and then we’ll worry about the consequences,” Wheeler said.

Semafor Logo
AD