• D.C.
  • BXL
  • Lagos
  • Riyadh
  • Beijing
  • SG
  • D.C.
  • BXL
  • Lagos
Semafor Logo
  • Riyadh
  • Beijing
  • SG


Nonprofit connects tech whistleblowers to improve AI oversight

Rachyl Jones
Rachyl Jones
Tech Fellow
Aug 8, 2025, 10:58am EDT
technologyNorth America
A screenshot from Psst’s website.
Courtesy of Psst
PostEmailWhatsapp
Title icon

The News

A startup nonprofit is trying to make it easier for tech workers to blow the whistle by connecting them with others who have witnessed similar problems.

Psst, pronounced as the whispering sound, created a platform that stores encrypted information and personal accounts provided by employees who feel uneasy about their work. The nonprofit then connects individuals making similar complaints with each other. If they decide to share information with a journalist, Congress member, or regulatory agency, they can bring forward a stronger story as a group. While matching is currently handled by Psst employees, the nonprofit is developing an algorithm that will eventually do it autonomously. Lawyers working pro bono are also present at every step in the process, ensuring potential whistleblowers know their rights. The idea is to “use tech to take on tech,” said co-founder and lawyer Jennifer Gibson.

“You’ve got [an industry] that’s possibly more powerful than the government developing a technology at breakneck speed without guardrails and workers who may not have good options for how to raise alarms,” she told Semafor. “If nobody’s going to regulate, we’re going to need insiders to tell us what’s going on.”

As AI systems become increasingly powerful and disruptive, they are also more opaque than many of the technologies preceding them. Messaging about their capabilities is highly curated, while sky-high investment amounts and pressure to deliver from the White House incentivize companies to paint an idealistic picture of their technologies and obfuscate any potential issues. Psst is part of a growing push to find alternative methods to promote AI safety and keep companies in check.

AD

Suresh Venkatasubramanian, who worked at the Biden administration’s White House Office of Science and Technology Policy, said the push to develop AI and the lack of robust oversight make it difficult to track potential harms. “We’re still at the very early stages of being able to understand how a lot of these systems work, and that’s critical information needed to understand their effectiveness,” he told Semafor. “In the absence of any ability to regulate these systems, we don’t have too many mechanisms to look under the hood. Whistleblowers [are] one way to get some insight.”

Title icon

Step Back

The rapid pace of AI development has made it harder for employees to share critical information — even as internal company decisions shape the public’s future. With no federal AI regulation and limited state rules, whistleblower protections remain unclear and often only apply if a company breaks a specific law, most of which predate the rise of AI. Monster salaries also raise the personal stakes for speaking out.

Meanwhile, some tech companies have cracked down on leaks. Meta recently fired about 20 employees for sharing internal information with journalists and sued a former executive over her memoir. Companies are also increasingly siloing information, making it harder for one person to see a complete picture of potential harms.

AD

By connecting tech employees with similar concerns and offering free legal advice, Psst aims to break through some of these barriers. Many workers want to share their information anonymously and keep their job, Gibson said, and lawyers can recommend who to speak with depending on what the whistleblower wants out of the process.

Title icon

Know More

Founded last year, Psst has five employees and a network of lawyers who work pro bono or on contingency if a worker pursues legal action and wins a monetary award — for example, if they were fired for raising internal concerns.

It’s one of a few groups helping whistleblowers share information with the public. SecureDrop, run by the Freedom of the Press Foundation, connects anonymous sources with journalists. The Signals Network provides legal support, housing, and mental health care. Psst’s whistleblower matching approach helps them speak out as a group, which better protects individuals’ identities.

Psst’s funding comes from a private tech foundation and individual donors, co-founder and director Rebecca Petras said. It built the first prototype with an open-source tip line called Hushline and is now developing an updated version with a university tech lab.

Title icon

Room for Disagreement

While platforms that help publish these accounts can play a role in making technology safer, there is still a need for companies to implement robust internal reporting mechanisms to address employee concerns before they become whistleblowers. Many tech firms provide anonymous tip lines, but whether employees feel their complaints were adequately addressed is another question.

Title icon

Notable

  • Having a support system of coworkers with similar concerns can help alleviate the personal and social pressures that come with publicly sharing sensitive information, especially when an employee decides to reveal their identity. Research from the Netherlands shows that most whistleblowers suffer severe anxiety, depression, and other mental health challenges.
AD
AD