A New Group Aims to Protect Whistleblowers In the Trump Era
The world needs whistleblowers, perhaps now more than ever. But whistleblowing has never been more dangerous.
Jennifer Gibson has seen this problem develop up close. As a whistleblower lawyer based in the U.K., she has represented concerned insiders in the national security and tech worlds for more than a decade. She’s represented family members of civilians killed by Pentagon drone strikes, and executives from top tech companies who’ve turned against their billionaire bosses.
[time-brightcove not-tgx=”true”]But for today’s whistleblowers, Gibson says, both the stakes and the risks are higher than ever. President Trump has returned to the White House and wasted no time using the might of the state to retaliate against perceived enemies. This time, Trump boasts the support of many of Silicon Valley’s richest moguls, including Elon Musk and Mark Zuckerberg, who have overhauled their social-media platforms to his benefit. Meanwhile, tech companies are racing to build AI “superintelligence,” a technology that could turbocharge surveillance and military capabilities. Politics and technology are converging in an environment ripe for abuses of power.
Gibson is at the forefront of a group of lawyers trying to make it safer for conscientious employees to speak out. She’s the co-founder of Psst, a nonpartisan, nonprofit organization founded in September and designed to “collectivize” whistleblowing.
On Monday, to coincide with Trump’s inauguration, Psst launched what it calls the “safe”: a secure, online deposit box where tech or government insiders can share concerns of potential wrongdoing. Users can choose to speak with a pro-bono lawyer immediately, anonymously if they prefer. Or they can ask Psst’s lawyers to do nothing with their information unless another person turns up with similar concerns. If that second party emerges, and both give their consent, Psst is able to match the two together to discuss the issue, and potentially begin a lawsuit.
Read More: The Twitter Whistleblower Needs You To Trust Him.
Gibson says the aim is to overcome the “first mover problem” in whistleblowing: that even if several insiders privately share the same concerns, they may never find out about each other, because nobody wants to risk everything by being the first to speak up. “The chances are, if you’re a tech worker concerned about what the company is doing, others are concerned as well,” Gibson says. “But nobody wants to be first.”
Psst’s model doesn’t negate all the dangers of whistleblowing. Even if multiple insiders share concerns through its “safe,” they still face the prospect of retaliation if they eventually speak out. The safe is end-to-end encrypted, but a lawyer has access to the decryption key; an adversary could sue Psst in an attempt to obtain it. Because it’s browser-based, Psst’s safe is marginally more vulnerable to attack than an app like Signal. And while information stored in the safe is protected by legal privilege, that’s only a protection against entities who respect legal norms. Gibson acknowledges the limitations, but argues the status quo is even riskier. “We need new and creative ways of making it easier and safer for a larger number of people to collectively speak out,” she says. If we continue to rely on the shrinking group of people willing to blow up their careers to disclose wrongdoing, she adds, “we’re going to be in a lot of trouble, because there aren’t going to be enough of them.”
In her previous role at the whistleblower protection group The Signals Network, Gibson worked on providing independent legal and psychosocial support to Daniel Motaung, a Meta whistleblower who first shared his story in TIME. Before turning her focus to the tech industry, Gibson spent 10 years at the U.K.-based human-rights group Reprieve, where her title was “Head of Extrajudicial Killings.” She focused on U.S. military drone strikes in the war on terror, which reports indicate had a higher civilian death rate than Washington publicly admitted. “I spent 10 years watching national security whistleblowers risk everything and suffer significant harm for disclosing information that the American public, and quite frankly the world, had a right to know,” Gibson says. “In my opinion, we as civil society failed to really protect the whistleblowers who came forward. We tried to get accountability for the abuses based on the information they disclosed—and many of them went to jail with very little media attention.”
Gibson also noticed that in cases where whistleblowers came forward as a group, they tended to fare better than when they did so alone. Speaking out against a powerful entity can be profoundly isolating; many of your former colleagues stop talking to you. One of Psst’s first cases is representing a group of former Microsoft employees who disclosed that the tech giant was pitching its AI to oil companies at the same time as it was also touting its ability to decarbonize the economy. “The benefit of that being a group of whistleblowers was the company can’t immediately identify who the information came from, so they can’t go after one individual,” Gibson says. “When you’re with a collective, even if you’re remaining anonymous, there are a handful of people you can reach out to and talk to. You’re in it together.”
Psst’s safe is based on Hushline, a tool designed by the nonprofit Science & Design Inc., as a simpler way for sources to reach out to journalists and lawyers. It’s a one-way conversation system, essentially functioning as a tip-line. Micah Lee, an engineer on Hushline, says that the tool fills a gap in the market for an encrypted yet accessible central clearinghouse for sensitive information. “It still fills an important need for the type of thing that Psst wants to do,” he says. “[But] it’s filling a space that has some security and usability tradeoffs.” For follow-up conversations, users will have to move over to an encrypted messaging app like Signal, which is marginally safer because users don’t have to trust the server that a website is hosted on, nor that your own browser hasn’t been compromised.
Read More: Inside Frances Haugen’s Decision To Take On Facebook.
For now, Psst’s algorithm for detecting matches is fairly simple. Users will be able to select details about their industry, employer, and the subject of their concerns from several drop-down boxes. Then Psst lawyers, operating under legal privilege, check to see if there is a match with others. Gibson expects the system’s capabilities to evolve. She’s sketched out a blueprint for another version that could use closed, secure large language models to perform the matching automatically. In theory, this could allow whistleblowers to share information with the knowledge that it would only ever be read by a human lawyer in the case that a different person had shared similar concerns. “The idea is to remove me from the process so that even I don’t see it unless there’s a match,” Gibson says.
At the same time, technological advancements have made it easier for governments and tech companies to clamp down on whistleblowing by siloing information, installing monitoring software on employees’ phones and computers, and using AI to check for anomalous behaviors. Psst’s success will depend on whether tech and government insiders trust it enough in this environment to begin depositing tips. Even if the system works as intended, whistleblowers will need extraordinary courage to come forward. With tech and government power colliding, and with AI especially getting more and more powerful, the stakes couldn’t be higher. “We need to understand what is happening inside of these frontier AI labs,” Gibson says. “And we need people inside those companies to feel protected if they feel like they need to speak out.”