Bill Cosby has been accused of drugging and raping dozens of women over several decades. Roger Ailes is accused of harassing multiple women as far back as the 1960s. And then there’s all those Catholic priests. Indeed, when sexual predators, especially those in positions of power, get away with such crimes once, they often do it again and again until an overwhelming preponderance of accusations, evidence, and outrage brings them down.
Perpetrators can get away with attacking so many because often when individuals report these crimes, they are ignored or discredited. But in the cases above, after multiple survivors—previously unbeknownst to one another—shared the same story, the authorities and the public started to take their accusations seriously. There’s strength (and credibility) in numbers, after all, but victims have few ways to connect with one another. We can fix this.
Gamergate Target Zoe Quinn Launches Anti-Harassment Support Network
Twitter Is Adding New Filtering Tools in an Effort to Curb Abuse
Imagine a scenario where a woman’s boss sexually assaults her but she can’t risk losing her job by accusing him directly. Still, she wants to report what happened and find out if he has done the same to others. She would need a way to communicate safely and anonymously with other victims—a mechanism for filing her story in a trusted, fully encrypted system that would allow her to maintain control of her identity. Then if others were to report the same perpetrator, victims could be alerted and invited to join a private messaging center. There they could communicate, coordinate, and, if they decide to, send a report to law enforcement and any organizations affiliated with their alleged assailant. Police would then be pressed to review the report, contact the victims, and weed out any impostors. (Campus-based reporting tools do already exist, but we need something more robust and universally available.)
Think of it like a sort of Dropbox, only no one can see the identity of the person being reported without permission from the accuser and only if that person has been accused by others. All the technologies exist now to make this happen: messaging, identity verification, encryption, anonymization, and social media controls, like blocking and muting.
It might be easy to reject this idea as unnecessary and possibly even fraught. We don’t need such a system for, say, mugging, after all. But sexual assault is not a mugging. When you’re mugged, people don’t question your motives for coming forward, and robbery victims often don’t know who committed the crime. Victims of sexual violence usually do. Furthermore, a survivor may be financially dependent on their abuser or fear reprisal—they can’t afford to bring charges if the charges aren’t going to stick, and sexual violence is notoriously difficult to prosecute. Consider the Stanford student who sexually assaulted an unconscious woman last year and got a mere few months in jail.
‘A reporting system that helps victims speak up on their own terms could really help expedite justice.’
“There are stories like the Brock Turner case and others in the media that make it very clear that the system is just not there to support survivors,” says Riddhi Mukhopadhyay, the director of Sexual Violence Legal Services at the YWCA. Online organizing could make prosecutions easier. “When it’s ‘he said, she said, she said, she said’ and multiple victims are willing to testify,” Mukhopadhyay says, “it can really help the case move through the criminal justice system.”
Such a platform wouldn’t help everybody: Not every abuser is a repeat offender. And this is not to say that multiple reports are in any way more worthy of investigation than a single account. But it would help at least some victims shatter the culture of doubt that shrouds their stories.
Of course, there’s the risk a website like this—or any reporting system—would be used to bring false accusations or abused by noxious trolls. So, to prevent such fraud and abuse, the tool would need to verify then encrypt each user’s identity. The claims, of course, would also be vetted by investigators. And to prevent a situation where hackers break into the system to harass or threaten victims, everyone must have the ability to block other users from communicating with them. “One of the scariest parts about sharing stories of personal trauma online is the possibility of losing control of your narrative,” says Whitney Phillips, a professor at Mercer University, whose book, This Is Why We Can’t Have Nice Things, examines harassment online.
If done right, a system like this could also act as a deterrent. Companies could embed a link to the platform in their HR portals, signaling that abuse won’t be tolerated and assuring victims that their voices will be heard, using the platform like a kind of independent electronic ombudsperson. “If it was carefully designed, a reporting system that helps victims speak up on their own terms could really help expedite justice,” Phillips says. “It’s not going to be easy, but it’s not impossible.”
Companies have invested billions of dollars in robotic cars. Silicon Valley has poured tens of millions into liquefied meals for people who are too busy to chew. Entrepreneurs like to say they want to make the world a better place. One way to do that is to spend a few million dollars to help build this sort of system.
We must do better. The good news is that we’ve got all the tools we need. Let’s start now.
April Glaser (@aprilaser) is a staff writer at Recode.
This article appears in the December issue. Subscribe now.