Skip to main content
 

Gab and the decentralized web

7 min read

As a proponent of the decentralized web, I've been thinking a lot about the aftermath of the domestic terrorism that was committed in Pittsburgh at the Tree of Life synagogue over the weekend, and how it specifically relates to the right-wing social network Gab.

In America, we're unfortunately used to mass shootings from right-wing extremists, who have committed more domestic acts of terror than any other group. We're also overfamiliar with ethnonationalists and racist isolationists, who feel particularly emboldened by the current President. Lest we forget, when fascists marched in the streets yelling "the Jews will not replace us", he announced that "you had very fine people on both sides". The messaging could not be more clear: the President is not an enemy of hate speech.

As the modern equivalent of the public square, social networking services have been under a lot of pressure to remove hate speech from their platforms. Initially, they did little; over time, however, they began to remove many of the worst offenders. Hence Gab, which was founded as a kind of refuge for people whose speech might otherwise be removed by the big platforms.

Gab claims it's a neutral free speech platform in the spirit of the First Amendment. (Never mind that the First Amendment protects you from the government curtailing your speech, rather than corporations enacting policies for private spaces that they own and control.) But anyone who has spent 30 seconds there knows this isn't quite right. This weekend's shooter chose to post there before committing his atrocity; afterwards, many other users proclaimed him to be a hero.

It's an online cesspit, home to of some of the worst of humanity. These are people who refer to overt racism as "wrongthink", and mock people who are upset by it. As Huffington Post recently reported about its CEO, Andrew Torba:

[...] As Gab’s CEO, he has rooted for prominent racists, vilified minorities, fetishized “trad life” in which women stay at home with the kids, and fantasized about a second American civil war in which the right outguns the left.

Gab is gone for now - a victim of its service providers pulling the plug in the wake of the tragedy - but it'll be back. Rather than deplatforming, the way to fight this speech, it claims, is with more speech. In my opinion, this is a trap that falsely sets up the two oppositing sides here as being equivalent. Bigotry is not an equal idea, but it's in their interests to paint it as such. While it's pretty easy to debate bigots on an equal platform and win, doing so unintentionally elevates their standing. Simply put, their ideas shouldn't be given oxygen. A right to freedom of speech is not the same as a right to be amplified.

I found this piece by an anonymous German student in Saxony instructive:

We also have to understand that allowing nationalist slogans to gain currency in the media and politics, allowing large neo-Nazi events to take place unimpeded and failing to prosecute hate crimes all contribute to embolden neo-Nazis. I see parallels with an era we thought was confined to the history books, the dark age before Hitler.

An often-repeated argument about deplatforming fascists is that we'll just drive them underground. In my opinion, this is great: when we're literally talking about Nazis, driving them underground is the right thing to do. Yes, you'll always have neo-Nazis somewhere. But the more they're exposed to the mainstream, the more their movement may gain steam. This isn't an academic problem, or a problem of optics: give Nazis power and people will die. These are people who want to create ethnostates; they want to prioritize people based on their ethnicity and background. These movements start in some very dark places, and often end in genocide.

When we talk about a decentralized social web, the framing is usually that it's one free from censorship; where everyone has a home. I broadly agree with that idea, but I also think the discussion must become more nuanced in the face of communities like Gab.

I agree wholeheartedly that the majority of our global discourse can't be trusted to a small handful of very large, monocultural companies that answer to their shareholders over the needs of the public. The need to make user profiles more valuable to advertisers has, for example, seen transgender users thrown off the platform for not using their deadnames. In a world where you need to be on social media to effectively participate in a community, that has had a meaningful effect on already vulnerable communities.

There's no doubt that this kind of unacceptable bigotry at the hands of surveillance capitalism would, indeed, be prevented by decentralization. But removing silos would also, at least in theory, enable and protect fascist movements, and give racists like this weekend's shooter a place to build unhindered community.

We must consider the implications of removing these gatekeepers very deeply - and certainly more deeply than we have been already.

A common argument is that the web is just a tool, oblivious to what people use it for. This is similar to the argument that was made about algorithms, until it became obvious that they were built by people and based on their assumptions and biases. Nothing created by people is unbiased; everything is in part derived from the context and assumptions of its creators. By being more aware of our context and the assumptions we're bringing to the table, we can hopefully make better decisions, and see potential problems with our ideas sooner. Even if there isn't a perfect solution, understanding the ethics of the situation allows us to make more informed decisions.

On one side, by creating a robust decentralized web, we could create a way for extremist movements to thrive. On another, by restricting hate speech, we could create overarching censorship that genuinely guts freedom of speech protections, which would undermine democracy itself by restricting who can be a part of the discourse. Is there a way to avoid the second without the first being an inevitability? And is it even possible, given the possible outcomes, to return to our cozy idea of the web as being a force for peace through knowledge?

These are complicated ethical questions. As builders of software on the modern internet, we have to know that there are potentially serious consequences to the design decisions we make. Facebook started as a prank by a college freshman and now has a measurable impact on genocide in Myanmar. While it's obvious to me that everyone having unhindred access to knowledge is a net positive that particularly empowers disadvantaged communities, and that social media has allowed us to have access to new voices and understand a wider array of lived experiences, it has also been used to spread hate, undermine elections, and disempower whole communities. Decentralizing the web will allow more people to share on their own terms, using their own voices; it will also remove many of the restrictions to the spread of hatred.

Wherever we end up, it's clear that President Trump is wrong about the alt-right: these aren't very fine people. These are some of the worst people in the world. Their ideology is abhorrent and anti-human; their messages are obscene.

No less than the future of democratic society is at stake. And a society where the alt-right wins won't be worth living in.

Given that, it's tempting to throw up our hands and say that we should ban them from speaking anywhere online. But if we do that, the consequence is that there has to be a mechanism for censorship built into the web, and that there should be single points of failure that could be removed to prevent any community from speaking. Who gets to control that? And who says we should get to have this power?