Digital censorship has always been a passionately contested subject. Recent events have thrust the conversation front and center once again, putting free-speech advocates at odds with groups looking to curtail hate speech or governments. At the center of the debate are technology platforms, which accept varying levels of responsibility for reducing cyberbullying or blocking racist agendas.
Determining where free-speech ends and safety or censorship begin is a difficult task, subject to personal judgment and human error. But as we look deeper at the impact censorship has on different communities, it’s apparent there’s a need for a fundamental shift in how we handle content on the Internet.
For some groups, the recently passed Allow States and Victims to Fight Online Sex Trafficking Act (or FOSTA, H.R. 1865) — which was used to take down backpage.com — has been lauded as a significant step toward reducing sex trafficking and the abuse of minors. For others, it represents a world less safe for the groups the legislation was meant to protect.
“There was an entire elaborate web of safety and security services that Section 230 propped up, websites that would do background checks on clients, things like whitelists and blacklists, where women could check out whether a client was shady or not,” writes Baylor University professor Scott Cunningham, who studies the economics of sex work. “All of that is going to disappear. So I expect the work for the women who do remain in the market will be much more dangerous than it was before.”
The most dominant platforms are centralized, contributing to a technological paradigm that is making it harder and harder for safe spaces to exist.”
This problem extends beyond government and goes straight to doorstep of tech corporations. While Twitter and other services could once employ a laissez-faire approach to certain content, this new law exposes them to far more liability for content appearing on their networks. Platforms will now have to self-regulate, by removing content that puts them at risk legally. This will result in fewer online venues where marginalized communities can safely communicate.
In the backpage.com case, the door for abusive pimps and other nefarious actors has now been thrown back open — FOSTA has removed a safe channel for consenting and legitimate sex workers to communicate, vet clients, and work safely. Newsweek even reports that those affected in the industry are “devastated” by the shut down.
The potential for abuse remains, as does the need for reliable, safe, digital spaces for citizens to engage in free speech and open communication.
Encrypted messaging apps
For many, encrypted messaging apps enable safe communication. Projects like Signal — which offers end-to-end encryption — have been critical to users concerned about privacy, security, and the ability to participate in free-speech. Government officials, as well as journalists and human rights activists, also use these services frequently.
However it’s clear these services are still vulnerable and can fairly easily be taken down or blocked by governments. For instance, Russia and Iran have both banned encrypted messaging application Telegram, citing national security threats.
The infrastructure beneath these platforms is moving toward a model that is more inclined to help governments than private citizens. Google’s decision to eliminate domain-fronting — a technique Signal and others were using to sidestep government censorship — has rendered these apps useless in the areas where they are most critical.
Right now, the most dominant platforms are centralized, contributing to a technological paradigm that is making it harder and harder for safe spaces to exist.
Censorship by the people
In a decentralized paradigm, individuals arbitrate censorship within their particular networks, and decide what content is acceptable as part of democratic communities.”
Decentralization makes it possible to give control of online services back to people. It is fundamentally impossible to block a decentralized solution, in any permanent way. In centralized services, media and ideas can be censored easily, simply by cutting off the single access point of that centralized server.
Decentralized networks on the other hand are peer-to-peer. Even if one “node” of the network gets cut off, there are still hundreds of other nodes with the same data. Furthermore, many of those nodes are anonymous and in locations around the world, making censorship almost impossible.
In this context, a decentralized network offers two defining features:
1. The individual owns their data. If a particular service or platform is no longer meeting their needs, they can pack up and move elsewhere, keeping their messages and contacts intact.
2. There are no servers that can be blocked. Messaging is peer to peer and redundantly stored on computers across the network in an encrypted fashion.
In today’s world, centralized corporations are the arbiters of “content morality” for all their users. This can have disastrous consequences, like the time when Chinese social media behemoth Weibo decided to briefly ban all LGBT content. Of course there will always be the need to flag hateful speech or illegal activity, which decentralized networks address with community blocklists, highly transparent ratings, and different versions of applications where users can decide what content they find acceptable. The key difference is that in a decentralized paradigm, individuals arbitrate censorship within their particular networks, and decide what content is acceptable as part of democratic communities.
A final answer? Hopefully
Perhaps unintentionally, Sen. Kamala D. Harris (D-Calif.) recently made a clear real-world case for decentralization, “For too long, traffickers have hidden behind liability protections designed to safeguard free speech. As California Attorney General, I witnessed firsthand the difficulty of charging sex trafficking sites — even for crimes as egregious as pimping minors. That loophole must close,” she has stated. “I recognize that there are concerns about curbing innovation and free expression online. But it is simply a false choice to suggest that we either protect victims or we protect free speech. We can and must do both.”
Whitelists, blocklists, and hyper-transparency into applications help the community democratically handle bad actors and tailor their online experience to them. A decentralized network puts online experiences in the hands of people, by virtue of the network being unblockable and data being fully user-owned.
With decentralized communication and social networking tools, we really can do both.