SERIES INTRODUCTION
Rebecca MacKinnon’s “Consent of the Networked: The Worldwide Struggle for Internet Freedom” is an interesting read on free speech, on the internet, in the context of a world where corporations are challenging the sovereignty of governments. Having read the book, I will be familiarizing readers with some of the themes and ideas discussed in MacKinnon’s work.
In Part I, we discussed censorship in the context of authoritarian governments.
In Part II, we will be dealing with the practices of democratic governments vis-à-vis online speech.
In Part III, we shall discuss the influence of corporations on online speech.
Essentially, the discussion will revolve around the interactions between the three stakeholders: netizens, corporations providing internet-based products and governments (both autocratic and democratic). Each of the stakeholders have varied interests or agendas and work with or against in each other based on the situation.
Governments wish to control corporations’ online platforms to pursue political agendas and corporations wish to attract users and generating profits, while also having to acquiesce to government demands to access markets. The ensuing interactions, involving corporations and governments, affect netizens’ online civil liberties across the world.
DEMOCRATIC GOVERNMENTS
In this section, we will be dealing with the actions of democratic governments and their effects on online speech.
MacKinnon notes that apart from authoritarian governments, even democratic institutions, albeit to a lesser degree, are indulging in activities that are detrimental to free speech online. For instance, after the U.S. learnt that the Chinese had access to a “kill switch” that would allow the Chinese government to terminate all access to the internet in its territory, the U.S. legislature attempted to pass a legislation that would provide the U.S. government with a similar capability. Though the legislation wasn’t passed, the same shows there exist voices within democratic set-ups that seek governmental power over cyberspace.
Further, corporations in the U.S. might be asked to comply with warrantless demands for information or surveillance and there doesn’t exist a recourse in law for them. These corporations might even be asked to comply with specific “requests” from the government. For instance, Amazon was initially hosting the WikiLeaks, but allegedly under U.S. pressure, Amazon backed out. It is pertinent to note that such pressure from the government, exerted in an opaque manner, is problematic as such actions skirt Due Process concerns.
The Panopticon Effect has consequences in democratic countries too. If government actions are opaque, citizens will be unaware of the breadth of surveillance and consequently, will alter their behaviour as a result of believing that they are being watched at all times.
Anonymity, Corporate Policing and Legitimization of Authoritarian Censorship
In addition to such opaque measures, democratic institutions also deal in legal censorship. MacKinnon refers to it as “Democratic Censorship”. The essential concern that democratic countries face while dealing with censorship is to balance the value curtailing online criminals and problematic speech (e.g. child pornography), while safeguarding the civil liberties of other netizens. Issues relevant to the balancing include anonymity, corporation policing of platforms and legitimization of authoritarian censorship.
The issue of anonymity features prominently in discussions involving balancing online privacy with online safety. While requiring netizens to identify themselves online would make them more accountable for their online transgressions, netizens involved in political activities, fearing social sanctions (e.g. anti-abortion speech related judgment), might refrain from posting. Without the option of anonymity, cyberspace would cease to serve as a platform for unpopular speech. Further, a government, generally influenced by majoritarian views, cannot be expected to regulate without bias. Hence, any requirement of non-anonymity can serve as a potential tool for censorship even in democratic setups.
For protecting netizens from problematic speech (e.g. child pornography), the government tasks the private sector to police their platforms. For instance, Google is expected to screen its video sharing platform YouTube for problematic speech. As is seen through this instance, legislating “Intermediary Liability” is one possible method of ensuring corporations police their platforms as the application of intermediary liability laws makes a corporation liable for problematic speech found on its platform. In Italy, Google executives were sentenced to prison for failing to prevent the uploading of a video of an autistic child and thereby, violating the child’s privacy.
What are the consequences of requiring corporations to police their platform?
First, issues of legitimacy arise. Should an entity that isn’t accountable to the public at all be given the authority to act as gatekeepers for content? Customer accounts are intruded into and regulated by those who aren’t accountable to the public. We will revisit this argument in Part III.
Consider the case of the Internet Watch Foundation. It is an organization that creates an updated list of websites it considers objectionable. U.K. based Internet Service Providers use the list, out of their own volition, block access to the listed websites. It isn’t MacKinnon’s contention that the IWF is a fraud, but the example showcases the immense power that private entities could exercise over online speech and the vacuum of accountability measures.
Second, ascribing liability on corporations for failing to remove problematic speech would push them towards being extremely cautious with screening content. In other words, corporations, in their zeal to avoid any liability whatsoever, would be inclined to block all content that seem problematic, but mightn’t actually be problematic. Hence, content that shouldn’t be getting blocked might be. There would be “collateral filtering” or blockage of content that isn’t actually intended to be blocked by the regulator. For instance, if the word “sex” was flagged for blocking to weed out pornographic websites, even content relating to health and marriage that uses the word “sex” will get blocked.
Third, the intermediary liability model pushes corporations to adopt the practice of blocking potentially problematic content at the outset and subsequently, reviewing the blocking, if necessary, much later. Such a practice runs contrary to one of the foundational principles of Due Process i.e. “Innocent until proven guilty”. Additionally, such a practice is especially detrimental to the efficacy of political speech, as such speech often loses its impact with the passage of time. For instance, if a journalist writes a scathing article on the government for fuelling communal riots, the article will have its maximum impact when published during or shortly after the riots, when the issue is fresh in the minds of readers. Therefore, even if corporations republish content upon reviewing, the content may have lost its potency.
Hence, there exist various problems with requiring corporations to police their platforms.
Moving on, MacKinnon argues that “Democratic Censorship” also leads to legitimization of censorship policies of authoritarian governments.
In this regard, The U.S. government’s actions in the realm of intellectual property are especially problematic. In its zealousness to protect copyrights, the U.S. and other countries have overlooked Due Process. For instance, WikiLeaks revealed that U.S. and 34 other countries were negotiating an international treaty called the Anti-Counterfeiting Trade Agreement, which required intermediaries to police their platforms and remove content without having to prove violation.
When democratic governments eschew Due Process in this manner, they legitimize the actions of authoritarian governments. It allows authoritarian governments to claim that their internet policy is in accordance with international standards. When the U.S. legislature, pushed by lobbyists, sacrificed civil liberties to protect intellectual property rights, it gave the Chinese and Russians a cover for supressing dissent. For example, The Russians clamped down on dissenters by taking them to task for violating Microsoft’s copyright. As an aside, it is heart-warming to note that Microsoft changed its policy after the event.
Conclusion
In Part II, we have to attempted to (a) understand the pressures that democratic governments place on corporations, (b) understand “democratic censorship” and the attempts of democracies to balance measures against problematic speech with protection of netizens’ civil liberties, (c) understand “intermediary liability” and “collateral filtering” and (d) dilution of Due Process in democracies and the dilution’s effect of legitimizing censorship policies of authoritarian regimes.
In Part III, we will analyse the influence of corporations on online speech.
Image taken from here.