Perceptions on Systems of Justice over the Internet.
Final critical text about Obscurity and the Right to Remove. 2018



Intro about the Perceptions on Freedom of Information: The Right to Be Forgotten, free speech, privacy, and transparency.


As an Internet artist and activist I see the Right to Be Forgotten at the intersection of crucial ethical and philosophical questions. With the projects Obscurity and Right to Remove, I research cultural critical discourses on the conflicts and contradictions of judgment, access, freedom, and responsibility over private information exposed on the Internet.

The idea of the Right to Be Forgotten has been popularized by the press, yet the meaning of the actual term is misleading. In fact, it is simply about a privacy regulation for allowing citizens to delist search engine results with their personal information. The legal definition originated in France with “the right to oblivion,” and it is often translated as “the right to obscurity.” I coined the term Right to Remove to provide a broader understanding of this principle and I drafted an Internet privacy policy that I promote in the United States and beyond through the site Right2Remove.us

The Right to Be Forgotten is a legal framework implemented in Europe in 2014 and it is now part of the General Data Protection Regulation (GDPR). It has also been implemented in other countries such as Argentina, India, and South Korea. However, so far this right has been rejected in the United States, Japan, and China, with opponents considering it a threat to journalism, freedom of speech, and the Internet itself.

This legal right created two main controversies: the fear that someone could potentially censor information and the fact that it makes search engine companies judges in opaquely deciding on the removal of information about citizens. With the Right to Be Forgotten, free speech, privacy, and transparency arguments are at odds.


Critique on Systems of Justice over the Internet.

As an artist I wanted to discuss the Right to Be Forgotten in order to investigate the cultural, political, and legal philosophy of the Internet. In this particular case, I am interested in how social perception shifts concerning justice and judgment on the Internet. Through advocating to obscure criminal records, I explore the crucial cultural conflicts between the ideas of the right to know, the right to privacy, and the right to explanation.

Social norms embedded in our culture shift and alter based on the conditions of the Internet. For instance, being able to find and expose information on everyone may produce a feeling of empowerment, since it allows people to bring anyone to justice by revealing negative information about them permanently. The reality is that mediated information concerning someone will always be deceptive; this is especially true on the Internet, where the nuances of complex social contexts are reduced. Furthermore, publicly shaming someone might result in an escalation of conflict with a lack of human respect and diplomacy. On the Internet such incidents are commonplace and there is a sort of irrational fear of not being able to find information on someone. On the other hand, there is an even greater fear of being judged and slandered.

These cultural perceptions and the resulting personal emotions are the products of our inexperience and undeveloped understanding of the environment of the Internet. This is fairly new to humans at this time; we are like teenagers when they enter the stage of discovering and confronting society. As we grow up, the idea of freedom of information over the Internet is being redefined as we learn how to negotiate personal freedom with the rest of society and understand how society affects our freedom. We become more sophisticated at distinguishing the contexts for what is significant to be told and what is just harmful to people or unnecessary.

Human civilization is in the constant process of learning ethical principles for laws and a culture that can help individuals to live peacefully and respectfully with each other. For instance, we have developed notions of mercy in religions, post-war reconciliation, the right to a fair trial, the expungement of past offences, or simply giving a second chance to a friend. However, on the Internet it seems people do not accept those notions, instead preferring to be able to bring crude justice while hiding behind a screen. This undeveloped Internet culture—detached from the reality of human civilization—slows policies regarding the right to remove harmful information and having a fair judicial system for doing so.

Surely we have felt empowered by this new medium of the Internet, an instrument that finally gave speech rights to everyone. However, as we learn to speak in groups, we should develop languages for moderation, environments for civil debates, and democratic systems of judgment on information that can harm people. These developments can not be managed by law enforcement agencies, private enterprises, or black box machines, which are all prone to misuse, bias, and negligence. Censorship, discrimination, and inequality can indeed be produced by mismanagement of sensitive data if there isn’t democratic oversight and participation.

Private Internet companies have a major role in shaping these social processes: they have total control over the social instruments and they manipulate their images in order to present themselves as public services and custodians of justice and freedom on the Internet and beyond. They oppose regulations and promote the idea that the Internet will regulate itself—when, as with the economy and the government, as well as with the media industry— humans have always required regulations for maintaining the balance of our society.

So far, Google, the major Internet company, has refused to implement the Right to Be Forgotten in United States and it is the judge of what gets removed or stays exposed. Similarly, social media companies, such as Facebook and Twitter, filter and delete content undemocratically and without transparency. These Internet platforms aren’t just the major gatekeepers of what we see and know; they also have become a parapolitical suprastructure, playing a judicial role in society, exercising mass surveillance, psychological propaganda, and influencing the rule of law, while branding themselves as guardians of free speech.

Nevertheless, this is not about censoring or deleting information; it is about removing it from private platforms economically motivated to capture and expose as much information as they can. A right to remove information could also be called “the right to accuracy,” since it would allow people to ensure that their sensitive information is correctly contextualized. Thus it’s a sort of right to speech, in terms of empowering everyday people to have a say regarding their own personal information, something that search engine firms actively oppose. The reality is that Internet companies edit and manipulate search results and post feeds based on the interests of the advertisers and the performance of the platform, which is not what we would call freedom of information; instead, it is actual manipulation, withdrawal, and control of information by an authoritative centralized power.

Technological determinism can be fatal if not taken seriously. Internet companies need to be regulated with global governance, and it’s toxic to think that they will always change us and themselves beyond our control. Technology is not ungovernable, we are in control; we have always fought for being in control of our lives and our society. Information ecology is not about technological innovations. It is, rather, a cultural, ethical, educational, philosophical, legal, and political field. The consequences of polluted information have very material effects on the lives of people, the social fabric, and the integrity of society as a whole.

For instance, slander, harassment, and hate speech on the Internet are often instrumental in stirring emotions and fear by both bad actors and platforms invested in generating voyeurism or populism. Instead, platforms could promote polite discourse and control over gratuitously harmful information, while educating people about the context of the circulation of controversial information, which could resolve the debate over free speech. On Internet platforms it is not always a question about the freedom of expression of personal, political, or religious ideas: speech can be weaponized for intimidation, harassment, and abuse. A contemporary understanding of free speech needs to recognize the use of powerful devices for the targetization of minorities, vulnerable, and harmless individuals. As modes of speech become more sophisticated, we need more sophistication in defending and understanding free speech.

Extending the frontier of a global right to remove sensitive personal information is about a profound reflection concerning human rights on the Internet, one which entails civic engagement and a democratic process for significant change.

By Paolo Cirio, June 2018.



home | cv & bio | works | archive works | press | archive press | events | contact | top