Sunrise

Sunset

A 30-second online art project:

Peter Burr, Sunshine Monument

Learn more

Learn more at whitney.org/artport

Skip to main content

Criminal Data
2022

The video essay Criminal Data—informed by the research, advocacy, and interventions unfolding through my projects Obscurity and Right to Remove—explores the social repercussions resulting from the dissemination of digital criminal records and mugshots on the public Internet. With the artwork Obscurity I aggregated over fifteen million criminal records of people arrested in the United States. I then obfuscated the photos and records of six mugshot websites by cloning them and republishing ten million mugshots blurred and with their related booking data shuffled. I carried out this “hacking” from 2016 to 2019 in order to interfere with search engine results, thus protecting the privacy and dignity of those who are stigmatized by misleading information about their arrests. As a consequence I have been subjected to legal threats from owners of mugshot websites and have received hundreds of messages with tragic stories from victims of mugshot extortion, to which I responded by giving them support as much as I could. Beyond the reporting on mass incarceration, the social experiment, and the performative hack, I ultimately designed the Internet privacy policy Right to Remove, which advocates for the legal right to remove personal information from search engines by adapting a form of the European Union’s Right to Be Forgotten for the United States. For years I collaborated with lawyers, academics, and legislators to promote this privacy policy. 

Mugshot websites have been exposing photos of people who have been arrested, often just for minor offenses, regardless of the amount of time spent in jail, even if they were later found to be innocent, or if the charges against them had been dropped. These websites are designed to embarrass and shame, since searchable booking photos can effectively ruin someone’s reputation with social stigmas and attendant prejudices in their respective communities, families, and workplaces, especially when they are seeking employment, or obtaining insurance or credit. The mugshots often depict the most vulnerable members of society: victims of mass incarceration, economic inequality, and racial discrimination, along with those who lack treatment for mental illness in a country with poor welfare policies, coupled with severe law enforcement agencies and an unforgiving criminal justice system.

The United States has the highest rate of imprisonment in the world. Every year city and county jails across the country admit between eleven and thirteen million people. Of those entering prison each year, sixty percent haven’t been convicted of any crimes. Seventy-five percent of the non-convicted defendants behind bars are held on non-violent offenses. Over the course of a decade, 101 million arrest records and 45.7 million mugshots have been posted on the Internet by police departments. Many of the people depicted in them have been, or will be found innocent.

The online mugshots phenomenon received public attention in 2013. Since then, the number of these websites has multiplied, and they continuously change their brand names in order to keep collecting and monetizing mugshots. This monetization can take multiple forms, from charging fees for the removal of images, to selling the data for use by law enforcement agencies and training Artificial Intelligence. During all these years, search engine firms and legislators have only adopted ineffective patches in order to fix a problem that keeps ruining the lives of millions of U.S. citizens. Until now, there hasn’t been clear federal legislation on the publishing, trading, and public indexing of criminal data.

Currently, some online mugshots are over ten years old and related to low-level or nonviolent crimes—such as driving without a license, and court-related or soft drug offenses—without making a distinction between people who are convicted, people whose charges have been dropped, or people who are acquitted. The data even includes children as young as eleven.

The publication of criminal data online is legal under the freedom of information and transparency laws in most U.S. states. Furthermore, several legislators and organizations devoted to freedom of the press have been opposing bills that would regulate the publication of mugshots. Search engines such as Google are complicit, as they can do what no legislator could: demote mugshot sites and thus reduce, if not eliminate, their power to stigmatize.

Making personal information concerning ordinary citizens and those most vulnerable available on search engines violates their dignity, security, and right to privacy. This project supports the so-called Right to Be Forgotten, originally a European privacy policy that allows individuals to remove sensitive information about themselves by requiring that search engines comply with requests to remove personal information from their results.

As an Internet artist and activist, I position the Right to Be Forgotten at the intersection of crucial ethical and philosophical questions. With the projects Obscurity and Right to Remove, I researched cultural and critical discourses on the conflicts and contradictions of judgment, access, freedom, and responsibility for sensitive and misleading information exposed on the Internet.

I wanted to discuss the Right to Be Forgotten in order to investigate the cultural, political, and legal philosophies of the Internet. In this particular case regarding criminal data, I am interested in how social perception shifts concerning justice and judgment on the Internet. Through advocating for obscuring criminal records, I explore the crucial cultural conflicts between the ideas of the right to know, the right to privacy, and the right to explanation.

Human civilization is engaged in a constant process of learning ethical principles for public policy and a culture that can help individuals to live peacefully and respectfully with each other. For instance, we have developed notions of mercy in religions, post-war reconciliation, the right to a fair trial, the expungement of past offences, or simply giving a second chance to a friend. However, on the Internet, it seems that people do not accept those notions, instead preferring the ability to deliver crude justice while hiding behind a screen. This undeveloped Internet culture—detached from the reality of human civilization—slows policies regarding the right to remove harmful information and having a fair judicial system for doing so.

These developments cannot be managed by law enforcement agencies, private enterprises, or black box machines, which are all prone to misuse, bias, and negligence. Censorship, discrimination, and inequality can indeed be produced by mismanagement of sensitive data if there isn’t democratic oversight and participation.

Private Internet companies have a major role in shaping these social processes: they have total control over our instruments of communication and manipulate their images in order to present themselves as public services and custodians of justice and freedom on the Internet and beyond. They oppose regulations and promote the idea that the Internet will regulate itself when—as with the economy and the government, as well as with the media industry—humans have always required regulations for maintaining the balance of power in our society.

So far, Google—the major Internet search engine company—has refused to implement the Right to Be Forgotten in the United States and is the judge of what gets removed or stays exposed in Europe. Similarly, social media companies, such as Facebook and Twitter, filter and delete content undemocratically and without transparency. These Internet platforms aren’t just the major gatekeepers of what we see and know; they also have become a parapolitical suprastructure, playing a judicial role in society, exercising mass surveillance, psychological propaganda, and influencing the rule of law, while branding themselves as guardians of free speech.

Nevertheless, this is not about censoring or deleting information; it is about removing it from private platforms economically motivated to capture and expose as much information as they can. A right to remove information could also be called “the right to accuracy,” since it would allow people to ensure that their sensitive information is correctly contextualized. Thus, it enacts a sort of right to speech, in the sense of empowering everyday people to have a say regarding their own personal information, something that search engine firms actively oppose. The reality is that Internet companies edit and manipulate search results and post feeds based on the interests of advertisers and the performance of the platform, which is not what we would call freedom of information; instead, it is actual manipulation, withdrawal, and control of information by an authoritative centralized power.

Technological determinism can be fatal if not taken seriously. Internet companies need to be regulated with global governance, and it’s toxic to think that they will always change us and themselves beyond our control. Technology is not ungovernable. We are in control; we have always fought for being in control of our lives and our society. Information ecology is not about technological innovations; rather, it is a cultural, ethical, educational, philosophical, legal, and political field. The consequences of polluted information have very material effects on the lives of people, the social fabric, and the integrity of society as a whole.

Slander, harassment, and hate speech on the Internet are often instrumental in stirring emotions and fear by both bad actors and platforms invested in generating voyeurism or populism. Instead, platforms could promote mediation and control over gratuitously harmful information, while educating people about the context of the circulation of information, which could help resolve debates over free speech. On Internet platforms, it is not always a question about the freedom of expression of personal, political, or religious ideas: speech can be weaponized for intimidation, harassment, and abuse. A contemporary understanding of free speech needs to recognize the use of powerful devices for the monetization of stigmatizing criminal records, targetization of minorities, vulnerable, and harmless individuals. As modes of speech become more sophisticated, we need more sophistication in defending and understanding free speech.

Extending the frontier of a global right to remove sensitive personal information is about a profound reflection concerning human rights on the Internet, one which entails civic engagement and a democratic process for significant change.

— Paolo Cirio

Learn more about this project by Paolo Cirio.