Aljumaine Gayle

Producer & Design Technologist.

Twenty Four Seven / 365.
 

As we go about our daily lives, we enter into and are confronted by spaces that are surveilled without consent or without considerations for privacy. As technology has advanced, it has become a tool for regulating our behaviour, whether we notice it or not. Privacy is particularly compromised for many Black communities and communities of colour. Police departments and governments adopt technologies to monitor, track and punish those communities. Twenty_Four_Seven / 365 is an interactive installation that illustrates the rising dangers of surveillance technology for communities of colour. Participants walk through an audio-visual installation that provokes reflection on privacy and non-consensual surveillance.The purpose of this installation is to start and accessible an inclusive conversation about surveillance and the technologies.

Where are the race based surveillance statistics? Why are they not made available to the public?

Who has all of the race based statistics on surveillance of Black, Indigenous and communities of color in Canada?

Do you want to live in a future where you have no right to control our privacy in public based on your ethnicity, class, education, political affiliation and human rights?

What could a future look like where we have the power to control our privacy in public from large corporations and the state?

“In what could be taken as the founding statement of computational thought, John von Neumann wrote: ‘All stable processes we shall predict. All unstable processes we shall control” (Bridle,14). An immersive, multimedia installation, Aljumaine Gayle’s Twenty Four Seven/365 explores state control and surveillance of communities of colour, specifically black and indigenous communities, through technological structures.

As participants navigate the installation, you will encounter yourself in a series of digital forms and sketches that simultaneously shroud and reveal. These distortions and abstractions demonstrate the faulty logic inherent to the notion of surveillance as safety. Four depth camera’s track and display your movements through the installation. Participants are encouraged to manipulate the sketches to create abstractions of their surveilled projections by moving their bodies. The depth cameras have been situated within the gallery space in a manner that is not immediately obvious to the viewer. The projections have been placed on the walls up above the user in order to display a top-down perspective similar to that of a surveillance camera. The user will have to view themselves within the space in order to become aware of their surroundings.

Rooted in surveillance technologies, such as Computer Vision, Facial Recognition, Rekognition, ShotSpotter, is the assumption that through the implementation of a surveillance framework, the state has the ability to reveal crime and be seen actively trying to reduce crime rates. What you actually have is “racialized surveillance” and “racialized sight”. In Dark Matters: On the Surveillance of Blackness, Simone Browne defines “racializing surveillance” as “a technology of social control where surveillance practices, policies, and performances concern the production of norms pertaining to race and exercise a ‘power to define what is in or out of place’” (16). Racializing surveillance, she continues, “signals those moments when enactments of surveillance reify boundaries, borders, and bodies along racial lines, and where the outcome is often the discriminatory treatment of those who are negatively racialized by such surveillance” (Brown 16).

For example, ShotSpotter is a surveillance technology that collects acoustic data by placing microphones in targeted communities that are predominantly Black. The data that ShotSpotter provides to its clients—such as police departments—is proprietary, so it's not released to the public. ShotSpotter has a rate of error that is as high as 30 to 70 percent despite being a popular technology within the United States of America adopted by police departments. The Toronto Police Service recently stopped plans to spend $4 million on ShotSpotter after civil liberties groups and community activists raised concerns about privacy and racially targeted policing.

Another example we can look to is a surveillance technology called the Investigative Case Management system—software established and used by the U.S. Immigration and Customs Enforcement agency. “The ICM is a critical component of ICE’s deportation operations—it integrates a vast ecosystem of public and private data to track down immigrants and, in many cases, deport them” (Hao 1). At this time there is very little information that is publically available to explain exactly how this technology and its ecosystem works and why it’s effective. We can look to the Trump administration’s aggressive campaign to deport DREAMers and immigrants that may be in limbo with respect to their citizenship. “ICE arrests increased 42% compared with the same period in the previous year” (Halo 5). According to civil rights and immigration activists, the ICM is fueling the mass surveillance and targeting of immigrants at an unprecedented scale. Recently there have been rising pressures on big tech companies to be held accountable for their contributions to the development and deployment of technologies that can be used for the purpose of tracking, monitoring and policing Black communities and other communities of color. If we look back to historically, new and powerful surveillance technologies that are left unchecked in the hands of the state have been used to target innocent people who have done nothing wrong in an attempt to over-police and punish for just being black or indigenous without merit. Ignoring these rising concerns while tech companies provide technologies to governments and law enforcement agencies will result in a white nationalism disguised as crime fighting.

Surveillance networks are expanded utilizing a tried and true process called “crime displacement”(Ball 17). Crimes are displaced into neighbouring areas once cameras are installed in a target location. To make this more complicated, statistics on the displacement of crime are rarely kept, making it challenging to review and analyze. Regardless of a surveillance camera’s placement in for example on a university campus or in a busy area in downtown Toronto, cameras are used to push purported crime based activities out of sight. Once those areas that are heavily surveilled alienate communities that frequent those areas, the desertion of those public areas invite new cameras to be set up elsewhere while neither the problematic activity or its causes are addressed.

Bibilography

Browne, Simone. Dark Matters: On the Surveillance of Blackness. 2nd ed., Duke University Press Books, 2005.

Hao, Karen. “Amazon Is the Invisible Backbone of ICE's Immigration Crackdown.” MIT Technology Review, MIT Technology Review, 8 Oct. 2019, https://tinyurl.com/yyt4chac

“Routledge Handbook of Surveillance Studies.” Google Books, Google,https://tinyurl.com/yyqhk7we

Additional reading resources:

System Technologies of Citizenship: Surveillance and Political Learning in Welfare

Ethical aspects of facial recognition systems in public places

Black communities are already living in a tech dystopia

Ferguson is the future

White Supremacy and Artifical Intelligence

These devices may be spying on you even in your own home

Policing Black Lives by Robyn Maynard

Virginia Eubanks Automating Inequality

Picturing alogoritmic surveillance: the politics of facial recognition systems