in READ

Friction-Free Racism

Friction-Free Racism — Real Life by an author (Real Life)
Surveillance capitalism turns a profit by making people more comfortable with discrimination
Chris Gilliard in Real Life Magazine. All annotations in context.

Questions about the inclusivity of engineering and computer science departments have been going on for quite some time. Several current “innovations” coming out of these fields, many rooted in facial recognition, are indicative of how scientific racism has long been embedded in apparently neutral attempts to measure people — a “new” spin on age-old notions of phrenology and biological determinism, updated with digital capabilities.

A need for diverse individuals in engineering, computer science, and STEM fields as these technological devices become ubiquitous in our lives.

Only the most mundane uses of biometrics and facial recognition are concerned with only identifying a specific person, matching a name to a face or using a face to unlock a phone. Typically these systems are invested in taking the extra steps of assigning a subject to an identity category in terms of race, ethnicity, gender, sexuality, and matching those categories with guesses about emotions, intentions, relationships, and character to shore up forms of discrimination, both judicial and economic.

Points about the use of technology as a means to identify and differentiate between groups, most specifically in terms of race.

A key to Browne’s book is her detailed look at the way that black bodies have consistently been surveilled in America: The technologies change, but the process remains the same. Browne identifies contemporary practices like facial recognition as digital epidermalization: “the exercise of power cast by the disembodied gaze of certain surveillance technologies (for example, identity card and e-passport verification machines) that can be employed to do the work of alienating the subject by producing a ‘truth’ about the body and one’s identity (or identities) despite the subject’s claims.”

More about coding difference and using this as a means to prescribe the same power structures and ideologies.

Many current digital platforms proceed according to the same process of writing difference onto bodies through a process of data extraction and then using “code” to define who is what.  Such acts of biometric determinism fit with what has been called surveillance capitalism, defined by Shoshanna Zuboff as “the monetization of free behavioral data acquired through surveillance and sold to entities with interest in your future behavior.”

 

In other words. race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both.

Having just finished White Fragility, this is at the top of my mind right now. The consideration of the systems involved in racism, and codification of these differences…while distancing people from the system so they don’t feel like they’re a part of it.

At the same time racism and othering are rendered at the level of code, so certain users can feel innocent and not complicit in it.

Adding algorithms to the model intensifies the problem as it doubles and triples down on user signals.

Once products and, more important, people are coded as having certain preferences and tendencies, the feedback loops of algorithmic systems will work to reinforce these often flawed and discriminatory assumptions. The presupposed problem of difference will become even more entrenched, the chasms between people will widen.

This is making me think about a recent piece in which our social media feeds were examined to consider the ways in which they reify the powerful by using algorithms to modify the feed.

What would it look like to be constantly coded as different in a hyper-surveilled society — one where there was large-scale deployment of surveillant technologies with persistent “digital epidermalization” writing identity on to every body within the scope of its gaze?

 

Proponents of persistent surveillance articulate some form of this question often and conclude that a more surveillant society is a safer one. My answer is quite different. We have seen on many occasions that more and better surveillance doesn’t equal more equitable or just outcomes, and often results in the discrimination being blamed on the algorithm. Further, these technological solutions can render the bias invisible

A powerful takeaway from Gilliard that will resonate with me for some time.

The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.

Write a Comment

Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Webmentions

  • Earl Draculera liked this bookmark on twitter.com.

  • chill innovation liked this bookmark on twitter.com.