A few observations about Amazon being urged not to sell facial recognition tool to police
Amazon’s decision to market a powerful face recognition tool to police is alarming privacy advocates, who say the tech giant’s reach could vastly accelerate a dystopian future in which camera-equipped officers can identify and track people in real time, whether they’re involved in crimes or not.
What is Amazon Rekognition Image?
Rekognition Image is a deep learning powered image recognition service that detects objects, scenes, and faces; extracts text; recognizes celebrities; and identifies inappropriate content in images. It also allows you to search and compare faces. Rekognition Image is based on the same proven, highly scalable, deep learning technology developed by Amazon’s computer vision scientists to analyze billions of images daily for Prime Photos.
So, let’s get this straight: Amazon trained this algorithm with customers’ photos (Amazon provides “free” photo storage service to Prime customers) and now they are planning to sell this technology to the police. I often talk about consent in relation to technology so here are some issues:
- customers must have consented (through an EULA probably) to have their photos used to improve the algorithm
- customers, however, did not consent to have technology created using their photos as material resources (ie the training material through which the algorithm learns) sold to the police
- people whose family members, friends, acquaintances etc took photos and uploaded them to Prime did not, however, consent to having their image used to train these algorithms
- people whose photos were taken in public spaces and eventually uploaded to Prime did not consent to have their image used for corporate profit (further reading, my essay from 2016: Private Internet, Public Streets)
- Amazon will obviously profit from this sale but, as is customary, the people who provided the resources for the training will not see a dime of this profit
- Amazon will not necessarily consent to scrutiny of their algorithm to inspect how it has treated facial recognition in relation to race
- people whose photos were stored and might object to have them used as part of the surveillance apparatus will have no say in the final sale
Non consensual data extractivism as the basis of the surveillance structures.
Also interesting this tidbit from Amazon Rekognition’s page:
Rekognition Image enables you to detect explicit and suggestive content so that you can filter images based on your application requirements. Rekognition provides a hierarchical list of labels with confidence scores to enable fine-grained control over what images you want to allow.
Especially in the context of the precedent of algorithms automatically banning all female nudity regardless of context. Who created Amazon’s “hierarchies”? (a taxonomy by any other name, etc).
I keep going back to the issue of taxonomies because they are foundational to the idea of algorithms and in this case, I am particularly interested in how taxonomies have been built in regards to race, gender, body language, etc. Especially with the possibility of police using this application to determine who is a potential criminal. Has this algorithm been “taught” what a woman looks like by being trained with stereotypically images of cis women? Has this algorithm been trained to recognize darker skin and if so, in what context? With the levels of police violence directed at Black people, what safeguards did Amazon take to prevent this tool from being used in a way that negatively impacts this community?
a couple of past reflections for further context:
This thread: “this idea of Big Data as “a mythology” (in the Haraway sense). What does this mythology say about relationships of power and domination in regards to this “right to name, designate, categorize etc” and who wields it over us?”
these taxonomies are centuries old and simply repurposed in new format to continue serving a specific model of vigilance, discipline and control in regards to capitalism and accumulation of resources. Data extractivism is *a thing* not different from other tangibles
— Flavia Dzodan (@redlightvoices) April 18, 2018
I teach a course called "The coloniality of the algorithm" that traces the history of the belief system we actually program into the machines, the taxonomies and epistemes that actually *make* our technology. They are not sentient beings that make us do things.
— Flavia Dzodan (@redlightvoices) March 28, 2018
but I am of course interested in how "Big Data as mythology" and its colonial taxonomies (ie gender, race, class etc) become foundational to our contemporary political systems
— Flavia Dzodan (@redlightvoices) January 25, 2018
UPDATE: Amazon has released a statement “Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes? Like any of our AWS services, we require our customers to comply with the law and be responsible when using Amazon Rekognition“.
For the past decade and a half I have been making all my content available for free (and never behind a paywall) as an ongoing practice of ephemeral publishing. This site is no exception. If you wish to help offset my labor costs, you can donate on Paypal or you can subscribe to Patreon where I will not be putting my posts behind a lock but you'd be helping me continue making this work available for everyone. Thank you. Follow me on Twitter for new post updates.