Deconstructing Surveillance Week 6— Student Projects

Nick Rabb
7 min readOct 10, 2023

As a reminder, this is part of a series of write-ups based off a class I’m teaching at Tufts University called Data and Power: Deconstructing Surveillance. If you would like more of a description of what the class is, check out the post from Week 1. If you want to keep up with this series, you can subscribe to receive email updates or on Medium where the write-ups are posted!

It is always a treat to see students present their findings on a topic that interests them. This week in class, we did just that, as students had spent the previous week working on a mid-semester project, deconstructing and analyzing a surveillance system of their choosing. We had a great time learning about each other’s work.

For the broader community that reads these write-ups, I want to share back some of the student projects. All of the projects were wonderful, but of course, I cannot share all of them. I have selected three projects that I think will interest readers of this series: emotional recognition AI, PayPal, and Project SHAMROCK. Each highlights interesting aspects of technological surveillance and its effects on society, but each also is a slightly different take on surveillance. What each project group found interested me, so I hope it interests you as well.

Emotional Recognition AI

Our first group focused on emotional recognition AI — systems that essentially scan faces to determine what emotion someone is feeling. They trace its origins back to an MIT researcher named Rosalind Picard, who became well-known for pioneering the idea of “affective computing.” The general idea was that a computer-human interface could benefit from a computer being able to detect user affect and respond with its interface accordingly.

However, the group makes clear that emotional recognition systems have drifted far from the goals of affective computing. Now, companies and government organizations are using facial recognition systems to attempt to detect emotion, and are using that technology to further their goals. Workplaces use emotion recognition to monitor workers on the job, trying to detect their level of engagement. Microsoft tried to develop a technology for blind and vision-impaired users that would describe scenes for them, including the emotions of others in the scene. The Chinese government has used emotion recognition in Xinjiang province to police the population.

This group was quick to note that there could be plenty of problematic biases in a system like this. The first is that the training data used to make these machine learning models could not represent every population, who may express feelings differently. Another is that any use of emotional recognition AI by already oppressive organizations can simply further oppression by giving them more power. They also note that they are skeptical as to whether a machine learning system could ever detect emotion at all, pointing to the fraught history of the polygraph as a failed lie-detection technology.

In total, I think this group chose a very interesting technology to analyze because it has several layers which are subject to critique. Researchers of emotion have often criticized this kind of technology as total pseudo-science, as emotions are far more complex than visual cues from a face or body. However, even an imperfect technology can be used by organizations for whatever purpose they deem necessary. Thus, even if the software worked perfectly, companies, police, and governments could still use it in an oppressive manner. This type of sociotechnical interaction is exactly what this course aims at, so this technology stood as a perfect example of the multifaceted nature of seemingly only technological systems.

PayPal

A different group chose to do their project on PayPal, the very popular money transfer service, originally called “Confinity” but changing to PayPal after merging with a company owned by Elon Musk called “x.com.” This system is different, as it is not surveilling in the sense that would be typically imagined. PayPal engages in a type of surveillance that is more in line with Zuboff’s surveillance capitalism: tracking user behavior, inferring characteristics or future behaviors from users based on their activity, and “disclose” those predictions to others.

This group took the time to read through PayPal’s privacy policy, and concluded that it is full of language that enables the company to make predictions about you and give them to third-party companies. At the point that the data is acquired by a third-party, PayPal no longer takes responsibility for what is done with that data. Moreover, it asks for information about you from other third-party companies to help its own predictions. They do this type of data analysis under the guise of being able to detect fraudulent purchases from your account, justifying it using language of safety for users.

I enjoyed this project because it touches on several important aspects of modern surveillance that is embedded into data-driven platforms. The company obviously has a lot of power over users in terms of what it can do with their data, the predictions it can make, and entitling itself to any subsequent use of those predictions. It fits into an ecosystem of other technologies that may, for example, in turn end up serving users advertisements that shape their thoughts and behaviors. Or, this data may be used by banks or credit agencies to approve or deny loans to users. The scary part is that nobody tells you where this data is going or how it is being used. The front-facing financial transaction application is just the tip of the iceberg in terms of how data is used in conjunction with social systems.

Project SHAMROCK

Finally, another group chose to examine a U.S. government surveillance program called Project SHAMROCK. This system, in contrast to the others above, is historical and did not so much interact with data-driven technology as we think of it today. But the ways that this system used older technology to do surveillance mirrors what we see in modern technologies.

This surveillance program monitored telegraphic communications entering or leaving the U.S., allowing government agencies to monitor communications for purposes of national security. It was created at the conclusion of the second World War so that the U.S. could intercept foreign intelligence to use for its war aims. However, the project was continued past the conclusion of the war, and into the Cold War era where it was then turned on the domestic population to identify communists and enemies of the state. The project was largely a secret until it was revealed to the public during the infamous Church Committee hearings in the late 1970s.

Here is a perfect example of disciplinary power in action, even when used in secret. Individuals who were targeted by domestic surveillance through SHAMROCK would be subject to disciplinary action, thus creating a norm of a “good citizen” who does not threaten the state at all. The group noted that this program disproportionately affected certain groups of people, including anti-war protesters and black activists. Thus, the government pushes a normative vision of the world where disagreeing with its wars or racist policies makes you a criminal.

Also, this program is an interesting example of the interplay between government and corporate power. Telecommunications companies were cooperating with the government to make this technology. For them, it likely led to a good deal of profit through large government contracts. For the state, their interest of “national security” was likely used to sway companies to support them. There are obvious corollaries in our society today, where companies like Amazon, Google, Palantir, and others, work closely with the U.S. government to create systems that enforce immigration borders, bolster police surveillance capacities, and are used for current war efforts.

For next week

Community announcements

None! This week there were no announcements sent in.

Readings and reflections

Next week, we are back to our normal schedule of reading and watching content to begin the “practice” portion of our course. First, we will watch an excellent talk by Ruha Benjamin called “Reimagining the Default Settings of Technology & Society.” We will also watch a snippet of a talk from Angela Davis, titled “How Does Change Happen?” (starting at time 4:00, and going to 7:45). Then, we will read chapters 1 and 2 from Data Feminism by Catherine D’Ignazio and Lauren Klein.

Thanks for reading along, and as a reminder, if you want to keep following these weekly updates or share them with friends, either follow my blog on Medium, or subscribe to the email list via this Google Form.

--

--

Nick Rabb

PhD candidate in Computer Science and Cognitive Science at Tufts University, organizer w/ Dissenters, MA Peace Action, formerly Sunrise Mvmt. Philosophy nerd.