Deconstructing Surveillance Week 10 — Inside Resistance, Design Justice

Nick Rabb
10 min readNov 6, 2023

--

As a reminder, this is part of a series of write-ups based off a class I’m teaching at Tufts University called Data and Power: Deconstructing Surveillance. If you would like more of a description of what the class is, check out the post from Week 1. If you want to keep up with this series, you can subscribe to receive email updates or on Medium where the write-ups are posted!

This week is our last week of content, and the last week of the “practice” section (where our first section was called “problem”). We end this section by looking at surveillance that is perhaps the strongest culmination of all aspects of the problem that we identified: surveillance in the Occupied Territories in Palestine. This case study, a current and ongoing surveillance program, is perhaps one of the best to study in the world, as it is the cutting edge of utilization of digital surveillance systems used on a racialized population for means of state-sponsored, occupational control. Moreover, it is a modern system that has been resisted by Palestinians being subject to it, and resisted by tech workers who organized to stop their work from being used for systematic oppression.

By analyzing this particular system (which was part of the curriculum before the conflict in Palestine escalated in early October), we are, of course, wading into currently charged waters. While the analysis of these systems can be viewed as separate enough from what is happening now, it must be acknowledged that they are key parts of the slow violence and oppression that contributed to future acts of acute violence. The conflict in Palestine now represents a consequence of longstanding oppressive surveillance, a form of slow violence that backs people into an increasingly shrinking corner. I encourage all of you reading this, regardless of your judgement of the current conflict, to keep an open mind while reading the discussion of surveillance in Palestine, free of ideological baggage, and try to cut through any deceptions along the way to see the truth of the situation.

We end the week by revisiting a conversation we began this section with: designing technology that is in the interest of justice. We dive into content from Sasha Costanza-Chock’s Design Justice, re-read a chapter from D’Ignazio and Klein’s Data Feminism, and add an article from the Just Tech project by Nassim Parvin. This will conclude our “practice” section, as we will try to synthesize the content we grappled with and think about what it means to be an ethical practitioner and user of data technology, particularly in the face of surveillance logics.

Surveillance in the Occupied Palestinian Territories

This content revolves primarily around two specific systems used by the Israeli military (the Israeli Defense Force, or IDF) in the territories they occupy of Palestine: Red/Blue Wolf, and Project Nimbus.

Red Wolf (formerly Blue Wolf) is a surveillance system designed for use in the Occupied Territories to use facial recognition as a means of bolstering surveillance of Palestinians. Prior to this system of surveillance, Palestinians in the Occupied Territories had been subject to systems of movement restriction through other means: permitting, blockage, fences, military checkpoints. While the Israeli government justified these restrictions in the name of counter-terrorism, Palestinians’ experience of these includes dehumanization, constant fear, and frustration as they are controlled at every turn. Many outside observers have argued these systems perpetuate Israel’s project of seizing territory from Palestinians to expand their state, a process of settler colonialism.

Blue and Red Wolf were designed to add a new, digital layer to the restrictions. Facial and biometric data were systematically collected at checkpoints and as IDF soldiers walked through the Occupied Territories to take photos of Palestinians (without consent). Cameras were set up all over the areas, subjecting Palestinians to constant surveillance no matter where they go. All this data sits in a database that can be used at checkpoints and by IDF soldiers roaming around to deny Palestinians access to certain areas, subject them to searches, and keep them “in line.”

Taken from Amnesty International’s “How Surveillance Tech is Used to Oppress Palestinians Through Apartheid?

The other part of this widespread system of surveillance is the 2021 Google system contracted by the Israeli government called Project Nimbus, intended to “provide the government, the defense establishment and others with an all-encompassing cloud solution.” In a nutshell, this means that the IDF would additionally gain AI capabilities that Google is developing, like emotion detection, object detection, and predictive model training for virtually any purpose through the AutoML platform. It should be noted that emotion detection is a technology that has been proven to be extremely faulty and not even scientifically valid.

Used in combination, this means that the IDF would have a sprawling database of Palestinians’ faces and biometrics, a pervasive networks of security cameras and IDF soldiers with smartphones feeding in real-time video of Palestinians, and AI algorithms evaluating these feeds to try to detect features of interest to the IDF: malicious intent, lying, suspicious activity — all behaviors that, as Browne explained while discussing the “white gaze” of surveillance on black people, are often benign activities being racialized and categorized as dangerous because of the racist viewer. These racialized, inaccurate predictions will then be used to further police, restrict, search, and harass Palestinians by their occupying forces.

This set of systems exemplifies so many aspects of our analysis of surveillance during the “problem” section. As securitization in the Occupied Territories moves from interaction at physical checkpoints and with soldiers to constant, background, hidden surveillance, it exemplifies the disciplinary power Foucault described in Discipline and Punish: panoptic exertion of power that forces individuals to constantly behave in the normalized manner for fear of punishment by officers who see you, but you cannot see them. It embodies Browne’s racializing surveillance, as distinctions between ethnicities are reified through the systems. Their justification stems from a “crisis,” as an act of self-defense against Hamas terrorists, but amount to a modern-day Lantern Law where a whole people must be constantly identified, casting all Palestinians as potential terrorists.

The Occupied Territories have been identified as a site for testing of new surveillance systems, making Israel a key technology developer as its companies profit off of its occupation. Israel’s Coordinator of Government Activities in the Territories called Hebron, where these systems are deployed, a “smart city,” invoking rhetoric of innovation and utility rather than occupation and oppression. Other of Israel’s surveillance systems have allegedly garnered purchasing interest from the U.S. government, and Google’s contract for Project Nimbus amounted to $1.2 billion.

Worker Resistance to Nimbus and Beyond

We also consider Google employees’ resistance to development of Project Nimbus for the Israeli government. Workers who learned about the project began organizing within the company, forming alliances between Palestinian and Jewish workers, and pushing the company to drop its contract. Even the new Alphabet workers union rallied behind these Googlers to support their efforts. Several employees were fired for pushing back against the system’s development.

A slide from Google’s presentations about its AI systems for Project Nimbus

This comes in the context of Googlers’ previous resistance to Project Maven, a U.S. Department of Defense contract which would see them working on object and person recognition systems to advance drone and missile targeting for use by the U.S. military. To resist, thousands of Google employees signed a petition against it, staged a massive walkout, and teams of engineers threatened to stop work if the project was not dropped. Google eventually stopped the project (though it’s unclear if other projects continued with similar work), developed its AI Principles as a show of good-will, and workers continued their resistance by formalizing their union organizing efforts.

Google employees’ actions represent a strong case of worker resistance to unethical surveillance projects. As we discussed in our section on theories of change, these employees worked within the system by both structure organizing (forming solidarity groups and a union) and mass protest organizing (walking out, writing a petition) to pressure their company to drop the contract. In ways, it was hugely successful, as it both drew attention to the work and widespread condemnation, and seems to have put some obstacles in Google’s way towards development.

Our discussion on movement ecology is relevant here, as neither of these types of resistance, realistically, can fix the problem alone. We may critique the Googlers’ actions as useless, as Google pursued other military contracts, but they represent a step forward. Their organizing during the Maven resistance led to structures that were then used to protest Nimbus. Continued education about the issue and subsequent issues has persisted, tech workers writ large are now unionizing and resisting more often, and perhaps students or new workers are less likely to join these companies for fear of being used for harmful work. Making change is by no means linear, nor is it entirely predictable. But the one constant is that continued resistance, and organization of structures to enact it, builds the foundation that can push back at certain flashpoints, and helps enable change when the time is right.

Designing Just Tech

All of the considerations that we’ve gathered thus far about making change and how to embody ethical technology development take us to this point. The design of ethical technology, or technology pushing towards justice, is still a very open question being considered by those pushing the boundaries of research and thought.

D’Ignazio and Klein put forward that data science should challenge power, and they illustrate several examples that collect and analyze data that is then used to fight certain injustices (racial pregnancy and birth care inequities, femicides in Mexico, racial biases in criminal recidivism algorithms) or bring attention to them. Costanza-Chock covers a lot of ground, but centers the participation in design by minoritized people, and community-centered development. Parvin alludes to projects that do similar interventions to those detailed by D’Ignazio and Klein — for example, mapping CCTV points in over-surveilled neighborhoods — and complicates ethics, concluding that socially just technology must be developed by inclusive community design that also imagines potential consequences.

In the end, it appears that we are left not with easy answers, but some guiding frameworks and probably several more questions. But these frameworks can lend some insight into how technology development should change; crucially, only if alternative processes are available.

If those alternatives are possible, then being guided by community-centered processes and using technology to support community projects is a fantastic way to embody an ethical practice. These possibilities may exist if one is able to work with a university, community organization, nonprofit, government body, or support their own project. There are plenty of community needs all over the world, and the only difficulty lies in finding organizations who are doing that work, or starting one.

But for many organizations, their goal is not to do community-centered work, or the communities that they care about are shareholders or military occupiers. What to do in these scenarios? This area of thinking is not so much covered by those writing about just technology. But it is covered by our previous discussions of theories of change and resistance.

In the case of Google, its contracts needed to be halted by worker resistance, which required organizing and coordinating petitions, walkouts, threatening work stoppage. One ethical angle that obviously comes from this is that technology workers can try to shape their company’s actions through resistance. Another is that Google’s work can be ethical, if only it took into account principles of justice and liberation and made its systems embody them.

In the case of the surveillance systems in the Occupied Territories, there is also a need for social change that would end the settler-colonial occupation of an entire people (which, in the first place, fuels the desire for ever-stronger surveillance technologies used against Palestinians). Technologies that support social change — allowing for information to be shared, helping people connect to organize, keeping track of growing interest and identifying others who may want to get involved — are all options that would need to be imagined and created. Here, we have to be conscious of the ecology of change, allowing for technologies that support personal transformation, alternatives, and institutional change through structure organizing or mass protest.

For next week

Readings and reflections

We have concluded our “practice” section and are going to move into creating final projects in class. But as we end this piece, we should reflect on our view of how to enact the ethics that we’ve build:

Reflect on a specific surveillance technology or system that is both interesting to you and that you would want to see change. When we say “change,” think of both the technology itself, the way it interacts with individuals, the institutions that support it, and the overarching social and cultural aspects it interacts with.

How would you imagine people may change aspects of the system that you think are unjustified, or harmful? What would you advocate for? Include both an imagination of a better system/technology, and how you may get there.

We do not have any readings for next week, as the class is transitioning to final projects. But I will have some descriptions of the project and how we can culminate all of our learning in next week’s write-up. Look forward to it!

Thanks for reading along, and as a reminder, if you want to keep following these weekly updates or share them with friends, either follow my blog on Medium, or subscribe to the email list via this Google Form.

--

--

Nick Rabb

PhD candidate in Computer Science and Cognitive Science at Tufts University, organizer w/ Dissenters, MA Peace Action, formerly Sunrise Mvmt. Philosophy nerd.