After taking account of surveillance capitalism and some of the social logics that undergird it (capitalism, liberalism and neoliberalism), we turn this week to more deeply understanding surveillance itself and its effects on how society is structured.
We’re reading from Simone Browne’s Dark Matters: On the Surveillance of Blackness and listening to a summary of the work of Michel Foucault from the podcast Philosophize This!. Each provides more explanation of surveillance itself: what is it, and how does it affect us? Particularly, the work of Foucault was seminal in describing disciplinary power, where surveillance is a key aspect of disciplining society. Browne provides us with a much-needed intervention in surveillance theory, introducing racializing surveillance as a key aspect of social control. These two sets of theory, combined with our knowledge of digital surveillance systems, results in a surveillance ecosystem that is pervasive, and begs us to question its legitimacy.
Surveillance and Power
Beginning with what we learn from Foucault, one of the prominent pieces of theory is how he describes a power dynamic in society of surveillance, examination, and normalization. This process is one that he argues is the dominant mode of discipline in society, as distinct from older instances of disciplinary action by the state like the public execution.
Foucault motivates this process by examining the modern prison and commenting on Bentham’s ideal prison, the panopticon. Bentham’s panopticon is imagined as a prison where there are guards in the middle who can see all prisoners, and prisoners in an outer ring who cannot see the guards or any other prisoners. In this way, prisoners never know when they are being watched, and thus act according to normalized behavior constantly, not knowing when a guard may not be watching so they can drop their act.
Prisons, Foucault argues, are excellent examples of disciplinary power because prisoners are subjected to surveillance, reprimanded through rigid examination, and incentivized to act as a “model prisoner” through that process. When panoptic surveillance is thrown into the equation, Foucault puts forward that guards do not even need to examine prisoners, because they will examine themselves for fear of always being surveilled and subject to punishment. Panoptic power is thus one that leads those under its gaze to discipline themselves.
But, of course, Foucault takes this one step further and names several other sites in society where this disciplinary process occurs: the workplace, the school, the family. There are power structures everywhere in which people are subject to surveillance, examination, and normalization. This is often done by superiors, whether that be our boss, manager, teacher, principal, or parent.
In fact, Foucault takes it even further and argues that these power dynamics are at play everywhere, what he calls biopower. There are constant social pressures to conform for fear of judgment by others in all areas of society, so we self-surveil and normalize to fit whatever norms we have been given. In this way, Foucault critiques what he sees as the typical view of power in society: that it lies at the top, with a sovereign or government. Instead, he argues that because we are under such pressures to conform socially, power lies in those who shape knowledge and discourse — the very ideas that we get from society to which we mold ourselves in order to fit in.
These theories can be combined with some of the content that we read from Browne, particularly arguments she relays from surveillance scholar Gary Marx, who argues that “new surveillance” is fundamentally distinct from “old surveillance.” Whereas old surveillance may have been limited by physical barriers like distance, someone actually watching with their eyes, the ability to record behavior, the ability to store information, and more, new surveillance does not have these restrictions. Because of digital technologies, data can be shared, stored, and aggregated easily; surveillance can be undetected because it is disguised as a consumer product; and there is much more surveillance because of the presence of data-collecting devices.
The result of new surveillance, mixed with Foucault’s ideas about disciplinary power, is that we live in a world where we are constantly surveilled by corporations and governments, and thus subject to normalization pressures from many angles. For some examples, imagine the way that our view of being fit is shaped by something like a FitBit, which has its own normative view of what a fit person does, records your movements, and encourages you to conform to its standards. Or think of the Nest thermostat, which tells you when you are being energy efficient or not, and encourages you to modify your behavior to meet its standards. We have many such devices, and we are frequently (and willingly) changing our habits accordingly.
Another consequence of the new system of surveillance comes with Foucault’s assertion that those who produce knowledge have the ultimate power over us by shaping our norms, discourse, and worldviews. In his time, the media system was very different, more monolithic. Foucault argues that science, as a dominant epistemology, wields tremendous power over society. But things have shifted dramatically from Foucault’s world, as we now live under new information systems where authority is not solely with scientific consensus. Our informational world is shaped by search algorithms, news feeds, and recommendation systems.
As a media ecosystem becomes more fragmented (the overlap between media that any set of individuals consume grows smaller as personalization increases), the information that we discipline ourselves according to may be different for different groups. Thus, different groups may adopt different norms, worldviews, habits. We may discipline ourselves to the norms of our in-group, but as more out-groups exist, we ignore their norms, or even view them as wrong.
We should reflect on the consequences of this new surveillance in concert with Foucault’s theories of disciplinary power:
How much power do search, social media, and advertisement companies have over us as they shape our informational world?
Is this power legitimate in your eyes? Who should have control over the information that shapes our norms and behaviors?
Are there consequences to the amount of surveillance we are under by so many data-driven “smart” technologies?
Reification of Racial Categories
The other major aspect we cover this week is Browne’s writing on racializing surveillance. Her work is a major contribution to surveillance studies, which she argues has had a predominantly white lens, particularly ignoring the ways that black people are surveilled. Her definition of racializing surveillance is worth quoting verbatim:
“moments when enactments of surveillance reify boundaries, borders, and bodies along racial lines, and where the outcome is often discriminatory treatment of those who are negatively racialized by such surveillance” (Browne, Dark Matters)
This is our first explicit exposure to the power of surveillance to categorize, and even more so to the hierarchical power dynamics inherent in that categorization. Browne continues to note that race is “understood as operating in an interlocking manner with class, gender, sexuality, and other markers of identity and their various intersections.”
The significance of this assertion is notable because it puts forward that acts of surveillance create and strengthen social divisions. In this sense, racializing surveillance recreates racism, which is socially constructed in the first place. She gives the example of the behavior of white men interacting on the street often being coded as normal, whereas the same activities performed by black men would be coded as dangerous, subject to disciplinary power. This foregrounds the “seeing eye” of who is surveilling, bringing in their identity, biases, and prejudices.
Browne recounts the story of Lantern Laws, which were instated in New York City in the early U.S. after an armed insurrection, partially composed of enslaved people, burned a building, killing nine white people. The laws required that black and native people carry lanterns after nightfall so they can constantly be identified. Spaces of gathering between black or native people were also limited so as to prevent conspiratorial plotting. This type of racializing surveillance is something that she alludes to being in the DNA of nations that privilege whiteness. Lantern Laws are a prelude to modern surveillance methods, as Browne notes that the specific racial ordering may change over time, racializing surveillance is always interested in creating hierarchies that uphold whiteness and accomplish the goals of the powerful.
Our picture of technological surveillance systems is thus further complicated by the biases inherent in their seeing eye. These days, it is more common knowledge that digital surveillance systems, for example, facial recognition, are racially biased, failing to recognize dark faces. But examples may extend into more socially charged technologies, like criminal recidivism systems (those used by courts to determine if criminals are likely to commit another crime), which were shown by ProPublica to disproportionately judge black women as likely to commit crimes, influencing judges’ decisions to grant them their freedom.
Reification of Any Oppression
But the story does not end there, as there are deeper layers we can think about with the power of surveillance to categorize. Even a facial recognition system that does recognize darker faces can be used by a surveiller whose seeing eye is oppressive. If the system is used, for example, by U.S. Immigration and Customs Enforcement (ICE) to identify and deport undocumented people, this can be viewed as surveillance being used to reify an oppressive division based on documented citizenship even without the technology itself making that judgment.
In general, we can see that surveillance systems, as Browne notes, can reify oppressive categorizations across any dimension of oppression: race, class, gender, sexuality, etc. When used in the process of disciplinary action, subjecting the surveilled to examination and normalization, this represents a very scary dimension of surveillance that may not be immediately apparent.
In the same way that Browne argues that racializing surveillance recreates racist hierarchies, so too would surveillance classifying across class or gender lines recreate oppressive hierarchies. Crucially, these dynamics are not possible without surveillance — not other types of observation. Browne is careful to point out that surveillance inherently involves the more powerful gazing upon the less powerful (an oversight), rather than just a veillance or neutral observation. This process is crucial to powerful actors in society maintaining the status quo social arrangement.
Data for Oppression, or Liberation
With these theories in hand, we can understand some more impacts of surveillance technologies on society. With its capacity to categorize along oppressive lines and then normalize that categorization, surveillance embodied through technologies can either be used to further oppression or lessen it. On top of this, certain actors have more power to mold norms than others, particularly technology companies who control the flow of information.
We should take careful note to recognize the potential for surveillance technologies to lead us into oppressive norms. Through advertisements, content ordering, or search results, the information we are presented with can further oppression. To be a critical consumer of information in the age of new surveillance, we need to know and recognize different typical dimensions of oppression so as to not be easily swayed by oppressive information. But while we focused on race from the reading, there are other dimensions as well — class, gender, sexuality, ability — and even some that we may not recognize.
On the flip side, to push our imaginative capacities a bit, we should also conceptualize how technologies could be used for liberation. What would it look like for a news feed or search algorithm to present information in an anti-oppressive manner? Is that something that you feel a technology company would be justified in doing?
To end on a more controversial note, technology ideology often goes hand-in-hand with the idea of objectivity, neutrality. Companies who serve us information will say that they simply present the results that are most popular, or if they are tailored to us, those that best matched our profile. Both shift responsibility onto the consumer, whereas there may be a responsibility for the company to serve anti-oppressive content, or at the least, not serve oppressive content. While this may enter tricky free speech and censorship territory, at the least we can recognize that because technology companies surveil us, then serve us certain types of content, they are making a choice regarding what they propagate. Their actions are not neutral, and should be scrutinized as they have so much power.
For next week
None! This week there were no announcements sent in.
Readings and reflections
For next week, we will read from Emily Guendelsberger’s On the Clock: What Low-Wage Work Did to Me and How it Drives America Insane, whose full book I also highly recommend reading.
We will also reflect on the concepts that we interrogated in class by thinking about the following:
Our readings from Browne and listening to summaries of Foucault have hopefully made clear that surveillance is performed in many ways. Connecting to our reading of Zuboff, we see that there are many technologies that are currently engaged in the disciplinary process of surveillance, examination, and normalization.
What devices or services do you interact with that push you to a certain norm through surveillance and examination? Do you agree with the normative version of the world you are given through these technologies?