Data and Power: Deconstructing Surveillance — Week 1

Nick Rabb
7 min readSep 3, 2023

How should data scientists think about their role in society? To what ends should this specific set of skills be used? If data science is to make a positive influence on the world, what should be avoided and what should be pursued?

These questions and attempts at answering them are by no means trivial, invoking the now frequently declared but perhaps shallowly understood phrase: “data science ethics.” To first wrestle with what it means to be ethical in our world, and then subsequently act according to those ethics, is the work of a lifetime. Ethics cannot just be a buzz-word thrown around performatively, but rather, should be a continued dedication to identifying, critiquing, and challenging that which is harmful. It is not easy, but it is necessary for building a world that embodies our highest values.

I have the privilege of teaching a course at Tufts University this fall semester, 2023, where we will grapple with exactly these questions. Our classroom — filled with budding data scientists, computer scientists, political scientists, economists, critical theorists — will build community and practices around the shared goal of understanding how to think and act ethically in a world newly driven by data.

But as an educator concerned with our collective education, not just that of those lucky enough to attend Tufts University, I want to bring the questions and discussions we will share in the classroom to a larger space. It is not often that the learning opportunities afforded by the university make their ways beyond the ivory walls, but I believe it is crucial that they do. Hence, I am committed to sharing our class more broadly by writing weekly about the content we are reading, the questions we are asking in class, and the frameworks we are utilizing to think critically about data science ethics. This is the first iteration of that weekly series — welcome, and I hope you stick around!

If you want to keep up with these newsletters, either follow my blog on Medium, or subscribe to the email list via this Google Form. Please feel free to share this with anyone else who is interested in these questions and expand our learning community. Moreover, if this series sparks any thoughts, questions, or recommendations, please do not hesitate to reach out and share them with me via email at nicholas.rabb@tufts.edu!

Week 1: Intro to the Course & Getting Real

Everyone loves a good syllabus week; the first and easiest week of class, primarily concerned with reuniting with friends, catching up about summer adventures, and figuring out which classes or professors seem boring enough to ditch.

Our first week will dutifully follow suit, but also break convention a bit and get real right off the bat. We will, of course, take time to introduce the course and its syllabus, which was created in 2022 by myself and a friend and colleague, Desen Ozkan, who co-taught this course with me in its inaugural year, and has since moved on to the University of Connecticut as an Assistant Professor. But the bulk of this introduction is not about the content and assignments. Rather, this time will be spent opening some space to be frank about why we all are here together.

Surveillance and Ethics

Before the first day, we will all watch this short documentary featuring Shoshana Zuboff, professor at the Harvard Business School, wherein she discusses her 2019 work, The Age of Surveillance Capitalism.

This documentary may contain several ideas that surprise us: that our data is being used to predict our behavior, that we are raw materials in a new form of wealth extraction, that seemingly innocuous systems or games may actually be experiments in behavioral modification. These are gateways into the type of work we will do in class, critically examining technologies and discussing why we think they are good or bad, how to redesign them, or how to stop those we deem corrupt.

Though this is a class aimed at learning critical thinking skills and a vocabulary to describe the ethics of data science systems, we focus specifically on surveillance technologies for several reasons. For one, they are increasingly pervasive given the new availability of behavioral data and the ability to process it at scale. Students will likely interact with — if not construct — data collection, processing, and prediction systems throughout their lifetime. Beyond that, surveillance is a social logic that pervades several areas where technology meets the social world. It pushes us to ask questions about power, identity, human rights, and how to govern our society. These questions and the answers we put forward will hone our ethics as we grapple with our own views of the world; critiquing, deconstructing, and reforging them continuously.

When we think critically, as Zuboff does, about technologies that are sold to us as natural, inevitable innovations, we collide the technical with the social. When we feel certain ways about technology — that it is watching us, extracting from us, manipulating us — and ask why we think these are good or bad, we enter the realm of morals. And when we finally ask if we can do something to mitigate the bad and enhance the good, we arrive at ethics.

Typically, data science education gives a shallow, if not last-minute treatment of ethics: a utilitarian harm versus benefit calculus, thinking about “fairness” or “accountability” of algorithms. We instead ask deeper questions, interrogating our notions of harm or benefit, fair or unfair, accountable or unaccountable, and building a conceptual vocabulary that we can use to define what is good and what is bad. This process forces us to learn about systems and concepts like capitalism, democracy, oppression, liberation, and power.

But beyond that, we also concern ourselves with how we can then embody our conclusions and act so that we move towards the good. We give ourselves tools by learning about ethical and just design, about labor organization that stops companies’ predatory actions, about social movements that redefine what is politically or socially possible.

At the end of the course, it is my hope that we all leave with greater powers to critically analyze the data-driven world, strategies to change the parts that seem unjust, and the hope and imagination to use data technology to embody our highest values.

Individual and community transformation

We are all taking part in this experience, whether in the classroom or as part of the greater learning community, because we care about using our lives and skills to make the world a better place. Grappling with the ethics of data science technology is not something that comes easily. It involves really thinking, and being committed to really learning.

Sometimes the classroom is just a room of individuals, shy and unwilling to participate, simply hearing the lecture and memorizing content so a test can be passed later. We all know that this type of class, frankly, sucks. This course is an opportunity to break that paradigm. If you ask yourself what the best environment would look like to learn and discuss the ethics of data science — something that you will use throughout your life to try to do the right thing in a world governed by data-driven technology—what do you imagine? How do we interact with one another? How much do we participate? What type of transformation do you go through?

In my mind, this process is not possible without building a strong community where we are not afraid to ask hard questions, share honestly what we think, and be humble enough to question our preconceived notions.

Personally, I think that the stakes are so high that we owe it to ourselves to take this learning experience extremely seriously. A world governed by unethical data technologies will oppress, subjugate, rule dictatorially, take lives unnecessarily, and more. Being a part of that, or actively building it, seems unacceptable. What we will learn is training to avoid that kind of world. To me, that encourages me to give my all in this space. If we all do the same, then the class will be a truly transformative experience that will change our lives.

For next week

At the end of this incredibly lighthearted class, we will take stock of some logistics for the following week.

Community announcements

We found that last year, we really benefited from bringing the rest of our lives into the classroom community — disrupting the typical separation. One way we did this was by asking each week for community announcements. These can take the form of party invitations, announcing events, protests, lectures to attend, astrological developments to keep an eye on, whatever.

For this week, we will forego community announcements in this weekly update, but ask any of you to send in your announcements if you would like them to be shared out! Please email them to nicholas.rabb@tufts.edu. I will bring them to the classroom as well as the next update.

Readings and reflections

For the next week, we will read Zuboff’s The Age of Surveillance Capitalism, beginning with Chapter 1, which I will make available here, but also encourage everyone to buy the full book because the content we do not cover is really worth reading.

We will also reflect on the documentary we watched by relating the content to our own lives. The reflection prompt is as follows:

One topic that Zuboff touches on is the intentional targeting of users with advertisements based off of their internet activity (not limited to social media activity). This is undoubtedly happening to all of us as we use social media!

Take a look at your own social media feeds and streaming services (they often give you ads too!). Which ads are you shown? Why do you think you got them?

Thanks for reading along, and as a reminder, if you want to keep following these weekly updates or share them with friends, either follow my blog on Medium, or subscribe to the email list via this Google Form.

--

--

Nick Rabb

PhD candidate in Computer Science and Cognitive Science at Tufts University, organizer w/ Dissenters, MA Peace Action, formerly Sunrise Mvmt. Philosophy nerd.