Debrief: Surveillance Studies Summer Seminar

Last week I had the pleasure of attending the Surveillance Studies Summer Seminar hosted by the Surveillance Project at Queens University. The seminar was led by world-class surveillance studies experts: David Lyon, Kevin Haggerty, and Kirstie Ball. About twenty graduate students attended from across the globe, whose disciplinary backgrounds spanned sociology, communication, criminology, information studies, and political science. The seminar program can be found here. Some highlights included:

:::

One of Kevin Haggerty’s lectures discussed the limitations of relying on a Foucauldian analysis of surveillance, and criticized the tendency by academics and the public alike of relying on Panopticism as a universal descriptor of surveillance systems. I’ve struggled with this in my own work, trying to fit concerns over Google’s infrastructure of dataveillance within Foucault’s terms. I don’t quite have it worked out to my liking yet…

:::

David Skillicorn from the School of Computing discussed “The Role of Data Analysis in Surveillance” — essentially a review of data mining techniques for prediction, clustering, outlier identification, and relationship analysis. While he was quite informative, what bothered me most about Prof. Skillicorn’s talk was his seemingly unquestioning trust and reliance on computer algorithms to provide accurate and helpful results. At one point, while discussing the use of data analysis in processing mortgage applications online, he made the comment to the effect that thanks to the unbiased nature of the algorithm, banks will save money from potential losses since the code won’t allow loans to be granted in cases where a human being might be tempted to approve one. I questioned this reliance on machines instead of human discretion, and he seemed shocked that anyone would want to rely on human nature rather than code. The very notion of bias in computer systems seemed lost on him.

I also questioned another comment by Prof. Skillicorn: the notion that the most opaque algorithms tend to be the strongest predictors, while the more transparent ones seem less effective. He acknowledged that researchers don’t know why this is the case, but suggested it might just be somehow fundamental to the nature of algorithms and that we might just have to accept it. I won’t. Transparency in technological design — including algorithms — is crucial to ensure fairness and justice in how they are deployed and used in society. We can’t just chalk up increased opacity in increasingly powerful data mining algorithms as “oh well – that’s just how algorithms are.”

:::

I was much more pleased with the presentation by Sean Reynolds from Queens’ IT department, where he detailed the efforts Queens is taking to reduce the amount of information about its students, faculty, and staff it retains in its databases. He noted how some data (marital status, occupation, etc) has been removed from student files (much to the dismay of the alumni & fundraising folks), how the University purges their server logs after 5 weeks, and that they keep very little data on usage of University computers (this is from memory – I’ve asked Reynolds for a copy of his presentation). The role of University as an ISP brings many issues of data retention to mind, which is only complicated by the trend of many universities to consider outsourcing their e-mail services to Google’s Gmail

:::

I tried to play the role of provocateur for much of the week, and was presented with such an opportunity when a discussed emerged about “good” forms of surveillance (ie, epidemiology surveillance). Since the discussion seemed to approach the faulty “guns don’t kill people, people kill people” argument, I countered with a more essentialist view that surveillance is always harmful: the purpose of surveillance is to view/record a person or action who would otherwise exist free from oversight. The very action of viewing/recording a person — whether for good or bad — minimized the natural autonomy of the individual. Thus, harm occurs. Few seemed willing to accept this position, but David Lyon was willing to acknowledge that surveillance is “never neutral” and “always potentially dangerous.” That’s a position I’m willing to take as well.

:::

We viewed “The Lives of Others,” the Oscar-winning film from Germany about surveillance by the Stasi in East Berlin, and the impact it had on both the watched and the watchers. It is a stunning film and I highly recommend it, especially for those interested in state surveillance (and bureaucracy).

:::

Visits by Pippa Lawson and Valerie Steeves fostered a discussion on privacy advocacy, especially the challenge of making one’s academic scholarship more useful for privacy activists and advocates. This can be a difficult gap to fill (more on that later).

In total, it was a productive week, filled with stimulating discussions and social networking with other young scholars of privacy and surveillance.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s