In his disucssion of the rise of everyday surveillance, Oscar Gandy describes systems that enable “panoptic sorting,” discriminatory technologies which surveil all information about an individual’s status and behavior to use in the profiling and categorization of a person’s potential economic value:
The collection, processing, and sharing of information about individuals and groups that is generated through their daily lives as citizens, employees, and consumers and is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy. (The Panotpic Sort, p. 15)
David Lyon extends Gandy’s formulation from the individual ream to the social realm, resulting in “social sorting,” which represents:
dangers inherent in surveillance systems whose crucial coding mechanisms involve categories derived from stereotypical or prejudicial sources (Surveillance as Social Sorting, p. 2).
Social sorting, then, results in deep discriminatory practices, making surveillance a matter not only of personal privacy, but also of social justice.
For example, consider this story about the use of biometrics and facial recognition systems at bars to monitor and track patrons:
BioBouncer‘s camera snaps customers entering clubs and bars, and facial recognition software compares them with stored images of previously identified troublemakers. The technology alerts club security to image matches, while innocent images are automatically flushed at the end of each night, Dussich said. Various clubs can share databases through a virtual private network, so belligerent drunks might find themselves unwelcome in all their neighborhood bars.
Besides the notorious inaccuracy of facial regonition systems, such a system could also easily be used to engage in discriminatory practices beyond just “belligerent drunks.”