Pandemic Surveillance by David Lyon

Pandemic Surveillance by David Lyon

Author:David Lyon [Lyon, David]
Language: eng
Format: epub
Publisher: Wiley
Published: 2021-11-11T00:00:00+00:00


Data represents people in particular ways

It is not merely that surveillance data makes people visible, however. It also represents them – or fails to represent them – in specific ways. For instance, as many have noted, not everyone, by any means, has a smartphone. This means that, as contact tracing apps require a smartphone, and if they really work to alert people to the possibility of infection, not to have a phone is to be data-less, in this context, and thus potentially vulnerable to needless infection. And, as we saw in chapter 2, there are many ways in which contact tracing apps easily offer false positives and false negatives, so that, when they are in use, they are also prone to misrepresent those who appear in the system.

Or if, as in the Philippines, errors and discrepancies in anonymized COVID-19 data produced a drop in publicly reported cases of infection, this clearly puts many more at risk. As a Philippine senator commented, “garbage data” could produce “garbage decisions” by the government.22 Such “garbage data” has negative effects, again, because of how people are represented by the data.

Linnet Taylor, whose scheme we are following – loosely – here, uses the example of Aadhaar, the registration and identification system used in India, to show how the poor in particular may be grossly under-represented, a process that has been magnified during COVID-19. The system completely misses the materiality of poverty – meaning, for example, fingerprints cannot even be gathered from those whose fingers have become worn through a lifetime of hard toil, and the iris scans of the elderly are unusable due to malnutrition.

Her other example, in this context, is the proposal by a consulting company, working with the European Space Agency, to monitor the progress of migrants traveling toward the southern borders of the European Union. The plan was to watch groups congregating on beaches, monitoring things like their social media posts to predict who was heading for specific destinations, and who would cross when. They would then sell the data to border enforcement and migration authorities, so that they could be algorithmically sorted prior to arrival, according to their “desirability.” From those calculations, those authorities could pre-sort their chances of obtaining asylum.23 Migrants, too, are especially at risk during the pandemic.24

But how people are represented by data is not a simple matter. For the data is also processed through algorithms, the codes that are used to translate the basic information into more usable forms for specific purposes. Such algorithms are the crucial determinants, within each system, of how people are represented. And a basic obstacle to our understanding of this is that algorithms, as Frank Pasquale observes so eloquently, create a Black Box Society.25 They just can’t be “seen.”

Algorithms are not just “technical” and “neutral.” They are powerful means of organizing data for particular purposes. Black boxes are found in planes and other transportation, to record technical data, but the image also suggests the obscurity to most people of data-processing. The black



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.