X hits on this document





12 / 100

It is often difficult to tease cause and effect apart: whether social practices and expectations drive the development of technology or vice-versa. Some observers have noted that the relationship between social constructs and technology is better described as co-evolution. Latour talks of “socio-technological hybrids,” undividable structures encompassing technology as well as culture—norms, social practices and perceptions [193]. Latour claims that these hybrids should be studied as a whole. This viewpoint is reflected in HCI research, including the proponents of participatory design [92, 256] and researchers of social computing [85]. Iachello et al. even go as far as claiming that in the domain of privacy, adoption patterns should be “designed” as part of the application and can be influenced to maximize the chances of successful acceptance [158].

The reader should note that in some cases, technologies that affect privacy are developed without much public debate. For example, Geographic Information Systems (GIS) classify geographic units based on census, credit, and consumer information. Curry and Philips note that GIS had a strong impact on the concepts of community and individual, but were introduced almost silently, over the course of several decades, by a combination of government action, developments in IT, and private enterprises, without spurring much public debate [78].

Understanding these changes is not a straightforward task, because technical development often has contradictory effects on social practice. The same artifact may produce apparently opposite consequences in terms of privacy, strengthening some aspect of privacy and reducing others. For example, cell phones both increase social connectedness, by enabling distant friends and acquaintances to talk more often and in a less scheduled way than previously possible, but also raise barriers between physically co-present individuals, creating “bubbles” of private space in very public and crowded spaces such as a train compartment [29].

From this standpoint, privacy-sensitive IT design becomes an exercise of systematically reconciling potentially conflicting effects of new devices and services. For example, interruption management systems based on sensing networks (such as those prototyped by Nagel et al. [218]) aim at increasing personal and environmental privacy by reducing unwanted phone calls, but can affect information privacy due to the collection of additional information through activity sensors. We highlight this issue of how expectations of privacy change over time as an ongoing research challenge in Section 4.5.

2.3.2 Changes in Privacy Methodologies

The discourse on human-computer interaction and on privacy in information technology (IT) shares a similar history over the past forty years. Reflections on the implications of IT on privacy surged in the late 1960’s with the proposal of a National Data Center in the United States [88] and culminated with the publication of the 1973 report Records, Computers and the Rights of Citizens [288] which introduced the Fair Information Practices. By the early 1970s, the accumulation of large amounts of personal data had prompted several industrialized countries to enact laws regulating the collection, use, and disclosure of personal information.

The FIPS reflect the top-down and systems approach typical of IT at the time. Systems were relatively few, carefully planned, developed for a specific purpose, centrally

end-user-privacy-in-human-computer-interaction-v57.docPage 12 of 85

Document info
Document views358
Page views358
Page last viewedSun Jan 22 17:49:48 UTC 2017