X hits on this document





75 / 100

before actually deploying a system. A typical process is to develop a full system (or new feature), deploy it, and then wait for negative responses from the public or the media, fixing or canceling the system in response.17 However, it is well known that modifying an existing system late in the design cycle is an expensive proposition. There is a strong need for better methods and tools for quickly and accurately assessing potential privacy risks as well as end-user privacy perceptions. To illustrate this argument, we consider the acceptance history of ubiquitous computing technologies, which have been hotly debated for the past 15 years over their effects on privacy.

4.5.1 A Story of Rejection And Acceptance: The Importance Of Value Propositions

Xerox PARC’s initial foray into ubiquitous computing in the late 1980’s provides an instructive case study on privacy. While groundbreaking research was being conducted at PARC, researchers in other labs (and even at PARC) had visceral and highly negative responses to the entire research program. Harper quotes one colleague external to the research team which developed Active Badges as saying:

“Do I wear badges? No way. I am completely against wearing badges. I don’t want management to know where I am. No. I think the people who made them should be taken out and shot... it is stupid to think that they should research badges because it is technologically interesting. They (badges) will be used to track me around. They will be used to track me around in my private life. They make me furious.” [140]

The media amplified the potential privacy risks posed by these technologies, publishing headlines such as “Big Brother, Pinned to Your Chest” [68] and “Orwellian Dream Come True: A Badge That Pinpoints You” [266]. Ubiquitous computing was not seen as an aid for people in their everyday lives, but as a pervasive surveillance system that would further cement existing power structures. Similar observations were voiced also in the IT community. For example, Stephen Doheny-Farina published an essay entitled “Default =

17 Part of the reason for this casual approach is that many developers do not expect such negative reactions from their work. For example, in September 2006, Facebook, a social networking site targeted at college students, added two new features to their site, News Feed and Mini-Feed 179.Kinzie, S. and Y. Noguchi, In Online Social Club, Sharing Is the Point Until It Goes Too Far, Washington Post pp. A01, 2006. . News Feed was a content module that showed what recent changes had occurred with friends and when. For example, News Feed would show that a friend had joined a group recently or had added another person as a friend. Similarly, Mini-Feed was a separate content module that let others see what recent changes an individual had made to their profile. What is interesting is that, although all of this information was already publicly available through a person’s Facebook profile, these fairly innocuous features generated a tremendous amount of resentment from Facebook users, over concerns of being stalked and a lack of appropriate privacy controls in one’s joining or leaving a certain social group.

Facebook’s experience is far from exceptional. Many other projects have faced similar concerns. For example, in 1990, Lotus proposed to sell a Housing Marketplace CD which provided directory information on the buying habits of 120 million people in the US 19.Agre, P.E. and M. Rotenberg, Technology and Privacy: The New Landscape. Cambridge MA: MIT Press, 1997.. That project was cancelled due to privacy concerns. In 1999, Intel proposed to add unique IDs to each of their processors, to facilitate asset management and provide hardware-based certificates 207.McCullagh, D., Intel Nixes Chip-Tracking ID. 2000. http://www.wired.com/news/politics/0,1283,35950,00.html. Intel quickly reverted to disabling this feature by default.

end-user-privacy-in-human-computer-interaction-v57.docPage 75 of 85

Document info
Document views145
Page views145
Page last viewedMon Oct 24 22:46:17 UTC 2016