X hits on this document





5 / 100

Human-computer interaction is uniquely suited to help design teams manage the challenges brought by the need of protecting privacy and personal information. First, HCI can help understand the many notions of privacy that people have. Westin describes four states of privacy: solitude, intimacy, anonymity, and reserve [307]. As practical examples, Murphy lists the following as expressions of privacy: “to be free from physical invasion of one’s home or person,” “the right to make certain personal and intimate decisions free from government interference,” “the right to prevent commercial publicity of one’s own name and image,” and “the control of information concerning an individual’s person” [216]. These perspectives represent different and sometimes conflicting worldviews on privacy. For example, while some scholars argue that privacy is a fundamental right, Moor claims that privacy is not a “core value” on par with life, security, and freedom, and asserts that privacy is just instrumental for protecting personal security [213].

Second, a concept of tradeoff is implicit in most discussions about privacy. In 1890, Warren and Brandeis pointed out that privacy should be limited by the public interest, a position that has been supported by a long history of court rulings and legal analysis [298]. Tradeoffs must also be made between competing interests in system design. For example, the developer of a retail web site may have security or business requirements that compete with the end-user privacy requirements, thus creating a tension that must be resolved through tradeoffs. Because HCI practitioners possess an holistic view of the interaction of the user with the technology, they are ideally positioned to optimally work through and solve these tradeoffs.

Third, privacy interacts with other social concerns, such as control, authority, appropriateness, and appearance. For example, while parents may view location-tracking phones as a way of ensuring safety and maintaining peace of mind, their children may perceive the same technology as smothering and an obstacle to establishing their identity. These relationships are compellingly exemplified in Goffman’s description of the behavior of individuals in small social groups [122]. For instance, closing one’s office door not only protects an individual’s privacy, but asserts his ability to do so and emphasizes the difference from other colleagues who do not own an individual office. Here, the discriminating application of HCI tools can vastly improve the accuracy and quality of the assumptions and requirements feeding into system design.

Fourth, privacy can be hard to rationalize. Multiple studies have demonstrated that there is a difference between privacy preferences and actual behavior [14, 44]. Many people are also unable to accurately evaluate low probability but high impact risks [260], especially related to events that may be far removed from the time and place of the initial cause [132]. For example, a hastily written blog entry or impulsive photograph on MySpace may cause unintentional embarrassment several years down the road. Furthermore, privacy is fraught with exceptions, due to contingent situations and historical context. The need for flexibility in these constructs is reflected by all the exceptions present in data protection legislation and by social science literature that describes privacy as a continuous interpersonal “boundary-definition process” rather than a static condition [23]. The use of modern “behavioral” inquiry techniques in HCI can help explicate these behaviors and exceptions.

end-user-privacy-in-human-computer-interaction-v57.docPage 5 of 85

Document info
Document views167
Page views167
Page last viewedWed Oct 26 20:09:32 UTC 2016