X hits on this document





30 / 100

information is seldom available, and the topic is often too complex for the typical user to understand [14]. For these reasons, basing system design on the result of surveys may be potentially misleading. Because of the difficulty of probing behavior, techniques that only probe attitudes toward privacy should be used with great care and the results should be interpreted accordingly.

Third, privacy can be a difficult topic to investigate from a procedural standpoint. Iachello et al.’s and Hudson and Bruckman’s experience shows that IRB informed consent requirements may impede achieving the immediacy required for authentic collection of privacy preferences. Second, participant privacy may be violated when following certain protocol designs, even when these protocols are approved by the IRB. We believe that an open discussion on an IRB’s role in HCI research on privacy should help evolve current guidelines, often developed for medical-type research, to the dynamic and short-term participant-based research in our field.

3.3 Prototyping, Building, and Deploying Privacy-Sensitive Applications

In this section, we focus on privacy with respect to prototyping, building, and deploying applications. We consider both research on methods (i.e., what processes to use to uncover privacy issues during design) and practical solutions (i.e., what design solutions help protect privacy). Cranor, Hong, and Reiter have sketched out three general approaches to improve user interfaces for usable privacy and security [74]:

Make it invisible.

Make it understandable, through better awareness, usability, and metaphors.

Train users.

These three themes come up repeatedly in the subsections below. It is also worth pointing out user interface advice from Chris Nodder, who was responsible for the user experience for Windows XP Service Pack 2: “Present choices, not dilemmas.” User interfaces should help people make good choices rather than making them confused about what their options are and obfuscating what the implications of those decisions are.

Work on privacy-enhancing interaction techniques is quite extensive and we present it here in several subsections. Early Privacy Enhancing Technologies (PETs) were developed with the intent of “empowering users,” giving them the ability to determine their own preferences [312]. More recent work has taken a holistic and more nuanced approach encompassing architectural and cognitive constraints as well as the user interface. For example, work on identity management and plausible deniability demands that the whole system architecture and user interface be designed with those end-user concerns in mind [236]. Finally, the reader will note that the literature relating to interaction techniques for privacy is intertwined with that of usable security. This is because security mechanisms are the basic tools of privacy protection. We limit our discussion to interaction techniques specifically targeted at privacy, ignoring other work on topics such as biometrics and authentication if it is not directly connected with privacy.

end-user-privacy-in-human-computer-interaction-v57.docPage 30 of 85

Document info
Document views170
Page views170
Page last viewedThu Oct 27 02:04:57 UTC 2016