X hits on this document





47 / 100

not manage to send mail at all within the time limit.11

Friedman et al. have studied the user interfaces for web browsers’ cookie handling in depth. Millett, Friedman, and Felten, for example, studied how the notification interfaces for cookies changed between 1995 and 2000, both in Netscape’s and Microsoft’s web browsers [211]. Expert analysis of UI metrics, including depth of menu items for configuration and richness of configuration options, showed that significant changes ensued over this five-year period. Configuration options were expanded, which Millett et al. consider a positive development. Further enhancements include better wording for configuration options and more refined cookie management (e.g., allowing users to delete individual cookies). Providing users more choice and better tools to express informed consent clearly comports with Value Sensitive Design [113]. However, the evaluation of PGP, discussed above, suggests that UI complexity is a fundamental drawback of these technologies and that PETs might be more effective with fewer, rather than more, choices. As noted in Section 3.2, systems should present meaningful choices rather than dilemmas.

In related research, Whalen and Inkpen analyzed the usage of security user interfaces in web browsers, including the padlock icon that signals a HTTPS connection with a valid certificate [308]. Using eyetracker data, they found that while the lock icon was viewed by participants, the corresponding certificate data was not. In fact, participants rarely pulled up certificate information and stopped looking for security cues after they have signed into a site. Complexity may be again a culprit here, considering that web browser certificate information dialogs are typically difficult to interpret for all but the most security savvy users.

The same theme of configuration complexity emerges from Good et al.’s work on the privacy implications of KaZaA, a popular file-sharing network [127]. Good et al. performed a cognitive walkthrough of the KaZaA client as well as a laboratory user study of its user interface. Results showed that a majority of participants were unable to tell what files they were sharing, and some even thought that they were not sharing any files

11 While low usability certainly contributed to PGP’s lackluster adoption, it is also likely that a reverse network effect, where few people could decrypt email, coupled with a perceived lack of need may also be responsible. For example, it is worth noting that the competing S/MIME standard, already integrated in popular email applications like Outlook and Thunderbird, has also not yet been widely adopted, notwithstanding the fact that it is arguably simpler to use (although not necessarily to configure).

Generally speaking, email encryption systems have been most successful when a service organization was present to configure and set up the clients. However, Gaw et al. found that even in organizations where email encryption technology is used, decisions about encrypting emails were driven not just by technical merit, but also by social factors 119.Gaw, S., E.W. Felten, and P. Fernandez-Kelly. Secrecy, flagging, and paranoia: adoption criteria in encrypted email. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems. Montréal, Québec, Canada: ACM Press. pp. 591-600, April 22 - 27, 2006 2006. http://doi.acm.org/10.1145/1124772.1124862. They found that “users saw universal, routine use of encryption as paranoid. Encryption flagged a message not only as confidential but also as urgent, so users found the encryption of mundane messages annoying.” Interestingly, this result is paralleled by research by Weirich and Sasse on compliance with security rules—users who follow them are viewed as paranoid and exceedingly strict 301.Weirich, D. and M.A. Sasse. Pretty Good Persuasion: A first step towards effective password security for the Real World. In Proceedings of New Security Paradigms Workshop 2001. Cloudcroft, NM: ACM Press. pp. 137-143, Sept. 10-13, 2001 2001. .

end-user-privacy-in-human-computer-interaction-v57.docPage 47 of 85

Document info
Document views295
Page views295
Page last viewedTue Jan 17 21:22:32 UTC 2017