X hits on this document





46 / 100

whether the personalized interface runs on a fixed terminal or a portable device, but also on its location and on its purpose of use.

In summary, research in this area suggests that the issue of personalization and privacy is highly contextual and depend heavily on trust, interpersonal relations, and organizational setting. The evidence also suggests that users and marketers alike appreciate customized services. Finally, it is also not clear if sophisticated PETs are commercially viable. Consequently, a normative approach to preventing misuse of personal information might be better advised.

3.4 Evaluation

In this section, we outline significant work either evaluating PETs or specifically probing the privacy characteristics of applications.10 Most PETs require advanced knowledge to use, are complex to configure and operate correctly, and ultimately fail to meet end-user needs. However, it is worth pointing out that there are also many examples of IT applications which successfully integrate privacy-enhancing functions, for example instant messaging clients and mobile person finders.

While some researchers had pointed out the importance of user-centered design in security technology [317], only recently has the security and privacy communities started moving down this path. Unfortunately, since many security applications are developed commercially, the results of in-house usability tests, interviews, and heuristic evaluations are not available. User testing of the privacy-related aspects of applications is difficult due to various reasons, including their non-functional nature and their prolonged appropriation curves. As a result, there are not many reports available describing summative evaluation work on PETs and privacy-sensitive technologies.

3.4.1 Evaluation of User Interfaces

One of the earliest and most renowned papers discussing HCI issues and PETs was Whitten and Tygar’s “Why Johnny Can’t Encrypt” [310]. Whitten and Tygar reported on the usability of Pretty Good Privacy (PGP), a popular email encryption application [315]. They conducted a cognitive walkthrough and a lab-based usability test on PGP. In the usability test, experienced email users were asked to perform basic tasks, for example, generating keys and encrypting and decrypting emails. Results showed that a majority of users did not form a correct mental model of the public-key encryption process. Some users also made significant mistakes such as sending unencrypted email, while others did

10 We are aware that the distinction between design and evaluation is, to a certain degree, artificial in an iterative development model. However, we feel that the techniques that are discussed here specifically apply to already-developed products, i.e. are more appropriate for summative evaluation.

end-user-privacy-in-human-computer-interaction-v57.docPage 46 of 85

Document info
Document views223
Page views223
Page last viewedSun Dec 04 23:34:21 UTC 2016