X hits on this document





48 / 100

while in fact all files on their hard drive were shared. Good et al. also probed the KaZaA network, finding that in fact a large number of users “appeared to be unwittingly sharing personal and private files, and that some users were […] downloading files containing ostensibly private information.” In summary, Whitten and Tygar’s, Whalen et al.’s, and Good et al.’s findings all indicate that privacy-affecting technologies are easily misunderstood and that their safe use is not obvious.

Difficulties in comprehension affect not only PETs but also privacy policies. Jensen and Potts analyzed sixty-four privacy policies of both high-traffic web sites and web sites of American health-related organizations (thus subject to HIPAA) [168]. They analyzed policy features including accessibility, writing, content, and evolution over time. The results portray a rather dismal situation. While policies are generally easy to find, they are difficult to understand. The surveyed policies were in general too complex from a readability standpoint to be usable by a large part of the population, which Jensen and Potts note also questions their legal validity. Furthermore, the user herself is typically responsible for tracking any changes to policies, thus curtailing effective notification. The policies of some web sites were very old, exposing both users and site operators to potential risks (respectively, unauthorized uses of personal information and legal liability). Finally, Jensen and Potts note that users typically do not have the choice to decline terms of the policy if they want to use the service. In short, the resulting picture is not encouraging. Users may well be responsible for not reading privacy policies [126], but even if they did, they would find it difficult to understand them, track them over time, and resist accepting them.

Evaluation of privacy-sensitive IT applications has also extended to off-the-desktop interaction. For example, Beckwith discusses the challenges of evaluating ubiquitous sensing technologies in assisted living facilities [38]. Beckwith deployed an activity sensing and location tracking system in a facility for elderly care, and evaluated it using semiformal observation and unstructured interviews with caregivers, patients, and their relatives. One question that arose was how users can express informed consent when they do not understand the operation of the technology or are not aware of it. Their observations highlight the users’ lack of understanding with respect to the recipient of the data and its purpose of use. Beckwith proposed renewing informed consent on a regular basis, through “jack-in-the-box” procedures—an approach that resembles the Just-In-Time Click-Through Agreements of Patrick & Kenney [235].

In conclusion, existing work evaluating privacy-affecting technologies shows that these technologies are too demanding on users [95]. Besides establishing common practices and safe defaults, we need to define appropriate metrics on user understanding and ability to express consent, and consistently try to improve them over time.

3.4.2 Holistic Evaluation

In addition to basic usability, applications must also be evaluated in their overall context of use. One key aspect of holistic evaluation is understanding the social and organizational context in which an application is deployed, because it can affect acceptance and skew the results of an evaluation (e.g., Keller’s analysis of privacy issues of electronic voting machines [177]). This kind of analysis is often done with retrospective case studies and controlled deployments of prototypes [53], but is

end-user-privacy-in-human-computer-interaction-v57.docPage 48 of 85

Document info
Document views158
Page views158
Page last viewedWed Oct 26 00:54:33 UTC 2016