X hits on this document





36 / 100

preferences, letting end-users simply manage how cookies are sent. Some solutions to this roadblock are discussed in the following section.

The second roadblock is that users may not be sufficiently motivated to use these technologies. Many users do not understand the issues involved in disclosing personal information, and may simply decide to use a service based on factors such as the benefit the service offers, branding, and social navigation. We believe that there are many research opportunities here in the area of understanding user motivation with respect to privacy.

The third roadblock is that many web sites owners may not have strong economic, market, and legal incentives for deploying these technologies. For example, they may feel that a standard text-based privacy policy may be sufficient for their needs. Web site owners may also not desire a machine-readable privacy policies, because it eliminates ambiguity and thus potential flexibility in how user data may be used.

Privacy Agents

From a data protection viewpoint, a privacy decision is made every time a user or a device under her control discloses personal information. The increasing ubiquity and frequency of information exchanges has made attending to all such decisions unmanageable. User interfaces for privacy were developed in part to cater to the user’s inability to handle the complexity and sheer volume of these disclosures.

Early work focused on storing user privacy preferences and automating exchanges of personal data excluding the user from the loop. An example of this is APPEL, a privacy preferences specification language developed by Cranor et al. which can be used to describe and exchange personal privacy preferences [75]. When this model was not widely adopted, researchers started investigating the causes. Ackerman et al. noted that users want to be in control for every data exchange of relevance [9]. The concept of Privacy Critics brings the user back in the loop. Critics are agents that help guide the user in making good privacy choices [10] and were introduced by Fischer et al. in the context of software design [107]. Rather than automating decisions, Privacy Critics warn the user when an exchange of personal data is going to happen. It should be noted that modern web browsers have incorporated the concept of critic for other kinds of data transactions, e.g., displaying non-secure pages and accepting dubious PKI certificates. However, it is also worth pointing out that these types of dialog tend to be ignored by users. This issue is discussed in Section 4.2 as an open challenge for future work.

Following this line of research, Cranor et al. developed an agent called Privacy Bird [71]. Privacy Bird compares a web site’s P3P policy with a user’s privacy preferences and alerts the user to any mismatches. In designing Privacy Bird, precautions were taken to increase the comprehensibility of the privacy preferences user interface, keeping only the relevant elements of P3P, removing jargon, and grouping items based on end-user categories rather than on P3P structure. Cranor et al. evaluated Privacy Bird according to Bellotti and Sellen’s feedback and control criteria [43], and found that users of Internet Explorer with Privacy Bird were more aware about the privacy policies of web sites than those without the Privacy Bird.

end-user-privacy-in-human-computer-interaction-v57.docPage 36 of 85

Document info
Document views133
Page views133
Page last viewedSun Oct 23 22:41:26 UTC 2016