X hits on this document





35 / 100

The third design choice is specifying the default privacy policies. For example, Palen found that 81% of corporate users of a shared calendar kept the default access settings, and that these defaults had a strong influence on the social practices that evolved around the application [231]. Agre and Rotenberg note a similar issue with Caller ID [19]. They note that “if CNID [i.e., Caller ID] is blocked by default then most subscribers may never turn it on, thus lessening the value of CNID capture systems to marketing organizations; if CNID is unblocked by default and the blocking option is inconvenient or little-known, callers' privacy may not be adequately protected.” In short, while default settings may seem like a trivial design decision, they can have significant impact in whether people adopt a technology and how they use it.

There is currently no consensus in the research community as to when coarse-grained versus fine-grained controls are more appropriate and for which situations, and what the defaults should be. It is likely that users will need a mixture of controls, ones that provide the right level of flexibility with the right level of simplicity for the application at hand.

3.3.3 Machine-Readable Privacy Preferences and Policies

Given that most users may not be interested in specifying their privacy policy, another line of research has attempted to automate the delivery and verification of policies for web sites. The most prominent work in this area is the Platform for Privacy Preferences Protocol (P3P). P3P lets web sites transmit policy information to web browsers in a machine-readable format. Users can then view policies in a standard format and then decide whether to share personal information [72]. Users can also set up their web browser to automate this process of sharing.

It is worth noting that the idea of a machine-readable privacy policy has been extended to other domains. For example, both Ackerman and Langheinrich proposed using labeling protocols similar to P3P for data collected in ubiquitous computing environments, to communicate such things as what location data about individuals is available, what kinds of things the environment would record, etc. [8, 190].

Although P3P was developed with feedback from various industrial stakeholders, it has been a hotly contested technology (see Hochheiser for an extensive discussion of the history of P3P [147]). One principled criticism is that automating privacy negotiations may work against users’ interests and lead to loss of control. Ackerman notes that “most users do not want complete automaticity of any private data exchange. Users want to okay any transfer of private data.” [9]

In practice, P3P has not yet been widely adopted. Egelman et al. indicate that, out of a sample of e-commerce web sites obtained through Google’s Froogle web site in 2006 (froogle.google.com), only 21% contained a P3P policy [90]. Reasons may include lack of enforcement [93], lack of motivation to adopt stringent policy automation by commercial players [147], and the lack of appropriate user interfaces for delivering the P3P policy to users and involving them in the decision processes [10].

In our view, there are three main roadblocks to the adoption of P3P. The first issue relates to the ability of users to define and control their preferences intuitively. This difficulty could be addressed through enhancements to the user interface of web browsers. For example, Microsoft Internet Explorer 6.0 only has rudimentary support for P3P privacy

end-user-privacy-in-human-computer-interaction-v57.docPage 35 of 85

Document info
Document views141
Page views141
Page last viewedMon Oct 24 05:45:52 UTC 2016