X hits on this document

212 views

0 shares

0 downloads

0 comments

70 / 100

Trusted proxies are another example of a third-party organization that can help manage privacy. For instance, MedicAlert is a paid service that stores personal medical records and forwards them to first responders in the case of medical emergencies. Such organizations, either not-for-profit (like MedicAlert), or for-profit (regulated by a service contract), could include:

evaluation clearinghouses, that indicate what products and services to trust. For example, SiteAdvisor [265] evaluates web sites’ spam, popup, and virus risks, and provides ratings via a web browser plug-in.

services that hold users’ location information and disclose it in case of emergency or subpoena, similar to current mobile telecom operators.

A service that seeds suspected privacy violators with fake personal data and tracks how that data is used and shared.

A service that checks if an individual reveals too much personal information in her resume and is at risk for identity theft [280].

In summary, privacy protection is a “systemic property” that requires support at all levels. However, special care should be exercised in allocating responsibility and oversight correctly, because the business goals of many organizations may not be aligned with those of the users, as suggested by recent controversies over security leaks at large personal data brokerage firms [297, 300].

4.2 A Deeper Understanding of People’s Attitudes and Behaviors towards Privacy

The second challenge is in gaining a deeper understanding of the behaviors of individuals towards privacy-affecting systems, at all levels of interaction.

One area where research is sorely needed is developing better ways of presenting warnings and notifications to people. There are many difficult forces to balance in creating an effective warning system. Warnings must be visible, comprehensible, understandable, and plausible to end-users [70, 311]. Cranor has also argued that warnings need to be tied to clear actions,16  and be designed so that users keep doing the right thing (rather than ignoring or turning them off). A counterexample to almost all of the above would be standard warning dialogs, most of which are simply swatted away because they get in the way of the user’s primary goals.

Another needed line of research is in understanding how attitudes and behaviors towards privacy-affecting systems evolve and reconcile over time. For example, recent research has shown that behavior in privacy matters often differs from stated preferences, for a variety of reasons [272]. Acquisti and Gross have also shown that on the Facebook social networking site, people perceived others as revealing too much information despite revealing a great deal of information about themselves.

A third needed line of work is that of understanding how to influence the behavior of users. For example, Jagatic et al. provide a striking instance of how publicly available

16 Echoing the UI design advice in Section 3.2: “Present choices, not dilemmas”

end-user-privacy-in-human-computer-interaction-v57.docPage 70 of 85

Document info
Document views212
Page views212
Page last viewedSat Dec 03 20:15:44 UTC 2016
Pages100
Paragraphs1533
Words44427

Comments