X hits on this document

204 views

0 shares

0 downloads

0 comments

43 / 100

al. believe this discomfort was due to a social desire to appear approachable. Overall, this result suggests that people prefer the flexibility of ambiguity over a clear message that offers no such latitude.

It is also worth noting that plausible deniability is at odds with a traditional view of security, defined as “confidentiality, integrity, and availability” [98]. Integrity and availability contrast with the idea that individuals should be granted a certain amount of unaccountability within information systems. Social science suggests however that plausible deniability is a fundamental element of social relations. Thus, plausible deniability should be viewed as a possible requirement for information technology, especially for artifacts meant to support communication between individuals and organizations.

A related issue is that plausible deniability may inhibit social translucency, which has been touted as one of the characteristics that makes computer mediated communications effective and efficient. Erickson and Kellogg define socially translucent systems as IT that supports “coherent behavior by making participants and their activities visible to one another” [94]. Plausible deniability may make it hard to hold other people accountable for their actions in such systems. A similar tension is explicitly acknowledged in the context of CSCW research by Kling [180] and was debated as early as 1992 at the CSCW conference [21]. It is currently not clear what the best way of balancing these two issues is. Social translucency is also discussed with respect to evaluation in Section 3.3.3.

Finally, one must take into consideration the fact that users of computer-mediated communications systems often perceive more privacy than what the technology really provides. For example, Hudson and Bruckman show that people have a far greater expectation of privacy in Internet Relay Chat than can be realistically provided given the design and implementation of IRC [151]. Thus, in addition to balancing plausible deniability with social translucency, designers must also consider users’ expectations of those properties. We concur with Hudson and Bruckman that more research is necessary in this field. This point is raised again in the final part of this article.

3.3.9 Fostering Trust in Deployed Systems

The issue of trust in IT is a complex and vast topic, involving credibility, acceptance, and adoption patterns. Clearly, respecting the privacy of the user can increase trust in the system. The relationship also works in the opposite direction: if an application or web site is trusted by the user (e.g., due a reputable brand), privacy concerns may be assuaged. In this section, we provide a brief overview of HCI research on technology and trust with respect to information privacy, both as a social construct and as a technical feature.

Trust is a fundamental component of any privacy-affecting technology. Many PETs have been developed with the assumption that once adopted, users would then use IT services with increased trust [239]. One particularly interesting concept is that of trust distribution, where information processing is split up among independent, non-colluding parties [60]. Trust distribution can also be adapted to human systems, e.g., assigning two keys to a safe to two different managers.

Social context is another factor impacting trust and privacy. Shneiderman discusses the generation of trust in CSCW systems [263], claiming that just like a handshake is a

end-user-privacy-in-human-computer-interaction-v57.docPage 43 of 85

Document info
Document views204
Page views204
Page last viewedSat Dec 03 07:16:40 UTC 2016
Pages100
Paragraphs1533
Words44427

Comments