X hits on this document





51 / 100

identifies three techniques to achieve this goal. First, Palen proposes limiting calendar “surfing,” that is, accessing others’ calendar information without a specific need and knowledge of that person. Second, privacy controls should be reciprocal, meaning that social subgroups share the same type of information in a symmetric way. Finally, social anonymity helps prevent systematic misuse. Palen notes that calendars were retrieved based on a specific employee name. Consequently, while any employee could in theory access any other employee’s calendar, this rarely happened since he would only know the names of a limited number of people in the company.

Tullio discusses a groupware calendar used to predict other users’ availability, for purposes of initiating in-person or mediated communication [287]. In addition to a qualitative analysis, Tullio performed an expert analysis of his groupware calendaring application using Jensen’s STRAP method and identified several potential privacy vulnerabilities, including prediction accuracy, consent, and notification. Tullio also notes that in these kinds of systems, concerns arise for “both […] controlling access as well as presenting a desired impression to others.” These dynamics are related to Goffman’s work on presentation of self and to the concept of personal privacy we outlined in Section 2.2.2.

An explicit analysis of varying degrees of social transparency is encompassed in Erickson et al.’s work on socially translucent systems [94]. In socially translucent systems, the overall goal is to increase awareness and communication opportunities by presenting information about others’ activities. These systems are translucent12 since they only present select aspects of activity, as opposed to being “transparent” and presenting all aspects [51]. Erickson et al. developed Babble, a chat system that allows one-to-one and group communication. Babble stores a persistent, topic-threaded copy of the chats, and offers a graphical representation of users that provides awareness of their activity within the chat system. The system was used for over two years within the research organization of the authors. Thus, observations of Babble’s use were grounded in an extensive deployment that saw both adoption successes in some groups and failures in other groups. The authors report that the system was often used to initiate opportunistic interactions, and contributed to increasing group awareness while preserving a sufficient degree of privacy for the involved parties.

One interesting aspect of Erickson et al.’s work is that they claim to have willfully refrained from building norms and social conventions in the UI and system architecture. For example, Babble did not provide specific tools for protecting privacy, expecting instead that users would develop their own acceptable behaviors and norms around the system. They argue that this did indeed happen. In fact, Erickson et al. go as far as stating that building such privacy-protecting mechanisms would have prevented users from showing one another that they could be trusted in their use of the system, a process that strengthened rather than weakened the social bonds within the organization [94]. Clearly, such an approach is possible only in specific contexts which should be carefully evaluated by the designer.

12 The concept of translucency has also been used in other HCI domains with different meanings, for example in the design of user interfaces for mobile systems 89.Ebling, M.R., B.E. John, and M. Satyanarayanan, The Importance of Translucence in Mobile Computing Systems. ACM Transactions on Computer-Human Interaction (TOCHI) 2002. 9(1): p. 42-67..

end-user-privacy-in-human-computer-interaction-v57.docPage 51 of 85

Document info
Document views135
Page views135
Page last viewedMon Oct 24 01:40:43 UTC 2016