link between remote co-workers. Users’ experiences were studied through interviews, transcripts of communications, usage logs, and direct observation . Ackerman et al. report the gradual emergence of social norms regulating the use of the space by group members. Users started ignoring disclosures by other users that were clearly personal in nature and had been transmitted through the system by mistake, perhaps because one party had forgotten to turn off the media space before a sensitive conversation.
Cool et al. also discuss the long-term evaluation of a videoconferencing system developed at Bellcore during the 1990’s . The system started out as an always-on link between public spaces and evolved into a personal videoconferencing system on personal workstations. Cool et al. observed four issues with their videoconferencing systems: system drift (system use and norms evolve over time), conflicting social goals of one user within the social system, concerns of social appropriateness and evaluation, and reaching a critical mass of users. Cool et al. point out that test implementations should be as complete and robust as possible, i.e., real products, if credible observations social behavior are sought. Studies should also extend over a long timeframe to motivate conclusions about the system’s acceptance. Finally, technology must be evaluated in the context of planned use rather than in a laboratory.
Cool et al.’s work leads to a final aspect of holistic evaluation, namely that it can be difficult to gather data on the privacy-sensitive aspects of IT applications. First, privacy and security are non-functional properties which may not be obvious to the user and might not be obvious in the UI. Second, case studies on privacy and security are often hampered by the lack of public knowledge on failures or successes. Third, concerns of social appropriateness can affect perceptions as well as cause tensions in collaborative environments, all of which can affect observations. These factors suggest that, to interpret observations correctly, researchers must take a broad view of the application and its perceived properties. Only through careful observations will user privacy concerns and perceptions emerge from product evaluations.
3.4.3 The Tension between Transparency and Privacy
In Section 3.2.8, we briefly touched on the tension between privacy and social transparency. One of the goals of CSCW research is to increase communication opportunities through technology. However, increased transparency, e.g., in the form of awareness of others’ activities, can conflict with an individual’s need for autonomy and solitude, with detrimental effects on organizational effectiveness. To a degree, these tensions have always existed, but Grudin points out that electronically collecting and distributing data about individuals significantly increases the risk of undesired uses . The point of this section is to show that the tension between transparency and privacy is subtle and that simple design features can often make the difference between accepting and rejecting a system.
Groupware calendars provide a prime example of this tension. Two obvious advantages of group calendars are more effective planning and better access to colleagues. However, these advantages also impact users’ personal space and work time. Palen describes the prolonged use of a groupware calendar system within a large organization, based on observations and expert analysis . She points out that technological infrastructure can curb risks by making misuse too expensive in the face of the potential gains. She
end-user-privacy-in-human-computer-interaction-v57.docPage 50 of 85