X hits on this document





49 / 100

challenging due to the temporal timeframe of the evaluation and complex data collection methods.

One interesting example of how social context affects the acceptance of privacy-sensitive IT is provided by the “office memory” project developed at the Laboratory of Design for Cognition at Electricité de France [189] discussed in Section 3.2.9. Here, the social context was essential for acceptance: the users were by and large the builders of the application. It is likely that acceptance would have been much lower in another setting. For example, as noted in Section 3.3.9, there was much resistance to the deployment of the Active Badge system [296] outside of the group that developed it [140]. Perception of individual autonomy, political structures, and group tensions all contributed to the rejection of a technology that was perceived as invasive.

Similarly, in hospitals, locator badges are used to facilitate coordination and protect nurses from spurious patient claims. However, in many cases, these locator badges have led to increased friction between workers and employers, as they were perceived by nurses as a surreptitious surveillance system [22]. In at least two separate cases, nurses outright refused to wear the locator badges [22, 59]. In cases where the value proposition was clear to the nurses using it, and where management respected the nurses, the system was accepted. In cases where the value proposition was not clear or was seen as not directly helping the nurses, the system tended to exacerbate existing tensions between the staff and management.

A second contentious social issue with respect to privacy-invasive systems is adjudication, that is, whose preferences should prevail in situations where part of the user base favors a technology and part opposes it. Although a general discussion is beyond the scope of this paper, one interesting comment is made by Jancke et al. in the context of a video awareness systems [165]. Jancke et al. note that what is commonly considered a public space is not one-dimensionally so. A vocal minority of their users were unsettled by an always-on system linking two public spaces. These users felt that there were many private activities that took place in that “public space” such as personal phone calls, eating lunch, and occasional meetings, and that the private nature of this “public space” was being subverted. Before the video awareness system was deployed, there was a degree of privacy based on the proxemics of the space. However, when computer-mediated communication technologies are introduced, such privacy was destroyed because individuals could not easily see who was present at the other end of the system. This shows that a legal or technical definition of public space often does not align with people’s expectations.

A third key aspect of holistic evaluation stems from the observation that privacy and security features are often appropriated late in the learning curve of an application [157], often after some unexpected security or privacy “incident.” Forcing participants to use privacy-related features can speed up the evaluation, but may be detrimental because the participants’ attention is focused on a specific feature instead of the whole application. Thus, the evaluation of privacy and security through test deployments requires researchers to engage in the observation of prolonged and continued use.

For example, Ackerman et al. performed a field study of an “audio media space” over the course of two months [12]. Their system provided an always-on audio communication

end-user-privacy-in-human-computer-interaction-v57.docPage 49 of 85

Document info
Document views191
Page views191
Page last viewedFri Oct 28 04:48:31 UTC 2016