X hits on this document

220 views

0 shares

0 downloads

0 comments

40 / 100

For further reading, we suggest Brunk’s overview of privacy and security awareness systems [54] and Lederer’s examples of feedback systems of privacy events in the context of ubiquitous computing [195].

3.3.6 Interpersonal Awareness

An alternate use of the term “awareness” relates to the sharing of information about individuals in social groups to facilitate communication or collaboration. This type of sharing occurs for example in communication media, including videoconferencing [118, 269], group calendars [39, 287], and synchronous communications [40, 233].

One example of awareness system is RAVE, developed in the late 1980’s at EuroPARC [118]. RAVE was an “always on” audio/video teleconferencing and awareness system. Based on the RAVE experience, Bellotti and Sellen wrote an influential paper presenting a framework for personal privacy in audio-video media spaces [43] (see Section 3.5.2). RAVE provided visible signals of the operation of the video camera to the people being observed, to compensate the disembodiment of the observer-observed relationship. Moreover, Bellotti and Sellen also suggested leveraging symmetric communication to overcome privacy concerns. Symmetric communication is defined as the concurrent exchange of the same information in both directions between two individuals (e.g., both are observers and observed).

Providing feedback of information flows and allowing their control is a complex problem. Neustaedter and Greenberg’s media space is a showcase of a variety of interaction techniques. To minimize potential privacy risks, they used motion sensors near a doorway to detect other people, weight sensors in chairs to detect the primary user, physical sliders to control volume, and a large physical button to easily turn the system on and off [222].

Hudson and Smith proposed obfuscating media feeds by using filters on the video and audio [152]. These filters include artificial “shadows” in the video image as well as muffled audio. While they did not evaluate these privacy-enhancing techniques, Hudson and Smith posited that privacy and usefulness had to be traded off to achieve an optimal balance. Boyle et al. also proposed video obfuscation to protect privacy for webcams in homes [49, 223]. However, evaluation by Neustaedter et al. showed that obfuscation neither increased users’ confidence in the technology nor their comfort level [224]. It is thus not clear whether obfuscation techniques, which are based on an “information-theoretic” view (i.e., disclosing less information increases privacy), actually succeed in assuring users that their privacy is better protected.

The idea of “blurring information” was also proposed in the domain of location information [87, 242]. However, the results of Neustaedter et al. for video are paralleled by results by Consolvo et al. in location systems [65]. Consolvo et al. discovered that users disclosing location seldom make use of “blurring” (i.e., disclosing an imprecise location, such as the city instead of a street address), in part for lack of need and because of the increased burden on usability.

Tang et al. suggest using “Hitchhiking” as an alternative approach: rather than modulating the precision of location disclosures, the identity of the disclosing party along with any sensed data is anonymized [281]. This approach can still support a useful class

end-user-privacy-in-human-computer-interaction-v57.docPage 40 of 85

Document info
Document views220
Page views220
Page last viewedSun Dec 04 19:09:32 UTC 2016
Pages100
Paragraphs1533
Words44427

Comments