X hits on this document

231 views

0 shares

0 downloads

0 comments

77 / 100

adoption. We advance a similar argument here, adopting these three models with respect to privacy (see Figure 2).

The binary evaluation model suggests that the acceptance or rejection of a technology is impacted by its perception of being trustworthy (or not) in protecting the user’s privacy. This strategy is adopted by users who lack the time, interest, or knowledge for making a more nuanced decision.

Binary Evaluation

User

Accepts

User Rejects

Product does not protect privacy

Product protects privacy

Threshold Evaluation

User

Accepts

User Rejects

Product does not protect privacy

Product protects privacy

Spectral Evaluation

User

Accepts

User Rejects

Product does not protect privacy

Product protects privacy

Figure 2. Three models of privacy concerns impacting adoption. A simple view of the domain leads to a binary evaluation model. Increasingly sophisticated understanding allow users to employ more refined evaluation models (Threshold Evaluation and Spectral Evaluation). Picture adapted from [110].

The threshold evaluation model is adopted by users with moderate interest or knowledge in a particular technology. It suggests that a product is accepted if the perceived trustworthiness is above a certain threshold. Between these thresholds, a more nuanced opinion is formed by the user and other considerations are brought to bear, which may affect an acceptance judgment.

The spectral evaluation model is adopted by users with the resources and knowledge to form a sophisticated view of a system, and does not necessarily imply a flat-out rejection or acceptance of a system, whatever its privacy qualities.

While these models are only informed speculation, we believe that there is value in studying acceptance in the context of HCI and privacy. MIS literature on technological acceptance informs us that adoption hinges on several factors, including usability, usefulness, and social influences. Social influences also includes social appropriateness and the user’s comfort level, specifically in relation to privacy concerns [292].

Patrick, Briggs, and Marsh emphasize the issue of trust as an important factor in people’s acceptance of systems [234]. They provide an overview of different layered kinds of trust. These include dispositional trust, based on one’s personality; learned trust, based on one’s personal experiences; and situational trust, based on one’s current circumstances. They also outline a number of models of trust, which take into account factors such as familiarity, willingness to transact, customer loyalty, uncertainty, credibility, and ease of use. There currently is not a great deal of work examining trust with respect to privacy, but it the reader should be convinced that there is a strong link between trust and privacy.

One complication of these theories is that the cultural context affects acceptance. Themes that are hotly debated by a nation’s media can significantly impact the perception of privacy risks. For example, a 2003 poll in the European Union showed that privacy concerns vary by national context based on media attention on the subject [102]. However, it is not clear how to reliably predict such concerns when moving from country to country. Perhaps a general survey administered prior to deployment could be useful in these situations. Finally, other factors, such as education, socio-economic status, and labor relations can affect privacy concerns, but we are not aware of any work in these areas in the HCI community. Clearly, there needs to be more work focusing on cultural and social context to gain a more refined understanding of how the phenomena of acceptance unfolds within a given user base.

end-user-privacy-in-human-computer-interaction-v57.docPage 77 of 85

Document info
Document views231
Page views231
Page last viewedMon Dec 05 15:06:19 UTC 2016
Pages100
Paragraphs1533
Words44427

Comments