X hits on this document





45 / 100

In summary, a variety of factors influence end-user’s trust in a system. In our opinion, however, strong brands and a positive direct experience remain the most effective ways of assuring users that sound organizational information privacy practices are being followed.

3.3.10 Personalization and Adaptation

Personalization and adaptation technologies can have strong effects on privacy. The tension here is between improving the user experience (e.g., recommendations) and collecting large amounts of data about the user behavior (e.g., online navigation patterns). For example, Kobsa points out that personalization technologies “may be in conflict with privacy concerns of computer users, and with privacy laws that are in effect in many countries” [183].9 Furthermore, Kobsa and Shreck note that users with strong privacy concerns often take actions that can undermine personalization, such as providing false registration information on web sites [184]. Trewin even claims that control of privacy should take precedence over the use of personal information for personalization purposes, but acknowledges that such control may increase the complexity of the user interface [286].

Several solutions have been developed to protect users while offering personalized services. For example, Kobsa and Shreck propose anonymous personalization services [184]. However, Cranor points out that these strong anonymization techniques may be too complex for commercial adoption [69]. Cranor also observes that privacy risks can be reduced by employing pseudonyms (i.e., associating the interaction to a persona that is indirectly bound to a real identity), client-side data stores (i.e., leveraging user increased control on local data), and task-based personalization (i.e., personalization for one single session or work task).

Notwithstanding Kobsa and Schreck’s and Cranor’s work, real-world experience tells us that many users are willing to give up privacy for the benefits of personalization. One need only look at the success of Amazon.com’s recommender system as an example. Awad and Krishnan provide another perspective on this argument. Their survey probed users’ views on the benefits of personalization and their preferences in data transparency (i.e., providing to users access to the data that organizations store about them and to how it is processed) [33]. Awad and Krishnan concluded that those users with the highest privacy concerns (“fundamentalists”), would be unwilling to use personalization functions even with increased data transparency. They suggested focusing instead on providing personalization benefits to those users who are unconcerned or pragmatists and to ignore concerned individuals. Awad and Krishnan’s article also includes a brief overview of privacy literature in the MIS community.

Trevor et al. discuss the issue of personalization in ubicomp environments [285]. They note that in these environments, an increasing number of devices are shared between multiple users and this can cause incidental privacy issues. In their evaluation, Trevor et al. probed the personalization preferences of users of a ubiquitous document sharing system in an office setting. They discovered that privacy preferences depend not only on

9 For an overview of work in this area, we refer to edited publications 55.Brusilovsky, P., A. Kobsa, and W. Nejdl,  The Adaptive Web: Methods and Strategies of Web Personalization. ed. Springer Verlag: Heidelberg, Germany, 2007..

end-user-privacy-in-human-computer-interaction-v57.docPage 45 of 85

Document info
Document views104
Page views104
Page last viewedFri Oct 21 18:49:40 UTC 2016