X hits on this document





65 / 100

low effectiveness of privacy policies on web sites [293]. Acquisti explains why Privacy-Enhancing Technologies (PETs) have not enjoyed widespread adoption, by modeling the costs and expected benefits of using a PET versus not using it, treating users as rational economic agents [13]. Acquisti also argues that economics can help the design of privacy in IT by identifying situations in which all economic actors have incentives to “participate” in the system (e.g., in systems that require the collaboration of multiple parties, such as anonymizing networks). He further contends that economics can help in identifying what information should be protected and what should not, for example, identifying situations in which the cost of breaching privacy is lower than the expected return (a basic risk analysis exercise).

The main limitation of economic models is that the models’ assumptions are not always verified. Individuals are not resource-unlimited (they lack sufficient information for making rational decisions), and decisions are often affected by non-rational factors such as peer pressure and social navigation [14]. One explanatory theory they discuss is that of bounded rationality, i.e., that individuals cannot fully process the complex set of risk assessments, economic constraints, and consequences of a disclosure of personal data.

Acquisti and Großklags’ research casts serious doubts on whether individuals are capable of expressing meaningful preferences in relation to data protection (i.e., the collection of data by organizations). While in interpersonal relations, individuals have a refined set of expectations and norms that help decision-making and a fine-grained disclosure or hiding process, the same is not true for data protection disclosures.

The Approximate Information Flows (AIF) framework proposed by Jiang et al. [172] combines ideas from economics and information theory. In AIF, Jiang et al. state the Principle of Minimum Asymmetry:

“A privacy-aware system should minimize the asymmetry of information between data owners and data collectors and data users, by decreasing the flow of information from data owners to data collectors and users and increasing the [reverse] flow of information…” [172]

To implement this principle, the authors propose a three-pronged strategy. First, personal information should be managed by modulating and enforcing limits on the persistency (retention time), accuracy (a measure of how precise the data is) and confidence (a probability measure that the data is correct) of information within an information system. Second, the personal information lifecycle should be analyzed according to the categories of collection, access, and second use. Third, at each of these stages, the system should provide ways to prevent, avoid, and detect the collection, access and further use of personal information.

The authors used AIF to analyze several technologies and applications, such as P3P, feedback and control systems, etc. to show how these fit within the framework. However, this model has some limitations. First, the authors have used AIF as an analytic tool, but AIF has not been used as a design model. Second, all data users are expected to comply with the AIF model and respect the constraints on the use and interpretation of personal data. Finally, there is a potential conflict between this approach and data protection legislation in certain jurisdictions, because data protection legislation requires data controllers to guarantee the integrity and correctness of the data they are entrusted with,

end-user-privacy-in-human-computer-interaction-v57.docPage 65 of 85

Document info
Document views348
Page views348
Page last viewedSat Jan 21 21:31:36 UTC 2017