Table 6. Questions and Evaluation Criteria for video media spaces .
When and what information about me gets into the system.
When and when not to give out what information. I can enforce my own preferences for system behaviours with respect to each type of information I convey.
What happens to information about me once it gets inside the system.
What happens to information about me. I can set automatic default behaviours and permissions.
Which people and what software (e.g., daemons or servers) have access to information about me and what information they see or use.
Who and what has access to what information about me. I can set automatic default behaviours and permissions.
What people want information about me for. Since this is outside of the system, it may only be possible to infer purpose from construction and access behaviours.
It is infeasible for me to have technical control over purposes. With appropriate feedback, however, I can exercise social control to restrict intrusion, unethical, and illegal usage.
Systems must be technically reliable and instill confidence in users
Feedback should be provided at a time when control is most likely to be required
Feedback should be noticeable
Feedback should not distract or annoy
Feedback should not involve information which compromises
The system should minimise information capture, construction and access by default
Mechanisms of control over user and system behaviours may need to be tailorable
Design solutions must be lightweight to use
Feedback and control must incorporate meaningful representations
Proposed designs should not require a complex model of how the system works
Naturally, we wish to keep costs of design solutions down
They developed a framework for addressing personal privacy in media spaces. According to their framework, media spaces should provide appropriate feedback and control structures to users in four areas (Table 6). Feedback and control are described by Norman as basic structures in the use of artifacts , and are at the base of the Openness and Participation principles in the FIPS.
Bellotti and Sellen adapted MacLean et al.’s Questions, Options, Criteria framework  to guide their privacy analysis process. They proposed evaluating alternative design options based on eight questions and eleven criteria, derived from their own experience and from other sources (see Table 6). Some criteria are closely related to security evaluation (such as trustworthiness), while other criteria try to address the problem of the human cost of security mechanisms. Bellotti and Sellen’s criteria are similar to those of Heuristic Evaluation , a well-known discount usability technique for evaluating user interfaces.
The evaluation of alternatives is common to several privacy frameworks, and is characteristic of design methods targeted at tough design problems that do not enjoy
end-user-privacy-in-human-computer-interaction-v57.docPage 59 of 85