Jason Hong, Jennifer D. Ng, Scott Lederer and James A. Landay present their framework for modelling privacy risks in ubiquitous computing environments.
The privacy risk models framework consists of two parts: privacy risk analysis, that proposes a list of questions to help defining the context of use of the future application and the privacy risk management, which is a cost-benefit analysis that is used to prioritise the privacy risks and develop the system.
Privacy risk analysis
The privacy risk analysis starts with the formulation of the following questions grouped in the categories Social and Organisational Context and Technology:
Social and Organizational Context
- Who are the users of the system? Who are the data sharers, the people sharing personal information? Who are the data observers, the people that see that personal information?
- What kinds of personal information are shared? Under what circumstances?
- What is the value proposition for sharing personal information?
- What are the relationships between data sharers and data observers? What is the relevant level, nature, and symmetry of trust? What incentives do data observers have to protect data sharers’ personal information (or not, as the case may be)?
- Is there the potential for malicious data observers (e.g., spammers and stalkers)? What kinds of personal information are they interested in?
- Are there other stakeholders or third parties that might be directly
or indirectly impacted by the system?
- How is personal information collected? Who has control over the
- computers and sensors used to collect information?
- How is personal information shared? Is it opt-in or is it opt-out (or do data sharers even have a choice at all)? Do data sharers push personal information to data observers? Or do data observers pull personal information from data sharers?
- How much information is shared? Is it discrete and one-time? Is it continuous?
- What is the quality of the information shared? With respect to space, is the data at the room, building, street, or neighborhood level? With respect to time, is it real-time, or is it several hours or even days old? With respect to identity, is it a specific person, a pseudonym, or anonymous?
- How long is personal data retained? Where is it stored? Who has access to it?
Privacy Risk Management
This part consists on the prioritisation of privacy risks applying the inequality known as the Hand’s rule.
C < L×D
- L: The likelihood that an unwanted disclosure of personal information occurs
- D: The damage that will happen on such a disclosure
- C: The cost of protecting this privacy in an adequate manner
J. I. Hong, J. D. Ng, S. Lederer, and J. A. Landay, “Privacy risk models for designing privacy-sensitive ubiquitous computing systems,” in Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques, 2004, pp. 91–100.