Rachel L. Finn , David Wright , and Michael Friedewald elaborated a really interesting list of categories of privacy related issues caused by the improvements in different types of technology.
- The Physical Person: This category refers specifically to aspects of the human body, for example: nudity, biometric data, electronic implants and sensing devices, brain signals monitors and any other type of information related to the physical body.
- Behavior and Action: Any type of information that reflects aspects of a person’s lifestyle, for example: Sexuality, religion, political beliefs or habits.
- Personal Communications: From traditional wiretap to more advanced email interception or capture and analysis of text from messaging apps like WhatsApp or Facebook.
- Data and Image: Problems derived from the proliferation of surveillance cameras or the appearance of massive amounts of images and videos in the social networks together with the possibility to apply automated face recognition techniques.
- Thoughts and Feelings: Technology can be used to estimate people’s mental state by using face/voice/gesture analysis.
- Location and Space: Related to the information of someone’s location, be it obtained from GPS tracking, cameras surveillance, wifi/bluetooth spoofing.
- Association and Group Membership: Privacy issues derived from aspects such as belonging to a specific community, following certain groups, individuals or initiatives in the social networks etc.
Keerthi Thomas, Arosha K. Bandara, Blaine A. Price1 and Bashar Nuseibeh propose a process (Requirement distillation) and a framework (PriF) as a way to capture the privacy related requirements for a mobile application development.
Requirements distillation process
The requirements distillation process consists of three main phases: “Structuring of the Qualitative Data”, “Information Flow Modelling” and “Privacy Problem Analysis”.
Privacy Facets framework
Structuring Qualitative Data: Use Privacy Facets (PriF) framework to structure the qualitative data. The outcome is a set of predefined codes adapted to the identification of privacy-sensitive contexts. The result of completing this phase is a set of Privacy or Threats Concerns from the users.
Information Flow Modeling: In the second phase, the problem models of information-flows are developed. That is done based on the information-flow problem patterns, which are provided in the PriF framework. These problem models capture the way the information is created and disseminated to other users.
The privacy problem analysis: To elaborate a list of the privacy requirements, the privacy-sensitive context and its privacy threats or concerns are analysed with the information-flow models.
The Privacy Facets is a framework that provides:
- analytical tools such as thematic codes, heuristics, facet questions and extraction rules to structure qualitative data
- information-flow problem patterns and privacy arguments language to model privacy requirements.
To obtain those assets, the system analyst should structure the qualitative data of the system from the first phase of the process by using some heuristic based categories, for example:
K. Thomas, A. K. Bandara, B. A. Price, and B. Nuseibeh, “Distilling privacy requirements for mobile applications,” in Proceedings of the 36th International Conference on Software Engineering, 2014, pp. 871–882.
- Negative Behaviour Patterns (NBP): Situations in which the user chooses not to use an application because of privacy concerns.
- Negative Emotional Indicators (NEI): These are keywords that indicate that the user might have some concerns about the privacy when using the application.
Xiaodong Jiang, Jason I. Hong, and James A. Landay, apply different concepts from economics and information theory to model the exchange of information among the different actors (data owners, data collectors and data users) to minimize the asymmetry of information flow among them.
Data Owner, Data Collector and Data User
After identifying the main actors they propose the principle of minimum asymmetry:
Principle of Minimum Asymmetry
A privacy-aware system should minimize the asymmetry of information between data owners and data collectors and data users, by:
- Decreasing the flow of information from data owners to data collectors and users
- Increasing the flow of information from data collectors and users back to data owners
To support this Principle of Minimum Asymmetry they design a space of Privacy solutions in Ubiquitous Computing
X. Jiang, J. I. Hong, and J. A. Landay, “Approximate information flows: Socially-based modeling of privacy in ubiquitous computing,”
Space of Privacy Solutions of Ubiquitous Computing
The Fair Information Practice Principles (FIPPs) proposed by the Federal Trade Commission (FTC) are the result of an enquiry to promote the adequate handling of personal information in information systems.
1. Notice/Awareness: Subjects should be given notice of the collection of personal information from them before it takes place.
2. Choice/Consent: Subjects should be given the choice of cancelling the collection of their personal information.
3. Access/Participation: Subjects should be allowed to access their personal information that has been collected.
4. Integrity/Security Information collectors should ensure that the information they collect is accurate and secure.
5. Enforcement/Redress: In order to ensure that the Fair Information Practice Principles are applied, there must be enforcement measures available to the Subjects.
The STRAP Framework is an iterative process that aims at the identification of privacy vulnerabilities throughout all the stages of the software development process.
- Design Analysis: The whole process starts by performing a Goal-oriented analysis of the whole system. The main actors, goals and major system components are identified. They are represented in a tree diagram following a dependency hierarchy (Figure 1). Goals are represented in the circles and actors are represented with colours on each goal. For each goal we ask the following analytical questions: “What information is captured/accessed for this goal?”, “Who are the actors involved in the capture and access?”, “What knowledge is derived from this information?” and “What is done with the information afterward?”
- Design refinement: Once all the vulnerabilities are identified, we iterate over them and decide which ones can be eliminated and which ones can be mitigated. For example: if one vulnerability is that personal information needs to be stored in a server and it can be stolen, then the mitigation is to keep the information encrypted.
- Evaluation: Elaborate a set of different design that tackle the goals of the system and evaluate them. Choose the design alternative that implies a lesser impact on privacy.
- Iteration: As the conceptualisation of the project evolves, repeat the different steps again to make sure that all the vulnerabilities are always documented and identified. Together with the vulnerabilities, it is necessary to document the assumptions taken in the design. Before new features are added to the system, they need to be evaluated and used to update the goal tree, adding new objectives and actors as needed.
Diagram with the process of the STRAP framework
C. Jensen, J. Tullio, C. Potts, and E. D. Mynatt, “STRAP: a structured analysis framework for privacy,” 2005.