Monthly Archives: January 2018

Exploring Relationships Between Interaction Attributes and Experience

Eva Lenz, Sarah Diefenbach and Marc Hassenzahl proposed the Interaction Vocabulary, obtained after evaluating different interactions through the why-, what- and how- framework (a simplistic form of Goal Oriented Analysis).


Experience Interaction Attributes Experience
Esteem, focus on the interaction itself, significance of the present moment, relaxing, calming, accuracy, care, appreciation of interaction/product slow fast Animating, stimulating, activating, efficiency, focus on instrumental goal of interaction, expression of willpower.
Ritualization, every step is meaningful, rewarding, emphasis on progress and advance of the process, approaching a goal step by step, clear structure, being guided through the process stepwise fluent Autonomy, continuous influence, power and right to change what’s happening at anytime of the process, no barriers, fluent integration in running process, spurring instead of interrupting
Instant feedback makes own effect experiential, competence, feeling of own impact creates a feeling of security, you see what you do, makes immediate correction possible, nothing in between, you experience what you do, increase of competence, the instant feedback creates a feeling of recognition. instant delayed Emphasizing the moment of interaction, creating awareness. Centering on the interaction itself rather than its instrumental effect.
Influence by intuition, control uniform diverging Unusual, unnatural, amplified, grasping for attention
Creates feeling of security constant inconstant Liveliness, suspense, you can’t adapt yourself to it, unreliable, chance as an idea generator
Uncertainty, ambiguity, magic, handing over the responsibility (the interaction happens somewhere else), you don’t put much of yourself in it m mediated direct Significance of your own doing, face-to-face contact, experiencing affinity, self-made, close relation to the product, feeling of constant control
not feeling as a part of it, feeling of distance spatial separation spatial proximity Personal contact, feeling of relatedness, safety (you know exactly what you did), being a part of it, intensive examination of details
Deeper analysis is needed, room for variation = room for competence, room for new ideas, exploration approximate precise Safety, no changes = room to concentrate on something else/competence in other fields, exact idea of result, always exact the same
Carefulness, awareness, appreciation, making a relationship with the thing (being gentle with it), being a part of it, revaluation of the action, raises the quality, allows to perform a loving gesture gentle powerful Archaic interaction, sign of strength, power, effectiveness
Low challenge, no room to experience competence, no room for improvement, becomes side issue, doesn’t matter incidental targeted Appreciation, significance of interaction, worthy of attention, high challenge, high concentration, room for competence
Conscious of the significance of your own doing, assurance, security, goal-mode, seeing what is going on, expressive, very easy apparent covered magic, excitement, exploration, action-mode, witchcraft, deeply impress somebody


E. Lenz, S. Diefenbach, and M. Hassenzahl, “Exploring relationships between interaction attributes and experience,” in Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces, 2013, pp. 126–135.

Proportionality Design Method

The principle of Data Quality from the Fair Information Practices insinuates that the information that is obtained from the users should be applied to their benefit:

“Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.”

Giovanni Iachello and Gregory D. Abowd use this as a starting point and elaborate the principle of proportionality:

“Any application, system, tool or process should balance its utility with the rights to privacy (personal, informational, etc.) of the involved individuals”

Based on this principle, they propose the Proportionality design method:

Proportionality Design Method

Proportionality Design Method

During the whole development cycle of the application, the different parts need to verify the legitimacy, appropriateness and adequacy of the application:

  • Legitimacy: Verify that the application is useful to the user. What is the function that the application cover?
  • Appropriateness:Analyse if the alternative implementations with the different technologies satisfy the goal of the application without supposing a risk for the privacy of the users?
  • Adequacy: Analyse if the different alternative technologies are correctly implemented.


G. Iachello and G. D. Abowd, “Privacy and proportionality: adapting legal evaluation techniques to inform design in ubiquitous computing,” in Proceedings of the SIGCHI conference on Human factors in computing systems, 2005, pp. 91–100.

Privacy Risk Models

Jason Hong, Jennifer D. Ng, Scott Lederer and James A. Landay present their framework for modelling privacy risks in ubiquitous computing environments.

The privacy risk models framework consists of two parts: privacy risk analysis, that proposes a list of questions to help defining the context of use of the future application and the privacy risk management, which is a cost-benefit analysis that is used to prioritise the privacy risks and develop the system.

Privacy risk analysis

The privacy risk analysis starts with the formulation of the following questions grouped in the categories Social and Organisational Context and Technology:

Social and Organizational Context

  • Who are the users of the system? Who are the data sharers, the people sharing personal information? Who are the data observers, the people that see that personal information?
  • What kinds of personal information are shared? Under what circumstances?
  • What is the value proposition for sharing personal information?
  • What are the relationships between data sharers and data observers? What is the relevant level, nature, and symmetry of trust? What incentives do data observers have to protect data sharers’ personal information (or not, as the case may be)?
  • Is there the potential for malicious data observers (e.g., spammers and stalkers)? What kinds of personal information are they interested in?
  • Are there other stakeholders or third parties that might be directly
    or indirectly impacted by the system?


  • How is personal information collected? Who has control over the
  • computers and sensors used to collect information?
  • How is personal information shared? Is it opt-in or is it opt-out (or do data sharers even have a choice at all)? Do data sharers push personal information to data observers? Or do data observers pull personal information from data sharers?
  • How much information is shared? Is it discrete and one-time? Is it continuous?
  • What is the quality of the information shared? With respect to space, is the data at the room, building, street, or neighborhood level? With respect to time, is it real-time, or is it several hours or even days old? With respect to identity, is it a specific person, a pseudonym, or anonymous?
  • How long is personal data retained? Where is it stored? Who has access to it?

Privacy Risk Management

This part consists on the prioritisation of privacy risks applying the inequality known as the Hand’s rule.

C < L×D


  • L: The likelihood that an unwanted disclosure of personal information occurs
  • D: The damage that will happen on such a disclosure
  • C: The cost of protecting this privacy in an adequate manner


J. I. Hong, J. D. Ng, S. Lederer, and J. A. Landay, “Privacy risk models for designing privacy-sensitive ubiquitous computing systems,” in Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques, 2004, pp. 91–100.

Three-Layer Privacy Responsibility Framework

Sarah Spiekermann and Lorrie Faith Cranor in their work “Engineering Privacy” state that software engineers have a major responsibility when it comes to developing privacy-friendly systems “because they are the ones devising the technical architecture and creating the code”. They present the three-layer model of user privacy concerns and responsibility framework. Based on this model they elaborate a set of guidelines, categorising them in “privacy-by-policy” and “privacy-by-architecture”

Three-Layer Privacy Responsibility Framework

The authors distinguish from three spheres of privacy: User Sphere (constrained to the user environment, i.e. laptop, mobile phone, integrated systems etc), Recipient Sphere (company centric sphere involving their back-ends infrastructure) and Joint Sphere (related to companies that host users information, like email or facebook). For each of the privacy layers, the following table describes where is the data stored, what is the responsibility of the engineer and what are the issues that they need to face.


Privacy Spheres Where Data is Stored Engineer’s Responsibility Engineering issues

User Sphere

Users’ desktop personal computers, laptops, mobile phones, RFID chips

  • Give users control over access to themselves (in terms of access to data and attention)
  • What data is transferred from the client to a data recipient?
  • Is the user explicitly involved in the transfer?
  • Is the user aware of remote and/or local application storing data on his system?
  • Is data storage transient or persistent?

Joint Sphere

Web service provider’s servers and databases

  • Give users some control over access to themselves (in terms of access to data and attention)
  • Minimize users’ future privacy risks
  • Is the user fully aware of how his data is used and can he control this?

Recipient Sphere

Any data recipients: servers and databases of network providers, service providers or other parties with whom data recipient shares data

  • Minimize users’ future privacy risks
  • What data is being shared by the data recipient with other parties?
  • Can the user expect or anticipate a transfer of his data by the recipient?
  • Is personal Data adequately secured?
  • Is data storage transient or persistent?
  • Can the processing of personal data be foreseen by the user?
  • Are there secondary uses of data that may not be foreseen by the user?
  • Is there a way to minimize processing? (e.g. by delegating some pre-processing to User Sphere)

Framework for Privacy-Friendly System Design

Spiekermann and Cranor propose a framework to develop privacy friendly systems. There is a rank of privacy levels lowest to highest that corresponds to the degree of identifiability (identified, pseudonymous, anonymous) of a user. In the cases where the user is totally identified, privacy needs to be provided by policy, while, in those cases where users are anonymous or pseudonymous, privacy can also be provided by architecture. The following table matches this attributes with the characteristics of the corresponding systems.


Privacy stages identifiability Approach to privacy protection Linkability of data to personal identifiers System Characteristics



privacy by policy (notice and choice)


  • unique identifiers across databases
  • contact information stored with profile information



linkable with reasonable & automatable effort

  • no unique identifiers across databases
  • common attributes across databases
  • contact information stored separately from profile or transaction information


privacy by architecture

not linkable with reasonable effort

  • no unique identifiers across databases
  • no common attributes across databases
  • random identifiers
  • contact information stored separately from profile or transaction information
  • collection of long term person characteristics on a low level of granularity
  • technically enforced deletion of profile details at regular intervals




  • no collection of contact information
  • no collection of long term person characteristics
  • k-anonymity with large value of k


S. Spiekermann and L. F. Cranor, “Engineering privacy,” IEEE Transactions on software engineering, vol. 35, no. 1, pp. 67–82, 2009.