Interaction
This module covers week 2 of the course.
Note that this material is subject to ongoing refinements and updates!
Interaction design is a process
We will return to processes in IxD later in the course, but for now, know that there are four stages in IxD:
- Discovering requirements
- Designing alternatives
- Prototyping
- Evaluating
Conceptual design
At the early stages of the IxD process, it’s important to conceptualise what a proposed design solution will actually do. This helps to:
- Evaluate whether ideas and assumptions are actually feasible.
- Whether the design is realistic to develop.
- Whether it’s desirable or useful.
We can make claims and assumptions about how users might engage with a design. According to Sharp et al.:
“by an assumption, we mean taking something for granted that requires further investigation; for example, people now want an entertainment and navigation system in their cars. By a claim, we mean stating something to be true when it is still open to question" (2019, p. 71).
We need to consider unknowns during the early stages of a project. Assumptions should be questioned and reconsidered early on in the process.
Analysing the problem space:
- Are there existing problems, and if so, what are they?
- Why do these problems exist?
- How might the design you are proposing help overcome these problems?
- How will your design improve the current way of doing things?
Conceptual model
Understanding the problem space is necessary before moving onto the "design space". Before deciding upon elements like the style of interface, user behaviour, and functionality, a conceptual model should be developed.
A conceptual model is:
“...a high-level description of how a system is organized and operates”
(Johnson and Henderson, 2002, p26)
A conceptual model enables:
“...designers to straighten out their thinking before they start laying out their widgets”
(Johnson and Henderson, 2002, p28)
The key parts of a conceptual model include:
- Metaphors and analogies that convey to people how to understand what a product is used for and how to use it for an activity
- The concepts to which people are exposed through the product, including the task-domain objects they create and manipulate, their attributes, and the operations that can be performed on them
- The relationships between those concepts
- The mappings between the concepts and the user experience the product is designed to support or invoke (Sharp et al., 2019, pp. 74-75).
According to Sharp et al, there are five types of interaction:
- Instructing: Where users issue instructions to a system...
- Conversing: Where users have a dialog with a system...
- Manipulating: Where users interact with objects in a virtual or physical space by manipulating them...
- Exploring: Where users move through a virtual environment or a physical space...
- Responding: Where the system initiates the interaction and the user chooses whether to respond (2019, p. 81).
It’s important to remember that interaction type differs from interface style. For example, an interaction type could be conversational, browsing, or responding. An interface style supports particular interaction types. Interface styles include menus, gestures, touch, or voice.
Prototyping with wireframes
We will return to prototyping later in the course, but for now, we will look at using wireframes to mock up our interaction/interface ideas. What is a wireframe?
“Initial conceptual models may be captured in wireframes—a set of documents that show structure, content, and controls. Wireframes may be constructed at varying levels of abstraction, and they may show part of the product or a complete overview” (Sharp et al., 2019, p. 445).
Wireframe basics (note that we won't need to create interactive wireframes in this course):
Justinmind (2021). The ultimate guide to wireframe design [Video file]. https://www.youtube.com/watch?v=7rw1tZwrccU.
There are a number of online apps that allow you to create wireframes of varying levels of fidelity, but we will be using Lunacy in this course to create medium-fidelity wireframes.
Emotional interaction
This term is used to encompass two separate, but related ideas:
Emotional AI/affective computing: “Designing technology to detect and recognize someone’s emotions automatically from sensing aspects of their facial expressions, body movements, gestures, and so forth” (Sharp et al., 2019, p. 165).
Emotional design
- “How to design interactive products to evoke certain kinds of emotional responses in people” (Sharp et al., 2019, p. 165).
- How easy is it to design an interface to match or change how we are feeling? Should we be doing this at all? What if our mood changes?
- The relationship between our behaviour and our emotions isn't simply explained as cause-and-effect (Baumeister et al., 2007).
Emotional design (Norman, 2005) occurs at three levels:
- Visceral
- Behavioural
- Reflective
Listen to Norman himself explain what these three levels mean.
CNN (2015). Don Norman and his theory on emotional design [Video file]. https://www.youtube.com/watch?v=G7MeRkDkRN4
Expressive interfaces
- Interfaces can give feedback to us that is informative and fun.
- Interfaces can be annoying and frustrating too (which are obviously things to avoid in design!).
- Error messages that are vague or condemning can be annoying.
- A system shouldn’t ask a user to take too many steps.
Technologies used in:
- Cameras for measuring facial expressions
- Biosensors placed on fingers or palms to measure galvanic skin response (which is used to infer how anxious or nervous someone is as indicated by an increase in their sweat)
- Affective expression in speech (voice quality, intonation, pitch, loudness, and rhythm)
- Body movement and gestures, as detected by motion capture systems or accelerometer sensors placed on various parts of the body (Sharp et al., 2019, p. 179).
Six core expressions typically measured:
Sadness, disgust, fear, anger, contempt, and joy
Type of facial expression chosen by AI through detecting presence or absence of:
Smiling, eye widening, brow raising, brow furrowing, raising a cheek, mouth opening, upper-lip raising, and wrinkling of the nose (Sharp et al., 2019, p. 180).
Persuasive technologies
- Systems designed to change people’s attitudes and behaviours (Fogg, 2003).
- Fogg is known for referring to this as persuasive design.
- Includes technologies like popup ads, warning messages, prompts, recommendations, and so on.
- Sometimes referred to as "digital nudging".
Anthropomorphism
Anthropomorphism is the tendency for us to give human qualities to animals and objects.
Social robots are a recent step in emotional design. There are many examples of these in Japan, such as:
Back to top | © Paul Haimes at Ritsumeikan University | View template on Github