About Face Book Summary: A Deep Dive into Goal-Directed Interaction Design

Quick Orientation

For decades, “About Face: The Essentials of Interaction Design” has been the definitive guide for creating digital products that people love. Authored by software pioneer Alan Cooper and a team of leading practitioners—Robert Reimann, David Cronin, and Christopher Noessel—the book introduces a revolutionary methodology called Goal-Directed Design. This isn’t just about making interfaces pretty; it’s a rigorous process for understanding what users truly want to achieve and designing products that help them do it efficiently and elegantly.

This book rejects the old “implementation-centric” approach, where interfaces merely exposed the underlying technology, and it moves beyond simple “metaphoric” design, which can be limiting. Instead, it offers a practical, humanistic framework for designing product behavior. Whether you’re a designer, developer, or product manager, this summary will walk you through every critical concept, principle, and process from the book. We’ll cover everything from conducting effective research and creating powerful user personas to designing for different platforms and eliminating the frustrating “excise” work that plagues so many digital products. Nothing significant has been left out—prepare to absorb the complete wisdom of a design classic.

PART I: Goal-Directed Design

Chapter 1: A Design Process for Digital Products

This chapter makes a powerful argument: too many digital products fail because they are designed without a deep understanding of human needs. It introduces Goal-Directed Design as the solution, a systematic process for creating successful, user-centric products.

The Consequences of Rude and Confusing Products

The book opens by diagnosing a common problem: digital products often make users feel stupid. This happens because these products are designed without a real design process, leading to several negative consequences.

Poorly designed products are often rude, blaming users for mistakes that aren’t their fault with unhelpful error messages. They forget information we’ve already given them and require us to think like computers, forcing us to understand their internal logic (the implementation model) rather than working the way we do. They are also filled with “excise” work—unnecessary tasks like window management or navigating complex file systems that don’t help us achieve our goals. These problems stem from a development process that prioritizes technology and features over user satisfaction.

Why Digital Products Fail

Products don’t become frustrating by accident; they fail for predictable reasons. The authors identify four main causes for this failure:

  • Misplaced priorities: Development teams often focus on technical challenges or a long list of marketing features, rather than on creating a cohesive and pleasant user experience.
  • Ignorance about real users: Teams often operate with a vague, stereotyped idea of “the user.” This elastic user can be stretched to justify any design decision, whether for a “power user” or a “novice,” depending on what’s convenient for the developer.
  • Conflicts of interest: The people who build the product (developers) are often the same people who design it. This creates a conflict between what is easy to code and what is easy to use.
  • Lack of a design process: Most organizations lack a rigorous, repeatable process for translating user needs into a concrete design solution.

Understanding Models: The Key to Better Design

To fix these problems, we need to understand the gap between how a product is built and how users perceive it. The book outlines three critical models:

  1. Implementation Model: This is how the system is actually built—the code, algorithms, and database structures. This model is logical to engineers but often incomprehensible to users.
  2. Mental Model: This is how a user imagines the product works. It’s a simplified, non-technical understanding that helps them get things done. For example, we think of dragging a file to the trash can, not of changing a pointer in a file allocation table.
  3. Represented Model: This is the way the designer chooses to present the product’s behavior to the user. It’s the actual interface. The single most important goal of an interaction designer is to make the represented model match the user’s mental model, not the implementation model.

An Overview of Goal-Directed Design

Goal-Directed Design is a process for bridging the gap between user research and a final product design. It ensures that product behavior is defined before a single line of code is written. It is built on the fundamental idea that if you want a product to be successful, you must first understand the user’s goals—their motivations and what they ultimately want to accomplish.

Goals are not the same as tasks. A task is an intermediate step (like entering a password), while a goal is the end condition (like securely accessing bank information). Designing for tasks can lead to incremental improvements, but designing for goals can lead to breakthrough innovations that eliminate unnecessary tasks altogether.

The Goal-Directed Design process consists of six phases:

  • Research: Using ethnographic methods to understand users and the domain.
  • Modeling: Creating detailed user models (personas) and domain models based on research.
  • Requirements Definition: Using personas and scenarios to define the product’s requirements.
  • Framework Definition: Creating the high-level structure for the product’s behavior and visual design.
  • Refinement: Fleshing out the design with detail and nuance.
  • Support: Assisting developers as they build the product.

Chapter 2: Understanding the Problem: Design Research

This chapter explains how to gather the user and domain knowledge that forms the foundation of Goal-Directed Design. It emphasizes the power of qualitative research to uncover the “why” behind user behavior, something quantitative data often misses.

Qualitative vs. Quantitative Research

Most business and engineering cultures prefer quantitative data—numbers, statistics, and metrics. This type of research can tell you “how many” or “how much,” but it often fails to explain why people behave the way they do.

Qualitative research, on the other hand, is about understanding behaviors, attitudes, and contexts in rich detail. Through techniques like interviews and observation, designers can uncover user goals, mental models, and pain points. This approach is not about statistical significance; it’s about gaining deep, empathic insight. While quantitative data (like market segmentation or web analytics) can be useful for identifying a business opportunity or prioritizing features, qualitative research is essential for designing the right product in the first place.

The Goal-Directed Design Research Process

Effective design research is a structured process. It begins with understanding the business and technical landscape and moves outward to understand users in their own environment.

  1. Kickoff Meeting: The design team meets with key stakeholders to understand the product vision, business goals, technical constraints, and perceived user base. This is the first chance to ask “Who are the users?” and “What do they need?”
  2. Literature Review: Designers should review any existing documents—marketing plans, brand strategy, prior usability studies, user surveys, technical papers, and competitor analysis. This provides critical domain knowledge.
  3. Product and Competitive Audits: The team should analyze the existing product (if any) and its main competitors to understand the current state of the art and identify strengths and weaknesses.
  4. Stakeholder Interviews: Interviewing stakeholders one-on-one (from marketing, engineering, sales, and management) is crucial for understanding different perspectives on the product vision and identifying potential organizational misalignments.
  5. Subject Matter Expert (SME) Interviews: In complex domains like medicine or finance, SMEs provide invaluable knowledge about industry practices and regulations. However, designers must remember that SMEs are often expert users and may not represent the needs of intermediates.
  6. User and Customer Interviews: This is the heart of design research. It’s critical to distinguish between customers (who buy the product) and users (who use it daily), as their goals are often different.

Ethnographic Interviews: Observing and Interviewing Users

The most effective way to gather user data is through ethnographic interviews, which combine observation and directed conversation in the user’s own environment (a practice sometimes called contextual inquiry).

The key principles of this approach are:

  • Interview where the work happens: Observing users in their natural context reveals critical details about their workflow, environment, and the artifacts they use (like sticky notes pasted on a monitor—a sure sign of an unmet need).
  • Focus on goals, not just tasks: The primary objective is to understand why users do what they do, not just what they do. This allows designers to streamline or eliminate tasks that don’t serve a goal.
  • Avoid making the user a designer: Users are great at describing their problems, but not at envisioning solutions. When a user suggests a feature, the designer should ask, “How would that help you?” to get to the root problem.
  • Encourage storytelling: Specific stories about past experiences reveal more about a user’s motivations and frustrations than abstract questions.
  • Use open-ended questions: Instead of a fixed script, use questions that begin with “Why,” “How,” and “What” to encourage detailed responses.

Chapter 3: Modeling Users: Personas and Goals

After conducting research, you’re left with a mountain of data. This chapter explains how to synthesize that data into a powerful design tool: personas. Personas are detailed, composite archetypes that represent a specific group of users, allowing designers to focus on a tangible target.

The Power of Personas

Designing for a vague “user” is a recipe for failure, as it allows for the elastic user—a shapeless concept that can be stretched to fit any designer’s or developer’s opinion. Personas solve this problem by providing a precise and unforgettable design target.

Personas are not stereotypes. They are not “made up.” They are rigorously synthesized from real-world research and observation. A persona encapsulates a distinct set of behaviors, goals, and motivations observed across multiple real people. By giving a persona a name, a photo, and a story, they engage the empathy of the entire product team.

Using personas helps avoid common design pitfalls:

  • It prevents designing for the elastic user.
  • It prevents self-referential design, where designers build products for themselves.
  • It helps prioritize features and avoid focusing on obscure edge cases.

Understanding User Goals (Visceral, Behavioral, Reflective)

To make personas truly powerful, they must have goals. Goals are the driving force behind behavior. The book identifies three types of user goals, which map to Don Norman’s three levels of cognitive processing:

  1. Experience Goals (Visceral): These describe how a user wants to feel while using a product. Do they want to feel smart, in control, or have fun? These goals guide the product’s visual style, tone, and microinteractions.
  2. End Goals (Behavioral): These are the user’s motivations for performing tasks. What does the user want to do? Examples include “stay connected with friends” or “find the music I’ll love.” End goals are the primary driver of a product’s functions and interaction design.
  3. Life Goals (Reflective): These are the user’s deep, personal aspirations. Who does the user want to be? Examples include “be a good parent” or “be respected by my peers.” Products that help users achieve life goals create fiercely loyal customers.

The 8-Step Process for Constructing Personas

Creating effective personas is a systematic process, not an informal brainstorming session.

  1. Group interview subjects by role: Look for role-based patterns in your research data (e.g., “receptionist,” “traveling salesperson”).
  2. Identify behavioral variables: For each role, list the distinct aspects of observed behavior (e.g., frequency of use, technical aptitude, motivations).
  3. Map interview subjects to behavioral variables: Place each interviewee along a spectrum for each variable to see where they cluster.
  4. Identify significant behavior patterns: Look for clusters of subjects across multiple variables. A group that clusters together on 6-8 variables likely represents a core persona.
  5. Synthesize characteristics and define goals: For each pattern, synthesize the behaviors, environment, frustrations, and goals. Give the persona a name and basic demographic details.
  6. Check for completeness and redundancy: Ensure your set of personas represents the full range of observed behaviors and that each persona is meaningfully distinct.
  7. Designate persona types: Prioritize the personas to create a clear design target. The main types are:
    • Primary Persona: The single, main target for the design of a specific interface. The product is designed to satisfy the primary persona completely.
    • Secondary Persona: Mostly satisfied by the primary’s interface but has a few unique needs that can be accommodated without compromising the primary’s experience.
    • Supplemental, Customer, Served, and Negative Personas: These other types help address the needs of other stakeholders or explicitly state who the product is not for.
  8. Expand the description of attributes and behaviors: Write a 1-2 page narrative that brings the persona to life, describing a “day in the life” and focusing on details relevant to the product.

Chapter 4: Setting the Vision: Scenarios and Design Requirements

This chapter shows how to bridge the gap between your research-driven personas and a concrete product design. The key is using narrative—creating stories about your personas—to imagine an ideal user experience.

Scenarios: Narrative as a Design Tool

Scenarios are the primary tool for envisioning a product’s design. A persona-based scenario is a concise story describing a persona using a future product to achieve their goals. Scenarios are powerful because they focus on human activity rather than just features or technology. They allow designers to “role-play” as the persona, which helps them imagine a more fluid and goal-directed interaction.

This approach is different from use cases or user stories, which are often just lists of functional requirements. A scenario is a true narrative that explores the context and motivation behind a persona’s actions.

The book outlines three types of scenarios used at different stages:

  • Context Scenarios: High-level stories created before any design begins. They focus on the ideal experience from the persona’s perspective.
  • Key Path Scenarios: More detailed scenarios that describe the primary pathways a user takes through the interface.
  • Validation Scenarios: “What-if” scenarios used to test the design against less common situations or edge cases.

The Requirements Definition Process

To avoid jumping straight to a solution without defining the problem, the authors outline a five-step process for defining design requirements.

  1. Create Problem and Vision Statements: A problem statement concisely frames the situation that needs changing for both the user and the business (e.g., “Customer satisfaction is low because users lack tools to achieve goal G”). A vision statement inverts this to create a high-level design mandate.
  2. Explore and Brainstorm: This step is for getting all preconceived ideas about the solution out on the table so they don’t pollute the scenario creation process.
  3. Identify Persona Expectations: Analyze your research to understand the persona’s mental model. How do they think about the basic elements and actions in the domain? This ensures the design’s represented model will align with how they think.
  4. Construct Context Scenarios: Write a “day in the life” story for your primary persona. This narrative should be broad and focus on the ideal experience, treating the product like a “magic” black box that helps the persona achieve their goals with minimal effort. This is where the core of the design vision is born.
  5. Identify Design Requirements: After drafting the scenario, analyze it to extract the user’s needs. These requirements are not features; they are a formal definition of the information and capabilities the persona needs to achieve their goals. These needs are categorized as:
    • Data requirements: The objects and information that must be represented.
    • Functional requirements: The operations that can be performed on the objects.
    • Contextual requirements: The relationships between objects and environmental considerations.
    • Other requirements: Business, brand, technical, and customer needs.

PART II: Making Well-Behaved Products

Chapter 5: Designing the Product: Framework and Refinement

This chapter details the process of translating your scenarios and requirements into a concrete design. It is broken into two main phases: defining the high-level Design Framework and then filling in the details during Refinement.

Creating the Design Framework

The Design Framework defines the overall structure of the user’s experience. This isn’t about pixel-perfect details; it’s about defining the underlying organizing principles, the arrangement of functional elements, and the visual and form language.

The process for defining the interaction framework involves six steps, which are often iterative:

  1. Define form factor, posture, and input methods: Is it a desktop app, a mobile device, or a kiosk? How much attention will the user devote to it? The book defines different product postures, such as sovereign (an application that monopolizes the user’s attention, like a word processor) and transient (an application used for brief, intermittent tasks, like a calculator).
  2. Define functional and data elements: This step translates the abstract requirements from the previous chapter into concrete interface elements. For example, the requirement to “call a contact” might translate into functional elements like “voice activation,” “quick-dial buttons,” and “selecting a contact from a list.”
  3. Determine functional groups and hierarchy: Group related elements to support the user’s workflow. What needs to be on the screen at the same time? What is most important? This is where you begin organizing elements into panes and other top-level containers.
  4. Sketch the interaction framework: Using a whiteboard or simple drawing tool, create low-fidelity sketches of the interface. This “rectangles phase” focuses on the high-level layout of panes and controls, not on detail. This allows for rapid iteration without getting tied to a flawed idea.
  5. Construct key path scenarios: Refine the context scenarios from the previous chapter into key path scenarios. These stories describe how the persona interacts with the product using the specific vocabulary of your new interaction framework. These are often storyboarded alongside the sketches.
  6. Check designs with validation scenarios: Test the framework against less common situations (alternative scenarios), infrequent but necessary tasks (necessary-use scenarios), and error conditions (edge-case scenarios) to ensure the design is robust.

Refining the Form and Behavior

Once the framework is stable, the Refinement phase begins. This is where designers fill in the details, translating the low-fidelity sketches into full-resolution screens. This process follows the same steps as the framework definition, but at a more granular level, focusing on the design of individual controls, information displays, and microinteractions.

During this phase, designers create a form and behavior specification (often called a design blueprint), which is a detailed document that developers can use to build the product.

Validating and Testing the Design

While personas and scenarios are powerful tools for “testing” a design throughout the process, it’s often desirable to get feedback from real users. The book distinguishes between research and testing: research happens before design to inform it, while usability testing happens after a design concept exists to validate it.

Usability testing is most effective for:

  • Refining details like button labels and task flow.
  • Identifying major problems with the interaction framework.
  • Assessing first-time use and discoverability.

The book recommends formative evaluations—quick, qualitative tests conducted during the Refinement phase. This allows designers to observe how users respond to the design and make adjustments before development is complete.

Chapter 6: Creative Teamwork

This chapter focuses on the practice of design—how to work effectively in teams to create great products. It introduces the idea of thought partnership, a collaborative model that balances different creative styles to produce better outcomes.

Generators and Synthesizers

The authors identify two complementary creative roles that are essential for effective teamwork:

  • Generators: These are the people who instinctively grab a marker and head to the whiteboard. They are concrete thinkers who excel at ideating and exploring new solutions. When working alone, they can sometimes zoom in on an incomplete idea too quickly.
  • Synthesizers: These individuals ask questions to guide and focus the conversation. They are skilled at clarifying, finding connections, and ensuring the proposed solution addresses the user’s goals. When working alone, they can sometimes get stuck in analysis.

A successful thought partnership balances these two styles. The Generator is responsible for driving the concept direction, while the Synthesizer is responsible for ensuring the concept is coherent and consistent. This dialog between ideation and evaluation allows teams to quickly identify the promise in new ideas and discard those that are dead ends.

Working Across Design and Development Teams

Great products are created by extended teams that include designers, developers, marketers, and business leaders. For this to work, each group must have clear responsibilities and authority.

  • Design is responsible for the user’s experience and how the product looks, feels, and behaves.
  • Engineering is responsible for construction and must have authority over the development platform and process.
  • Marketing is responsible for defining the market opportunity and inspiring adoption.
  • Business is responsible for profitability and driving decisions about product priorities.

The book offers specific advice on collaborating with agile development teams. In an agile context, designers must be able to define the core elements of the user experience before development sprints begin, ensuring that fast work is not aimless work. The most valuable outcome of agile processes for designers is the ability to get frequent feedback on the user experience as it’s being built.

Chapter 7: A Basis for Good Product Behavior

This chapter introduces the fundamental concepts that form the basis for well-designed product behavior: design values, principles, and patterns. These tools help designers translate user needs into effective and desirable solutions.

Design Values

Design values are ethical and practical imperatives that guide the practice of design. A design solution should be:

  • Ethical: It should do no harm and improve the human situation. This includes considering everything from a user’s dignity to the product’s environmental impact.
  • Purposeful: It must be useful and usable, helping users achieve their goals while accommodating their contexts and abilities.
  • Pragmatic: It must be viable for the business and technically feasible to build.
  • Elegant: It should be the simplest complete solution, possess internal coherence, and appropriately stimulate cognition and emotion.

Interaction Design Principles

Design principles are generalizable guidelines for creating good product behavior and form. They operate at different levels of detail:

  • Conceptual principles define what a product should be and how it fits into the user’s life (covered in Chapters 8-13).
  • Behavioral principles describe how a product should behave in general and in specific contexts (covered in Chapters 14-17).
  • Interface-level principles describe effective strategies for organization, navigation, and communication (covered in Chapters 18-21).

One of the primary goals of these principles is to minimize the cognitive, memory, visual, and physical work required of the user.

Interaction Design Patterns

Design patterns are exemplary, reusable solutions to common design problems. By capturing best practices, patterns help designers:

  • Reduce design time and effort.
  • Improve the quality of solutions.
  • Facilitate communication between designers and developers.
  • Educate new designers.

The book identifies several types of patterns, including postural patterns (like sovereign or transient), structural patterns (like the Organizer-Workspace layout in desktop apps), and behavioral patterns (like specific widget behaviors).

Chapter 8: Digital Etiquette

This chapter presents a foundational principle: since we unconsciously treat computers like people, software should behave like a considerate human being. This means designing products that are respectful, generous, and helpful.

Designing Considerate Products

A considerate product understands and respects the user’s goals and needs. The authors outline several characteristics of considerate behavior:

  • Take an interest: A considerate product remembers a user’s actions and preferences.
  • Are deferential: Software should submit to the user, not the other way around. It should never pass judgment or limit a user’s actions unnecessarily.
  • Are forthcoming: It should volunteer useful, related information without being asked.
  • Use common sense: It shouldn’t offer inappropriate functions in inappropriate places.
  • Anticipate needs: A smart product can use idle time to perform tasks it anticipates the user will need soon, like pre-loading links on a web page.
  • Are conscientious: It should take initiative to help the user, like helping to disambiguate two files with the same name.
  • Don’t burden you with their problems: A product shouldn’t whine with error messages or demand confirmation for routine tasks.
  • Keep you informed: It should provide rich, modeless feedback about its status.
  • Are perceptive: It should observe user behavior to offer relevant information.
  • Are self-confident: A product shouldn’t second-guess the user with “Are you sure?” messages.
  • Don’t ask a lot of questions: It should provide choices rather than interrogating the user.
  • Know when to bend the rules: It should allow for “fudgeability,” letting users perform actions out of sequence or with incomplete information, just as people do in the real world.
  • Take responsibility: It shouldn’t blame other parts of the system for its failures.

Designing Social Products

When software mediates communication between people, it must also adhere to social norms. It should understand the difference between social norms (the unspoken rules between friends) and market norms (the rules of business). It should allow users to present their best side, facilitate easy collaboration, and respect the complexity of social circles. Importantly, it must respect user privacy and provide tools for dealing with anti-social behavior like griefing.

Chapter 9: Platform and Posture

This chapter explores how a product’s hardware platform and its behavioral posture are the first critical decisions in defining an interaction framework. A product’s posture is its behavioral stance—how it presents itself based on how much attention a user will devote to it.

Postures for the Desktop

Desktop applications typically fall into one of three postures:

  • Sovereign Posture: These are applications that monopolize a user’s attention for long periods, like word processors or spreadsheets. They should be designed for full-screen use, support rich input and feedback, and be optimized for intermediate users.
  • Transient Posture: These applications are used for brief, intermittent tasks, like a calculator or a volume control. They should be simple, clear, and to the point, with bold graphics and built-in instructions. They must remember their previous position and configuration.
  • Daemonic Posture: These are applications that run invisibly in the background, like a printer driver or network connection. They should only surface an interface when they need to be configured, and that interface should follow the principles of a transient application.

Postures for the Web

Websites also have postures, which are often a blend of sovereign and transient.

  • Informational Websites: These sites, like Wikipedia or corporate marketing sites, must balance the need to display dense information (a sovereign attribute) with the need for easy navigation for infrequent users (a transient attribute).
  • Transactional Websites: Sites like Amazon or online banking services also blend postures. Users may spend significant time researching products (sovereign), but the act of making a purchase is often a transient task.
  • Web Applications: Rich Internet Applications (RIAs) like Google Docs or Basecamp behave much more like desktop applications. Sovereign web applications should be treated like their desktop counterparts, with rich, modeless interaction.

Postures for Mobile and Other Devices

The context of mobile use dictates a predominantly transient posture. Even though apps are full-screen, interactions are typically brief and task-focused.

  • Smartphones and Handhelds: Early devices had a satellite posture, acting as extensions of a desktop computer. Modern smartphones have a standalone posture, functioning as powerful, general-purpose computers in their own right.
  • Tablets: Larger tablets can support true sovereign-posture apps, especially for productivity and creative tasks. However, they must still account for touch-based input.
  • Kiosks: Kiosks are almost always transient. They must be optimized for first-time use, with simple navigation and large, clear controls.
  • Ten-Foot Interfaces (TVs): These interfaces, controlled by a remote, must have simple navigation (often mapping to a five-way D-pad) and be easily readable from across a room.
  • Automotive Interfaces: Safety is the primary concern. These interfaces must minimize driver distraction, use direct control mappings, and provide clear audible and visual feedback.

Chapter 10: Optimizing for Intermediates

This chapter tackles a core dilemma in design: how to serve beginners, experts, and everyone in between. The solution is to reject the idea that users are either novices or power users. The vast majority of users, for the majority of the time, are perpetual intermediates.

The Myth of Beginners and Experts

The user population for any activity follows a bell curve. There are a few beginners on one end and a few experts on the other, but the vast majority of people are in the middle. Furthermore, this is a dynamic process. Nobody wants to remain a beginner. Users either quickly learn enough to become intermediates, or they abandon the product. At the same time, becoming a true expert requires a significant time commitment that most people are unwilling or unable to make. Therefore, the most stable and largest group of users is the perpetual intermediates.

The most important design principle is to optimize for intermediates. This doesn’t mean ignoring beginners or experts, but it means the design should not be compromised for their sake. You shouldn’t “weld on training wheels” that get in the way of intermediates.

Inflecting the Interface for Intermediates

The key to serving intermediates while still accommodating beginners and experts is to inflect the interface. This means organizing it to support typical navigation and use.

  • Place the most frequently used functions in the most immediate and convenient locations (like toolbars).
  • Push less-frequently used functions deeper into the interface (like menus or dialogs).

This approach is guided by the principle of commensurate effort: people will willingly work harder for something that is more valuable to them. A complex feature is acceptable if it delivers a powerful result, but a simple task must have a simple interaction.

Designing for the Three Experience Levels

A well-balanced product provides different pathways for users at different levels.

  • For Beginners: The goal is to get them to intermediacy as quickly and painlessly as possible. This means providing a clear conceptual model and pedagogic tools like menus and guided tours.
  • For Experts: Experts demand speed and access. They want invisible commands like keyboard shortcuts and gestures for everything they do frequently.
  • For Perpetual Intermediates: Intermediates need fast access to their working set of commonly used tools. They appreciate immediate commands like toolbar buttons and benefit from memorization vectors—cues in the interface (like showing keyboard shortcuts in a menu) that help them learn faster ways to do things.

Chapter 11: Orchestration and Flow

This chapter focuses on designing interactions that help users achieve a state of deep, productive concentration known as flow. When a product is well-orchestrated, the interface becomes transparent, allowing the user to focus on their goals, not on the tool itself.

The Principles of Harmonious Interaction

Creating flow requires harmonious organization, where all elements of the interface work together coherently.

  • Follow users’ mental models: Design the product to work the way users think, not the way it’s built.
  • Less is more: Strive to reduce the number of elements on the screen without reducing the product’s capabilities.
  • Let users direct, not discuss: Users want to feel like they are in control of a tool, not engaged in a conversation with a machine.
  • Provide choices, not questions: Toolbars offer choices; dialog boxes ask questions. The former is empowering, the latter feels like an interrogation.
  • Keep necessary tools close at hand: Frequently used tools should be immediately accessible, not buried in menus.
  • Provide modeless feedback: Information about the product’s status should be integrated into the main interface, not delivered via disruptive dialogs.
  • Design for the probable, but anticipate the possible: Don’t bother users with options for rare edge cases. Instead, design for the most common path and provide a robust Undo function.
  • Contextualize information: Present data in a way that helps users understand it. A pie chart showing disk space is more helpful than a raw number of bytes.
  • Reflect object and application status: An object should visually indicate its current state (e.g., a “read” email should look different from an “unread” one).
  • Avoid unnecessary reporting: Don’t stop the proceedings to announce normal operations.
  • Avoid blank slates: Most users find a blank document intimidating. Provide templates or smart defaults to give them a starting point.
  • Differentiate between command and configuration: Invoking a function (like Print) should be a separate, more immediate action than configuring it (like Print Setup).
  • Hide the ejector seat levers: Dangerous or irreversible functions should not be easily accessible.

Motion, Timing, and Transitions

Judicious use of animation and motion can significantly enhance flow. It can focus user attention, show relationships between objects, maintain context during transitions, and create a more immersive experience. Animations should be short, simple, meaningful, and feel natural.

Chapter 12: Reducing Work and Eliminating Excise

This chapter identifies one of the biggest sources of user frustration: excise. Excise is any work the user is forced to do that does not directly contribute to achieving their goals. It’s the “tax” a product imposes on its users. To make products better, we must eliminate excise wherever possible.

Types of Excise

Excise comes in many forms, each representing a different kind of unnecessary work for the user.

  • Navigational Excise: This is the effort required to move around an interface—between windows, panes, pages, or tools. It is one of the most common and disruptive forms of excise.
  • Skeuomorphic Excise: This occurs when an interface is forced to conform to a real-world metaphor, inheriting the limitations of the physical object. A digital calendar that only shows one month at a time because paper calendars do is a classic example.
  • Modal Excise: This is the excise of being stopped by a dialog box. Error messages, alerts, and confirmations are the biggest culprits. They stop the user’s flow, often to report something obvious or to ask a question the application should be able to answer itself.
  • Stylistic Excise: This is the visual work required to decode an overly stylized or cluttered interface, where it’s hard to distinguish between controls, content, and decoration.

How to Eliminate Excise

Eliminating excise requires a goal-directed approach to design.

  • Reduce the number of places to go: Minimize the number of windows, panes, and pages. Combine related functions into a single, cohesive view.
  • Provide signposts and overviews: Use persistent navigation elements, breadcrumbs, and overview panes (like the Navigator palette in Photoshop) to help users stay oriented.
  • Properly map controls to functions: Ensure that a control’s appearance and location clearly relate to what it does. Avoid the “four burners, four knobs in a line” problem.
  • Avoid deep hierarchies: Most people think in terms of monocline grouping (a single layer of organization, like folders in a file cabinet). Deeply nested hierarchies are difficult for users to navigate.
  • Don’t replicate Mechanical-Age models: Rethink how a digital product can improve on its physical counterpart instead of just mimicking its form. A digital calendar doesn’t need to be bound by the limitations of paper.

The core principle is to make the computer do the work. The user should be focused on their goals, not on managing the tool.

Chapter 13: Metaphors, Idioms, and Affordances

This chapter argues that the era of designing interfaces around metaphors is over. While metaphors were once seen as the key to making computers “intuitive,” they are often limiting and inefficient. A better approach is to design using learnable idioms.

The Problem with Metaphoric Interfaces

A visual metaphor relies on the user making a connection between an image and a real-world object to understand its function (e.g., a trash can icon for deleting files). While this can help a first-time user, metaphors have serious drawbacks:

  • They are limiting: Tying an interface to a physical object means inheriting its limitations. A digital file doesn’t need to be in only one “folder.”
  • They don’t scale well: A desktop metaphor with a few file icons worked fine on a 20 MB hard drive, but it’s useless for managing thousands of files.
  • There aren’t enough good ones: It’s hard to find good metaphors for abstract processes, which is what most software does.
  • They are culturally specific: An image can have different meanings in different cultures.

The Power of Idiomatic Interfaces

An idiomatic interface is based on the idea that people are incredibly good at learning and remembering simple, unique conventions, or idioms. We don’t understand an idiom like “kick the bucket” by thinking about its literal meaning; we understand it because we’ve learned it.

Most of what we consider “intuitive” in graphical interfaces is actually idiomatic. We learn what a window, a menu, a hyperlink, or a scrollbar does, and then we apply that knowledge everywhere. The key is that good idioms need to be learned only once.

The power of graphical interfaces comes from a restricted vocabulary of interaction primitives: point, click, and drag. These primitives are combined to form compounds (like buttons and text fields), which are then combined to form a rich language of application-specific idioms.

Manual Affordances and Direct Manipulation

  • Affordance refers to the perceived properties of an object that suggest how it can be used. A flat plate on a door affords pushing; a handle affords pulling. In interfaces, we create virtual manual affordances (like the 3D look of a button) to suggest interaction.
  • Direct Manipulation is the ability to act on visual objects directly, with immediate, visible results. It’s a powerful and engaging idiom, but it’s not always appropriate. It requires the user to do the task directly, and sometimes the user may not be very good at it.
  • Pliancy is the term for how an object visually communicates that it can be manipulated. This can be done with static hinting (like a button’s 3D rendering), dynamic hinting (like a rollover effect), or cursor hinting (where the cursor changes shape).

Chapter 14: Rethinking Data Entry, Storage, and Retrieval

This chapter tackles one of the most frustrating aspects of software: how it handles data. Traditional systems force users to think like a database, with rigid data entry rules and confusing file systems. A better approach is to design systems that are more flexible and human-centered.

Rethinking Data Entry

The traditional approach to data entry is focused on data integrity, which means preventing “bad” data from ever entering the system. This leads to rigid forms and frequent, frustrating error messages. A better approach is data immunity, where the system is smart enough to handle incomplete or imperfect data.

  • Audit, don’t edit: Instead of rejecting user input, the application should accept it and provide modeless feedback (like the red wavy underline for a typo in a word processor) to let the user know there might be a problem.
  • Accommodate “fudgeability”: In the real world, people often work with incomplete information. The system should allow users to save a transaction even if a non-critical field is missing, with the expectation that it can be fixed later.

Rethinking Data Storage: The Unified File Model

The traditional model of data storage, with its separate “copy in memory” and “copy on disk,” is an implementation model that is confusing to users. This leads to the dreaded “Do you want to save changes?” dialog, which is almost always answered with “Yes.”

The solution is a unified file model, where the user perceives only a single, persistent document.

  • Save automatically: The application should save the user’s work automatically and continuously in the background. The manual “Save” command becomes unnecessary for most users.
  • Provide explicit functions for user goals: Instead of a confusing “Save As” dialog, provide separate, explicit functions for what the user actually wants to do: Rename, Move, Create a Copy, or Create a Version.

Rethinking Data Retrieval

The traditional file system is a coupled storage and retrieval system based on location. To find a file, you have to remember where you put it. This is a Mechanical-Age model that ignores the power of computers.

A better approach is an attribute-based retrieval system. This separates storage from retrieval and allows users to find documents based on their inherent qualities (e.g., “Show me the Word documents related to ‘Widgetco’ that I modified yesterday”). This is the principle behind search tools like Apple’s Spotlight.

When designing a query interface for such a system, avoid asking users to learn complex Boolean logic. Instead, use constrained natural-language output, where users construct a search sentence by choosing from a series of drop-down menus, which guarantees a valid query.

Chapter 15: Preventing Errors and Informing Decisions

This chapter outlines three powerful strategies for creating more forgiving and supportive interfaces: using rich modeless feedback, providing a robust Undo function, and letting users compare and preview changes.

Rich Modeless Feedback

Instead of stopping the proceedings with an error dialog, applications should integrate feedback directly into the main interface.

  • Rich Visual Modeless Feedback (RVMF): This involves using visual cues to provide in-depth information about the status of an object or process. For example, the icon for a printer could show the progress of a print job, or a file icon could show how full a drive is.
  • Positive Audible Feedback: Most applications use sound only for negative feedback (an error beep), which is a public announcement of failure. A better approach is to use positive audible feedback—subtle sounds that confirm successful actions. The absence of sound then becomes a powerful, non-confrontational indicator that something is amiss.

Undo, Redo, and Reversible Histories

Undo is a primary tool for supporting user exploration. It gives users the confidence to try things because they know they can always go back. A good Undo facility must be designed around the user’s mental model, not the implementation model.

The book explores several types of Undo:

  • Single and Multiple Undo: Single Undo reverses only the last action. Multiple Undo allows reversing a sequence of actions, but the standard implementation requires undoing them in strict reverse order, which can be clumsy.
  • Discontiguous Multiple Undo: An ideal Undo system would allow the user to select and reverse a specific past action without undoing the valid actions that came after it.
  • Category-Specific Undo: Just as the Backspace key only undoes typing, an application could have separate Undo functions for different categories of actions, like formatting or object manipulation.
  • Deleted Data Buffers: This is a repository where all deleted text or objects are stored, allowing the user to browse and recover items without having to use the Undo command.
  • Versioning and Reversion: This allows users to save explicit “snapshots” of a document and revert to any previous version.

What If: Compare and Preview

Toggling between Undo and Redo is often used as a way to compare a change. A better approach is to provide an explicit compare or what-if function. Even better is to provide a preview that shows the result of an action before the user commits to it. Photo editing apps do this well by showing a gallery of thumbnails, each with a different filter applied.

Chapter 16: Designing for Different Needs

This chapter discusses how to design for a diverse set of user needs, focusing on learnability and help, customizability, localization, and accessibility.

Learnability and Help

An interface can support users with different levels of experience by providing multiple command modalities.

  • Pedagogic Commands: These are commands that teach their use, like menus and descriptive dialog boxes. They are ideal for beginners.
  • Immediate Commands: These are direct-manipulation controls like toolbar buttons and sliders that have an immediate effect. They are ideal for intermediates.
  • Invisible Commands: These are commands that must be memorized, like keyboard shortcuts and gestures. They are ideal for experts.

A well-designed interface provides memorization vectors—cues that help users transition from pedagogic to immediate and invisible commands (e.g., showing the keyboard shortcut next to a menu item).

Other ways to support learnability include:

  • Guided Tours and Overlays: These are excellent for orienting first-time users, especially on mobile devices where pedagogic commands are rare.
  • Galleries and Templates: Providing a gallery of ready-to-use templates is much less intimidating for most users than starting with a blank slate.
  • Traditional Online Help: This should be treated as a reference tool for intermediates, with a strong index and full-text search.

Customizability, Localization, and Accessibility

  • Customizability: A distinction should be made between personalization (decorating the interface, like changing colors) and configuration (moving, adding, or deleting persistent objects, like rearranging a toolbar). Personalization is low-risk and good for all users; configuration is more powerful but should be approached with caution, as it can disrupt navigation.
  • Localization and Globalization: Designing for different cultures requires careful consideration of language, text length, date formats, and cultural meaning of symbols. Immediate, idiomatic interfaces (like icon buttons) are generally easier to globalize than text-heavy pedagogic interfaces.
  • Accessibility: This means designing products that can be used by people with cognitive, sensory, or motor impairments. Key principles include leveraging OS accessibility tools, providing options for display (like high-contrast mode), enabling keyboard access, using clear language, and providing text equivalents for visual elements.

Chapter 17: Integrating Visual Design

This chapter explains how to use visual interface design to clearly communicate a product’s behavior and information. Good visual design is not just decoration; it’s a critical tool for creating a useful, usable, and desirable product.

The Elements of Visual Interface Design

Visual designers use a palette of core elements to create meaning and structure.

  • Shape, Size, and Value (light/dark): These are used to create contrast and establish a clear information hierarchy.
  • Color (Hue and Saturation): Color is a powerful tool for conveying meaning and brand, but it must be used judiciously to avoid creating a “carnival effect.” It should never be the only means of conveying information, as many people are color-blind.
  • Texture and Position: Texture can hint at affordance (e.g., a bumpy surface seems “grippable”). Position is used to group related items and guide the user’s eye.
  • Typography: Text should be easy to read. Use high-contrast text, choose an appropriate typeface, and be succinct.
  • Motion: Animation can be used to direct attention, show relationships, and maintain context during transitions.

Key Visual Interface Design Principles

  • Convey a tone and communicate the brand: The interface should embody the brand promise and feel appropriate for the user’s goals.
  • Lead the user with a clear visual hierarchy: Use visual properties to make it instantly clear what’s most important on a screen and how elements are related. Use the squint test to check your hierarchy.
  • Provide visual structure and flow: Use an alignment grid to create a sense of order and consistency. This makes interfaces easier to scan and more aesthetically pleasing.
  • Signal what the user can do: Use clear affordances, icons, and symbols to communicate what is interactive. When possible, pre-visualize the results of an action.
  • Respond to commands and draw attention to important events: Provide visual feedback for actions and use contrast to draw attention to critical information without being disruptive.
  • Minimize visual work and keep it simple: Eliminate visual noise and unnecessary variation. A good rule of thumb is to take things away until the design breaks, then put the last thing back in.

Chapter 18: Designing for the Desktop

This chapter provides a detailed guide to the primary building blocks of desktop graphical user interfaces (GUIs), focusing on windows, menus, toolbars, and pointer-based interactions.

The Anatomy of a Desktop App

Most sovereign desktop applications are structured around a primary window, which contains the main content and controls. This window is often subdivided into panes:

  • Content Pane: The primary work area.
  • Index Pane: Provides navigation to content objects (like an email inbox).
  • Menu Bar, Toolbars, Palettes, and Sidebars: Collections of controls for accessing functions.

Secondary windows, like dialog boxes, support the primary window and are used for less-frequently used functions. The book argues that many functions currently relegated to dialogs would be better served in a modeless sidebar or task pane within the primary window.

Windows, Menus, and Toolbars

  • Windows: The chapter discusses the pros and cons of overlapping windows (the traditional desktop metaphor) versus tiled windows. It concludes that for sovereign applications, a single, maximized, multipaned window is often the best approach, as it minimizes window management excise.
  • Menus: Menus are primarily a pedagogic tool for beginners. They teach the application’s functionality through their verbose, textual nature. They should be used to provide a complete map of an application’s features and to offer memorization vectors (like icons and keyboard shortcuts) to help users graduate to more immediate commands. Cascading menus are difficult to use and should be avoided.
  • Toolbars, Palettes, and Sidebars: These are collections of immediate controls for perpetual intermediates. They give users fast access to frequently used functions. Icon buttons are the primary idiom here, and their meaning should be clarified with ToolTips. Modern applications have evolved beyond simple toolbars to include customizable toolbars, contextual (pop-up) toolbars, and the Microsoft ribbon control. Sidebars or task panes are an excellent idiom for providing modeless access to functions that would otherwise require a dialog.

Pointing, Selection, and Direct Manipulation

This section covers the mechanics of mouse-based interaction.

  • Mouse Actions: It breaks down the primary mouse actions: point, click, right-click, drag, double-click, and chord-click. A key principle is that mouse-down should be used for selecting an object, while mouse-up should be used for committing to an action on a control. This allows users to gracefully back out of an inadvertent click.
  • Selection: The chapter distinguishes between selecting discrete objects (like icons) and contiguous data (like text). Additive selection (selecting multiple items) should be supported with modifier keys (like Ctrl or Shift). Any selection must be visually evident and unambiguous.
  • Drag and Drop: This is one of the most powerful direct-manipulation idioms. For it to work well, drop candidates must visually indicate their receptivity. The drag cursor must visually identify the source object. Any scrollable drag-and-drop target must also auto-scroll. A critical detail is to debounce all drags by implementing a drag threshold, which prevents an accidental nudge of the mouse from being misinterpreted as a drag.

Chapter 19: Designing for Mobile and Other Devices

This chapter focuses on the unique design challenges and patterns of mobile devices and other non-desktop platforms. The rise of multi-touch, sensor-laden devices has created a new set of powerful interaction idioms.

The Anatomy of a Mobile App

The on-the-go, context-driven nature of mobile use dictates that most mobile apps have a transient posture. Even though they are full-screen, interactions are typically brief and task-focused. The form factor also has a huge impact on the design.

  • Handhelds (Smartphones): The tall, narrow screen leads to stack-based layouts (vertical lists and grids).
  • Tablets: The larger screen allows for more desktop-like layouts, such as using an index pane next to a content pane. More complex authoring apps can use more sophisticated layouts, but they must still account for finger-sized controls.
  • Mini-Tablets: These devices live in an awkward space between phones and full-sized tablets and require careful layout considerations to avoid feeling either cramped or out of proportion.

Mobile Navigation, Content, and Control Idioms

Mobile apps have evolved a unique set of patterns for navigating and displaying content.

  • Browse Controls: Since browsing is a primary mobile activity, several patterns have emerged:
    • Lists and Grids: The fundamental patterns for displaying content.
    • Carousels and Swimlanes: Horizontally scrolling rows of content that allow users to browse multiple categories on a single screen.
    • Cards: A popular idiom for presenting self-contained chunks of rich media content with associated social actions.
  • Navigation and Tool Bars: These are the primary mechanism for navigating between functional areas. They can be implemented as tab bars (often at the bottom in iOS, top in Android), nav bars (at the top), or a combination. The More… control and tab carousels are clever ways to handle more navigation items than can fit on the screen.
  • Drawers: The “hamburger menu” has become a popular (though controversial) idiom for stowing primary navigation off-screen, freeing up more room for content.
  • Searching, Sorting, and Filtering: Searching is a critical mobile activity. Effective mobile search should use auto-complete, auto-suggest, and categorized suggestions to help users form effective queries with minimal typing.
  • Welcome and Help Screens: Since mobile apps lack the pedagogic tools of a desktop menu, they must use guided tours or overlays to orient first-time users and explain gestural idioms.

Multi-Touch Gestures

The core gesture vocabulary is small and should be kept that way to ensure learnability.

  • Tap: Used to select, activate, or toggle.
  • Drag: Used to scroll or move objects.
  • Swipe: A flicking gesture used for faster scrolling or navigation.
  • Pinch: Used to zoom in or out.
  • Rotate: Used to rotate objects.

Other Devices

The chapter also touches on design considerations for other platforms:

  • Kiosks: Must be optimized for first-time use, with simple navigation and large, clear controls.
  • Ten-Foot Interfaces (TVs): Must be readable from across a room and navigable with a simple five-way remote control.
  • Automotive Interfaces: Must minimize driver distraction above all else.
  • Audible Interfaces: Must provide clear signposting and easy ways to go back or speak to a human.

Chapter 20: Designing for the Web

This chapter dives into the design considerations unique to the web. While web technologies have become incredibly powerful, allowing for rich, application-like experiences, the fundamental, page-based nature of the medium still shapes many important conventions.

Page-Based Interactions

The page is the core structural unit of the web, and this has significant implications for navigation and information architecture.

  • Navigation and Wayfinding: The effort required to move between pages is a form of navigational excise. Good web design seeks to minimize this.
    • Primary navigation (the main links to different sections) is most effective when placed in a persistent header at the top of the page.
    • For deeper hierarchies, secondary navigation can be provided in a left-hand column or through fat navigation (large, expanding menus).
    • Breadcrumbs are a crucial tool for helping users understand where they are in a site’s hierarchy.
  • Searching: Since most users are not skilled at forming search queries, a good search experience is critical. This involves using auto-complete, auto-suggest, faceted search, and categorized suggestions to help users quickly find what they’re looking for.
  • Scrolling: With the rise of touch interaction, long, scrolling pages have become more common and effective. The key is to make scrolling an engaging experience by creating a good visual rhythm and providing clear orientation cues. It’s important to remember that infinite scrolling and site footers are mutually exclusive idioms. An infinite scroll is appropriate for feeds where recent content is most important, but not for interfaces where users need to get to the end of a list.
  • The Header and Footer: The header is the place for branding, primary navigation, and search. The footer is an excellent place to put links to less-frequently visited areas (like legal notices) or a “fat footer” that serves as a condensed sitemap.

The Mobile Web

Designing for the web today means designing for a huge variety of screen sizes. Responsive design is the contemporary approach for handling this. It involves creating a single, flexible layout that adapts to different screen widths at key breakpoints. While this is a powerful technique, the authors caution that sometimes a separate, dedicated mobile version of a site can be a better choice, as it allows for an experience that is more optimized for the unique context of mobile use (touch, sensors, and challenging lighting conditions).

Chapter 21: Design Details: Controls and Dialogs

This final chapter dives into the nitty-gritty details of the most common building blocks of GUIs: controls and dialogs. While these elements are often provided by standard development libraries, their misuse can lead to significant user frustration.

Controls

Controls can be categorized by their purpose:

  • Imperative Controls (Verbs): These initiate an action. The most common are buttons, icon buttons, and hyperlinks. A key principle is to use links for navigation and buttons for action.
  • Selection Controls (Nouns/Adjectives): These allow users to choose from a set of options. They include check boxes, radio buttons, toggle buttons, switches, list controls, and combo boxes. A powerful idiom for multiple selection in a scrolling list is earmarking—using check boxes next to each item instead of the standard selection highlight.
  • Entry Controls (Nouns): These allow users to enter data. They can be bounded (like a slider or spinner) or unbounded (like a text edit field). A core principle is to use bounded controls for bounded input to make errors impossible.
  • Display Controls: These are used to display information or manage the presentation, like scrollbars, splitters, and drawers.

Dialogs

Dialogs are pop-up windows that engage the user in a conversation. The most important principle is that primary interactions belong in the primary window. Dialogs are secondary spaces, appropriate for functions that are out of the main interaction flow, like configuring infrequently used settings.

The book identifies several types of dialogs:

  • Property Dialogs: For viewing and changing settings.
  • Function Dialogs: For controlling a single function, like Print.
  • Process Dialogs: For showing that the application is busy.
  • Notification Dialogs: For reporting events or messages.
  • Bulletin Dialogs: These include errors, alerts, and confirmations.

Eliminating Errors, Alerts, and Confirmations

Bulletin dialogs are one of the most misused and frustrating idioms in software. They stop the proceedings, often to report something trivial or to ask a question the user shouldn’t have to answer.

  • Error Dialogs: Most error dialogs are the result of the application failing to be flexible. The solution is not to write better error messages, but to make errors impossible by using bounded controls and designing more forgiving interactions. Remember, an error may not be your application’s fault, but it is its responsibility.
  • Alerts: These usually announce the obvious (“File Saved!”). They should be eliminated in favor of rich modeless feedback.
  • Confirmations: The “Are you sure?” dialog is the dialog that cried wolf. Users quickly learn to dismiss it without reading, so it fails to protect them when a real danger arises. The solution is to do, don’t ask, and provide a robust Undo function.

By focusing on the details and applying goal-directed thinking to every control and dialog, designers can create products that are not just functional, but truly pleasant and efficient to use.

Key Takeaways

Core Lessons

  • Design for Goals, Not Tasks: The most powerful way to create a better user experience is to understand the user’s ultimate goals and design to meet them. This often allows you to eliminate unnecessary tasks altogether.
  • Design for Perpetual Intermediates: The vast majority of your users are not beginners or experts; they are intermediates. Optimize your design for them by providing immediate access to frequently used functions, while still offering pathways for beginners to learn and experts to become more efficient.
  • Eliminate Excise: Ruthlessly identify and remove any work the user is forced to do that doesn’t directly help them achieve their goals. This includes navigating complex interfaces, managing windows, and dismissing unnecessary dialogs.
  • Make the Interface Idiomatic: Instead of relying on limiting real-world metaphors, design a system of simple, learnable visual and behavioral idioms. Good idioms need to be learned only once.
  • Products Should Be Considerate and Smart: Design your product to behave like a polite, intelligent, and helpful human being. It should remember user preferences, anticipate needs, and take responsibility for its own problems.

Next Actions

  • Create Personas from Real Research: Don’t just make up user profiles. Conduct ethnographic interviews to understand real user behaviors, goals, and motivations, and use that data to synthesize a small set of specific, believable personas. Make one your primary design target.
  • Write Scenarios Before You Design: Before you sketch a single screen, write a “day in the life” story for your primary persona. Use this narrative to envision the ideal user experience and extract the core design requirements.
  • Adopt a Unified File Model: Eliminate the confusing “Save” and “Save As” commands. Design your application to save automatically and provide explicit, goal-directed functions like “Rename,” “Move,” and “Create a Version.”
  • Replace Errors, Alerts, and Confirmations: Stop interrupting users. Use rich modeless feedback to provide status information, make all actions reversible with a robust Undo function, and design interfaces that make errors impossible in the first place.

Reflection Prompts

  • What is the primary goal of the user of my product? What are the intermediate tasks they perform, and can any of those tasks be eliminated or streamlined?
  • Am I designing for a stereotyped “user,” or do I have a clear, specific primary persona in mind? Does every design decision I make serve that persona’s goals?
  • Where does my product ask the user to do excise work? What unnecessary navigation, window management, or modal dialogs can I eliminate?
  • If my product were a person, would it be a considerate and helpful colleague, or a rude and demanding micromanager?
HowToes Avatar

Published by

Leave a Reply

Recent posts

View all posts →

Discover more from HowToes

Subscribe now to keep reading and get access to the full archive.

Continue reading

Join thousands of product leaders and innovators.

Build products users rave about. Receive concise summaries and actionable insights distilled from 200+ top books on product development, innovation, and leadership.

No thanks, I'll keep reading