0% found this document useful (0 votes)
82 views66 pages

HCI Design Process and User Considerations

The document outlines the design process of Human-Computer Interaction (HCI), detailing phases such as requirement gathering, design, implementation, testing, and deployment. It emphasizes the importance of understanding human characteristics and considerations in HCI to create user-friendly interfaces. Additionally, it discusses the significance of organizing screen elements and content for effective navigation and usability.

Uploaded by

Suvarna kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views66 pages

HCI Design Process and User Considerations

The document outlines the design process of Human-Computer Interaction (HCI), detailing phases such as requirement gathering, design, implementation, testing, and deployment. It emphasizes the importance of understanding human characteristics and considerations in HCI to create user-friendly interfaces. Additionally, it discusses the significance of organizing screen elements and content for effective navigation and usability.

Uploaded by

Suvarna kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Design Process – Human Interaction with Computers (HCI)

Human-Computer Interaction (HCI) is the study and design of how people interact with
computers and systems. The design process in HCI focuses on creating interfaces that are
efficient, intuitive, user-friendly, and meet the needs of users.

Phases of the HCI Design Process

1. 1. Requirement Gathering and Analysis

o Objective: Understand the users, their tasks, and the environment.

o Activities:

▪ User interviews, questionnaires, observations.

▪ Task analysis.

▪ Identifying user goals and challenges.

o Output: User requirements and system specifications.

2. 2. Design

o Objective: Create a prototype or conceptual model of the interface.

o Activities:

▪ Sketching wireframes.

▪ Creating mockups and interactive prototypes.

▪ Applying design principles (consistency, feedback, affordance).

▪ Defining user flows and interaction paths.

o Output: Interface design (low- and high-fidelity prototypes).

3. 3. Implementation

o Objective: Develop the actual user interface using appropriate tools and
technologies.

o Activities:

▪ Front-end coding (HTML, CSS, JavaScript, etc.).

▪ Integration with back-end systems.

▪ Ensuring accessibility, responsiveness, and performance.

o Output: A functional system or interface.


4. 4. Testing and Evaluation

o Objective: Ensure the system meets user needs and is easy to use.

o Activities:

▪ Usability testing (lab-based, remote, A/B testing).

▪ Heuristic evaluation.

▪ Collecting feedback through user testing.

o Output: Evaluation report with usability issues and suggestions for


improvement.

5. 5. Deployment and Maintenance

o Objective: Release the system to users and maintain it.

o Activities:

▪ User training and documentation.

▪ Monitoring system usage.

▪ Updating and refining the interface based on user feedback.

o Output: A deployed and user-accepted system.


+---------------------------+

| 1. Requirement Analysis |

| - Understand users |

| - Task analysis |

+------------+-------------+

+---------------------------+

| 2. Design |

| - Wireframes |

| - Mockups/Prototypes |

+------------+-------------+

+---------------------------+

| 3. Implementation |

| - Develop interface |

| - Integrate systems |

+------------+-------------+

+---------------------------+

| 4. Testing & Evaluation |

| - Usability testing |

| - Feedback & fixes |

+------------+-------------+

+---------------------------+

| 5. Deployment & Maintenance |

| - Release system |

| - Monitor & update |

+---------------------------+
Importance of Human Characteristics & Considerations in HCI (Human-Computer Interaction)

Designing any computer system or interface without considering human characteristics can lead to
poor usability, user frustration, and inefficiency. To create systems that are effective, efficient, and
satisfying, it is essential to understand the physical, cognitive, and emotional attributes of users.

1. Importance of Human Characteristics in HCI

Human Characteristic Importance in HCI Design

Perception (Vision, Interfaces should match human sensory limits (e.g., font size, color
Hearing) contrast, audio alerts).

Memory Users can't remember too much – interfaces should reduce cognitive
load and offer cues.

Attention Span Limited attention requires clear navigation and avoidance of


distractions.

Learning Ability Interfaces should be intuitive and support learnability with simple
tutorials or tooltips.

Reaction Time Systems should respond quickly to user actions to match expected
feedback time.

Physical Abilities Buttons and interactive elements must be appropriately sized and
placed for easy access.

Cognitive Load Complex tasks should be broken into smaller steps to avoid
overloading the user’s brain.

Cultural Factors Language, symbols, and colors must align with the cultural norms of
the target users.

2. Key Human Considerations in HCI Design

1. User-Centered Design

o Focuses on users' needs, preferences, and limitations throughout the design process.

2. Accessibility

o Design should support users with disabilities (visual, auditory, motor impairments,
etc.).

3. Ergonomics

o Ensures physical comfort while interacting with devices (e.g., screen height,
mouse/keyboard placement).
4. Emotional Experience

o Design must account for how the user feels while using the system—positive
experiences boost usability.

5. Age Group

o Younger and older users may require different design strategies (e.g., larger icons for
seniors).

6. User Expertise

o Novice and expert users need different levels of guidance and interface complexity.

Human Interaction Speeds in HCI (Human-Computer Interaction)

Understanding how fast humans can interact with computers is essential for designing efficient,
responsive, and user-friendly systems. Human interaction speeds refer to how quickly users can
perform tasks like typing, clicking, reading, reacting, and interpreting visual/audio cues.

Common Human Interaction Speeds

Action Typical Speed / Time Design Implication

Keystroke ~200 milliseconds per key Keyboard shortcuts should be optimized for
(5 keys/sec) quick access.

Mouse Click ~150-250 milliseconds Buttons must be clearly distinguishable and


easy to target.

Double Click <500 milliseconds (OS Must allow enough time to differentiate
threshold for double-click) from two single clicks.

Eye Fixation (reading ~200-300 milliseconds per Text should be simple and easily scannable.
text) word

Pointing (Fitts’s Law) Time depends on distance Larger and closer targets are easier and
and size of the target faster to click.

Reaction Time (to ~250 milliseconds Systems should not require ultra-fast
visual stimulus) reactions unless designed for trained users.

Speech Command ~1-2 seconds (user Speech interfaces must account for natural
Response speaking) pauses and delays.
System Response <1000 milliseconds (1 Systems should respond within 1 sec for
Expectation second) smooth interaction; delays cause
frustration.

Factors Affecting Interaction Speeds

• Age and motor skills

• Device type (touchscreen vs mouse)

• Task complexity

• User expertise

• Disability or impairment
+--------------------------------+

| Human Interaction Speeds |

+--------------------------------+

┌──────────────────┴───────────────────┐

│ │

+------------------+ +---------------------+

| Keystroke | | Mouse Click |

| (~200 ms/key) | | (150–250 ms) |

+------------------+ +---------------------+

│ │

▼ ▼

"Quick, efficient input" "Clearly sized, targetable buttons"

│ │

└──────────────────┬───────────────────┘

┌──────────────────┴───────────────────┐

│ │

+----------------+ +----------------------+

| Double Click | | Eye Fixation |

| (<500 ms) | | (200–300 ms per word)|

+----------------+ +----------------------+

│ │

▼ ▼

"Differentiate two clicks" "Easy-to-read text layouts"


│ │

└──────────────────┬───────────────────┘

┌──────────────────┴───────────────────┐

│ │

+---------------------+ +-----------------------+

| Pointing (Fitts's) | | Reaction Time |

| Law Consideration | | (~250 ms to visual) |

+---------------------+ +-----------------------+

│ │

▼ ▼

"Optimize size & placement" "Provide clear, immediate feedback"

│ │

└──────────────────┬───────────────────┘

┌──────────────────┴───────────────────┐

│ │

+---------------------+ +------------------------+

| Speech Command | | System Response |

| (~1–2 seconds) | | (<1 second ideal) |

+---------------------+ +------------------------+

│ │

▼ ▼

"Account for natural pauses" "Maintain fluid interaction"

Understanding Business Functions (Sometimes called Business "Junctions" in a


contextual sense)

Business functions (often mistakenly referred to as "business junctions") are the core
activities that help an organization operate efficiently and achieve its goals. These functions work
together like junction points in a system to keep the business running smoothly.

Major Business Functions


Function Purpose

1. Marketing Identifies customer needs, promotes products/services,


builds brand image.

2. Finance Manages money, budgeting, investments, and financial


planning.

3. Human Resources Recruits, trains, and manages employee relations and


development.

4. Operations Oversees production, logistics, supply chain, and


process management.

5. Sales Converts leads into customers, manages customer


relationships.

6. Research & Innovates new products/services and improves existing


Development (R&D) ones.

7. IT/Technology Maintains business systems, networks, cybersecurity,


and tech infrastructure.

8. Customer Service Handles client issues, support, and post-sale


satisfaction.

9. Legal & Compliance Ensures adherence to laws, industry standards, and


internal policies.

10. Administration Supports the entire business through documentation,


scheduling, and general office tasks.

Screen Designing – Design Goals in HCI (Human-Computer Interaction)

In Human-Computer Interaction, screen design plays a crucial role in shaping how users interact with
software or digital systems. Good screen design ensures that users can achieve their goals efficiently,
effectively, and with satisfaction.
Key Design Goals in Screen Design

Design Goal Explanation

1. Clarity The content on the screen should be clear, readable, and easy to understand.

2. Consistency Use uniform design elements (colors, fonts, icons, layouts) across all screens.

3. Simplicity Avoid clutter. Keep the interface simple and focused on core tasks.

4. Efficiency Minimize the number of steps to complete a task. Reduce user effort.

5. Feedback Provide clear and immediate responses to user actions (e.g., loading
indicators).

6. Error Design the screen to minimize user errors (e.g., disable wrong options).
Prevention

7. Flexibility Allow users to customize or control how they interact (e.g., shortcuts,
themes).

8. Accessibility Ensure the design works for users with disabilities (e.g., screen readers,
contrast).

9. Aesthetics The design should be visually appealing without sacrificing usability.

10. Learnability New users should be able to learn the interface quickly.

Example: Good Screen Design Features

• Clean layout with proper spacing.

• Clear call-to-action buttons.

• Labels next to input fields.

• Use of color to highlight important information (without overuse).

• Visual hierarchy: headings > subheadings > body text.

Screen Planning and Purpose in HCI (Human-Computer Interaction)

Screen planning is a critical phase in interface and system design. It involves organizing and
structuring the content, elements, and layout of each screen so that users can interact with the
system efficiently and intuitively.
Purpose of Screen Planning

Purpose Explanation

1. Enhance Usability Helps design screens that are easy to use and navigate.

2. Support Task Flow Ensures that screens follow the logical sequence of the user’s
tasks.

3. Improve User Satisfaction A well-planned screen reduces frustration and increases


satisfaction.

4. Maintain Consistency Promotes uniform design, improving recognition and reducing


learning time.

5. Reduce Errors Proper layout and design minimize chances of mistakes during
interaction.

6. Optimize Information Ensures the right information is displayed at the right time in
Presentation the right place.

7. Ensure Accessibility Screens can be planned to meet accessibility needs (contrast,


font size, etc.).

8. Facilitate Responsiveness Planning ensures design works across devices (desktop,


mobile, tablet).

Organizing Screen Elements in HCI

Organizing screen elements refers to the strategic placement of visual and interactive components
(like buttons, menus, icons, text, images) to create a clear, efficient, and user-friendly interface.

Goals of Organizing Screen Elements

• Improve readability and scannability

• Enhance user navigation

• Minimize cognitive load

• Increase task efficiency

• Support consistency and aesthetics


Key Principles for Organizing Screen Elements

Principle Explanation

1. Visual Hierarchy Use size, color, contrast, and position to show importance (e.g., headings
> subheadings > text).

2. Grouping (Gestalt Group related items together using spacing, borders, or background color.
Laws)

3. Alignment Align text, buttons, and images to a consistent grid (left, center, right) to
create order.

4. Consistency Keep similar elements in the same location across screens (e.g., "Submit"
button always at the bottom).

5. Proximity Place related elements close together to indicate their relationship.

6. Balance Distribute visual elements evenly across the screen to avoid a crowded or
empty look.

7. White Space Use blank space around elements to reduce clutter and increase
readability.

8. Navigation Menus and key navigation should be in familiar places (e.g., top or left
Placement side).

9. Accessibility Ensure elements are distinguishable (e.g., high contrast, readable font
size).

Typical Layout Zones in Screen Design

+-------------------------------------------------------------------+
| Header: App Name / Branding / Main Navigation |
+-------------------------------------------------------------------+
| Sidebar (Optional): Secondary Menu |
| |
| Main Content Area: |
| - Title |
| - Input Fields / Text / Buttons |
| - Images / Charts / Tables |
| |
| Footer: Status / Legal Info / Help / Contacts |
+------------------------------------------------------------------+
Tips for Organizing Elements

• Use grid systems for alignment.

• Highlight calls-to-action with size and color.

• Stick to 2–3 font styles only.


• Provide visual feedback (e.g., button highlights on hover).

• Avoid overcrowding — aim for clarity over complexity.

Ordering of Screen Data and Content in HCI

The ordering of screen data and content is the logical and visual arrangement of information on a
screen to support user goals, task flow, and readability. A well-ordered screen helps users
understand what to do, where to look, and how to interact without confusion.

Purpose of Ordering Screen Data

• To present information in a meaningful sequence

• To guide user attention to important content first

• To support task completion and decision-making

• To reduce cognitive effort and avoid overload

• To create a logical flow from one step or idea to another

Common Strategies for Ordering Content

Strategy Explanation

1. Top-to-Bottom Flow Start with the most important or frequently used information
at the top.

2. Left-to-Right (Z-pattern or F- Users naturally scan left to right, especially in Western


pattern) languages.

3. Priority-Based Ordering Display critical data or tasks first (e.g., alerts, errors, totals).

4. Task Sequence Order content as per task flow (e.g., Step 1 → Step 2 → Step
3).

5. Logical Grouping Group related information together using boxes, color, or


headers.

6. Progressive Disclosure Show only essential info at first; reveal more as needed (helps
avoid clutter).

7. Visual Hierarchy Use font size, color, boldness to show importance or


sequence.
Screen Navigation and Flow in HCI (Human-Computer Interaction)

Screen navigation and flow refer to how users move between screens in an application or website,
and how the screens are organized to support smooth, logical, and intuitive task completion.

Purpose of Screen Navigation and Flow

• Enable users to move between screens and features easily

• Support logical progression through tasks or processes

• Reduce confusion and frustration

• Ensure that users always know where they are and how to go back or forward

• Improve efficiency and user satisfaction

Types of Navigation

Type Description Example

Linear Navigation Step-by-step screen progression in a fixed Registration wizards,


sequence. onboarding steps

Hierarchical Movement from general screens to Home → Category → Product


Navigation detailed ones (parent-child). → Product Details

Global Navigation Top-level links that are always accessible Navigation bar at the top
from any screen. (Home, Profile)

Local Navigation Navigation within a section or module. Tabs within a profile section

Breadcrumb Shows user's location in the hierarchy and Home > Products > Electronics
Navigation lets them go back. > Mobile

Search-Based Users jump directly to a screen/content Search bar on e-commerce


Navigation via search. apps

Screen Flow

Screen flow is the path users follow to complete a task across multiple screens. It must be:

• Logical – Screens should appear in the order users expect

• Predictable – Transitions between screens should make sense

• Flexible – Users should be able to go back, skip, or redo steps


Best Practices for Navigation & Flow

Aspect Guideline

Clarity Use labels/icons that clearly indicate destination (e.g., “Settings” not
“Stuff”)

Feedback Highlight current location (e.g., highlight active menu/tab)

Consistency Keep navigation structure the same across screens

Minimize Depth Avoid too many nested levels—users shouldn’t click 5 times to find
something

Back/Forward Provide clear options to go back, cancel, or continue


Controls

Avoid Dead Ends Every screen should lead somewhere; don’t trap the user

Visual Cues Use arrows, tabs, or progress bars to show steps

Simple Screen Flow Diagram Example

[Home Screen]

[Category List]

[Item List]

[Item Details]

[Add to Cart] → [Cart Summary] → [Checkout] → [Confirmation]


### **Visually Pleasing Composition in Screen Design (HCI)**

A **visually pleasing composition** refers to the balanced and attractive arrangement of elements
on a screen that enhances **user experience**, **clarity**, and **aesthetic satisfaction** without
compromising functionality.

### **Goals of a Visually Pleasing Composition**

- Create a **positive emotional response**

- Increase **user engagement and retention**

- Make information **easy to process**

- Improve **brand perception**

- Support **usability and readability**

### **Key Principles for Visually Pleasing Screen Composition**

| **Principle** | **Explanation** |
|-----------------------------------------------------------|----------------------------------------------------------------------------------|
| **1. Alignment** | Elements should be visually connected through consistent edge or center alignment|
| **2. Balance** | Distribute visual weight evenly (symmetrical or asymmetrical) for harmony. |
| **3. Contrast** | Use color, size, and shape to make important items stand out |
| **4. Proximity** | Group related elements close together to form logical sections. |
| **5. Repetition** | Repeat colors, fonts, shapes for consistency and recognition. |
| **6. White Space (Negative Space)** | Leave enough space between elements to prevent clutter and
improve focus. |
| **7. Hierarchy** | Guide the eye by making key elements bigger, bolder, or more colorful. |
| **8. Grid System** | Use a structured layout to organize elements proportionally and predictably. |
| **9. Typography** | Use readable and attractive fonts; vary size and weight for headings, subheadings,
and body. |
| **10. Color Harmony** | Choose a balanced color palette that’s pleasing and aligned with the brand. |

---### **Example: Good vs. Bad Visual Composition**

| Good Design | Bad Design |


|----------------------------------------------------|------------------------------------------------|
| Balanced layout with white space | Cluttered and overcrowded screen |
| High contrast for readability | Low contrast text on background |
| Clear visual hierarchy | All text looks the same |
| Consistent button styles | Randomly styled buttons |
| Calming, coordinated color palette | Excessive colors without logic |
### **Example Visual Layout (Wireframe Style)**

+------------------------------------------------+

| Logo | Navigation Menu |

+------------------------------------------------+

| Header Image or Title |

+--------------------+ +-----------------------+

| Feature Box 1 | | Feature Box 2 |

+--------------------+ +-----------------------+

| Main Content Area (Text, Forms, etc.) |

+------------------------------------------------+

| Footer (Links, Contact Info, Legal) |

+------------------------------------------------+

Amount of Information in Screen Design (HCI)

The amount of information presented on a screen refers to how much data, text, images, and
controls are shown to the user at once. Striking the right balance is crucial for clarity, efficiency, and
usability.

Why Managing the Amount of Information Is Important

• Too much information = clutter, confusion, cognitive overload

• Too little information = under-informing the user, increasing navigation steps

• Right amount = clarity, focus, smooth task flow

Goals When Controlling Information Amount

Goal Explanation

Reduce Cognitive Load Avoid overwhelming the user with too many choices or details at once

Improve Comprehension Present information in digestible chunks

Support Decision-Making Show only what’s necessary for the current step

Enhance Navigation Prevent unnecessary scrolling or hunting for information


Techniques to Manage the Amount of Information

Technique Description

Progressive Disclosure Show only essential info first, reveal more as needed (e.g., “Read
more” links)

Chunking Break info into small, grouped sections (like bullet points or cards)

Tabbed Views or Use collapsible sections or tabs to organize content


Accordions

Minimalism Keep only essential elements; avoid distractions

Clear Hierarchy Use headings, bold text, and spacing to prioritize key information

Consistent Terminology Helps users quickly understand and remember concepts

Miller’s Law in HCI

“The average person can hold 7 ± 2 items in working memory.”


Design implication: Avoid more than 5–9 interactive items or choices per screen.

Example Use Case: Payment Form

Good Design Poor Design

Shows only: Name, Card Info, Expiry, Pay Now Shows terms, policies, input help, alternate plans

Optional info hidden under tooltips All info shown at once, cluttered look

Visual hierarchy used Equal weight for all text

### **Focus and Emphasis in Screen Design (HCI)**

**Focus and emphasis** are visual design techniques used to **guide the user's attention** toward
the most important elements on the screen. They ensure that users don’t get distracted and can
easily identify key actions, messages, or data.
### **Purpose of Focus and Emphasis**

| **Goal** | **Explanation** |
|----------------------------------|--------------------------------------------------------------------------------|
|Draw attention to key content| Help users identify what to read, click, or act on first. |
| Guide the user journey| Lead users naturally from one element to the next. |
| **Support task completion** | Highlight primary buttons to prompt action. |
|Reduce confusion | De-emphasize secondary or irrelevant content. |
| **Improve user experience | Makes the interface feel intuitive and well-structured. |

### **Techniques to Create Focus and Emphasis**

| **Technique** | **Effect** |

|--------------------------|------------------------------------------------------------------------------------------|

|Color Contrast| Bright or bold colors on dull backgrounds draw attention |

| Size and Scale| Bigger elements appear more important than smaller ones. |

| Position (Layout) | Elements at the top, center, or top-left are naturally more visible. |

|Whitespace (Negative Space)| Isolating elements creates visual focus. |

|Typography| Bold, italic, or large font draws attention to headings or keywords. |

| **Animation or Movement**| Subtle animation draws the eye. |

|Visual Hierarchy | Structure content so that the most important elements stand out first. |

|Borders and Highlights|Boxes, shadows, or highlights help group and emphasize content. |

### **Example: Login Screen**

| **Focused Element** | **Why** |

|-------------------------------|----------------------------------------------------------------------------------|

| “Login” button (in blue) | Draws attention with color and size |

| Username/Password fields | Placed centrally with labels and spacing |

| “Forgot Password?” (faded text)| Still visible but less emphasized – not the main action
|

---
### **Presenting Information Simply and Meaningfully in HCI**

Presenting information **simply and meaningfully** is a core principle in Human-Computer


Interaction (HCI). It ensures users can **easily understand, process, and act** on the content
without confusion, delay, or frustration.

### **Purpose of Presenting Information Simply and Meaningfully**

- Enhance **readability** and **clarity**

- Help users **find what they need quickly**

- Minimize **cognitive load**

- Improve **decision-making** and **task completion**

- Build **trust and confidence** in the system

### **Principles for Simple & Meaningful Information Presentation**

| **Principle** | **Explanation** |

|-----------------------------|----------------------------------------------------------------------------------|

| **1. Clarity** | Use plain language, familiar terms, and avoid technical jargon. |

| **2. Relevance** | Show only what is needed for the task; hide or minimize irrelevant data.
|

| **3. Brevity** | Keep text and content concise — short sentences, clear labels. |

| **4. Structure** | Organize content into logical sections with headings, subheadings, and
bullets. |

| **5. Visual Hierarchy** | Use font size, boldness, color to show importance and order.
|

| **6. Icons & Visual Aids** | Support text with helpful icons, infographics, or images. |

| **7. Consistency** | Use consistent labels, terminology, and visual styles across screens.
|

| **8. Feedback and Help** | Offer hints, tooltips, or confirmation messages to support user
understanding. |

---
### **Techniques for Effective Information Presentation**

| **Technique** | **Usage Example** |

|-----------------------|------------------------------------------------------------------------------------|

| **Chunking** | Break long information into smaller groups (e.g., paragraphs, cards, or steps).
|

| **Tables & Lists** | Use structured formats for comparisons, steps, or multiple items. |

| **Color Coding** | Highlight categories, status, or urgency (e.g., red = error, green = success).
|

| **Progressive Disclosure** | Show basic info first; expand for details (e.g., “Show more” link).
|

| **Tooltips & Labels** | Add brief explanations without cluttering the main layout. |

---

### **Example: Meaningful vs Cluttered Presentation**

| **Good Presentation** | **Poor Presentation** |

|-------------------------------------------------------|-------------------------------------------------------|

| “Enter your name” label above input field | “Enter details” (vague and unclear) |

| Icons with tooltips (“ What is CVV?”) | Long paragraphs explaining every term on
screen |

| Step-by-step progress bar for checkout | All checkout steps on one page without grouping
|
Information Retrieval on the Web

Information retrieval (IR) on the web refers to the process of searching for and obtaining relevant
data from the vast resources available online. It involves locating, extracting, and presenting
meaningful information based on a user’s query.

Purpose of Web Information Retrieval

• To help users find specific answers or documents efficiently

• To filter and rank results based on relevance

• To support decision-making, learning, or research

• To enhance access to digital knowledge in various formats (text, video, images, etc.)

Key Components of Web Information Retrieval

Component Description

1. User Query The keywords or phrases entered by the user to find information

2. Search Engine A system (e.g., Google, Bing) that processes the query and finds matching
content

3. Web Crawler Bots that browse the web and index pages for the search engine

4. Indexing Organizing and storing web content by keywords and metadata for faster
searching

5. Ranking Algorithm Determines which pages are most relevant to the query (e.g., Google
PageRank)

6. Results Displays the ranked list of web pages or snippets to the user
Presentation

How Web Information Retrieval Works

[User Input]

[Query Parsing] → [Search Engine]

[Index Matching] → [Ranking]

[Search Results Displayed]


Key Techniques Used in Web IR

Technique Function

Boolean Search Uses AND, OR, NOT to refine queries (e.g., "solar AND energy")

Natural Language Processing Understands the meaning of queries in natural human language
(NLP)

Keyword Matching Matches query words with indexed content

Relevance Feedback Improves search results based on user behavior (e.g., clicks, time
on page)

Semantic Search Understands the context/intent behind a query, not just exact
keywords

Challenges in Web Information Retrieval

• Information overload

• Irrelevant or outdated results

• Ambiguous queries (e.g., "apple" = fruit or company?)

• Content duplicity or spam

• Privacy and bias in search results

Example: Searching for “climate change effects”

Step What Happens

You enter the query Google processes your keywords

NLP analyzes it Understands you want factual, up-to-date impacts

Indexes are searched Finds pages containing relevant info on “climate change” + “effects”

Results are ranked Based on page quality, relevance, authority, freshness

You get a list With snippets, links, and images/videos on the topic
Statistical Graphics

Statistical graphics are visual representations of data that help communicate complex quantitative
information in an easy-to-understand, insightful, and meaningful way. They are key tools in statistics,
research, data analysis, and presentations.

Purpose of Statistical Graphics

• To summarize large datasets visually

• To identify patterns, trends, and outliers

• To compare variables and relationships

• To support decision-making and communication of findings

Common Types of Statistical Graphics

Type Description Example Use Case

1. Bar Chart Displays data using rectangular bars to Comparing sales of products or
show quantities exam scores

2. Pie Chart Circular chart divided into sectors Showing percentage share of
representing proportions different departments

3. Line Graph Connects data points with lines to show Tracking temperature, stock prices,
trends over time or growth

4. Histogram Shows frequency distribution of a Distribution of student marks


variable

5. Scatter Plot Plots data points to show relationships Relationship between hours studied
between two variables and exam scores

6. Box Plot Summarizes data using median, Comparing spread and skewness of
quartiles, and outliers multiple datasets

7. Area Chart Like a line graph, but the area under the Visualizing volume or cumulative
line is filled values

8. Heat Map Color-coded matrix to represent values Correlation matrix or sales across
locations

9. Stem-and- Shows distribution while retaining actual Small datasets in stats classes
Leaf Plot data values
Example Scenario

Data: Marks of 100 students

• Use a histogram to show score distribution

• Use a box plot to compare performance of different sections

• Use a bar chart to show subject-wise averages

### **Technological Considerations in Interface Design (HCI)**

In Human-Computer Interaction (HCI), **technological considerations** refer to the various


**hardware, software, and platform-related factors** that influence how an interface is designed,
built, and functions. Ignoring these can lead to poor usability, system incompatibility, or performance
issues.

---

### **Why Technological Considerations Matter**

- Ensure **system compatibility**

- Provide **smooth performance**

- Enhance **user experience**

- Optimize for **different devices and platforms**

- Maintain **scalability and maintainability**

---

### **Key Technological Considerations**


Category Consideration Explanation

1. Hardware Device type (PC, mobile, Design must adapt to screen size,
tablet, kiosk, etc.) resolution, touch/mouse input, etc.

Input/Output devices Interfaces should work with keyboards,


touchscreens, speech, sensors, etc.

Processing power & memory Avoid heavy UI loads for low-end


devices

2. Software Operating systems (Windows, UI behavior and layout must be OS-


Android, iOS, Linux) compatible

Browser or platform support Ensure the UI works across Chrome,


Firefox, Safari, etc.

Backend integration UI must work well with APIs, databases,


and cloud systems

3. Network Internet connectivity Design for both offline and low-


bandwidth usage if needed

Response time & loading Interfaces must respond quickly to


avoid user frustration

4. Security Authentication & encryption UI must support secure logins, alerts for
invalid actions, session timeouts

Data privacy Design for consent, access control, and


visibility of privacy-related features

5. Accessibility Support for assistive Design should support screen readers,


technologies high contrast modes, and keyboard
access

6. Scalability & Future-proofing interface to Responsive design, modular UI


Flexibility adapt to growth or changes components

7. Development Tools Frameworks and libraries Choice of React, Angular, Flutter, etc.
influences design capabilities

8. Multilingual support Interface should support different


Internationalization languages, date formats, RTL layout
### **Example Scenario: Mobile Banking App UI**

Statistical Graphic Best For

Bar chart Comparison

Pie chart Proportions

Line graph Trends over time

Histogram Frequency distribution

Scatter plot Relationships between variables


Unit -3
In Human-Computer Interaction (HCI) and GUI design, Windows play a central role in managing how
information is presented and how users interact with applications. Here’s a breakdown of the topic
“Windows – New and Navigation Schemes, Selection of Window” relevant for exams like SSC CGL
(for Computer Awareness or technical posts):

1. What is a Window?

A window is a rectangular area on the screen that displays information and allows interaction with
the user. It can contain text, graphics, controls (buttons, input boxes), and even other windows.

2. Types of Windows:

• Primary window (Main window): The main interface of an application (e.g., Microsoft
Word’s main document window).

• Secondary window (Dialog box): Temporary pop-ups that request input or give information
(e.g., Save dialog).

• Modal window: Prevents interaction with other windows until closed.

• Modeless window: Allows switching between windows without closing the current one.

• Popup window: Used for tooltips, dropdowns, or context menus.

3. New Window Schemes:

When a user performs an action (like clicking a link or opening a file), the system must decide how
and where to open a new window.

Common schemes include:

• Replace current window: Useful for seamless transitions (e.g., a browser link).

• Open in a new window/tab: Good for multitasking or referencing multiple items.

• Open in a dialog box: For focused tasks (e.g., input form).

• Split view or pane window: Part of the current screen is reused (e.g., email client with inbox
and message preview).

4. Navigation Schemes in Windows:

Navigation refers to how users move through content or switch between windows.

Types of navigation schemes:


Navigation Type Description Example

Sequential Linear flow Installation wizards

Hierarchical Tree-like structure File Explorer

Direct (Random) Access any screen directly Web browsers (with tabs)

Modal navigation Controlled steps requiring input Settings dialogs

Tabbed navigation Use of tabs to manage multiple windows Web browsers, settings pages

5. Selection of Window:

When designing or using a system, choosing the correct window type depends on:

• Task complexity (simple vs. complex interactions)

• User control needed (modal vs. modeless)

• Space on screen (small devices vs. desktops)

• Multitasking needs (support for multiple windows)

• Consistency in navigation and look & feel

Principles for choosing windows:

• Use modal windows for critical tasks that need user focus.

• Use modeless windows when multitasking is necessary.

• Avoid clutter—don't open unnecessary new windows.

• Ensure clear navigation cues (back, close, cancel, etc.).

Diagram – Navigation & Window Types Overview:


1. What are Controls in UI Design?

Controls are interface elements that allow users to input information, make choices, or manipulate
content in a software system. Controls can be divided into:

• Device-Based Controls

• Screen-Based Controls

2. Device-Based Controls

These rely on hardware devices for user interaction.

Types of Device-Based Controls:

Device Control Description Example

Keyboard For textual input and shortcuts Typing, Ctrl + S to save

Mouse For pointing, clicking, dragging File selection

Joystick For directional movement Video games

Touchscreen Direct interaction with content Smartphones

Stylus Pen-like input device Drawing on tablets

Trackpad/Touchpad Gesture-based navigation Laptop cursor control

Selection Criteria for Device-Based Controls:

• Nature of the task (typing, pointing, drawing)

• User environment (mobile, desktop, fieldwork)

• Speed and precision requirements

• Physical abilities of users (e.g., for accessibility)

3. Screen-Based Controls

These are graphical elements on the screen that users interact with using a device (like a mouse or
touch).

Common Screen-Based Controls:

Control Type Description Example

Text Box Input area for text Username, password fields

Radio Buttons Select one from many Gender selection

Check Boxes Select multiple options Interests selection

Dropdown Lists Select one option from a list Country selection


Sliders Adjust a range or value Volume control

Buttons Trigger actions Submit, Cancel

Icons Represent commands or files Trash bin, file icons

Menus Organize commands File > Save

Tooltips Brief help on hover "Save this file" popup

4. Factors for Selection of Controls

Criteria Device-Based Control Screen-Based Control

Task Type Typing → KeyboardPointing → Choices → Radio/CheckboxForm Entry


Mouse → Textbox

User Experience Advanced users may prefer Beginners prefer visual buttons and
keyboard shortcuts menus

Frequency Frequent tasks → Use shortcuts Rare tasks → Use labeled controls

Speed vs. Devices like mouse offer precise Screen controls can simplify actions
Accuracy selection

Environment Mobile users need touch-friendly Desktop users can use more complex
controls menus

Accessibility Use larger touch targets and Provide keyboard navigation support
alternatives
1. UI Components Overview

UI Components are individual elements used in screen design to interact with users and present
information clearly and effectively. Major categories include:

• Text and Messages

• Icons

• Increases (likely refers to enhancements or dynamic elements like animations, scaling, or


interactivity improvements)

2. Text and Messages

Types of Text in UI:

Type Purpose Example

Labels Identify fields or controls "Username", "Password"

Headings/Titles Indicate sections or pages "Login Page", "Settings"

Instructions Guide user on what to do "Enter your email address"

Error messages Inform user of incorrect actions "Invalid password"

Feedback messages Confirm actions "Your file has been saved"

Notifications System alerts "Battery low", "Update available"

Good Design Practices for Text:

• Clarity: Use simple and direct language.

• Consistency: Use consistent wording and style across the interface.

• Visibility: Ensure good contrast between text and background.

• Brevity: Keep messages short but informative.

• Tone: Maintain a friendly and professional tone.

3. Icons

Definition:

Icons are graphical symbols representing actions, files, apps, or features. They help users recognize
functions quickly.

Common Icon Types:


Icon Meaning

Trash bin Delete

Floppy disk Save

Envelope Mail/message

Magnifying glass Search

Gear Settings

Plus Add

Stop/Error Warning or error

Design Considerations:

• Should be easily recognizable

• Use universal symbols where possible

• Include tooltips for clarity

• Use consistent style (color, shape, line weight)

4. Increases (Enhancements in UI Elements)

This likely refers to visual or functional enhancements that increase usability, interactivity, or
engagement in UI design.

Types of Increases / Enhancements:

Enhancement Description Example

Hover Effects Changes appearance on mouse- Button glows on hover


over

Animations Smooth transitions or feedback Loading spinner, slide-in menus

Responsive Scaling Auto-adjust to screen size Mobile-friendly layout

Tooltips Text appears on hover for help "Click here to upload file"

Dynamic Content Content updates in real-time Live chat, stock price updates

Accessibility Improves usability for all users Text resizing, screen reader
Features support

Benefits:

• Improves user engagement

• Enhances visual feedback


• Reduces user errors

• Makes interface more interactive and intuitive

Diagram Example (Text + Icons + Enhancements)

1. Multimedia in HCI

Definition:

Multimedia refers to the integration of multiple forms of media such as text, audio, video, graphics,
and animation to present information in a richer, more interactive way.

Uses of Multimedia:

Media Type Use

Text Basic information, instructions

Audio Alerts, guidance (e.g., voice commands)

Video Tutorials, demonstrations

Animation Explain processes, transitions

Graphics/Images Enhance visual appeal, support comprehension


Benefits in HCI:

• Improves user engagement

• Increases information retention

• Enhances accessibility (e.g., speech for visually impaired)

• Makes UI more interactive and appealing

2. Colors in User Interfaces

Uses of Colors in UI Design:

Purpose Color Usage Example

Attraction Bright colors for CTA (Call-To-Action) buttons

Categorization Tabs with different background colors

Feedback Green for success, red for error

Navigation Color-coded menus or highlights

Branding Consistent theme colors (e.g., Facebook Blue)

3. Problems with Using Colors

Common Issues:

Problem Description

Overuse of colors Can distract or overwhelm the user

Poor contrast Makes text unreadable (e.g., yellow on white)

Cultural Colors have different meanings (e.g., red = danger in West,


misunderstanding celebration in East)

Color blindness Some users may not distinguish certain colors (e.g., red-green color
blindness)

Inconsistency Using the same color for different meanings confuses users
4. Guidelines for Choosing Colors

Key Principles:

Guideline Explanation

High contrast Use light text on dark background or vice versa

Consistent color coding Same colors should mean the same thing

Use color with redundancy Combine color with shape or labels (e.g., + red)

Test for color blindness Use simulators or safe palettes

Cultural sensitivity Consider the audience (e.g., red = warning in West)

Limit palette Use 3–5 core colors for cleaner UI

Example (Login UI using multimedia + color):


Unit-4
HCI in the Software Process

(HCI = Human-Computer Interaction)

Definition:

HCI in the software process refers to how user interaction design is integrated into the software
development lifecycle (SDLC) to ensure the final product is usable, user-centered, and effective.

1. Why is HCI Important in Software Process?

Reason Explanation

User Satisfaction Makes software easy and pleasant to use

Efficiency Reduces time needed to complete tasks

Error Reduction Helps avoid mistakes through intuitive design

Goal Alignment Ensures user goals are met effectively

Accessibility Supports users with different needs and devices

2. Stages of the Software Process with HCI Integration

Stage HCI Role

1. Requirements Analysis Involve users, identify user needs, context of use

2. Design Create user-centered interfaces, wireframes, prototypes

3. Implementation Develop using usability principles (layouts, navigation, color)

4. Testing Conduct usability testing, user feedback, accessibility checks

5. Deployment Ensure onboarding, tooltips, help content for end users

6. Maintenance Collect feedback, make iterative UI/UX improvements

3. Key HCI Activities in the Software Process

Activity Description

User Research Understand who the users are and their needs
Personas & Scenarios Create user types and usage stories

Prototyping Build low/high-fidelity UI samples

Usability Testing Observe users to identify issues

Heuristic Evaluation Experts review the UI for usability problems

Iterative Design Refine designs based on real feedback

4. Role of Usability Engineering

Usability Engineering is a discipline within HCI that applies structured methods to improve user
interfaces.

Phases of Usability Engineering in SDLC:

1. Plan usability activities

2. Set usability goals (e.g., task time, error rate)

3. Test with real users

4. Evaluate and refine interfaces

5. HCI Models Used in Software Design

Model Purpose

User-Centered Design (UCD) Focus on user needs throughout the process

Participatory Design Users actively participate in design decisions

Iterative Design Continuous refinement through feedback

Gulf of Execution/Evaluation Identify gaps between user intentions and system response
+-----------------------------------+
| 1. Requirements Analysis |
+-----------------------------------+

(User Needs, Tasks, Context)
+---------------------------+
| 2. Design |
+---------------------------+
| - Wireframes |
| - Mockups |
| - User Flows |
+---------------------------+
↓ (User Feedback)
+-------------------------------+
| 3. Implementation |
+--------------------------------+
| - UI Development |
| - Consistent Layouts |
| - Usability Principles |
+--------------------------------+

+-------------------------- -+
| 4. Testing |
+--------------------------- +
| - Usability Testing |
| - Accessibility Testing |
| - User Feedback |
+-----------------------------+

+---------------------------- ------+
| 5. Deployment |
+-----------------------------------+
| - Help & Documentation |
| - Onboarding UI |
+------------------------------------+

+------------------------------+
| 6. Maintenance |
+------------------------------+
| - User Support |
| - UI Updates |
|Iterative Improvements|
Throughout the Process: +-------------------------------+
↳ User-Centered Design Principles

↳ Iterative Design and Feedback Loops

↳ Involvement of Real Users


1. The Software Life Cycle

Also known as Software Development Life Cycle (SDLC), it includes the following phases:

1. Requirements Analysis – Understand what users need.

2. Design – Plan UI and interaction.

3. Implementation – Develop the interface.

4. Testing – Check usability, accessibility, functionality.

5. Deployment – Launch the software.

6. Maintenance – Fix issues, enhance usability over time.

HCI must be involved in all stages to ensure a user-friendly product.

2. Usability Engineering

A structured approach to improve the usability of systems.

• Involves user research, task analysis, prototyping, usability goals.

• Measurable: Time to complete a task, error rate, user satisfaction.

Focus: Efficiency, learnability, satisfaction.

3. Iterative Design and Prototyping

Iterative Design:

• Continuous cycle of design → test → evaluate → redesign.

• Uses real user feedback to refine the UI.

Prototyping:

• Low-fidelity: Sketches, paper mockups.

• High-fidelity: Interactive screen demos.

• Saves time and cost before full development.

4. Design Focus: Prototyping in Practice

• Create multiple prototypes before final design.

• Use tools like Figma, Sketch, or Adobe XD.

• Evaluate with users and iterate.

• Helps visualize layout, flow, and interactivity.


5. Design Rationale

It explains why certain design decisions were made.

• Documents trade-offs, user needs, and goals.

• Useful for team communication and future revisions.

Example:
“We used a hamburger menu to save space on mobile screens.”

6. Design Rules

Rules to ensure UI is usable, consistent, and effective.

Type Description

Principles High-level design goals

Standards Industry or platform-specific rules (e.g., iOS/Android)

Guidelines Best practices (e.g., WCAG for accessibility)

Heuristics Rules of thumb for good UI

7. Principles to Support Usability

From Ben Shneiderman, Nielsen, and Norman:

Principle Focus

Learnability Easy to learn

Efficiency Fast task completion

Memorability Easy to remember after a break

Error Prevention Avoid mistakes or help users recover

Satisfaction Pleasant user experience

8. Standards

Predefined norms set by:

• ISO (e.g., ISO 9241 for usability)

• WCAG (Web Content Accessibility Guidelines)

• Platform guidelines (Google Material Design, Apple Human Interface Guidelines)

Helps maintain consistency and quality across platforms.


9. Golden Rules and Heuristics

Shneiderman’s Eight Golden Rules:

1. Strive for consistency

2. Enable frequent users to use shortcuts

3. Offer informative feedback

4. Design dialog to yield closure

5. Offer error prevention and simple error handling

6. Permit easy reversal of actions

7. Support internal locus of control

8. Reduce short-term memory load

Nielsen’s 10 Heuristics:

• Visibility of system status

• Match between system and real world

• User control and freedom

• Consistency and standards

• Error prevention

• Recognition over recall

• Flexibility and efficiency of use

• Aesthetic and minimalist design

• Help users recognize and recover from errors

• Help and documentation

10. HCI Patterns

Reusable solutions to common usability problems. Like design patterns in software engineering.

Pattern Use

Breadcrumb Navigational trail

Wizard Step-by-step guidance

Undo/Redo Task reversal


Pattern Use

Infinite Scroll Continuous content load

Modal Dialog Force action before proceeding

11. Evaluation Techniques

Formative Evaluation (During development):

• Cognitive walkthrough

• Heuristic evaluation

• User testing (early prototypes)

Summative Evaluation (After development):

• Usability testing with metrics

• A/B testing

• Surveys and feedback forms

Technique Purpose

Heuristic Evaluation Expert reviews using known rules

User Testing Real users try the interface

Surveys & Interviews Gather user opinions

Analytics Tools Track real-time usage data

Summary Diagram: HCI in Design & Evaluation Process

[User Research]

[Prototype Design] ← Iterative feedback loop →

[Usability Goals & Heuristics]

[Testing & Evaluation]

[Refinement & Final Design]


Goals of Evaluation in HCI

Evaluation in Human-Computer Interaction (HCI) is the process of assessing how well a system or
interface supports users in achieving their goals. It is a critical phase in the design and development
of interactive systems.

Key Goals of Evaluation:

1. Assess Usability

o To determine if the system is easy to learn, efficient, and satisfying for users.

o Involves testing learnability, efficiency, memorability, error handling, and user


satisfaction.

2. Identify Usability Problems

o To uncover issues that may hinder user interaction or cause errors.

o Helps improve navigation, layout, feedback mechanisms, etc.

3. Ensure User Needs are Met

o Confirms that the system supports user tasks, expectations, and preferences.

o Aligns design with real-world use cases.

4. Improve the User Interface Design

o Provides feedback for refining interface elements like buttons, menus, layouts, and
workflows.

5. Validate Design Decisions

o To justify and confirm that the design choices (e.g., color, icon placement, interaction
flow) are effective.

6. Support Iterative Development

o Helps implement a cyclical design-evaluation-redesign process.

o Continuous feedback leads to a more refined final product.

7. Ensure Accessibility

o Verifies that the interface is usable by people with disabilities or impairments (e.g.,
visual, motor, cognitive).

8. Measure User Performance

o Quantitatively evaluate metrics such as:

▪ Time taken to complete a task


▪ Number of errors

▪ Success rate

▪ User satisfaction ratings

9. Compare Design Alternatives

o Used to evaluate different design options or versions of a prototype (e.g., A/B


testing).

10. Enhance User Satisfaction

• Ultimately aims to create a system that users find pleasurable, intuitive, and productive.

## **Evaluation through Expert Analysis**

**Expert analysis** is a **usability evaluation method** where **HCI experts or usability


professionals** examine the user interface to identify potential usability issues **without involving
end users directly**.

### **Definition:**

> Evaluation through expert analysis is a method in which trained usability experts analyze an
interface based on established principles, guidelines, or heuristics to identify design flaws.

## **Types of Expert Evaluation Methods**

Method Description

1. Heuristic Evaluation Experts evaluate the interface using predefined usability heuristics
(e.g., Nielsen’s 10 heuristics).

2. Cognitive Experts simulate the user's thought process while performing tasks,
Walkthrough step by step, to find usability issues.

3. Guidelines Review Experts check whether the design adheres to usability standards or
style guides.

4. Consistency Analysts ensure the interface is consistent in terms of layout, colors,


Inspection actions, etc.

5. Formal Usability A team of experts follows a structured procedure to review the


Inspection interface and document issues.

## **Goals of Expert Analysis**

- Identify potential usability problems early in the design phase

- Reduce cost and time by finding issues without user testing


- Validate adherence to design principles and standards

- Improve user interface before releasing for actual use

## **Advantages of Expert Analysis**

- Fast and cost-effective

- Requires fewer resources than user testing

- Detects **violations of usability principles**

- Good for **early-stage prototypes**

Evaluation through User Participation

Definition:

Evaluation through user participation is a method in which real users of a system are directly
involved in testing and evaluating the interface to assess its usability, effectiveness, and user
satisfaction.

This type of evaluation provides first-hand insights into how actual users interact with the system.

Objectives:

• Identify real user problems and confusion areas

• Measure task success rate, time taken, error frequency

• Gather user feedback for improvements

• Understand user behavior and expectations

• Improve the overall user experience (UX)


Common User-Based Evaluation Techniques:

Technique Description

Usability Testing Users perform typical tasks while observers record difficulties and
errors.

Field Observation Users are observed in their natural working environment.

A/B Testing Two interface versions are tested with users to compare performance.

Surveys and Collect subjective feedback (e.g., satisfaction, clarity).


Questionnaires

Interviews One-on-one discussions to understand user experiences in depth.

Think-Aloud Protocol Users speak their thoughts while performing tasks to reveal reasoning
and confusion.

Key Metrics Evaluated:

• Task completion rate

• Time on task

• Number of errors

• User satisfaction (via rating scales)

• Navigation issues

Example:

In a usability test for an e-learning platform, students are asked to:

• Register for a course

• Find a lesson

• Take a quiz

The observer notes:

• Time taken for each task

• Mistakes (e.g., clicking the wrong button)

• User comments and confusion points

Afterward, a feedback form is given to collect their thoughts.


Aspect Expert Analysis User Participation

Who evaluates? HCI experts or usability Real end users of the system
professionals

Purpose Detect usability problems using Understand how real users interact
design principles with the system

Cost Usually lower Higher (logistics, user recruitment,


compensation)

Time Faster Slower and more time-consuming

Stage used Early in design (low/high-fidelity Mid or late design stages, near
prototypes) functional system

Tools Used Heuristics, guidelines, checklists Usability testing, think-aloud, surveys

Type of Issues Theoretical usability violations Practical, real-world usage problems


Found

Output Expert opinion and reports User feedback, performance data,


satisfaction rating

Accuracy Depends on expert knowledge Reflects real user experience

[Define Objectives]

[Select Target Users]

[Choose Evaluation Method]
↓ ↓
Usability Test Interviews/Surveys
↓ ↓
[Observe User Behavior]

[Collect Data (errors, time, feedback)]

[Analyze Results]

[Identify Usability Issues]

[Refine Interface Design]
Choosing an Evaluation Method in HCI
Evaluation in HCI helps assess whether an interface is usable, effective, and meets user needs.
Selecting the right method depends on project goals, development stage, time, resources, and user
availability.

Factors to Consider When Choosing an Evaluation Method


Factor Description
Development Early-stage = expert review or walkthroughLater-stage = user testing
Stage
Time and Budget Limited time = heuristic evaluationMore time = full user studies
Availability of If real users are available, use usability testingIf not, rely on expert
Users analysis
Purpose of Finding usability flaws = heuristic/cognitiveMeasuring performance =
Evaluation usability testing
Type of Interface Mobile, web, desktop, embedded systems may require different tools
and techniques
Data Required Quantitative (time, error rate) or qualitative (opinions, preferences)

Types of Evaluation Methods and When to Choose Them


1. Heuristic Evaluation
• Use when: Experts are available, system is in early stages
• Pros: Quick, cost-effective
• Cons: May not reflect real user behavior
2. Cognitive Walkthrough
• Use when: Analyzing task flow and learnability for new users
• Pros: Good for early design evaluation
• Cons: Limited to specific tasks
3. Usability Testing (User Testing)
• Use when: Real users are available, system is functional
• Pros: Gives real-world insights
• Cons: Time-consuming, needs user recruitment
4. Surveys and Interviews
• Use when: You want user opinions, preferences, or feedback after use
• Pros: Easy to collect broad feedback
• Cons: May not reveal usability problems directly
5. Field Studies / Observations
• Use when: Studying users in their natural work environment
• Pros: High ecological validity
• Cons: Can be hard to control variables
6. A/B Testing
• Use when: Comparing two versions of a design
• Pros: Measures real-world performance
• Cons: Needs large user base and deployed product
Decision Tree for Choosing an Evaluation Method

Universal Design in HCI


Definition:
Universal Design is the process of creating interfaces, products, and environments that are usable by
all people, to the greatest extent possible, without the need for adaptation or specialized design.
It aims to make systems inclusive for users of all ages, abilities, and backgrounds.
Objectives of Universal Design:
• Ensure accessibility for users with disabilities
• Make systems usable by both novice and expert users
• Design for diverse devices and contexts (e.g., mobile, desktop, kiosks)
• Support global usability, regardless of culture, language, or skill level
Principles of Universal Design (as applied to HCI):
Principle Description HCI Example
1. Equitable Use Useful to people with diverse abilities Screen reader support,
captions in videos
2. Flexibility in Use Accommodates a wide range of Keyboard + mouse + touch
preferences input options
3. Simple and Intuitive Easy to understand, regardless of Clear navigation, self-
experience explanatory icons
4. Perceptible Communicates effectively to all users Visual + audio alerts, good
Information contrast
5. Tolerance for Error Minimizes hazards and consequences Undo buttons, confirmation
prompts
6. Low Physical Effort Used comfortably with minimal effort One-click actions, voice
control
7. Size and Space for Appropriate for interaction regardless of Touch-friendly buttons,
Approach body size/mobility scalable UI
Why Universal Design is Important in HCI:
• Promotes equal access for users with disabilities
• Complies with accessibility standards (e.g., WCAG, ADA)
• Enhances usability for all users, not just people with disabilities
• Reduces the need for multiple versions or adaptations
• Future-proofs systems for aging populations and technology changes
Difference: Universal Design vs. Accessibility
Universal Design Accessibility
Inclusive design for everyone Focused on users with disabilities
Built-in from the start Often added later as adaptation
Proactive Reactive
Example in HCI:
• A mobile app that supports:
o Voice commands
o Large font options
o Colorblind-friendly mode
o Easy undo options
o
Importance in HCI:
• Promotes inclusive user interfaces
• Complies with accessibility laws (e.g., WCAG, ADA)
• Reduces need for redesigns or assistive adaptations
• Improves usability for everyone
Multi-modal Interaction
Definition:
Multi-modal interaction refers to the use of multiple modes of input and output to communicate
with a system.
It enables users to interact using speech, touch, gestures, eye movement, or even facial expressions,
often simultaneously or interchangeably.
Examples of Modalities:
Mode Input Example Output Example
Speech Voice commands Voice responses (e.g., Siri, Alexa)
Touch Tapping/swiping on screen Vibrations (haptic feedback)
Gesture Hand movements (e.g., Kinect) Visual signals (animations)
Keyboard/Mouse Typing or pointing Text or cursor feedback
Vision/Eye Tracking Gaze control Highlighting focused items
Facial Expression Smiles, frowns (emotion detection) Adjust screen content or behavior

Benefits of Multi-modal Interaction:


• Increases accessibility (e.g., voice for visually impaired)
• Provides flexibility for different users and contexts
• Supports natural human communication styles
• Reduces cognitive and physical load
Unit -5
Cognitive Models
Cognitive models are theoretical representations of how the human mind processes information.
They help predict how users think, learn, and interact with systems.
Common Types:
1. GOMS Model (Goals, Operators, Methods, Selection rules)
o Used to analyze user performance in task execution.
2. Human Processor Model
o Models the brain like a computer: perceptual, cognitive, and motor processors.
3. ACT-R Model (Adaptive Control of Thought-Rational)
o Simulates memory, problem-solving, and decision-making.
Purpose:
• Predict user behavior
• Improve usability
• Estimate task performance time
• Design efficient user interfaces

Goals
A goal is what the user wants to achieve during an interaction with a system.
Examples:
• Saving a file
• Booking a train ticket
• Sending an email
Goals can be:
• High-level (e.g., plan a trip)
• Low-level (e.g., click "submit" button)

Task Hierarchies
A task hierarchy breaks down a goal into subtasks, actions, and operations. It shows the step-by-
step structure of how a goal is achieved.
Example: Goal → Send an Email
Goal: Send an email
└── Task: Open email client
└── Subtask: Click on email app icon
└── Task: Compose email
└── Subtask: Click "Compose"
└── Subtask: Enter recipient, subject, and message
└── Task: Send email
└── Subtask: Click "Send"
Design Focus: How GOMS Saves Money in System Design

The GOMS model (Goals, Operators, Methods, Selection rules) is a cognitive modeling technique
used in HCI and usability engineering to analyze how users interact with interfaces. Here's how
GOMS helps reduce costs and improve design efficiency:

GOMS = Cost-Saving Design Tool


Benefit Area How GOMS Helps How It Saves Money
1. Predictive Analysis Simulates user actions before Reduces need for expensive
implementation usability testing
2. Design Identifies unnecessary steps and UI Reduces development time and
Optimization friction effort
3. Performance Estimates task completion time Prevents costly redesign due to
Estimation accurately poor efficiency
4. Error Reduction Highlights areas prone to user Lowers support and maintenance
mistakes costs
5. Task Comparison Compares multiple interface Guides cheaper, more effective
methods design choices
6. Early Feedback Can be used even in mock- Saves cost from post-launch
up/prototype stages corrections

Example: Email System (Traditional vs. GOMS-based Design)


Task Traditional UI GOMS-Based UI
Clicks to send email 8 5
Time to complete task 25 sec 15 sec
Estimated support issues/month 40 10
Cost saving/month ₹15,000–₹30,000 (in dev/support resources)

What Are Linguistic Models?


They are models that:
• Describe the structure of language (syntax, semantics, phonology, etc.)
• Explain how humans interpret and produce language
• Help machines understand human speech or text

Types of Linguistic Models


Model Type Description Example
Phonological Models Deal with sounds of language Speech recognition systems
Syntactic Models Analyze sentence structure (grammar) Grammar checkers
Semantic Models Focus on meaning of words and sentences Search engines, AI chatbots
Pragmatic Models Consider context and user intent Virtual assistants like Siri
Discourse Models Understand conversation flow and Chatbots, email
coherence summarizers
Statistical/ML Use data to learn patterns in language GPT, BERT, ChatGPT, etc.
Models

Applications in HCI
Area Linguistic Model Use
Speech Interfaces Understanding spoken commands
Chatbots/Assistants Parsing queries, generating replies
User Documentation Simplifying complex instructions
Error Messages Making language more natural and helpful
Multilingual Systems Adapting UI to different languages and structures
Example: Simple Linguistic Interaction
Task: User wants to book a train ticket
• User says: "Book me a train from Delhi to Jaipur on Friday."
• System uses models to:
o Parse syntax (find subject, verb, object)
o Extract semantics (what is being asked)
o Interpret intent (make a booking)
o Check pragmatics (verify Friday’s date)
o Respond appropriately ("Train booked for Friday at 6:00 AM.")

The Challenge of Display-Based Systems in HCI


Display-based systems (like GUIs, dashboards, mobile UIs) rely on visual output to communicate with
users. While powerful and user-friendly, they come with specific challenges in terms of design,
usability, and user cognition.

Key Challenges in Display-Based Systems


Challenge Explanation
1. Information Too much data on screen overwhelms the user; leads to confusion or
Overload slow decision-making.
2. Visual Clutter Poor layout, inconsistent fonts, or excessive colors distract from
important content.
3. Screen Real Estate Limited screen size (especially on mobile) requires efficient design and
Limits space management.
4. Navigation Poorly organized menus or multiple levels of navigation confuse users.
Complexity
5. Inconsistent Design Variations in button styles, icons, or workflows disrupt user
expectations.
6. Accessibility Issues Visual-only systems can exclude users with vision impairments or color
blindness.
7. Context Awareness Displays may not adapt to the user’s task, environment, or device being
used.
8. Latency and Delays in display updates or lack of visual feedback reduce user
Feedback confidence.
9. Cognitive Load Users must remember where things are or how to use features,
increasing mental effort.
10. User Diversity One design may not suit all users with different skill levels, languages, or
preferences.

Example:
A poorly designed dashboard might:
• Show 10 graphs at once (overload)
• Use similar colors for all graphs (confusion)
• Require 3 clicks to reach the logout button (poor navigation)
A well-designed one:
• Prioritizes most-used data
• Uses clear labels and color codes
• Has a simple and predictable layout

Physical and Device Models in HCI


In Human-Computer Interaction (HCI), physical and device models help designers understand how
users interact with hardware devices (like a mouse, keyboard, touchscreens, etc.) and how physical
limitations affect usability.

1. Physical Models
These models focus on the physical constraints of the human body that impact interaction with
computer systems.
What They Consider:
• User’s body: hand size, reach, posture
• Motor skills: speed, accuracy, fatigue
• Ergonomics: comfort and physical effort
• Fitts’ Law: predicts time to move and select a target
Example:
Using a mouse to click a small button —
→ Fitts’ Law says: the smaller and farther a button is, the longer it takes to click it.
Design Use:
• Optimizing button sizes and placement
• Designing for hand-held or wearable devices
• Minimizing repetitive strain (RSI)

2. Device Models
These describe how users interact with input/output devices, and how devices translate user
actions into system input.
Focus Areas:
• Input devices: mouse, touchpad, keyboard, stylus, gamepad, etc.
• Output devices: display, haptic feedback, audio
• Device behavior: precision, latency, sensitivity
• Interaction style: direct (touchscreen) vs. indirect (mouse)
Example:
• A touchscreen allows direct interaction, but may cause finger fatigue or imprecision in small
targets.
• A stylus allows precise input, useful for drawing or handwriting.
Design Use:
• Matching device to user tasks (e.g., mouse for desktop, touch for tablets)
• Designing gestures and input methods
• Adapting UI to device limitations (like screen size or touch accuracy)

Comparison Table
Feature Physical Models Device Models
Focus Human body capabilities & limits Hardware and device interaction
Concerned Ergonomics, movement, muscle Input/output methods, accuracy,
with effort responsiveness
Example Fitts’ Law, Hick’s Law Device throughput, control-display ratio
Theory
Use in Design Adjusting layouts and sizes Choosing suitable input/output devices

Cognitive Architectures

Definition
A cognitive architecture is a framework for building models that simulate human cognitive
processes like memory, perception, attention, learning, and decision-making.

Goals of Cognitive Architectures


• Understand how the human mind works
• Simulate human-like behavior in software and robots
• Improve user interface design by predicting user behavior
• Support intelligent agents and AI systems

Key Components of a Cognitive Architecture


Component Description
Perception Processes external stimuli (vision, sound, etc.)
Working Memory Holds current information (limited capacity)
Long-Term Memory Stores knowledge and experience
Decision Making Chooses actions based on rules or goals
Learning Updates knowledge through experience or feedback
Motor Control Executes physical actions (typing, clicking, etc.)

Popular Cognitive Architectures


Architecture Description
ACT-R (Adaptive Control Models human cognition with separate modules (memory, vision,
of Thought - Rational) etc.) and a central production system. Widely used in HCI to simulate
user actions.
SOAR Problem-solving and decision-making focused. Used in robotics and
intelligent agents.
EPIC (Executive Process- Models multitasking behavior and perception-motor coordination.
Interactive Control)
CLARION Simulates both conscious and unconscious processing.

Ubiquitous Computing and Augmented Reality in HCI


Both ubiquitous computing and augmented reality (AR) are modern interaction paradigms in
Human-Computer Interaction (HCI) that focus on making technology seamless, natural, and
embedded in everyday life.

1. Ubiquitous Computing (Ubicomp)


Definition:
Ubiquitous computing refers to the concept where computing is integrated into the environment,
allowing users to interact with technology anytime, anywhere, without being consciously aware of it.
Coined by Mark Weiser (1991) — "The most profound technologies are those that disappear."

Key Features:
• Pervasive: Computers are embedded everywhere (homes, vehicles, wearables).
• Invisible: Interactions happen naturally, without direct user input.
• Context-aware: Systems adapt based on location, time, user activity, etc.
• Seamless integration: Devices communicate and cooperate with minimal effort from users.

Examples:
• Smart homes (automated lighting, temperature)
• Fitness bands (track activity without manual input)
• Smart assistants (Google Assistant, Alexa)
• IoT-enabled environments (factories, hospitals)

2. Augmented Reality (AR)


Definition:
Augmented reality overlays digital content (text, images, 3D models) on the real-world environment
in real time to enhance perception and interaction.

Key Features:
• Real + Virtual: Combines real-world scenes with virtual objects.
• Interactive: Users can manipulate digital content in real space.
• Real-time: Updates and responds instantly to user actions or surroundings.

Examples:
• AR filters on Instagram or Snapchat
• Google Lens (recognize objects and give data)
• IKEA Place app (see furniture in your room)
• AR in education (anatomy apps, lab simulations)
• Heads-Up Displays in vehicles or AR glasses

Comparison Table
Feature Ubiquitous Computing Augmented Reality
Focus Computing embedded in Enhancing real world with digital
environment layers
User Interface Often invisible or passive Visual, spatial, interactive
Interaction Automatic, context-aware Real-time user interaction
Style
Hardware Sensors, IoT devices, wearables AR glasses, smartphones, cameras
Goal Seamless integration of technology Visual enrichment of user experience

Ubiquitous Computing: Applications and Research Areas

Ubiquitous computing (Ubicomp) is a computing paradigm where technology is seamlessly


integrated into the environment, enabling smart, context-aware, and invisible interaction between
people and computers.

Key Applications of Ubiquitous Computing


1. Healthcare

• Wearable health monitors (e.g., Fitbit, ECG sensors)

• Smart pill dispensers and reminder systems

• Ambient assisted living for the elderly

• Real-time patient monitoring in hospitals

2. Smart Homes

• Automated lighting, HVAC, and appliances

• Voice-controlled assistants (e.g., Alexa, Google Home)

• Presence-aware systems (lights turn on when you enter)

• Home security systems with sensors and cameras

3. Smart Transportation

• Vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication

• Real-time traffic and route optimization

• Self-driving cars using environmental sensors

• Public transport tracking apps

4. Education

• Context-aware learning environments (e.g., AR-enabled labs)

• Smart classroom systems with automatic attendance and feedback

• Location-based content delivery (e.g., museum guides)

• Interactive learning materials delivered via wearables or mobile

5. Workplace and Industry (Smart Offices)

• Energy-saving building automation

• Sensor-enabled workflow tracking

• Real-time employee location and resource usage

• Smart meeting room booking systems

6. Retail and Shopping

• Smart shelves and inventory management

• Proximity marketing (offers sent when near a store/product)

• Automated checkout systems (Amazon Go-style stores)


• Personalized product recommendations in-store

Current Research Areas in Ubiquitous Computing

1. Context Awareness

• Understanding user behavior, location, time, and environment

• Context-aware app adaptation (lighting, notifications, etc.)

2. Human-Centric AI

• AI that learns from user habits, preferences, and context

• Personalized assistants and predictive systems

3. Privacy and Security

• Securing personal data in constantly connected environments

• Preventing unauthorized tracking and data collection

4. Interoperability and Standards

• Making diverse devices and systems communicate seamlessly

• Research into common protocols (e.g., Matter, Zigbee)

5. Cognitive and Emotional Interaction

• Systems that sense and respond to mood, stress, or focus

• Emotion-aware computing (useful in health, learning, therapy)

6. Low-Power and Energy-Efficient Systems

• Developing sensors and devices that use minimal energy

• Self-powered devices (solar, motion, heat)

7. Wearable and Implantable Devices

• Integration with human body for health, AR, and communication

• Research into bio-compatible materials and miniaturization

Design Focus: Ambient Wood — Augmenting the Physical World

Ambient Wood is a research project in the field of ubiquitous computing and augmented reality,
designed to explore how digital augmentation can enhance learning and interaction with natural
environments, particularly in education.
What is Ambient Wood?

Ambient Wood was a project by the University of Sussex and the University of Nottingham (UK),
where children explored a woodland environment augmented with digital technology to enhance
science learning through discovery and interaction.

Design Goals of Ambient Wood

Design Focus Description

Enhance physical experience Use digital tools to make the natural world more informative

Support discovery learning Encourage curiosity-driven exploration and real-world interaction

Seamless augmentation Make technology blend into the environment, not distract

Context-aware interaction Sensors react to location, movement, and actions of users

Collaborative learning Allow students to explore together and discuss observations

How It Works (Key Technologies Used)

Component Role in Design

Environmental
Detect light, temperature, humidity, and feed data to learners
Sensors

Mobile Devices /
Provide children with prompts, audio, or questions based on location
PDAs

RFID/Location
Help the system know where the learner is in the forest
Tracking

Digital Content
Add sound, text, or video about trees, wildlife, or ecology
Overlays

Show changes or feedback based on user interaction (e.g., audio feedback


Ambient Displays
when a tree is touched)
Virtual Reality (VR) vs. Augmented Reality (AR) – In HCI and Design

1. Virtual Reality (VR)

Definition:

VR is a fully immersive digital environment that replaces the real world with a computer-generated
simulation.

Key Characteristics:

• Users are completely cut off from the real world.

• Requires headsets like Oculus Rift, HTC Vive, or Google Cardboard.

• Interactions occur in 3D virtual spaces.

• Often includes controllers, motion sensors, and haptic feedback.

Applications:

Field VR Use Case

Education Virtual labs, history simulations

Training Flight simulators, medical surgery practice

Gaming Immersive 3D games

Therapy Exposure therapy for phobias

Architecture Virtual walkthroughs

2. Augmented Reality (AR)

Definition:

AR overlays digital content (text, images, 3D models) onto the real-world environment in real-time.

Key Characteristics:

• Real world is still visible and central.

• Uses smartphones, tablets, or AR glasses (e.g., Microsoft HoloLens, Google Glass).

• Combines physical space with interactive virtual elements.

• Often used for guidance, labeling, visualization, and interaction.

Applications:

Field AR Use Case


Education Anatomy visualizers, interactive books

Retail Try-before-you-buy (clothes, furniture)

Maintenance Real-time repair instructions (AR manuals)

Gaming Pokémon Go, AR treasure hunts

Healthcare Surgical overlay guidance, vein visualizers

Comparison Table: VR vs AR

Feature Virtual Reality (VR) Augmented Reality (AR)

Environment Fully virtual, immersive Real world + digital overlay

Hardware VR headsets, controllers Smartphone, AR glasses, tablets

Interaction With simulated world With real world + virtual objects

User isolation User is isolated from real world User stays connected to real environment

Typical Use Cases Training, simulation, gaming Assistance, learning, navigation, shopping

Mobility Often requires a fixed setup Highly mobile and flexible

Design Focus: Shared Experience in HCI

What is a Shared Experience?

A shared experience is when two or more users engage with a system together, experiencing the
same content, space, or task, either simultaneously or sequentially, to achieve a common
understanding or goal.

Design Goals for Shared Experience

Goal Description

Collaboration Support Enable users to work together seamlessly (e.g., co-editing, gaming)

Real-Time Interaction Ensure simultaneous updates are visible to all users instantly
Awareness Let users know what others are doing (e.g., cursors, highlights)

Equity of Experience All participants get equal access and interaction ability

Presence & Engagement Design to make users feel connected and "together"

Design Elements That Support Shared Experience

Element Example

Shared Display Interactive whiteboards, multitouch tables

Collaborative Google Docs, Figma, online whiteboards


Interfaces

Synchronous Input Multiplayer games, co-browsing, shared AR sessions

Embodied Interaction Gestures or movement sensed in physical space

Audio/Visual Voice chat, video call, avatars, cursors


Feedback

Mixed Reality (MR) Combining physical and digital space for collaboration (e.g., Hololens in
design teams)

Example Use Cases

Domain Shared Experience Design Example

Education Students using AR to explore science models together

Healthcare Remote surgeons collaborating using VR overlays

Design & Engineering Teams co-designing 3D models in virtual environments

Gaming Multiplayer VR games like Beat Saber or Rec Room

Museums Group exploration using shared AR devices or smart tables


Information and Data Visualization in HCI

Definition

Information visualization is the graphical representation of abstract data to reinforce human


cognition.
Data visualization is the use of visual techniques to display quantitative or statistical data.

Goals of Visualization in HCI

Goal Description

Simplify complexity Turn large, complex datasets into easy-to-understand visuals

Enhance cognition Support human memory, attention, and reasoning

Support decision-making Enable quick insight and evidence-based action

Facilitate exploration Let users interact with data for deeper understanding

Improve usability Help users comprehend system status, progress, or performance

Types of Data Visualizations

Type Use Case Example

Bar chart Comparing categories Sales by region

Line chart Showing trends over time Temperature over months

Pie chart Showing parts of a whole Market share %

Scatter plot Correlation between variables Height vs. weight

Heatmap Intensity/frequency data Website click zones

Tree map Hierarchical data File storage usage

Network graph Relationships/connections Social networks

Timeline Temporal events Project milestones

Principles of Effective Visualization

1. Clarity – Keep visuals simple and easy to read

2. Relevance – Show only the data needed for the user’s task

3. Consistency – Use consistent scales, colors, labels

4. Accuracy – Avoid misleading visual proportions or distortions

5. Interactivity – Let users explore, filter, zoom, and drill dow


6. Design Focus: Getting the Size Right in HCI

In Human-Computer Interaction (HCI), getting the size right means designing interface elements
(like buttons, text, icons, menus) with the appropriate dimensions so that users can interact
efficiently, accurately, and comfortably across different devices and contexts.

Why Size Matters in Design

Reason Explanation

Touch accuracy Elements must be large enough to be tapped or clicked easily (especially
on mobile)

Readability Fonts must be big enough to read without straining the eyes

Accessibility Users with disabilities or elderly users may struggle with small controls

Fitts’ Law compliance Targets should be sized and placed for faster and easier interaction

Visual hierarchy Size indicates importance and guides attention (e.g., headings vs. body
text)

Device Sizes must adapt across screen sizes and resolutions


responsiveness

Design Guidelines for Sizing

1. Touch Targets

• Minimum size: 44 x 44 pixels (Apple), 48 x 48 dp (Google Material Design)

• Ensure spacing between targets to avoid accidental taps

2. Font Sizes

• Body text: 16px (web standard)

• Headings: Gradually larger, e.g., 20–32px

• Use relative units (em/rem) for responsiveness

3. Icons and Buttons

• Avoid decorative icons smaller than 24px

• Buttons should have enough padding (not just visible size)

4. Interactive vs. Non-interactive elements

• Interactive elements should be larger and bolder than static ones

Examples in Real Design

Context Good Size Practice


Mobile apps Buttons are thumb-friendly (not too small or too close)

Websites Fonts and images resize well on tablets and desktops

Kiosks/ATMs Large on-screen buttons for quick access

Wearables Interface adapts to tiny screen with gestures or voice

You might also like