Who? What? What do you want to know? What scope of the product will you evaluate?

Similar documents
Qualitative Research Methods

Course summary, final remarks

Research Methods & Design Outline. Types of research design How to choose a research design Issues in research design

Principles of Sociology

VALIDITY OF QUANTITATIVE RESEARCH

Higher Psychology RESEARCH REVISION

Evaluation: Controlled Experiments. Title Text

PLANNING THE RESEARCH PROJECT

Assignment 2: Experimental Design

Seminar 3: Basics of Empirical Research

KOM 5113: Communication Research Methods (First Face-2-Face Meeting)

Dilemmatics The Study of Research Choices and Dilemmas

Chapter 2 Methodology: How Social Psychologists Do Research

SEMINAR ON SERVICE MARKETING

SOCIOLOGICAL RESEARCH

Evaluation: Scientific Studies. Title Text

Human intuition is remarkably accurate and free from error.

Underlying Theory & Basic Issues

Asking and answering research questions. What s it about?

Psych 1Chapter 2 Overview

Chapter 11 Nonexperimental Quantitative Research Steps in Nonexperimental Research

2 Critical thinking guidelines

The Scientific Method. Philosophy of Science. Philosophy of Science. Epistemology: the philosophy of knowledge

The Role and Importance of Research

Research Strategies. What We Will Cover in This Section. Research Purposes. Overview. Research techniques. Literature Reviews. Basic research.

Conducting Research in the Social Sciences. Rick Balkin, Ph.D., LPC-S, NCC

A to Z OF RESEARCH METHODS AND TERMS APPLICABLE WITHIN SOCIAL SCIENCE RESEARCH

Evidence Informed Practice Online Learning Module Glossary

The essential focus of an experiment is to show that variance can be produced in a DV by manipulation of an IV.

RESEARCH METHODS: PSYCHOLOGY AS A SCIENCE

CSC2130: Empirical Research Methods for Software Engineering

3: (Semi-)Structured Interviews. Tina FREYBURG [Introduction to Qualitative Methods]

TYPES OF RESEARCH (CONTD )

8/17/2012. Social Psychology: An Empirical Science. Social Psychology: An Empirical Science. Chapter 2

AP Psychology -- Chapter 02 Review Research Methods in Psychology

THE RESEARCH ENTERPRISE IN PSYCHOLOGY

Communication Research Practice Questions

Doing High Quality Field Research. Kim Elsbach University of California, Davis

Lecture Slides. Elementary Statistics Eleventh Edition. by Mario F. Triola. and the Triola Statistics Series 1.1-1

Choosing and Using Quantitative Research Methods and Tools

Design of Experiments & Introduction to Research

Research Approach & Design. Awatif Alam MBBS, Msc (Toronto),ABCM Professor Community Medicine Vice Provost Girls Section

Designing Experiments... Or how many times and ways can I screw that up?!?

Case studies. Course "Empirical Evaluation in Informatics" Prof. Dr. Lutz Prechelt Freie Universität Berlin, Institut für Informatik

EXPERIMENTAL RESEARCH DESIGNS

Chapter 02 Developing and Evaluating Theories of Behavior

Analysis A step in the research process that involves describing and then making inferences based on a set of data.

Sociological Research Methods and Techniques Alan S.Berger 1

04/12/2014. Research Methods in Psychology. Chapter 6: Independent Groups Designs. What is your ideas? Testing

LAST LECTURE. What Do Sociologists Do? Forms of Truth. What Is a Valid Sociological Topic? Any kind of human behaviour & social interaction

STATISTICAL CONCLUSION VALIDITY

DOING SOCIOLOGICAL RESEARCH C H A P T E R 3

PSYC 335 Developmental Psychology I

SAMPLE. 1. Explain how you would carry out an experiment into the effect playing video games has on alertness.

In this chapter we discuss validity issues for quantitative research and for qualitative research.

Empirical Research Methods for Human-Computer Interaction. I. Scott MacKenzie Steven J. Castellucci

Formulation of Research Design

Research Approaches Quantitative Approach. Research Methods vs Research Design

Checklist of Key Considerations for Development of Program Logic Models [author name removed for anonymity during review] April 2018

Research Methods. Page 1 of 23

Social Research (Complete) Agha Zohaib Khan

Research Plan And Design

Research Process. the Research Loop. Now might be a good time to review the decisions made when conducting any research project.

Georgina Salas. Topics 1-7. EDCI Intro to Research Dr. A.J. Herrera

What is Psychology? chapter 1

Research Methods in Psychology UNIT 3 PSYCHOLOGY 2013

Science is a way of learning about the natural world by observing things, asking questions, proposing answers, and testing those answers.

20. Experiments. November 7,

The Scientific Method

Validity and Quantitative Research. What is Validity? What is Validity Cont. RCS /16/04

DEFINING RESEARCH WITH HUMAN SUBJECTS - SBE

Disposition. Quantitative Research Methods. Science what it is. Basic assumptions of science. Inductive and deductive logic

Why do Psychologists Perform Research?

Audio: In this lecture we are going to address psychology as a science. Slide #2

Quantitative Methods. Lonnie Berger. Research Training Policy Practice

I. Methods of Sociology Because sociology is a science as well as being a theoretical discipline, it is important to know the ways in which

Experimental Methods. Anna Fahlgren, Phd Associate professor in Experimental Orthopaedics

(CORRELATIONAL DESIGN AND COMPARATIVE DESIGN)

Validity and Reliability. PDF Created with deskpdf PDF Writer - Trial ::

Topics. Experiment Terminology (Part 1)

Chapter 11. Experimental Design: One-Way Independent Samples Design

Experimental Research in HCI. Alma Leora Culén University of Oslo, Department of Informatics, Design

INTERVIEWS II: THEORIES AND TECHNIQUES 5. CLINICAL APPROACH TO INTERVIEWING PART 1

One slide on research question Literature review: structured; holes you will fill in Your research design

Incorporating Experimental Research Designs in Business Communication Research

Intro to HCI evaluation. Measurement & Evaluation of HCC Systems

Measuring impact. William Parienté UC Louvain J PAL Europe. povertyactionlab.org

TECH 646 Analysis of Research in Industry and Technology

Research Methodology

Psychology 205, Revelle, Fall 2014 Research Methods in Psychology Mid-Term. Name:

REVIEW FOR THE PREVIOUS LECTURE

Monitoring and Evaluation. What Is Monitoring & Evaluation? Principles and purposes of monitoring, evaluation and reporting

Research Methodology in Social Sciences. by Dr. Rina Astini

UNIT 3 & 4 PSYCHOLOGY RESEARCH METHODS TOOLKIT

Lecturer: Dr. Emmanuel Adjei Department of Information Studies Contact Information:

Big Idea 1 The Practice of Science. Big Idea 2 The Characteristics of Scientific Knowledge

Selecting a research method

Anamnesis via the Internet - Prospects and Pilot Results

Lecture 5 Conducting Interviews and Focus Groups

Transcription:

Usability Evaluation Why? Organizational perspective: To make a better product Is it usable and useful? Does it improve productivity? Reduce development and support costs Designer & developer perspective: Understanding the real world What should it do? Compare designs Is it good enough? Does it do what it should do? 1

Who? Who is your target population? Do you want expert usability advice or user feedback? Who do you have access to? Who are you going to select to evaluate? Who will conduct the evaluation? What? What do you want to know? What do users want? Is the product usable? Is it useful? What does the system need to do? How easy is the product to learn? How powerful is the product for expert users? Does it do what it claims to? What problems occur when using the system? What scope of the product will you evaluate? Full functionality vs. prototype components 2

Where? Where will the study take place? Field study Lab Study When? When in the design/development cycle will your evaluation take place? Will there be several iterations of evaluation? How long will it last? 3

Usability Methods Inquiry Inspections Testing www.best.com/~jthom/usability/ Usability Methods - Inquiry Contextual Inquiry Understanding the context in which a product is used The user is a partner in the design process The usability design process must have a focus Field Observation Artifacts are physical object in use at a site Outcroppings are noticeable physical traits that characterize the site Interviews and Focus Groups Ask users about their experiences and preferences with your product Focus groups enable you to identify common problems that people experience. 4

Usability Methods - Inquiry Surveys Ad hoc interviews, where the interviewer asks a set list of questions and records the responses. Questionnaires A written list of questions that you distribute to your users Journaled Sessions They are often used as a remote inquiry method for software user interface evaluation. Usability Methods - Inquiry Self-Reporting Logs Paper and pencil journals where users log their actions and observations while interacting with a product. Screen Snapshots The user takes screen snapshots at different times during the execution of a task or a series of tasks. 5

Usability Methods - Inspection Heuristic Evaluation Usability specialists judge whether each element of a UI follows established usability principles. Cognitive Walkthrough The person conducting the walkthrough constructs scenarios from a specification or early prototype and role plays the part of a user working with that interface. Formal Usability Inspections Adapts the software inspection methodology to usability evaluation. Typically the roles include: moderator, owner, recorder, and inspectors Usability Methods - Inspection Pluralistic Walkthroughs Meetings where users, developers, and usability professionals step through a scenario, discussing and evaluating each element of interaction. Feature Inspection Analyze the feature set of a product, usually given end user scenarios. Consistency Inspections These are used to ensure consistency across multiple products from the same development effort. 6

Usability Methods - Inspection Standards Inspections Ensure compliance with industry standards. (e.g. software products for the Windows environment should have common elements, such as the same functions on the File menu, Help menu, etc.) Guideline Checklists Usually checklists are used in conjunction with many of the usability inspection methods. (The checklist gives the inspectors a basis by which to compare the product.) Usability Methods - Testing #1 determine what you are trying to find out what do you want to know? describe this goal in a few objectives #2 design your test identify your users should match the expected user population as closely as possible determine sample size, cost vs. precision -- rule of thumb at least 10 subjects identify your variables independent variables (controlled) dependent variables (measured) Your design must also account for possible confounding factors 7

Usability Methods - Testing #2 design your test (continued) experimental design how you will order and run your experiment to eliminate non-interesting variables from the analysis between-groups and within-groups random assignment and counter-balancing develop the tasks the users will perform specify the experiment equipment methods and data collection techniques to be used physical set-up Identify required research personnel decide how you will analyze the results recorded e.g. will you generate statistics? Usability Methods - Testing #3 get your users selection from population how narrow will your selection be? selection procedure Gather information from your subjects #4 setup the test setup all components of the equipment and make sure it is all working #5 run the test and collect the data follow your experimental design EXACTLY #6 analyze the data find the big problems first summarize the quantitative and qualitative data gathered 8

Usability Methods - Testing Thinking aloud protocol ask the participant to vocalize his/her thoughts, feelings, and opinions while interacting with the product Co-discovery method have two participants perform the task together Question-asking protocol prompt the user to vocalize his/her thoughts, feeling and opinions by asking direct questions about the product Performance measurement usability test designed to determine hard, quantitative data quantitative data is gathered Theory vs. Reality Time constraints Financial constraints Technology constraints Working with human subjects Ethical approval Video data Planning, planning and more planning 9

Methodology Methodology techniques used to measuring phenomena, manipulate phenomena or control the impact of various phenomena Methods the tools (instruments, techniques, and procedures) to gather and analyze information each method can do different things methods are inherently flawed, though each is flawed differently use multiple methods can add strength by offsetting each other s weaknesses credible empirical knowledge requires consistency or convergence of evidence across studies based on different methods Research Strategies Research evidence in the social and behavioral sciences involves somebody doing something in some situation actors who humans (individuals, groups, organizations, communities) whose behaviour is being studied behaviour what all aspects of states and actions of the humans that might be of interest context when & where all relevant temporal, locational, and situational features of the environment 10

3 main criteria A. Generalizability how generalizable your results are to a population of actors B. Precision how precise your measurement of behaviours is as well as precise control over extraneous factors that are not being studies C. Realism how realistic the situation or context of where you gathered the evidences is in relation to the contexts to which you want your evidence to apply Try to maximize all three criteria, A, B and C Field strategies study natural behaviours Field study make direct observations of natural, ongoing systems, while intruding on and disturbing those systems as little as possible i.e. ethnography, case studies Field experiment works within an ongoing natural system as unobtrusively as possible except it manipulates one major feature manipulates one feature in order to assess effect on other behaviours of the system a compromised strategy, the researcher gives up some of the unobtrusiveness of the plain field study, to gain more precision 11

Experimental strategies Involves concocted rather than natural settings Lab experiment deliberately constructs a situation or context, defines the rules, and then has individuals perform tasks related to the concocted system can study behaviours with high precision by controlling extraneous factors Experimental simulation construct a situation but make it closely resemble a real situation flight simulators fine line between precision and realism Respondent strategies Systematic gathering of responses Sample survey obtain evidence to estimate distribution of some variables with a specific population careful sampling of actors from that population and systematically eliciting responses about the material of interest public opinion surveys Judgment study researcher concentrates on obtaining information about the properties of a certain set of stimulus psychophysical 12

Theoretical strategies Non-empirical strategies, does not involve actors behaving in context Formal theory researchers focuses on formulating general relations among a number of variables of interest these hypotheses are intended to hold over a broad range of populations Computer simulation attempt to model a real world system the system is complete and closed, without any behavior by any participants any behaviour outcomes are logical predictions from the theory the researcher built into the model Strategic issues Each strategy has inherent weaknesses and potential strengths since all strategies are flawed, but flawed in different ways, need to use more than one strategy, carefully selected to complement each other in their strengths and weaknesses 13

Comparison techniques baserates how often (at what rate or what proportion of time) does Y occur? if you don t know how often Y occurs in the general case, then how can you decide whether the rate of Y is some particular case is notably high or low high rate of birth defects among infants born to women who worked jobs involving continual use of video display tubes debate on baserates is common in the behavioural sciences how much is too much? differences asks whether Y is present (or at a high value) under conditions where X is present (or at a high value) or vise versa Comparison techniques (continued) correlations asks whether there is a systematic covariation in the values of two (or more) properties if X is high for some case is it likely that Y will also be high for that case and if X is low for some case, is it likely the Y will also be low for that case a high, positive correlation between X & Y means that when X is high, Y is also likely to be high and when X is low, Y is also likely to be low a high, negative correlation between X&Y means high X values will indicate low Y values and vice versa low or no correlation implies that knowing X does not give any indication about the likely value of Y X has no predictive power with respect to Y X & Y do not covary also possible to have a strong, nonlinear correlation 14

Randomization can t control all potential factors random assignment of cases to conditions helps to control these extraneous factors strengthen the credibility of the information effectiveness depends on the number of cases being allocated random allocation procedure: each case must be equally likely to end up in any given combination of conditions does not guarantee an equal distribution of any or all potentially extraneous factors, instead makes an unequal distribution highly unlikely (but not impossible) if you do end up with an unequal distribution, this could confound your results Sampling Basis for choosing the cases that are to be included in your study out of a larger population of potential cases effects the credibility of your results statistical reasoning requires that the cases in the study be a random sample of the population to which the results apply do you have a random sample vs. what is the nature of the population of which you actually have a random sample you do not actually select a random sample, you select a sample using a random procedure no guarantee that the resulting sample will mirror the population sample size is important for the credibility of experimental results 15

Validity Internal validity the degree to which the results of a study permit you to make strong inferences about causal relations did it occur by chance? Are other factors causing the effects? Depends on how well you can rule out all other plausible rival hypotheses Construct validity how well defined are you theoretical ideas External validity how confident you are that your results will hold upon replication and how generalizable are the results impacted by features of the study (size, nature, sampling, setting, procedures, etc.) no one study has external validity alone Data collection methods Self-Reports i.e.. participant fills out a questionnaire participants always know that their behaviour is being recorded Observations i.e.. Watching someone interact with a piece of software and recording behaviours of interest records of behaviour made by the investigator, an assistant, or by a physical instrument Observations by a Visible Observer participants know that they are being observed Observations by a Hidden Observer participants do not know that they are being observed 16

Data collection methods (2) Archival records analyze material in existing archives records were gathered external to the research activities i.e.. public records of births, deaths, marriages, etc. Records of Public Behaviour participants were aware that the behaviour was likely to be recorded and used political speeches Records of Private Behaviour participant may not know their behaviour will be used later for research (diary) or the results are not affected y the participant s awareness that the results will become public record (statistical data) Data collection methods (3) Trace measures records of behaviour are make by the behaviour itself but without the participants being aware that they are making the record dog-eared pages like self-reports in that the participants do the recording unlike self-reports in that they are normally not aware that there will be a record of their behaviour and that it will be used for research purposes 17

Self-reports questionnaire responses, interview protocols, rating scales, paper and pencil tests, etc. most frequently used type of measure Advantages: versatile, low in set-up cost and subsequent cost-per-case low dross rates (little information gets discarded) Disadvantages: potentially reactive the participants are aware their behaviour is being done for the researcher s purposes which may influence how they respond may try to make a good impression give socially desirable answers help the researchers get the answers they are looking for (or hinder) may do this consciously or unconsciously as a result, they don t tend to be very useful forms of evidence Observations watching or recording a participants behaviour Advantages: versatile data recorded can be very rich Disadvantages: also potentially reactive the participants are aware their behaviour is being observed which may influence their behaviour observer error used only on overt behaviour, not on thoughts, feelings or expectations costly in time and resources and a high dross rate use of hidden observers raises ethical concerns 18

Trace measures physical evidences of behaviour left behind as unintended residue of past behaviour Advantages: unobtrusive and therefore not reactive Disadvantages: not versatile, not available for many types of data we would like to study difficult to pinpoint the exact cause of the evidence time consuming to gather and process costly and high dross rates while this is a potentially strong method, is has been used very little in the social and behavioral sciences Archival records census data, court proceedings, diaries, material from newspapers, magazines, radio, and TV, administrative documents, contracts, etc. Advantages: lower cost may be the only way to gain evidence on past behaviours or for very large social units Disadvantages: reactive (public behaviour) versatility, high doss rates only a loose linkage between the record and the concept represented by it no opportunity to cross-validate your findings 19

Techniques for manipulating variables Selection make sure all cases of a condition condition are alike on a certain variable all six-year olds; all male or female advantage: convenient disadvantage: not a true experiment can t determine why the groups are different, just that the difference may cause an effect Techniques for manipulating variables (2) Manipulation of variables of the system manipulate a given variable and randomly assign participants to each condition advantage: not likely to be costly or time consuming have low doss rates disadvantage: applicable for only overt and tangible variables unintended experimental demands (hints as to what the researcher really wants) Hawthorne effect 20

Techniques for manipulating variables (3) Induction manipulation by less direct interventions #1 misleading instructions to the participant #2 use of false feedback #3 use of experimental confederates advantage: if done correctly, can minimize reactive behaviour disadvantage: involve deception -- raises ethical issues can backfire if the participant figures out the deception Observational techniques Thinking aloud protocol ask the participant to vocalize his/her thoughts, feelings, and opinions while interacting with the product Co-discovery method have two participants perform the task together Question-asking protocol prompt the user to vocalize his/her thoughts, feeling and opinions by asking direct questions about the product Performance measurement usability test designed to determine hard, quantitative data quantitative data is gathered 21

Protocol analysis field notes (paper and pencil) cheap, flexible hard to get detailed information researcher must be present, obtrusive audio recordings can be unobtrusive useful when the user is thinking aloud difficult to record other related information to synchronize later time consuming analysis Protocol analysis (2) video recordings can be unobtrusive can see everything within the camera range limited as to what the camera can look at different perspective than actually being there time consuming analysis computer logging cheap, easy, unobtrusive good for longitudinal studies difficult with off-the-shelf software can only give the system perspective (doesn t know what else is going on in the environment) large amount of data 22

Protocol analysis (3) user notebooks can gather unusual or infrequent feedback useful for longitudinal studies less detail (at a course level) interpreted records 23