Date of Award

1-1-2016

Document Type

Dissertation

Degree Name

Ph.D.

Organizational Unit

Morgridge College of Education, Research Methods and Information Science, Research Methods and Statistics

First Advisor

Kathy E. Green, Ph.D.

Second Advisor

Donald Bacon

Third Advisor

Duan Zhang

Fourth Advisor

Krystyna Matusiak

Keywords

Measurement, Usability, User experience

Abstract

Consumers spend an increasing amount of time and money online finding information, completing tasks, or making purchases. The quality of the website experience has become a key differentiator for organizations--affecting whether they purchase and their likelihood to return and recommend a website to friends. Two instruments were created to more effectively measure the quality of the website user experience to help improve the experience.

Three studies used Classical Test Theory (CTT) to create a new instrument to measure the quality of the website user experience from the website visitor's perspective. Data were collected over five years from more than 4,000 respondents reflecting on experiences with more than 100 websites. An eight-item questionnaire of website quality was created - the Standardized User Experience Percentile Rank Questionnaire (SUPR-Q). The SUPR-Q contains four factors: usability, trust, appearance, and loyalty. The factor structure was replicated across three studies, with data collected both during usability tests and retrospectively in surveys. There was evidence of convergent validity with existing questionnaires, including the System Usability Scale (SUS). An initial distribution of scores across the websites generated a database used to produce percentile ranks and make scores more meaningful to researchers and practitioners. In Study 4, a new set of data and confirmatory factor analysis (CFA) confirmed the factor structure and generated alternative items that work on non-e-commerce websites. The SUPR-Q can be used to generate reliable scores in benchmarking websites, and the normed scores can be used to understand how well a website scores relative to others in the database.

A fifth study was designed to develop and evaluate guidelines regarding the quality of the user experience that could be judged by experts. Study 5 establishes a Calibrated Evaluator's Guide (CEG) for evaluators to review websites against a set of guidelines to predict perceptions of quality of website user experience. The CEG was refined from 105 to 37 items using the many-faceted Rasch model. The CEG was found to complement the SUPR-Q by providing a more detailed description of the website user experience. Suggestions for practical use and future research are discussed.

Publication Statement

Copyright is held by the author. User is responsible for all copyright compliance.

Rights Holder

Jeff Sauro

Provenance

Received from ProQuest

File Format

application/pdf

Language

en

File Size

148 p.

Discipline

Information Science



Share

COinS