Date of Award

3-2024

Document Type

Dissertation

Degree Name

Ph.D.

Organizational Unit

Morgridge College of Education, Research Methods and Information Science, Research Methods and Statistics

First Advisor

Nicholas Cutforth

Second Advisor

Antonio Olmos

Third Advisor

Lilian Chimuma

Fourth Advisor

Ruth Chao

Keywords

Factor analysis, Keying and wording, Method effects, Mixed-methods explanatory sequential study, Psychometric instruments, Structural equation modeling

Abstract

This dissertation presents an innovative approach to examining the keying method, wording method, and construct validity on psychometric instruments. By employing a mixed methods explanatory sequential design, the effects of keying and wording in two psychometric assessments were examined and validated. Those two self-report psychometric assessments were the Effortful Control assessment (Ellis & Rothbart, 2001) and the Grit assessment (Duckworth & Quinn, 2009). Moreover, the quantitative phase utilized structural equation modeling to analyze 2,104 students’ responses and assess the construct of keying and wording. Various hypothetical models were investigated and evaluated. The reliability of each construct in each method was analyzed using several omega coefficients. Simultaneously, construct validity was discussed through model performance and parameter estimates. Following the quantitative phase, six one-on-one interviews were conducted in the qualitative phase. The purpose of the second phase was to gather participants’ opinions and gain insights to further interpret and understand the results obtained in the first quantitative phase. By integrating quantitative and qualitative approaches, this study aims to enhance the validity and meaningfulness of its conclusions.

The overarching purpose of this study was to assess the keying structure and method effects of keying and wording through a mixed methods explanatory sequential study. To achieve this goal, the study investigated different method factors that impacted the construct validity of psychometric assessments among adolescents. Findings from the first quantitative strand revealed the detection of the keying, although the structure of keying displayed inconsistent probability of variance associated with the reliability across different instruments. Additionally, method effects of keying and wording were identified, and problematic item questions were further examined in the qualitative phase. Results from the qualitative strand led to the conclusion that three factors influenced adolescents’ understanding of assessment questions. These were complex item questions, student’s characteristics, and school course plan. This study therefore addressed research gaps regarding the effects of keying and wording in psychometric instruments and contributed to understanding their impact on construct validity. The findings draw attention to the keying and wording methods in self-report psychometric instruments. This also impacted the accuracy of assessing the construct of interest and the further parameter estimates. Based on these findings, suggestions for future research were discussed and summarized.

Copyright Date

3-2024

Copyright Statement / License for Reuse

All Rights Reserved
All Rights Reserved.

Publication Statement

Copyright is held by the author. User is responsible for all copyright compliance.

Rights Holder

Lin Ma

Provenance

Received from ProQuest

File Format

application/pdf

Language

English (eng)

Extent

220 pgs

File Size

7.5 MB

Discipline

Statistics, Educational tests and measurements



Share

COinS