Date of Award

1-1-2016

Document Type

Dissertation

Degree Name

Ph.D.

Organizational Unit

Morgridge College of Education, Research Methods and Information Science, Research Methods and Statistics

First Advisor

Kathy E. Green, Ph.D.

Second Advisor

Duan Zhang

Third Advisor

Antonio Olmos

Fourth Advisor

Bin Ramke

Keywords

Latent class analysis, Mathematics education, Mixture Rasch model, TIMSS-2011, Turkish educational system, Validity

Abstract

This study provides a comparison of the results of latent class analysis (LCA) and mixture Rasch model (MRM) analysis using data from the Trends in International Mathematics and Science Study - 2011 (TIMSS-2011) with a focus on the 8th-grade mathematics section. The research study focuses on the comparison of LCA with Mplus version 7.31 and MRM with WinMira 2011 to determine if results obtained differ when the assumed psychometric model differs. Also, a log-linear analysis was conducted to understand the interactions between latent classes identified by LCA and MRM. The data set used in the study was from four diverse countries (Turkey, USA, Finland, and Singapore) participating in TIMSS-2011. There are instructional differences and historical performance differences for each country, which was why they were selected. Analyses yielded class results associated mostly with nation of the participants, which was, in turn, associated with performance level.

Although the two approaches and the outcomes in terms of class designations overlapped, assumptions about the nature of the data and the information derived from each analysis differed. The literature review summarized the theory and application of latent class analysis and the mixture Rasch model in identifying latent classes in the social sciences. The results suggest that TIMSS-2011 8th-grade mathematics data yield different subgroups based on ability levels of students.

The findings of this paper do not reveal unequivocally whether a model based on primarily qualitative differences (LCA), that is, different strategies, instructional differences, curriculum etc. or a model including additional factors of quantitative differences within strategies (MRM) should be used with this particular dataset. Both of the tests provided similar results with more or less similar interpretations. Both techniques fit the data similarly, a result found in prior research. Nonetheless, for tests similar to TIMSS exams, item difficulty parameters can be useful for educational researchers giving potential priority to use of MRM.

Publication Statement

Copyright is held by the author. User is responsible for all copyright compliance.

Rights Holder

Turker Toker

Provenance

Received from ProQuest

File Format

application/pdf

Language

en

File Size

154 p.

Discipline

Statistics, Educational Tests & Measurements, Elementary Education



Share

COinS