Date of Award

2020

Document Type

Masters Thesis

Degree Name

M.A.

Organizational Unit

Morgridge College of Education, Research Methods and Information Science, Research Methods and Statistics

First Advisor

Denis Dumas

Second Advisor

Peter Organisciak

Third Advisor

Garrett Roberts

Fourth Advisor

Jesse Owen

Keywords

Divergent thinking, Elaboration, Originality, Reliability, Text-mining model

Abstract

The increased use of text-mining models as a scoring mechanism for divergent thinking (DT) tasks has sparked concerns about the ways in which automated Originality scores may be influenced by other dimensions of DT, especially Elaboration. The debate centers around the question of whether too much variance in automated Originality scores is accounted for by the number of words a participant uses in a response (i.e., Elaboration), and, thus, how the influence of Elaboration can affect the reliability of Originality scores. Here, a partial correlation analysis, in conjunction with text-mining and psychometric modeling, is conducted to test the degree to which the reliability of Originality scores produced via a freely-available text-mining system is dependent on the variance explained by Elaboration. Findings reveal that, when modern methodological recommendations for text-mining Originality scoring are applied, the reliability of Originality scores estimated by the GloVe 840B text-mining system is not meaningfully confounded by Elaboration. I conclude that, even when the variance attributed to Elaboration is partialled out, this method is capable of providing reliable Originality scores.

Publication Statement

Copyright is held by the author. User is responsible for all copyright compliance.

Rights Holder

Shannon Marie Maio

Provenance

Received from ProQuest

File Format

application/pdf

Language

en

File Size

57 p.

Discipline

Psychology



Share

COinS