Animated videos in assessment: a comparison study of validity evidence from and test-takers’ Reactions to an animated and a text-Based Version of a situational judgment test
Karakolidis, AnastasiosORCID: 0000-0002-7460-7759
(2019)
Animated videos in assessment: a comparison study of validity evidence from and test-takers’ Reactions to an animated and a text-Based Version of a situational judgment test.
PhD thesis, Dublin City University.
The majority of tests in use today rely on static text to communicate information, ideas, and concepts and to pose questions. However, the overuse of text may have consequences for the validity of the inferences drawn from test-takers’ scores. This may be true especially in the case of assessments taken by test-takers with poor reading comprehension skills or with low levels of proficiency in the language of the test. More specifically, linguistic complexity can be a source of construct-irrelevant variance as test-takers’ performance can be negatively affected by factors that are beyond the focus of the assessment itself.
This study examined the extent to which the use of animated videos, as opposed to static text, can (i) reduce construct-irrelevant variance in test scores and (ii) have a positive impact on test-takers’ reactions to the test. A true experiment was conducted with 129 native and non-native English speakers, using an animated-video and a text-based version of the same situational judgment test of practical knowledge.
The results indicated that, overall, the variance attributed to construct-irrelevant factors (i.e., native language, English proficiency, and reading comprehension in English) was lower by 9.4% in the animated-video versus the text-based version of the test. In addition, the animated-video test was perceived by participants to be more valid, fair and enjoyable. Study participants also found the language used in the animated-video test less difficult to process, but no significant differences between the two formats were found with respect to the perceived difficulty of the content. Finally, the use of animated videos did not result in participants investing a significantly greater effort in the test. The implications of these and other findings, as well as recommendations for policy, practice, and future research are discussed in the final chapter of the thesis.