The Effect of Differentiated Test Format and Question Type on Reading Comprehension Test Performance

Author(s)

Erin Byrd, SVHS

School Name

Spring Valley High School

Grade Level

10th Grade

Presentation Topic

Psychology and Sociology

Presentation Type

Non-Mentored

Written Paper Award

1st Place

Abstract

The purpose of this experiment was to see how student performance differed between a computer-based and paper-based format of a reading comprehension test as well as between a multiple choice and short answer version of that test. 46 high school students took a reading comprehension test that contained three reading passages and a total of twenty questions. Students could have either had a paper-based test (PBT) or a computer-based test (CBT), with either multiple choice or short answer questions. It was predicted that the scores of the PBT and the CBT would not be different, the multiple choice scores would be higher than the short answer scores, and there would be no interaction effect between the test format and question type on the test scores. The first two hypotheses were supported, but the third was not. Using a two-way ANOVA, the scores of the PBT were not significantly different from the scores of the CBT (F(1,42) = 0.954, p = 0.334), and students scored significantly higher on the multiple choice test than the short answer test (F(1,42) = 2.909, p = 0.095), at α=0.10. A significant interaction was found between test format and question type (F(1,42) = 4.346, p = 0.043), so simple effects were used to determine where the significance lay. Overall, student performance was equal regardless of the test format, and students performed better on multiple choice tests compared to short answer.

Location

Neville 321

Start Date

4-14-2018 1:45 PM

Presentation Format

Oral and Written

COinS
 
Apr 14th, 1:45 PM

The Effect of Differentiated Test Format and Question Type on Reading Comprehension Test Performance

Neville 321

The purpose of this experiment was to see how student performance differed between a computer-based and paper-based format of a reading comprehension test as well as between a multiple choice and short answer version of that test. 46 high school students took a reading comprehension test that contained three reading passages and a total of twenty questions. Students could have either had a paper-based test (PBT) or a computer-based test (CBT), with either multiple choice or short answer questions. It was predicted that the scores of the PBT and the CBT would not be different, the multiple choice scores would be higher than the short answer scores, and there would be no interaction effect between the test format and question type on the test scores. The first two hypotheses were supported, but the third was not. Using a two-way ANOVA, the scores of the PBT were not significantly different from the scores of the CBT (F(1,42) = 0.954, p = 0.334), and students scored significantly higher on the multiple choice test than the short answer test (F(1,42) = 2.909, p = 0.095), at α=0.10. A significant interaction was found between test format and question type (F(1,42) = 4.346, p = 0.043), so simple effects were used to determine where the significance lay. Overall, student performance was equal regardless of the test format, and students performed better on multiple choice tests compared to short answer.