TY - JOUR
T1 - Comparison of performance on multiple-choice questions and open-ended questions in an introductory astronomy laboratory
AU - Wooten, Michelle M.
AU - Cool, Adrienne M.
AU - Prather, Edward E.
AU - Tanner, Kimberly D.
PY - 2014/7/14
Y1 - 2014/7/14
N2 - When considering the variety of questions that can be used to measure students' learning, instructors may choose to use multiple-choice questions, which are easier to score than responses to open-ended questions. However, by design, analyses of multiple-choice responses cannot describe all of students' understanding. One method that can be used to learn more about students' learning is the analysis of the open-ended responses students' provide when explaining their multiple-choice response. In this study, we examined the extent to which introductory astronomy students' performance on multiple-choice questions was comparable to their ability to provide evidence when asked to respond to an open-ended question. We quantified students' open-ended responses by developing rubrics that allowed us to score the amount of relevant evidence students' provided. A minimum rubric score was determined for each question based on two astronomy educators perception of the minimum amount of evidence needed to substantiate a scientifically accurate multiple-choice response. The percentage of students meeting both criteria of (1) attaining the minimum rubric score and (2) selecting the correct multiple-choice response was examined at three different phases of instruction: directly before lab instruction, directly after lab instruction, and at the end of the semester. Results suggested that a greater proportion of students were able to choose the correct multiple-choice response than were able to provide responses that attained the minimum rubric score at both the post-lab and post-instruction phases.
AB - When considering the variety of questions that can be used to measure students' learning, instructors may choose to use multiple-choice questions, which are easier to score than responses to open-ended questions. However, by design, analyses of multiple-choice responses cannot describe all of students' understanding. One method that can be used to learn more about students' learning is the analysis of the open-ended responses students' provide when explaining their multiple-choice response. In this study, we examined the extent to which introductory astronomy students' performance on multiple-choice questions was comparable to their ability to provide evidence when asked to respond to an open-ended question. We quantified students' open-ended responses by developing rubrics that allowed us to score the amount of relevant evidence students' provided. A minimum rubric score was determined for each question based on two astronomy educators perception of the minimum amount of evidence needed to substantiate a scientifically accurate multiple-choice response. The percentage of students meeting both criteria of (1) attaining the minimum rubric score and (2) selecting the correct multiple-choice response was examined at three different phases of instruction: directly before lab instruction, directly after lab instruction, and at the end of the semester. Results suggested that a greater proportion of students were able to choose the correct multiple-choice response than were able to provide responses that attained the minimum rubric score at both the post-lab and post-instruction phases.
UR - http://www.scopus.com/inward/record.url?scp=84940218716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84940218716&partnerID=8YFLogxK
U2 - 10.1103/PhysRevSTPER.10.020103
DO - 10.1103/PhysRevSTPER.10.020103
M3 - Article
AN - SCOPUS:84940218716
SN - 1554-9178
VL - 10
JO - Physical Review Special Topics - Physics Education Research
JF - Physical Review Special Topics - Physics Education Research
IS - 2
M1 - 020103
ER -