Abstract
This study investigates whether a computer-based version of a multiple-choice cloze reading test for English-language learners is comparable to its traditional paper-based counterpart. One hundred and twenty high school ELL students were recruited for the study. The research instruments included both paper and computer-based versions of a locally-developed reading assessment. The two tests are as similar as possible in terms of content, questions, pagination, format and layout. The design was counterbalanced so that two groups of learners took the tests in the opposite order and their scores were compared to address concerns about practice and order effect. Results indicate that the paper and computer-based versions of the test are comparable. These findings help validate the cross-mode comparability of assessments outside of the traditional discrete-point multiple choice tests which predominates in current research.
Copyright of articles is retained by authors and CALL-EJ. As CALL-EJ is an open-access journal, articles are free to use, with proper attribution, in educational and other non-commercial settings. Sources must be acknowledged appropriately.