ZIB Education

National add-on study on computer-based assessment (CBA) in PISA 2012

Lead: DIPF

Computers influence our personal and professional life, so it is reasonable to measure knowledge and skills with the help of computers. A computer-based implementation of tests not only reduces the costs and effort associated with paper-based assessment, but also allows for new presentation and task formats. Therefore, the German add-on study to PISA 2012 tried to provide answers to questions about the possible uses of computers, especially for measuring reading comprehension. At the DIPF, research questions regarding two central topics were investigated:

1. Computer-based measurement of hypertext reading (digital reading)

In contrast to the "traditional" linear text format, hypertext has specific characteristics. Reading a typical website means not only reading continuous text passages, but also deciding which other websites are to be accessed via links and in which order. This selection should take into account additional criteria, such as the relevance and trustworthiness of the websites. Computers are therefore not only an output medium when assessing the reading, but also make it possible to read such digital texts. The national add-on study examined empirically whether hypertext reading competence differs from the reading competence typically measured in PISA, and to what extent differences can be explained by personal characteristics, such as how well someone can use a computer.

2. Studies of the administration mode (paper versus computer)

Other questions that the add-on study are concerned with effects that can arise from the transfer of (linear) paper texts to the computer. In order to keep the reading of texts comparable between the different modes (i.e. paper vs. computer), the best possible correspondence between the paper-based and computer-based forms was sought. Apart from the technical transferability itself, it was of particular interest whether the psychometric characteristics of the tasks (e.g. item difficulty) change with the mode change, and how any differences that may occur can be explained. One of the central questions was to what extent such mode effects depend on the respective answer format (e.g. free text writing or multiple choice) of the reading tasks.

Methodological approach/em>

The data collection of the add-on study took place as part of a national second test day of the PISA 2012 main study in Germany. 77 of the 212 PISA schools were recruited to participate, so that the data of almost 900 students was available for evaluation. Most of the students worked on assignments on the computer during the study. In addition to a test battery that assessed their computer skills, working memory capacity and basic reading skills, they also completed reading comprehension tasks and questions that assessed their attitudes towards these tasks. Some students also completed tasks in a test booklet (i.e. in paper form). In order to answer the research questions, results from the reading competence assessment were also included, which were computer-based on the first day of the PISA study.


Project related publications

Hahnel, C., Goldhammer, F., Kröhne, U., & Naumann, J. (2017). Reading digital text involves working memory updating based on task characteristics and reader behavior. Learning and Individual Differences, 59, 149–157. https://doi.org/10.1016/j.lindif.2017.09.001➚

Hahnel, C., Goldhammer, F., Kröhne, U., & Naumann, J. (2018). The role of reading skills in the evaluation of online information gathered from search engine environments. Computers in Human Behavior, 78, 223–234. https://doi.org/10.1016/j.chb.2017.10.004➚

Hahnel, C., Goldhammer, F., Naumann, J., & Kröhne, U. (2016). Effects of linear reading, basic computer skills, evaluating online information, and navigation on reading digital text. Computers in Human Behavior, 55, 486–500. https://doi.org/10.1016/j.chb.2015.09.042➚

Kroehne, U., Buerger, S., Hahnel, C., & Goldhammer, F. (2019). Construct Equivalence of PISA Reading Comprehension Measured With Paper‐Based and Computer‐Based Assessments. Educational Measurement: Issues and Practice, 38(3), 97–111. https://doi.org/10.1111/emip.12280➚

Kroehne, U., Hahnel, C., & Goldhammer, F. (2019). Invariance of the Response Processes Between Gender and Modes in an Assessment of Reading. Frontiers in Applied Mathematics and Statistics, 5, 1–16. https://doi.org/10.3389/fams.2019.00002➚

Zehner, F., Kroehne, U., Hahnel, C., & Goldhammer, F. (2020). PISA reading: Mode effects unveiled in short text responses. Psychological Test and Assessment Modeling, 62(1), 85–105.

Contact

089 289 28274 zib.edu@sot.tum.de