ZIB Education

Validation of Creative Thinking in PISA 2022
and Automated Coding of Written Responses

Current Cooperation Partner

  • Technical University Munich, TUM School of Education, Centre for International Student Assessment (ZIB), Munich (Dr. Jennifer Diedrich, Prof. Dr. Kristina Reiss)
  • University of Graz, Institute of Psychology, Graz, Austria) (PD Dr. Mathias Benedek➚)
  • DIPF | Leibniz Institute for Research and Information in Education, Frankfurt am Main (Dr. Fabian Zehner➚)
Figure 1: Processes and Domains of Creative Thinking in PISA 2022 (OECD 2019, S. 17).

As the innovative domain in PISA 2022, students’ creative thinking will be assessed. Creative thinking, an important prerequisite for individual and social success in the coming decades (Lucas & Spencer, 2017), has not yet been systematically compared at the international level. PISA is the first international large-scale assessment to take up this challenge.

The OECD contractor (ACT Next) has developed instruments for this purpose to record three processes of creative thinking in four sub-domains (Figure 1; OECD 2019) and examined the interrater reliabilities in three pilot studies. Additionally the following concepts will be assessed as part of the – student, teacher, parent, and school – context questionnaires: Ideas about and openness to creativity, creative activities within and outside of school and in the family, creativity-related self-efficacy expectations and self-concept.

Before test results can be interpreted in order to inform education policy, the measuring instruments must be validated. This means that it must be ensured that the test really does capture creative thinking (in all its complexity). Because of the innovative nature of this new domain in PISA, the development of the instruments could not be based on any preliminary work in standardised, international assessment, so this test places special emphasis on validity. To this end, further instruments will be implemented in a national supplementary study in Germany as part of the context questionnaires. For example, divergent thinking and creative achievement (Diedrich et al., 2018) will be additionally surveyed nationally to test ecological validity.

Another strand of this project is devoted to exploring the possibilities of automatic coding of open text responses (Zehner et al., 2016) to creative stimuli. For this purpose, software that can compare the semantics of the responses will be used. In addition, the international comparability of the coding of tasks for creative thinking will be investigated.


References

Diedrich, Jennifer; Jauk, Emanuel; Silvia, Paul J.; Gredlein, Jeffrey M.; Neubauer, Aljoscha C.; Benedek, Mathias (2018): Assessment of real-life creativity: The Inventory of Creative Activities and Achievements (ICAA). In: Psychology of Aesthetics, Creativity, and the Arts 12 (3), S. 304–316. DOI: 10.1037/aca0000137.

Lucas, Bill; Spencer, Ellen (2017): Teaching Creative Thinking. Developing learners who generate ideas and can think critically. La Vergne: Crown House Publishing (Pedagogy for a changing world). Online verfügbar unter https://www.researchgate.net/publication/320324550_Teaching_Creative_Thinking_Developing_learners_who_generate_ideas_and_can_think_critically.

OECD (2019): Framework for the Assessement of Creative Thinking in PISA 2021. Third draft.

Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank (2016): Automatic Coding of Short Text Responses via Clustering in Educational Assessment. In: Educational and psychological measurement 76 (2), S. 280–303. DOI: 10.1177/0013164415590022.

Contact

089 289 28274 zib.edu@sot.tum.de