Skip to main content

Processes and Outcomes of Educational Evaluation from a Cross-Cultural Perspective

Periodic Reporting for period 1 - POEECCP (Processes and Outcomes of Educational Evaluation from a Cross-Cultural Perspective)

Reporting period: 2018-02-01 to 2020-01-31

Evaluation, considered as an integral part of the educational process in many countries, takes place at different levels with a wide range of practices. The full complexity of educational evaluation at various levels (i.e. system, school, classroom, and student) in different countries has hardly been recognized. It constitutes a tremendous barrier to consolidate theories on evaluation, and for countries to draw on successful experience and implement targeted educational policies for better quality and effectiveness. Important barriers include the insufficient methodology to soundly compare multiple student groups and multiple countries and a cultural insensitive approach to unfolding evaluation processes and outcomes. This project aims to provide a comprehensive framework for the dynamics of evaluation. It delineates the pathways in processes and outcomes of educational evaluation with data from the Programme for International Student Assessment (PISA) with a solid cross-cultural research methodology.

The substantive challenge lies in that evaluation research at various levels is not synchronized or coordinated, and results are not consistent across countries, presumably due to a lack of integration of levels and overlooking relevant moderators. In this project, a broader range of outcomes is proposed, a thorough scrutiny of evaluation practices is conducted, and various moderators from different levels are highlighted.

The methodological challenge rests upon data incomparability in large-scale assessment and surveys. In multiple cultural contexts, measurement bias from construct, method, and item levels can invalidate comparative results. In other words, whether, in different countries, (i) target constructs (e.g. student motivation to learn) are understood as having the same meaning, (ii) the instruments (e.g. Likert-scale response options) are used in the same manner, and (iii) item content has an unequivocal interpretation, need to be demonstrated before any meaningful comparisons are made. Thus, to validly compare countries or cultural groups, the comparability of data should be gauged with adapted survey designs and advanced psychometric tools.

Overall, this project aims to contribute to a better understanding of the processes and outcomes of evaluation in a variety of educational systems in two studies. It is expected to consolidate theories from a cross-cultural perspective and provide efficient policy recommendations tailored to specific contexts.
The overall objective of the project is realized through two studies and various outreaching activities.

Study 1 has a methodological focus to analyze scalable measures and their cross-cultural comparability. It tests the extent to which self-report multiple-item measures are comparable across cultural groups to ensure proper comparisons of structural relations. I investigated several design and statistical procedures to enhance data comparability using the PISA data (the manuscript is currently revised and resubmitted to Journal of Cross-Cultural Psychology), and I worked on innovative psychological network analysis in combination with conventional multigroup confirmatory factor analysis (the paper is published in the peer-reviewed journal of Studies in Educational Evaluation).

Study 2 has a substantive focus to delineate the pathways of evaluation processes of school evaluation and classroom assessment. I studied empirically the differential associations of school practices with achievement and sense of belonging of immigrant and non-immigrant students and the paper was published in Journal of Applied Developmental Psychology.

To promote methodological rigor in cross-cultural educational assessment, we (in collaboration with colleagues) have gave four workshops on cross-cultural research methods in Tilburg University in the Netherlands, the Method Week in Goethe University in Frankfurt, the 8th IEA International Research Conference, and the 51st Congress of the German Psychological Society in Frankfurt.

We also disseminated our research findings in the 51st Congress of the German Psychological Society in Frankfurt, the Comparative and International Educational Society Congress in 2019, the 24th Congress of Cross-Cultural Psychology, the SIG meeting of the European Association for Research on Learning and Instruction, the Tilburg Cross-Cultural Research Methods Conference and the OECD measurement invariance conference.
A few scientific papers are in the pipeline closely related to the theme of this project, including validly assessing meta-cognitive knowledge in the Programme for International Student Assessment, and combining process data in computer generated logs and self-reported questionnaire data to enhance the measurement validity of “soft skills”. These collaborations give us new direction to broaden the research scope and applying for other research grants. The next step is to scale up the methods used in this research for other large-scale surveys in education and other domains, and seek opportunities to collaborate with corporates to transform their personnel selection and employee survey practices.