Periodic Reporting for period 4 - IMPROVE (Innovative Methods for Psychology: Reproducible, Open, Valid, and Efficient)
Reporting period: 2021-12-01 to 2022-11-30
The IMPROVE extensively investigated the use and usefulness of preregistration as a promising methodological tool to avoid biases in the analyses and reporting of results. Team members considered over 550 preregistrations to assess their quality and effectiveness in countering biases emerging from researcher degrees of freedom in the analysis of studies. In over 450 of these preregistrations, we also considered the extent to which later publications of studies conform to the preregistered plans. The results of both studies showed that preregistrations do promote the use of power analyses, but are often insufficiently detailed to fully counter biases in the analysis of data. Furthermore, many articles were found to not conform to registered plans by adding, omitting, or changing hypotheses. This work contributes to improving scientific practice by pointing at how preregistration templates and practices can be improved.
In surveys covering over 1300 psychological researchers, the IMPROVE project found that poor statistical intuitions lead many researchers to underestimate the likelihood of studies yielding non-significant results, which could play a key role in the common failure to publish negative results. To handle publication bias, the IMPROVE project developed, extended, and refined several meta-analytic tools that allow for corrections of publication bias and selective reporting of (correlated) outcomes and developed a graphical tool that highlights potential publication biases in meta-analytic results. We also contributed to a major mutlidisciplinary survey featuring nearly 7000 academics on their use of questionable research practices and responsible research practices, and the factors that play a role in such suboptimal or preferred practices. This survey pointed at the import role of publication pressure, scientific norms, and mentoring in both types of behavior, which highlights that improvements in mentoring, promotion of the norms of good science, and tackling publication pressure could help promote responsible research practices. The IMPROVE project built a flexible, efficient and extensible simulator called the Science Abstract Model (SAM) which can be used to study effects of p-hacking and publication bias, and the usefulness of novel methodological and statistical tools to counter and correct for biases in the current scientific system. We used simulations to get a better understanding of severity of biases, and the effectiveness of corrective tools developed in the context of meta-analyses.
The IMPROVE project found that the calculation of effect sizes in meta-analyses is commonly irreproducible and prone to error. In line with the high prevalence of reporting errors in the literature, we found that reporting errors are easily overlooked by reviewers. The openly available tool statcheck retrieves statistical results from articles and manuscripts to check for internal consistency, allowing researchers to detect and correct errors in the reporting of statistical results, which have been found to be prevalent in the literature. In a large study using the updated version of statcheck, we found the implementation of statcheck during peer review at major psychology journals is associated with a steep decline in reporting errors in published articles, highlighting the usefulness of the tool to counter reporting errors.
The IMPROVE project studied the performance of state-of-the-art psychometric tools that allow users to study the latent psychological variables that underly psychological scales and tests. We applied these new tools to find that measurement characteristics are often different between experimental conditions and groups, to study interactions between latent variables in predicting important medical outcomes, and to gain a better understanding of how speed limits differently affect gender groups' performance on cognitive tests.
The results of the IMPROVE were disseminated via a meta-research conference, presentations at conferences, online presentations, workshops on preregistration, peer reviewed articles, preprints, a website, 15 open data sets, openly available computer code, and a registration template.