Generalized linear mixed-effects models (GLMMs) are increasingly used to analyze reaction time (RT) data in cognitive psychology. Yet, whether they genuinely outperform linear mixed-effects models (LMMs) or t-tests on aggregated data in realistic experimental settings remains unclear. In this preregistered simulation study, we systematically compared three types of GLMMs – a Gamma GLMM, an Inverse Gaussian GLMM, and a novel Probit GLMM – to LMMs and the t-test on aggregated data. As the main performance metric, we introduced adjusted power, a measure that adjusts statistical power to equalize methods in terms of confidence interval coverage. To this end, we used virtual experiments derived from five large datasets: three from numerical cognition experiments investigating the SNARC effect (Spatial-Numerical Association of Response Codes), and two from inhibitory control tasks. Across 1,480,000 simulations covering 36 configurations and including RTs from over 2,000 participants, we found that Gamma and Probit GLMMs outperform LMMs and the t-test on average, with the Probit GLMM showing a particular advantage for detecting the SNARC effect. However, no single method dominates across all configurations: performance depends substantially on the dataset and effect considered, and some mixed-effects models require robust error estimation to provide adequate coverage at small sample sizes. To make these findings actionable, we provide the interactive Shiny app Powering Up Cognitive Psychology, which allows researchers to explore how the results depend on analytical choices and to make informed decisions about which statistical method to use for their data analysis given their own experimental context.

Powering Up Cognitive Psychology: Comparing Statistical Methods for Detecting the SNARC Effect via Virtual Experiments / D'Atri, Federico; Hohol, Mateusz; Fantoni, Carlo; Egidi, Leonardo; Roth, Lilly; Cipora, Krzysztof. - (2026), pp. 1-43. [10.31234/osf.io/jnpgc_v1]

Powering Up Cognitive Psychology: Comparing Statistical Methods for Detecting the SNARC Effect via Virtual Experiments

Federico D'Atri
Primo
;
Carlo Fantoni;Leonardo Egidi;Krzysztof Cipora
2026-01-01

Abstract

Generalized linear mixed-effects models (GLMMs) are increasingly used to analyze reaction time (RT) data in cognitive psychology. Yet, whether they genuinely outperform linear mixed-effects models (LMMs) or t-tests on aggregated data in realistic experimental settings remains unclear. In this preregistered simulation study, we systematically compared three types of GLMMs – a Gamma GLMM, an Inverse Gaussian GLMM, and a novel Probit GLMM – to LMMs and the t-test on aggregated data. As the main performance metric, we introduced adjusted power, a measure that adjusts statistical power to equalize methods in terms of confidence interval coverage. To this end, we used virtual experiments derived from five large datasets: three from numerical cognition experiments investigating the SNARC effect (Spatial-Numerical Association of Response Codes), and two from inhibitory control tasks. Across 1,480,000 simulations covering 36 configurations and including RTs from over 2,000 participants, we found that Gamma and Probit GLMMs outperform LMMs and the t-test on average, with the Probit GLMM showing a particular advantage for detecting the SNARC effect. However, no single method dominates across all configurations: performance depends substantially on the dataset and effect considered, and some mixed-effects models require robust error estimation to provide adequate coverage at small sample sizes. To make these findings actionable, we provide the interactive Shiny app Powering Up Cognitive Psychology, which allows researchers to explore how the results depend on analytical choices and to make informed decisions about which statistical method to use for their data analysis given their own experimental context.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11368/3132241
 Avviso

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact