Research on the reliability and credibility of scientific research sparks major debate
Research co-led by Dr Eike Rinke, has triggered responses from researchers at Caltech, UCL and Oxford, Stanford and Harvard.
The study, ‘Observing Many Researchers Using the Same Data and Hypothesis Reveals a Hidden Universe of Uncertainty’, set out to test the same prominent social science hypothesis – that greater immigration reduces support for social policies among the public.
161 researchers in 73 research teams were given the same data and hypothesis to independently test. Their research decisions, numerical findings and substantive conclusions varied significantly.
This research, co-led by POLIS’s Dr Eike Rinke, was published in the prestigious Proceedings of the National Academy of Sciences (PNAS) in October 2022.
The research showed more than 95% of the total variance in numerical results remained unexplained by the major identifiable decisions in each team’s workflow.
This revealed a universe of uncertainty due to a multitude of minute analytical decisions that remain hidden when considering a single study in isolation. Read our original, in-depth article about the research on the School website)
Since its publication in October, PNAS has pubished multiple responses that engage with the paper’s findings:
- In the 27 December 2022 issue (Vol. 119, No. 52), Professor Colin F. Camerer of the California Institute of Technology published his commentary on the research in ‘The apparent prevalence of outcome variation from hidden “dark methods” is a challenge for social science’.
- University College London and the University of Oxford’s Dr Per Engzell’s letter, ‘A universe of uncertainty hiding in plain sight’, was published along with a reply from Dr Rinke and his co-PI Dr Nate Breznau of the University of Bremen, ‘Reply to Engzell: Maybe in plain sight but out of focus’, in the 10 January 2023 issue (Vol. 120, No. 2).
- Finally, the 17 January 2023 issue (Vol. 120, No. 3) includes a response letter by Stanford University’s Dr Maya M. Mathur and Harvard University’s Dr Christian Covington and Professor Tyler J. VanderWeele, ‘Variation across analysts in statistical significance, yet consistently small effect sizes’, to which Dr Rinke and his co-PIs replied in the same issue (‘Reply to Mather et al.: Many-analyst studies should consider effect sizes and Cis’).
Reflecting on the clear impact made by the original research, Dr Rinke comments that –
We have been pleased to see that our recent article on the threat to research integrity posed by often tiny decisions in scientific analysis has sparked such productive responses from esteemed colleagues in the field of meta-research.<br><br>It is clear that this is only the beginning of an important conversation across various disciplines about how we can mitigate that threat.