Reproducibility Project: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m [Task 1] Fix non-plural section headers
Remove unnecessary and unsourced details, add another source, change 'Outcomes and importance' section to 'Impact'
Line 6: Line 6:
In 2021, the project showed that of 193 experiments from 53 top papers about cancer published between 2010 and 2012, only 50 experiments from 23 papers could get replicated. Moreover, it showed that the effect sizes of that fraction were 85% smaller on average than the original findings. None of the papers had its experimental [[open science|protocols fully described]] and 70% of experiments required asking for key reagents.<ref>{{cite news |title=Dozens of major cancer studies can't be replicated |url=https://www.sciencenews.org/article/cancer-biology-studies-research-replication-reproducibility |access-date=19 January 2022 |work=Science News |date=7 December 2021}}</ref><ref>{{cite web |title=Reproducibility Project: Cancer Biology |url=https://www.cos.io/rpcb |website=www.cos.io |publisher=[[Center for Open Science]] |access-date=19 January 2022 |language=en}}</ref>
In 2021, the project showed that of 193 experiments from 53 top papers about cancer published between 2010 and 2012, only 50 experiments from 23 papers could get replicated. Moreover, it showed that the effect sizes of that fraction were 85% smaller on average than the original findings. None of the papers had its experimental [[open science|protocols fully described]] and 70% of experiments required asking for key reagents.<ref>{{cite news |title=Dozens of major cancer studies can't be replicated |url=https://www.sciencenews.org/article/cancer-biology-studies-research-replication-reproducibility |access-date=19 January 2022 |work=Science News |date=7 December 2021}}</ref><ref>{{cite web |title=Reproducibility Project: Cancer Biology |url=https://www.cos.io/rpcb |website=www.cos.io |publisher=[[Center for Open Science]] |access-date=19 January 2022 |language=en}}</ref>


== Statistical relevance ==
== Impact ==
The project, along with broader action in response to the [[replication crisis]], has helped spur changes in scientific culture and publishing practices.<ref>{{Cite web |last=Loken |first=Eric |date=2019-04-08 |title=The replication crisis is good for science |url=http://theconversation.com/the-replication-crisis-is-good-for-science-103736 |access-date=2023-11-07 |website=The Conversation |language=en-US}}</ref><ref name="Wired2017">{{cite magazine |last1=Apple |first1=Sam |date=22 January 2017 |title=The Young Billionaire Behind the War on Bad Science |url=https://www.wired.com/2017/01/john-arnold-waging-war-on-bad-science/ |magazine=Wired}}</ref> The results of the Reproducibility Project might also affect public trust in psychology.<ref name="osf.io">{{Cite journal|last1=Wingen|first1=Tobias|last2=Berkessel|first2=Jana B.|last3=Englich|first3=Birte|date=2019-10-24|title=No Replication, No Trust? How Low Replicability Influences Trust in Psychology|journal=Social Psychological and Personality Science|volume=11|issue=4|language=en-US|pages=454–463|doi=10.1177/1948550619877412|s2cid=210383335|issn=1948-5506|url=http://osf.io/4ukq5/download}}</ref><ref>{{Cite journal|last1=Anvari|first1=Farid|last2=Lakens|first2=Daniël|date=2019-11-19|title=The replicability crisis and public trust in psychological science|journal=Comprehensive Results in Social Psychology|volume=3|issue=3|pages=266–286|doi=10.1080/23743603.2019.1684822|issn=2374-3603|doi-access=free}}</ref> Lay people who learned about the low replication rate found in the Reproducibility Project subsequently reported a lower trust in psychology, compared to people who were told that a high number of the studies had replicated.<ref>{{Cite web|url=https://digest.bps.org.uk/2019/10/31/the-replication-crisis-lowers-the-publics-trust-in-psychology-but-can-that-trust-be-built-back-up/|title=The Replication Crisis Lowers The Public's Trust In Psychology — But Can That Trust Be Built Back Up?|date=2019-10-31|website=Research Digest|language=en|access-date=2019-11-30}}</ref><ref name="osf.io" />
Failure to replicate can have different causes. The first is a [[type II error]], which is when the null hypothesis fails to be rejected when it is false. This can be classified as a false negative. A [[type I error]] is the rejection of a null hypothesis even if it is true, so this is considered a false positive.{{Citation needed|date=October 2021}}

== Center for Open Science ==
The [[Center for Open Science]] was founded by Brian Nosek and Jeff Spies in 2013 with a $5.25 million grant from the [[Laura and John Arnold Foundation]].<ref name=Wired2017>{{cite magazine|last1=Apple|first1=Sam|title=The Young Billionaire Behind the War on Bad Science|url=https://www.wired.com/2017/01/john-arnold-waging-war-on-bad-science/|magazine=Wired|date=22 January 2017}}</ref> By 2017 the Foundation had provided an additional $10 million in funding.<ref name=Wired2017/>

== Outcome and importance ==
There have been multiple implications of the Reproducibility Project. People all over have started to question the legitimacy of scientific studies that have been published in esteemed journals. Journals typically only publish articles with big [[effect sizes]] that reject the null hypothesis. Leading into the huge issue of people re-doing studies that have already found to fail, but not knowing because there is no record of the failed studies, which will lead to more false positives to be published. It is unknown if any of the original study authors committed fraud in publishing their projects, but some of the authors of the original studies are part of the 270 contributors of this project.

One earlier study found that around $28 billion worth of research per year in [[Medicine|medical]] fields is non-reproducible.<ref>{{cite journal |volume=13|issue=6|pages=e1002165|doi=10.1371/journal.pbio.1002165|pmid=26057340|pmc=4461318|year=2015|last1=Freedman|first1=L. P.|title=The Economics of Reproducibility in Preclinical Research|journal=PLOS Biology|last2=Cockburn|first2=I. M.|last3=Simcoe|first3=T. S.}}</ref>

The results of the Reproducibility Project might also affect public trust in psychology.<ref name="osf.io">{{Cite journal|last1=Wingen|first1=Tobias|last2=Berkessel|first2=Jana B.|last3=Englich|first3=Birte|date=2019-10-24|title=No Replication, No Trust? How Low Replicability Influences Trust in Psychology|journal=Social Psychological and Personality Science|volume=11|issue=4|language=en-US|pages=454–463|doi=10.1177/1948550619877412|s2cid=210383335|issn=1948-5506|url=http://osf.io/4ukq5/download}}</ref><ref>{{Cite journal|last1=Anvari|first1=Farid|last2=Lakens|first2=Daniël|date=2019-11-19|title=The replicability crisis and public trust in psychological science|journal=Comprehensive Results in Social Psychology|volume=3|issue=3|pages=266–286|doi=10.1080/23743603.2019.1684822|issn=2374-3603|doi-access=free}}</ref> Lay people who learned about the low replication rate found in the Reproducibility Project subsequently reported a lower trust in psychology, compared to people who were told that a high number of the studies had replicated.<ref>{{Cite web|url=https://digest.bps.org.uk/2019/10/31/the-replication-crisis-lowers-the-publics-trust-in-psychology-but-can-that-trust-be-built-back-up/|title=The Replication Crisis Lowers The Public's Trust In Psychology — But Can That Trust Be Built Back Up?|date=2019-10-31|website=Research Digest|language=en|access-date=2019-11-30}}</ref><ref name="osf.io"/>


==See also==
==See also==

Revision as of 15:13, 7 November 2023

The Reproducibility Project: Psychology was a crowdsourced collaboration of 270 contributing authors to repeat 100 published experimental and correlational psychological studies.[1] This project was led by the Center for Open Science and its co-founder, Brian Nosek, who started the project in November 2011. The results of this collaboration were published in August 2015. Reproducibility is the ability to produce the same findings, using the same methodologies as the original work, but on a different dataset (for instance, collected from a different set of participants). The project has illustrated the growing problem of failed reproducibility in social science. This project has started a movement that has spread through the science world with the expanded testing of the reproducibility of published works.[2]

Results

Brian Nosek of University of Virginia and colleagues sought out to replicate 100 different studies that all were published in 2008.[3] The project pulled these studies from three different journals, Psychological Science, the Journal of Personality and Social Psychology, and the Journal of Experimental Psychology: Learning, Memory, and Cognition, published in 2008 to see if they could get the same results as the initial findings. In their initial publications 97 of these 100 studies claimed to have significant results.

Barriers to conducting replications of experiment in cancer research, The Reproducibility Project: Cancer Biology

The group went through extensive measures to remain true to the original studies, including consultation with the original authors. Even with all the extra steps taken to ensure the same conditions of the original 97 studies, only 35 (36.1%) of the studies replicated, and if these effects were replicated, they were often smaller than those in the original papers. The authors emphasized that the findings reflect a problem that affects all of science and not just psychology, and that there is room to improve reproducibility in psychology.

In 2021, the project showed that of 193 experiments from 53 top papers about cancer published between 2010 and 2012, only 50 experiments from 23 papers could get replicated. Moreover, it showed that the effect sizes of that fraction were 85% smaller on average than the original findings. None of the papers had its experimental protocols fully described and 70% of experiments required asking for key reagents.[4][5]

Impact

The project, along with broader action in response to the replication crisis, has helped spur changes in scientific culture and publishing practices.[6][7] The results of the Reproducibility Project might also affect public trust in psychology.[8][9] Lay people who learned about the low replication rate found in the Reproducibility Project subsequently reported a lower trust in psychology, compared to people who were told that a high number of the studies had replicated.[10][8]

See also

External links

References

  1. ^ Open Science Collaboration (28 August 2015). "Estimating the reproducibility of psychological science". Science. 349 (6251): aac4716. doi:10.1126/science.aac4716. hdl:10722/230596. PMID 26315443. S2CID 218065162.
  2. ^ Jarrett, Christian (27 August 2015). "This is what happened when psychologists tried to replicate 100 previously published findings". Research Digest. BPS Research Digest. Retrieved 8 November 2016.
  3. ^ Weir, Kristen. "A reproducibility crisis?". American Psychological Association. American Psychological Association. Retrieved 24 November 2016.
  4. ^ "Dozens of major cancer studies can't be replicated". Science News. 7 December 2021. Retrieved 19 January 2022.
  5. ^ "Reproducibility Project: Cancer Biology". www.cos.io. Center for Open Science. Retrieved 19 January 2022.
  6. ^ Loken, Eric (8 April 2019). "The replication crisis is good for science". The Conversation. Retrieved 7 November 2023.
  7. ^ Apple, Sam (22 January 2017). "The Young Billionaire Behind the War on Bad Science". Wired.
  8. ^ a b Wingen, Tobias; Berkessel, Jana B.; Englich, Birte (24 October 2019). "No Replication, No Trust? How Low Replicability Influences Trust in Psychology". Social Psychological and Personality Science. 11 (4): 454–463. doi:10.1177/1948550619877412. ISSN 1948-5506. S2CID 210383335.
  9. ^ Anvari, Farid; Lakens, Daniël (19 November 2019). "The replicability crisis and public trust in psychological science". Comprehensive Results in Social Psychology. 3 (3): 266–286. doi:10.1080/23743603.2019.1684822. ISSN 2374-3603.
  10. ^ "The Replication Crisis Lowers The Public's Trust In Psychology — But Can That Trust Be Built Back Up?". Research Digest. 31 October 2019. Retrieved 30 November 2019.