Sorry! De informatie die je zoekt, is enkel beschikbaar in het Engels.
This programme is saved in My Study Choice.
Something went wrong with processing the request.
Something went wrong with processing the request.

Detecting Bad Science: Reviewing and Improving Social Science Research

This intensive course will make you a Bad Science Detective with a good purpose: to improve social science research.

In case you missed it: science is in a credibility crisis. At least half of researchers in the social and behavioural sciences in the Netherlands admit to have engaged in questionable research practices (Gopalakrishna et al., 2022a). Yes – for every pair of scholars you randomly choose, one is engaging in bad science. For every sixteen researchers you choose, one has even committed fraud or fabricated data. Though the number of retractions by academic journals for fraud, fabrication, plagiarism and other integrity violations is rising, most bad science is still undetected. The quality control system that peer review is commonly believed to be is a very lax one, and easy to fool (Smith, 2006). As a result, it should be no surprise that half of all studies do not replicate, and published effects are only half the original size upon replication (Open Science Collaboration, 2015). In sum: you cannot trust research to be valid and reliable, even when it is peer reviewed and published in the most prestigious journals.

How then can you tell the difference between good and bad science? What signals tell you something about the quality of research? As a bad science detective, you’ll be able to call bullshit on the texts that your professors require you to read – including their own work. At the same time, we will collectively improve the chances that bad science is identified. With a higher discovery rate of bad science, researchers will be more careful, and the quality of research will improve (Gopalakrishna et al., 2022b). In addition, by identifying the weaknesses in the work of others, you learn in which aspects you can improve your own research.

Professor René Bekkers

Professor René Bekkers

Prof. dr. René Bekkers is the director of the Center for Philanthropic Studies at the Department of Sociology of VU Amsterdam. He has been an Open Science advocate since 2012. He signs peer reviews, publishes research data analyzed when possible, and posts preprints and all materials for re-use on the Open Science Framework. He preregisters research designs whenever possible. Since 2019 he also publishes grant applications upon submission, regardless of funding success. More about him at his blog

Additional course information

  • Learning objectives

    At the end of this course, you will be able to use analytical tools and software to identify the weaknesses of research and evaluate the quality of research in the social and behavioural sciences. The primary analytical tool is iQUESST – identifying Questionable Social Science through Transparency. 

    The iQUESST acronym refers to the evaluation of research quality with respect to the:

    i        information on

    QU   the Question that the research answers: how informative would potential answers to the research question be for practice and theories?

    E       the Estimation method: is it able to provide an answer to the question, and is it the best choice?

    S       the Sample: is it useful to make inferences about the target population?

    S       the Stringency criterion: are the data and methods the best possible stress test of the research claims?

    T       through Transparency of the research: what does the research report tell you about the data and methods used to produce the results?

    The secondary analytical tool is the four validities framework (Vazire et al., 2022), to which iQUESST roughly corresponds as follows:

    1. Construct validity ≈ QUestion: poorly defined and badly operationalized constructs, ill-documented measures, and hypothesizing after results are known;

    2. Internal validity ≈ Estimation: selective attrition, non-causal mediation, lack of random assignment, reverse causality, incorrect chronology, omitted variable bias;

    3. External validity ≈ Sample: constraints on generality due to survivorship bias, selection bias, biased samples, or selective response;

    4. Statistical conclusion validity ≈ Stringency: problems with outliers, missing values, model misspecification, false assumptions, p-hacking, researcher degrees of freedom, the garden of forking paths, or low power.

  • Forms of tuition and assessment

    The course consists of four meetings, on two days per week: Mondays and Thursdays. Course meetings take 4 hours, and are scheduled in the afternoon, from 13.00 to 17.00. Each hour is 50 minutes of class time, followed by a 10-minute break.

    Meeting 1, Monday 6 January
    We start by getting to know each other, reviewing the course design and activities, and discussing the credibility crisis. We get started with nominations for target papers and create a schedule for the presentations in the second and third meeting.

    Meeting 2, Thursday 9 January
    The second day is a workshop in which we discuss the first half of the target papers.

    Meeting 3, Monday 13 January
    On the third day we discuss the second half of the target papers.

    Meeting 4, Thursday 16 January
    We discuss improvements of the research designs and analyses of the target papers.

    Assessment

    You successfully complete the course if you’ve participated in the course meetings, demonstrated the ability to identify weaknesses in research reports discussed during the meetings and in the assignments, and submitted a sufficiently detailed and constructively critical review of a target paper. The review is sufficiently detailed if you can describe the quality of the research using iQUESST. You receive extra praise if you detect statistical anomalies, incorrect interpretations, plagiarism, data fabrication, or undisclosed deviations from preregistrations. We use the four eyes principle: you read the review composed by another participant and check if you understand the report (Bekkers, 2021).

  • Assignments

    1. Describe how your bachelor or master program training taught you to think about the quality of research. Which aspects of research have you learned indicate that the research is of high quality, and which aspects indicate low quality? Reflect on these criteria referring to the readings for meeting 1. Which aspects of research that you learned about in your training do not necessarily indicate research quality, and by which criteria should they be replaced?
    2. Select a paper from the list of target papers compiled in the first meeting and read it thoroughly on your own.
    • Identify all possible weaknesses of the paper, starting from iQUESST and the examples of weaknesses given in the first meeting. Which study limitations did the authors note themselves? Which weaknesses does the paper have that the authors themselves did not describe?
    • Search for replications or commentaries on the target paper in papers that cite the target paper according to Google Scholar. What was the result of the replication? Which of the study weaknesses may have contributed to the replication showing different results than the original result? Which weaknesses do studies citing the target paper identify?
    • Summarize the study weaknesses you have identified in a five-minute presentation. List all weaknesses you have identified and explain one of them in detail so that participants who have not read the target paper can understand.
    1. Study the target paper for another course participant and suggest improvements to the weaknesses you find. Did you find additional weaknesses? Explain to what extent and how the improvements can effectively repair the weaknesses you identified.
    2. Write a short report on the weaknesses of your target paper and suggest improvements. Explain how the improvements effectively repair the weaknesses. If, in your view, repairing the problems is not possible and the paper is a total loss, explain why.

We are here to help!

Feel free to contact us anytime.

Contact

  • Yota
  • Programme Coordinator
  • Celia
  • Summer and Winter School Officer
Celia VU Amsterdam Summer & Winter School
  • Esther
  • Summer and Winter School Officer

Quick links

Research Research and Impact Support Portal University Library VU Press Office

Study

Education Study guide Canvas Student Desk

Featured

VUfonds VU Magazine Ad Valvas

About VU

About us Contact us Working at VU Amsterdam Faculties Divisions
Privacy Disclaimer Safety at VU Amsterdam Colofon Cookies Web archive

Copyright © 2024 - Vrije Universiteit Amsterdam