Research – OSIRIS Work Packages (WP)

While over the past decade many interventions to improve reproducibility have been introduced, targeted at funders, publishers or individual researchers, only few of them have been empirically tested. OSIRIS will do just that, testing existing and newly developed interventions, including Open Science practices, through controlled trials. The underlying drivers, barriers and incentives of reproducibility will also be studied in a systematic way. Aim is to deliver and disseminate guidance about evidence-based interventions that can improve the reproducibility of scientific findings; and develop and test training to implement these.

OSIRIS has as objectives:

increase

reproducibility

To increase reproducibility

To understand the underlying drivers and effective interventions that increase reproducibility at funding, publishing, university, and researcher-level using systematic literature review, evidence mapping, policy audits, and interviews and focus group discussions with stakeholders. Results will be distributed through an open knowledge base and Open Access (OA) publications to reach global academia optimally.

develop

and test solutions

To develop and test solutions

To develop and test effective, evidence-based solutions for the reproducibility crisis across various stakeholders in policy and research practice by utilising well-controlled Randomised Controlled Trials (RCTs) rather than mere pilots, develop dashboards of indicators of reproducible research practices, and providing funders, publishers, researchers, and peer reviewers with guidance for judging reproducibility.

create

collaborative community

To create collaborative community

To create a community of stakeholders that will aid in educating and implementing better reproducible research practices using our results to create guidelines and training on how researchers can embed reproducibility in the design of their research and disseminate these widely, thereby increase the reproducibility of their scientific research. Additionally, we will perform quality audits at project and output levels to test these novel practices.

Reproducibility

in research projects

Reproducibility in research projects

To embed reproducibility in the strategy and design of research projects by informing researchers and convincing funders and journals to include measures and preconditions on reproducibility in their assessment of project proposals and articles.

Open Science Framework (OSF)

Open science is at the heart of the OSIRIS project. We use the Open Science Framework (OSF), a free and open-source project management tool, to develop, collaborate on, document, share and disseminate all our protocols, study materials and methodology documents. This makes the OSIRIS research transparent and contributes to the reproducibility of our research and the FAIRness of our data.

OSIRIS on OFS:

WP1: Coordination and Project Management

Here we will conduct the overall project coordination, and risk management and will monitor and report on the overall activities, progression, resources, quality, results, and milestones in compliance with the Grant Agreement and, if necessary, propose and implement modifications; to:

  • Ensure constant dialogue, efficient knowledge exchange and communication across the consortium for a successful collaboration. Seek consensus and optimize interactions between parties and WPs.
  • Monitor, prevent, mitigate, and report on project risks. Install mitigation strategies and manage those risks and mitigations to minimize the impact on the project.
  • Coordinate compliance with all ethics requirements and the budget.

The WP1 will facilitate the smooth and timely implementation of the OSIRIS project work plan to achieve its objective. This WP will result in the development of the OSIRIS Project Handbook, including templates, procedures, meetings and planning of activities as well as a Data management plan.

OSF links WP1:

WP2: Drivers, barriers and facilitators for reproducibility of research

Here we investigate the underlying drivers and effective interventions that increase reproducibility at funding, publishing, university and researcher-level; and compile and map evidence on reproducibility practice and what it means to different stakeholders, thus creating a strong knowledge base.

Systematic scoping review and evidence mapping on reproducibility measures

Many interventions, especially those linked to open science, have been proposed to combat the reproducibility crisis. To what extent these propositions are based on scientific evidence from empirical evaluations, is not clear.

We aim to identify interventions that have been formally investigated regarding their influence on reproducibility and replicability. A secondary aim is to list any facilitators or barriers reported, and to identify gaps in the evidence.

We will search broadly, by using electronic bibliographic databases, broad internet search and contacting experts in the field of reproducibility, replicability, and open science. Selection criteria will be any study investigating interventions that have been investigated for their influence on reproducibility of research and drivers and barriers of the implementation and effectiveness of interventions. We will analyze existing scientific evidence using scoping review and evidence gap mapping methodologies.

Interviews with researchers

We will interview 100 researchers across Europe in order to understand their views, motivations, personal practices and barriers regarding reproducibility. This will include how they personally define and view reproducibility, what reproducibility means in their research, their experiences within their own research domain and science in general, how they execute reproducibility and the drivers, facilitators and barriers towards reproducibility.

Focus group discussions with stakeholders

During a series of focus group discussions with stakeholders we will explore their views on and needs for reproducibility in research, and the interventions they currently apply or promote or would like to see applied. As stakeholder groups we identify research funders, publishers, government agencies, reproducibility action groups such as ReproducibiliTea clubs and National Reproducibility Networks, reproducibility officers and other specialised careers that promote reproducibility and civil society organisations.

OSF links WP2:

WP3: Interventions to improve reproducibility for researchers and institutions

Here we develop and test effective, evidence-based interventions that increase transparency and reproducibility.

Intervention development

The first intervention is the set-up of institutional networks in charge of reproducibility checks in consortium partner institutions. Two different networks of ECRs will be created to address two distinct and complementary facets of reproducibility: computational reproducibility, and methods reproducibility. These networks will use a Delphi process to develop specific modules/checklists for computational/methods reproducibility (general and field-specific if relevant) and “FAIRification” of data to maximize re-use of the data by others.

The Second intervention will rely on the use of existing indicators of reproducible research practices (an observatory, i.e. an online dashboard, showing several different reproducibility parameters at a glance and for consecutive years. The observatory will be used at both the team and institutional levels.

Intervention testing

After developing the interventions, we will test those empirically to demonstrate their usefulness. We will implement a meta-research approach to demonstrate empirically, in randomised controlled trials, that the proposed interventions are able to increase, the transparency as well as the proportion of reproducible results in those institutions.

OSF links WP3:

WP4: Interventions to improve reproducibility for funders and journals

 

Here we will develop and evaluate automatic systems for compliance to reproducibility that can be used by publishers and/or funders. We will also identify guidance and policies that can be implemented by funders.

Checklist to assess the level of reproducibility

We will develop a checklist to assess the level of reproducibility of research output at a funding/journal level and apply this checklist to research published or funded by specific journals/funders. This can be seen as a pilot study for the Observatory in WP3, and the results of the assessment will be used as input in WP3.

Observational study

We will carry out an observational study to associate reproducible outcomes with facilitating mechanisms. Hereto, we will collect a set of reproducible and non-reproducible studies and compare whether and which measures were applied by the journal or the funders of each study. This information will feed into a causal diagram of possibly associated measures and control for confounding factors such as discipline, topic, gender, and whether it is qualitative or quantitative research.

Intervention for publishers

We will randomize incoming manuscripts in one or two journals in different disciplines (PLoS One and a journal in a different discipline) to be scrutinized for reproducibility or not. Then, we will assess the level of reproducibility for two experimental groups, using manual checks and using SciScore (an automatic machine learning system) at the peer review stage. In these experimental arms, the report will be sent back together with the peer review report.

Intervention for funder

We will develop a checklist to assess the level of reproducibility of research proposals. This checklist will be shared with grant proposal peer reviewers, who will user-test whether the checklist can be used to assess the anticipated reproducibility of future research output. Reviewers will be randomised to receive the checklist or not and the difference between the two groups in perceived reproducibility of the granted proposals will be analysed.

OSF links WP4:

WP5: Training and guidance to increase reproducibility

In WP5 we will gather all the knowledge from the other work packages. Through a process of facilitated co-creation, design and user-testing, we will develop effective and impactful resources that can be used to train researchers in how they can increase the reproducibility of their research.

Researcher roundtable

We will run a guided discussion with the research team and the advisory board to reflect on the issues raised by the research of the drivers, barriers and facilitators of reproducibility and examples of effective and successful interventions which improve the reproducibility of research for researchers. With the output of this roundtable, we will develop a prototype for training resources for the three types of stakeholders (funders, individual researchers and journals).

Co-creation of training resources

In three workshops we will discuss the prototype training resource developed in the researcher roundtable. Through a facilitated discussion we will test the language, understanding and usefulness of the prototype material, and identify questions and concerns raised to further develop the training materials.

Co-design user testing workshop

We will hold three workshops to capture ideas for the effective delivery of the communication materials including the context on which the materials can be used.

Co-design user testing workshop

In this last step, we will test the effectiveness of the training package in a randomized controlled trial. After which we will develop a MOOC for the dissemination of the training material.

OSF links WP5:

WP6: Communication and Dissemination

Here we will further highlight, showcase and ensure the evidence-based impact of OSIRIS throughout the duration of the project and beyond. We will produce a strategic Dissemination and Communication Plan (DCP) to detail the communication and dissemination activities of OSIRIS and share its successes and lessons learned. We will also raise awareness of the objectives and achievements of OSIRIS through traditional (e.g., print) and non-traditional (e.g., social media platforms) means of communication as well as the project website and will map out, engage and maintain a strong stakeholder network through a series of events both online and face to face and will develop an exploitation roadmap towards further validation and sustainability of OSIRIS.

OSF links WP6:

Information for interviewees and focus group participants:

What is the project about?

Reproducibility is crucial to progress and impact of Research and Innovation (R&I) as it confirms or corrects the outcomes of single studies, resulting in higher quality research, more reliable and implementable outcomes, and reduction of research costs. There is limited evidence as to what works to improve reproducibility and how practices to improve reproducibility are implemented.

The EU-funded Horizon Europe project Open Science to Increase Reproducibility In Science (OSIRIS) aims to gather knowledge on the underlying drivers of reproducibility, testing effective evidence-based solutions, identifying incentives for reproducibility by stakeholders, and embedding reproducibility in research design.

In order to understand the underlying drivers and inform development of effective interventions that increase reproducibility, we will carry out in-depth semi-structured interviews and focus group discussions with researchers and other stakeholders from various institutions in Europe.

How will you be involved?

If you agree to participate in this study, you will be asked to schedule an online (Microsoft Teams) or in-person interview led by a research team member from KU Leuven, the University of Oxford or UMC Utrecht. The questions asked will focus on your views, motivations, personal practices, and facilitator or barriers you face regarding reproducibility in research. This will include how you define and view reproducibility in science, what your experiences are in your own field of research and in science in general. The interview will typically be held in English, however, depending on our internal availability we may be able to accommodate non-English interviews. We expect an interview to typically take 60 minutes though shorter or longer sessions could potentially be accommodated during scheduling. All interviews will be audio-recorded, transcribed and anonymised. Afterward, it will be sent to you for an interviewee transcript review (ITR) in which you can offer clarifications or request additional anonymization modifications to the transcript.

If you are participating in a focus group discussion, you will join an in-person or online focus group led by a research team member from KU Leuven, the University of Oxford or the Mario Negri Institute. The discussion will focus on views on and needs for reproducibility in research, barriers and facilitators, and the interventions different stakeholder groups currently apply or promote or would like to see applied. We expect focus group discussions to take 60-120 minutes. Discussions will be audio-recorded, transcribed and anonymised. Afterwards, it will be sent to you for review.

Use of your personal data

Personal data about you collected during the interviews or focus group will be processed in accordance with the General Data Protection Regulation (GDPR). Only personal data required for the purposes of this study will be collected and processed. The data will be processed on the basis of public interest. This means that the research will lead to advances in knowledge and generate insights that (directly or indirectly) benefit society.

Digital recordings of interviews or focus groups will be stored on a secure server managed by the institution of the interviewer (KU Leuven, University of Oxford, Mario Negri Institute or UMC Utrecht, respectively). Each interviewee will be assigned a unique, coded identifier that will be used in all records instead of actual identifiers in order to pseudonymise the information. A record that links each coded identifier to the actual interviewee name will be maintained separately in a password protected file on a secure server. Transcripts will be anonymised and stored on a secure Teams and Sharepoint platform managed by KU Leuven to facilitate collaborative analysis by the research team across KU Leuven, University of Oxford and UMC Utrecht.

After completing the study all audio recordings will be deleted and all personal data held about interviewees and participants will be deleted.

Your rights

You have the right to request more information about the use of your data. In addition, you have the right to access, rectify or erase your data unless exercising these rights would render impossible or seriously impair the achievement of the research objectives.

If you wish to exercise one of these rights, please contact the researchers using the contact details at the bottom of this information sheet.

How will the collected information be used

The results of this study are expected to be used in different outputs, such as publications, including reports or website content, or presentations at scientific meetings. In any sort of publicly available output, we will not include any information that will make it possible to identify you unless you give your permission.

Anonymous interview and focus group transcripts, reviewed by interviewees and participants, will be deposited and published in the KU Leuven Research Data Repository (RDR) to make the materials available for future use in research and education. After 10 years we will decide whether it is necessary to keep these data for a longer time. When further preservation is no longer necessary the data will be deleted.

Can I withdraw from the project?

Your participation is completely voluntary and you may withdraw from the research at any time. You do not have to answer any questions you do not want to answer during the interview. If you decide to withdraw, we can remove your interview from the study until the time of publishing of the results.. If you choose not to be in this study, it will not affect your current or future relations with KU Leuven or any other project consortium member.

What are the benefits and risks?

Participating in this research will allow you to reflect on your research process and your experiences with barriers and incentives towards reproducibility in science.

The results of the project will benefit the scientific community by aiming at reforming the R&I system such that reproducibility is more accepted, practiced, and recognized within the global scientific practice by 2026. In addition, researchers and policy-makers will get access to the wealth of data that OSIRIS will collect.

Should any of the topics discussed cause you distress then we will stop the interview.

Whilst we will take all precautions to keep your personal information secure and confidential and to anonymise all interviews and focus group discussions, there may be a small risk for your identity to be ascertained in the study or for a data breach to occur.

Where can I get more information?

If you have questions about this research, you may contact one of the following KU Leuven researchers: Veerle Van den Eynden, Research Coordination Office, at  veerle.vandeneynden@kuleuven.be, or Magdalena Kozula, Faculty of Psychology and Educational Sciences, at magdalena.kozula@kuleuven.be.