Skip to Main Content

Evidence Synthesis: Step 1: Preparation

Staying Organized

  • Decide on a team communication strategy and space for shared work. This might look like:
    • Microsoft Teams site
    • Shared One Drive or Google Drive folders
    • Slack channel
    • Regular Zoom meetings
    • Trello or another project management tool
  • Decide on a citation manager to use and create a dedicated folder or library for the project
    • Classes are offered regularly through HSL
  • Create a Covidence account with your UNC email 
    • Classes are offered regularly through HSL
  • Begin documenting your work in a Word or Google document. You will use these details when writing your manuscript, so be as comprehensive as possible.
1: Preparation 2: Protocol 3: Searching 4: Deduplication 5: Screening 6-7: Data Extraction & Quality Assessment 8: Writing & Reporting

Questions to Ask Before Starting

What is the goal of your review?

If you are doing a review for a class, thesis, capstone, or dissertation, see the systematized review page in this guide. Evidence syntheses like systematic and scoping reviews are typically intended for publication in a journal. 

Do you have the time to do a full project? 

On average, it takes roughly 67 weeks (16 months) to do a systematic review, from start to publication. Keep in mind that this can vary based on how many people are on the team, how many results have to be screened, whether there is funding to support the project, and other factors. If you don't have 12-24 months to work on it, you may want to opt for a systematized review instead.

Do you have team members to help you?

You should have at least 3 people on your team. This is especially important during the screening and data extraction steps, and it helps mitigate bias in the review. 

Does your team have the proper expertise?

View the section for Creating a Team (below) to see the types of expertise needed for an evidence synthesis project.

Has a review on the topic already been done? 

If a review on your topic has been completed recently or is in progress, you'll want to change the scope of your topic. It's important to show that your work is contributing something meaningful to the existing knowledge base.

If a review on your topic exists but was done poorly, or if new evidence has been published since the review was completed, it may be justifiable to do another review on the same topic. 

Defining Your Research Question

Defining the parameters of your question is an important first step. This requires identifying what will be included and what won't be included in the review. 

Reasons to define your question:

  • Transparency: Tells readers what question you plan to answer and how you plan to answer it
  • Organization and Focus: Having a set of clear guidelines keeps you focused on the main question and ensures you don't waste time & energy on tangential questions that could muddy up the results or make you feel overwhelmed
  • Defining Important Concepts: Scoping your question involves defining and clarifying key concepts. This is important during the searching and screening processes. 

 

Research Question Frameworks

You might find it useful to refer to a question framework as you define your research question. There are many frameworks, and each framework presents a different set of concepts to consider in your question.

Note that some question frameworks are better suited to specific disciplines than others, and no single framework is the best for all cases. If you don't see a framework that works for your question, there are at least 25 established frameworks you can choose from. Sometimes good research questions don't fit into an established framework, and that is okay. A framework is just that -- a guideline to help you decide which concepts are most important for your end goal.

Listed below are some popular research question frameworks for social science disciplines, along with examples.

 

PEO

  • Used for qualitative research questions

Research question: What kinds of interventions or programs can improve literacy rates for low-income elementary school students with dyslexia?

Concept Definition Example
Population The specific demographic group the research question focuses on Elementary school students from low-income households  
Exposure or Experience The broad phenomenon that the population group lives with or is affected by Dyslexia
Outcome The specific topic(s) or end result(s) being analyzed Interventions for improving literacy 

 

PICO

  • Used for quantitatively assessing the effectiveness of a specific intervention 
  • Works well for assessing clinical trials

Research question: Does listening to text while reading (audio-assisted reading) improve literacy rates for low-income elementary school students with dyslexia?

Concept Definition Example
Population The specific demographic group the research question focuses on and the experience they are living with Elementary school students from low-income households with dyslexia
Intervention The specific method or approach that is being assessed to determine if it improves the experience of the population Listening to text while reading (audio-assisted reading)
Comparison The alternative method or approach that already exists or is used more widely than the intervention, and against which the success of the intervention is being measured Not listening to text while reading
Outcome Specific end goal being assessed Literacy rates

 

SPICE

  • Used for qualitatively assessing the effectiveness of an intervention 

Research question: How likely are low-income elementary students with dyslexia to use an audio-assisted reading strategy for their reading assignments?

Concept Definition Example
Setting Where the phenomenon is taking place Elementary school
Perspective or Population The group whose opinions or feelings are being considered in order to assess the effectiveness of the intervention Students with dyslexia
Intervention The method or approach being assessed Listening to text while reading (audio-assisted reading)
Comparison The alternative method or approach that already exists or is used more widely than the intervention, and against which the success of the intervention is being measured Not listening to text while reading
Evaluation The qualitative concept being used to measure the success of the intervention Students' attitudes about audio-assisted reading 

 

SPIDER

  • Used for assessing qualitative or mixed-methods research

Research question: What are the attitudes of elementary school students with dyslexia from low-income households toward using audio-assisted reading technology?

Concept Definition Example
Sample The specific demographic group the research question focuses on and the experience they are living with Elementary school students from low-income households with dyslexia
Phenomenon of Interest The specific method or approach that is being assessed (i.e., the intervention) Listening to text while reading (audio-assisted reading)
Design The methods used (i.e., study design) to gather qualitative data about the population sample and phenomenon of interest Surveys
Evaluation Specific qualitative end goal being assessed (i.e., the outcome) Attitudes
Research type The type(s) of studies being assessed - either qualitative or mixed methods Mixed methods

 

Evidence Synthesis Methodology

There are two types of guidelines for evidence syntheses: guidelines for conducting a review, and guidelines for reporting a review.  

  • Methodological guidelines provide guidance on planning, organizing, and conducting your review.
  • Reporting guidelines provide guidance on what and how information needs to be shared in your final publication. 

Methodological Guidelines

The methodological guidelines you choose should reflect your area of research and the type of evidence synthesis you are conducting. You may consult more than one handbook listed below during your project. Here are some guidelines that are frequently used when conducting reviews across various disciplines:
 

Social Sciences

 

Environmental Sciences

 

Health Sciences and Health-Adjacent Social Sciences

Reporting Guidelines

Most evidence syntheses are reported according to the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. PRISMA guidelines provide a checklist and flow diagram template to help authors report their findings systematically and transparently.

PRISMA 2020 checklist: Word document where you will indicate the page on which each item is discussed in your final review.

PRISMA diagram: Word document template where you will report specific numbers related to your search and screening processes.

 

PRISMA Checklist Extensions: Various PRISMA extensions provide additional guidance. Some are used to supplement the main PRISMA 2020 checklist, and others are used in lieu of it. When using one of these extension checklists, you will still use the original PRISMA flow diagram template to report on your search and screening results. 

  • PRISMA for abstracts: Short checklist of items that should be in your review's abstract. Used in conjunction with the PRISMA 2020 checklist.
  • PRISMA Search (PRISMA-S): Short checklist of items that should be included when reporting your search process. Used in conjunction with the PRISMA 2020 checklist.
  • PRISMA for protocols (PRISMA-P): Checklist of items that should be included in a systematic review protocol. 
  • PRISMA for scoping reviews (PRISMA-ScR): Similar to the main PRISMA checklist, but adjusted slightly for scoping reviews. Used in lieu of the PRISMA 2020 checklist.
  • PRISMA for harms: For reviews focused on reporting harms. Used in conjunction with the PRISMA 2020 checklist.
  • PRISMA for Ecology and Evolutionary Biology (PRISMA-EcoEvo): For reviews focused in the ecology and evolutionary biology disciplines. Used in lieu of the PRISMA 2020 checklist.
  • PRISMA Equity (PRISMA-E): For reviews focused on health equity. Used in lieu of the PRISMA 2020 checklist.

 

PRISMA is the most well-known reporting guideline, but it is not the only one. 

  • ROSES (RepOrting standards for Systematic Evidence Syntheses in environmental research) 

 

Tips for Using Guidelines

  • When writing your manuscript, avoid saying that your review was conducted according to PRISMA guidelines. PRISMA provides guidelines for reporting a review, not conducting a review. 
  • Currently there are no guidelines for conducting a systematized review. You can reference any of the methodological guidelines listed above and say that your methodology is loosely based on it. Many systematized reviews include a PRISMA diagram as well.

Preliminary Searching (Finding Existing Reviews)

Reasons to look for existing reviews before beginning your project:

  1. Determine if a review on your topic has already been done. Novel research adds something valuable to the field, is more likely to be accepted for publishing, and is a good use of time/resources.
  2. Identify existing reviews on similar topics that can help inform your approach.

It is important to search for completed reviews and reviews in progress. Reviews in progress can be identified by search preprint servers and protocol registries (listed below). If you identify a review in progress that matches your topic, you can contact the lead author to ask if it actively in progress or if work has been halted, which may open the possibility of proceeding with your review as planned.

If a review on your topic has already been completed by another team or is in progress, consider modifying your question to address a different aspect of the issue that hasn't been reviewed. For ideas on how to modify a research question, see the inclusion/exclusion criteria listed below.

Published Reviews

Protocols for Reviews in Progress

Creating a Team

You should have a team of at least 3 people for your review (not including a librarian/information specialist). This is important during the screening and data extraction steps, where two people will screen and extract the same information while the third person acts as a tie breaker and resolves disagreements.

It is also important to ensure that your team members possess the skills necessary for various roles. These roles can include:

  • Subject matter experts: at least two people on the team should be subject matter experts on the topic being reviewed.
  • Methods experts: at least one person should have experience doing an evidence synthesis, understand the methods, and be able to guide team members through the steps.  
  • Search expert (or librarian): at least one person who can create an advanced search strategy and translate it for all relevant databases.  
  • Statistician and/or data visualization expert: if necessary.
  • Project manager: one person should be in charge of facilitating communication, arranging meetings, and keeping team members on track.

Lastly, consider if the members of your team provide a variety of perspectives and expertise in professional and educational experience, subject matter expertise, institutional affiliation, and even geographic location. A team with a variety of experiences and perspectives can help mitigate bias.

Tips for Building a Team

  • It's okay if some members are more active at different parts of the review process. Team members may have specialized expertise that only applies to certain parts.
  • While it's helpful to identify your team at the beginning of your project, you may want to add additional members as the review progresses. Likewise, an existing member may need to drop out of the review and be replaced after work has begun. 
  • The fewer team members you have, the more work each member will have to do and the more expertise each member will need. The more team members you have, the more important it will be to establish clear communication and training procedures.
  • If you need access to a specific subscription database or tool that is not accessible through UNC, you will want to identify a potential team member at another institution who can provide access.

Inclusion & Exclusion Criteria

Your inclusion and exclusion criteria will be based on your research question and will dictate which citations are passed through the screening stage. Criteria can pertain to the content of a publication (e.g., population demographic discussed) or can be about the publication itself (e.g., language the publication is written in). Your criteria should be defined before you start searching. They can be listed in clear, concise bullet points. 

Common criteria to consider:

  • Population demographics (age, gender, race, ethnicity, country of origin, country of residence, etc.)
  • Geographic region
  • Setting (primary schools, public schools, etc.)
  • Language of publication 
  • Study types (randomized controlled trials, narrative reviews, case studies, etc.)
  • Resource types (journal articles, dissertations, government documents, trade journal articles, editorials, etc.)
  • Date of publication
  • Intervention and/or outcome that your research question focuses on

You do not need to address every criteria item listed above; only include the items that pertain to your research question. 

Sources

Anderson, P. F., & Booth, A. (2022). Question Frameworks. In M. J. Foster & S. T. Jewell (Eds.), Piecing Together Systematic Reviews and Other Evidence Syntheses (pp. 45-56). Find@UNC

Booth, A., Noyes, J., Flemming, K., Moore, G., Tunçalp, Ö., & Shakibazadeh, E. (2019). Formulating questions to explore complex interventions within qualitative evidence synthesis. BMJ global health4(Suppl 1), e001107. https://pmc.ncbi.nlm.nih.gov/articles/PMC6350737

Borah, R., Brown, A. W., Capers, P. L., & Kaiser, K. A. (2017). Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry. BMJ Open, 7(2), e012545. Link to full text 

Vela, K., & McCall-Wright, P. (2022). Related Reviews in Context. In M. J. Foster & S. T. Jewell (Eds.), Piecing Together Systematic Reviews and Other Evidence Syntheses (pp. 57-70). Find@UNC

Townsend, W. A., Capellari, E. C., & Allee, N. J. (2022). Project and Data Management. In M. J. Foster & S. T. Jewell (Eds.), Piecing Together Systematic Reviews and Other Evidence Syntheses (pp. 71-91). Find@UNC

Content on this page was developed with assistance from UNC's Health Sciences Library and Medical University of South Carolina (MUSC). Libraries