• Privacy Policy

Research Method

Home » Research Methodology – Types, Examples and writing Guide

Research Methodology – Types, Examples and writing Guide

Table of Contents

Research Methodology

Research Methodology

Definition:

Research Methodology refers to the systematic and scientific approach used to conduct research, investigate problems, and gather data and information for a specific purpose. It involves the techniques and procedures used to identify, collect , analyze , and interpret data to answer research questions or solve research problems . Moreover, They are philosophical and theoretical frameworks that guide the research process.

Structure of Research Methodology

Research methodology formats can vary depending on the specific requirements of the research project, but the following is a basic example of a structure for a research methodology section:

I. Introduction

  • Provide an overview of the research problem and the need for a research methodology section
  • Outline the main research questions and objectives

II. Research Design

  • Explain the research design chosen and why it is appropriate for the research question(s) and objectives
  • Discuss any alternative research designs considered and why they were not chosen
  • Describe the research setting and participants (if applicable)

III. Data Collection Methods

  • Describe the methods used to collect data (e.g., surveys, interviews, observations)
  • Explain how the data collection methods were chosen and why they are appropriate for the research question(s) and objectives
  • Detail any procedures or instruments used for data collection

IV. Data Analysis Methods

  • Describe the methods used to analyze the data (e.g., statistical analysis, content analysis )
  • Explain how the data analysis methods were chosen and why they are appropriate for the research question(s) and objectives
  • Detail any procedures or software used for data analysis

V. Ethical Considerations

  • Discuss any ethical issues that may arise from the research and how they were addressed
  • Explain how informed consent was obtained (if applicable)
  • Detail any measures taken to ensure confidentiality and anonymity

VI. Limitations

  • Identify any potential limitations of the research methodology and how they may impact the results and conclusions

VII. Conclusion

  • Summarize the key aspects of the research methodology section
  • Explain how the research methodology addresses the research question(s) and objectives

Research Methodology Types

Types of Research Methodology are as follows:

Quantitative Research Methodology

This is a research methodology that involves the collection and analysis of numerical data using statistical methods. This type of research is often used to study cause-and-effect relationships and to make predictions.

Qualitative Research Methodology

This is a research methodology that involves the collection and analysis of non-numerical data such as words, images, and observations. This type of research is often used to explore complex phenomena, to gain an in-depth understanding of a particular topic, and to generate hypotheses.

Mixed-Methods Research Methodology

This is a research methodology that combines elements of both quantitative and qualitative research. This approach can be particularly useful for studies that aim to explore complex phenomena and to provide a more comprehensive understanding of a particular topic.

Case Study Research Methodology

This is a research methodology that involves in-depth examination of a single case or a small number of cases. Case studies are often used in psychology, sociology, and anthropology to gain a detailed understanding of a particular individual or group.

Action Research Methodology

This is a research methodology that involves a collaborative process between researchers and practitioners to identify and solve real-world problems. Action research is often used in education, healthcare, and social work.

Experimental Research Methodology

This is a research methodology that involves the manipulation of one or more independent variables to observe their effects on a dependent variable. Experimental research is often used to study cause-and-effect relationships and to make predictions.

Survey Research Methodology

This is a research methodology that involves the collection of data from a sample of individuals using questionnaires or interviews. Survey research is often used to study attitudes, opinions, and behaviors.

Grounded Theory Research Methodology

This is a research methodology that involves the development of theories based on the data collected during the research process. Grounded theory is often used in sociology and anthropology to generate theories about social phenomena.

Research Methodology Example

An Example of Research Methodology could be the following:

Research Methodology for Investigating the Effectiveness of Cognitive Behavioral Therapy in Reducing Symptoms of Depression in Adults

Introduction:

The aim of this research is to investigate the effectiveness of cognitive-behavioral therapy (CBT) in reducing symptoms of depression in adults. To achieve this objective, a randomized controlled trial (RCT) will be conducted using a mixed-methods approach.

Research Design:

The study will follow a pre-test and post-test design with two groups: an experimental group receiving CBT and a control group receiving no intervention. The study will also include a qualitative component, in which semi-structured interviews will be conducted with a subset of participants to explore their experiences of receiving CBT.

Participants:

Participants will be recruited from community mental health clinics in the local area. The sample will consist of 100 adults aged 18-65 years old who meet the diagnostic criteria for major depressive disorder. Participants will be randomly assigned to either the experimental group or the control group.

Intervention :

The experimental group will receive 12 weekly sessions of CBT, each lasting 60 minutes. The intervention will be delivered by licensed mental health professionals who have been trained in CBT. The control group will receive no intervention during the study period.

Data Collection:

Quantitative data will be collected through the use of standardized measures such as the Beck Depression Inventory-II (BDI-II) and the Generalized Anxiety Disorder-7 (GAD-7). Data will be collected at baseline, immediately after the intervention, and at a 3-month follow-up. Qualitative data will be collected through semi-structured interviews with a subset of participants from the experimental group. The interviews will be conducted at the end of the intervention period, and will explore participants’ experiences of receiving CBT.

Data Analysis:

Quantitative data will be analyzed using descriptive statistics, t-tests, and mixed-model analyses of variance (ANOVA) to assess the effectiveness of the intervention. Qualitative data will be analyzed using thematic analysis to identify common themes and patterns in participants’ experiences of receiving CBT.

Ethical Considerations:

This study will comply with ethical guidelines for research involving human subjects. Participants will provide informed consent before participating in the study, and their privacy and confidentiality will be protected throughout the study. Any adverse events or reactions will be reported and managed appropriately.

Data Management:

All data collected will be kept confidential and stored securely using password-protected databases. Identifying information will be removed from qualitative data transcripts to ensure participants’ anonymity.

Limitations:

One potential limitation of this study is that it only focuses on one type of psychotherapy, CBT, and may not generalize to other types of therapy or interventions. Another limitation is that the study will only include participants from community mental health clinics, which may not be representative of the general population.

Conclusion:

This research aims to investigate the effectiveness of CBT in reducing symptoms of depression in adults. By using a randomized controlled trial and a mixed-methods approach, the study will provide valuable insights into the mechanisms underlying the relationship between CBT and depression. The results of this study will have important implications for the development of effective treatments for depression in clinical settings.

How to Write Research Methodology

Writing a research methodology involves explaining the methods and techniques you used to conduct research, collect data, and analyze results. It’s an essential section of any research paper or thesis, as it helps readers understand the validity and reliability of your findings. Here are the steps to write a research methodology:

  • Start by explaining your research question: Begin the methodology section by restating your research question and explaining why it’s important. This helps readers understand the purpose of your research and the rationale behind your methods.
  • Describe your research design: Explain the overall approach you used to conduct research. This could be a qualitative or quantitative research design, experimental or non-experimental, case study or survey, etc. Discuss the advantages and limitations of the chosen design.
  • Discuss your sample: Describe the participants or subjects you included in your study. Include details such as their demographics, sampling method, sample size, and any exclusion criteria used.
  • Describe your data collection methods : Explain how you collected data from your participants. This could include surveys, interviews, observations, questionnaires, or experiments. Include details on how you obtained informed consent, how you administered the tools, and how you minimized the risk of bias.
  • Explain your data analysis techniques: Describe the methods you used to analyze the data you collected. This could include statistical analysis, content analysis, thematic analysis, or discourse analysis. Explain how you dealt with missing data, outliers, and any other issues that arose during the analysis.
  • Discuss the validity and reliability of your research : Explain how you ensured the validity and reliability of your study. This could include measures such as triangulation, member checking, peer review, or inter-coder reliability.
  • Acknowledge any limitations of your research: Discuss any limitations of your study, including any potential threats to validity or generalizability. This helps readers understand the scope of your findings and how they might apply to other contexts.
  • Provide a summary: End the methodology section by summarizing the methods and techniques you used to conduct your research. This provides a clear overview of your research methodology and helps readers understand the process you followed to arrive at your findings.

When to Write Research Methodology

Research methodology is typically written after the research proposal has been approved and before the actual research is conducted. It should be written prior to data collection and analysis, as it provides a clear roadmap for the research project.

The research methodology is an important section of any research paper or thesis, as it describes the methods and procedures that will be used to conduct the research. It should include details about the research design, data collection methods, data analysis techniques, and any ethical considerations.

The methodology should be written in a clear and concise manner, and it should be based on established research practices and standards. It is important to provide enough detail so that the reader can understand how the research was conducted and evaluate the validity of the results.

Applications of Research Methodology

Here are some of the applications of research methodology:

  • To identify the research problem: Research methodology is used to identify the research problem, which is the first step in conducting any research.
  • To design the research: Research methodology helps in designing the research by selecting the appropriate research method, research design, and sampling technique.
  • To collect data: Research methodology provides a systematic approach to collect data from primary and secondary sources.
  • To analyze data: Research methodology helps in analyzing the collected data using various statistical and non-statistical techniques.
  • To test hypotheses: Research methodology provides a framework for testing hypotheses and drawing conclusions based on the analysis of data.
  • To generalize findings: Research methodology helps in generalizing the findings of the research to the target population.
  • To develop theories : Research methodology is used to develop new theories and modify existing theories based on the findings of the research.
  • To evaluate programs and policies : Research methodology is used to evaluate the effectiveness of programs and policies by collecting data and analyzing it.
  • To improve decision-making: Research methodology helps in making informed decisions by providing reliable and valid data.

Purpose of Research Methodology

Research methodology serves several important purposes, including:

  • To guide the research process: Research methodology provides a systematic framework for conducting research. It helps researchers to plan their research, define their research questions, and select appropriate methods and techniques for collecting and analyzing data.
  • To ensure research quality: Research methodology helps researchers to ensure that their research is rigorous, reliable, and valid. It provides guidelines for minimizing bias and error in data collection and analysis, and for ensuring that research findings are accurate and trustworthy.
  • To replicate research: Research methodology provides a clear and detailed account of the research process, making it possible for other researchers to replicate the study and verify its findings.
  • To advance knowledge: Research methodology enables researchers to generate new knowledge and to contribute to the body of knowledge in their field. It provides a means for testing hypotheses, exploring new ideas, and discovering new insights.
  • To inform decision-making: Research methodology provides evidence-based information that can inform policy and decision-making in a variety of fields, including medicine, public health, education, and business.

Advantages of Research Methodology

Research methodology has several advantages that make it a valuable tool for conducting research in various fields. Here are some of the key advantages of research methodology:

  • Systematic and structured approach : Research methodology provides a systematic and structured approach to conducting research, which ensures that the research is conducted in a rigorous and comprehensive manner.
  • Objectivity : Research methodology aims to ensure objectivity in the research process, which means that the research findings are based on evidence and not influenced by personal bias or subjective opinions.
  • Replicability : Research methodology ensures that research can be replicated by other researchers, which is essential for validating research findings and ensuring their accuracy.
  • Reliability : Research methodology aims to ensure that the research findings are reliable, which means that they are consistent and can be depended upon.
  • Validity : Research methodology ensures that the research findings are valid, which means that they accurately reflect the research question or hypothesis being tested.
  • Efficiency : Research methodology provides a structured and efficient way of conducting research, which helps to save time and resources.
  • Flexibility : Research methodology allows researchers to choose the most appropriate research methods and techniques based on the research question, data availability, and other relevant factors.
  • Scope for innovation: Research methodology provides scope for innovation and creativity in designing research studies and developing new research techniques.

Research Methodology Vs Research Methods

About the author.

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Research Paper Citation

How to Cite Research Paper – All Formats and...

Data collection

Data Collection – Methods Types and Examples

Delimitations

Delimitations in Research – Types, Examples and...

Research Paper Formats

Research Paper Format – Types, Examples and...

Research Process

Research Process – Steps, Examples and Tips

Research Design

Research Design – Types, Methods and Examples

Get science-backed answers as you write with Paperpal's Research feature

What is Research Methodology? Definition, Types, and Examples

what is included in the research methodology

Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of the research. Several aspects must be considered before selecting an appropriate research methodology, such as research limitations and ethical concerns that may affect your research.

The research methodology section in a scientific paper describes the different methodological choices made, such as the data collection and analysis methods, and why these choices were selected. The reasons should explain why the methods chosen are the most appropriate to answer the research question. A good research methodology also helps ensure the reliability and validity of the research findings. There are three types of research methodology—quantitative, qualitative, and mixed-method, which can be chosen based on the research objectives.

What is research methodology ?

A research methodology describes the techniques and procedures used to identify and analyze information regarding a specific research topic. It is a process by which researchers design their study so that they can achieve their objectives using the selected research instruments. It includes all the important aspects of research, including research design, data collection methods, data analysis methods, and the overall framework within which the research is conducted. While these points can help you understand what is research methodology, you also need to know why it is important to pick the right methodology.

Why is research methodology important?

Having a good research methodology in place has the following advantages: 3

  • Helps other researchers who may want to replicate your research; the explanations will be of benefit to them.
  • You can easily answer any questions about your research if they arise at a later stage.
  • A research methodology provides a framework and guidelines for researchers to clearly define research questions, hypotheses, and objectives.
  • It helps researchers identify the most appropriate research design, sampling technique, and data collection and analysis methods.
  • A sound research methodology helps researchers ensure that their findings are valid and reliable and free from biases and errors.
  • It also helps ensure that ethical guidelines are followed while conducting research.
  • A good research methodology helps researchers in planning their research efficiently, by ensuring optimum usage of their time and resources.

Writing the methods section of a research paper? Let Paperpal help you achieve perfection

Types of research methodology.

There are three types of research methodology based on the type of research and the data required. 1

  • Quantitative research methodology focuses on measuring and testing numerical data. This approach is good for reaching a large number of people in a short amount of time. This type of research helps in testing the causal relationships between variables, making predictions, and generalizing results to wider populations.
  • Qualitative research methodology examines the opinions, behaviors, and experiences of people. It collects and analyzes words and textual data. This research methodology requires fewer participants but is still more time consuming because the time spent per participant is quite large. This method is used in exploratory research where the research problem being investigated is not clearly defined.
  • Mixed-method research methodology uses the characteristics of both quantitative and qualitative research methodologies in the same study. This method allows researchers to validate their findings, verify if the results observed using both methods are complementary, and explain any unexpected results obtained from one method by using the other method.

What are the types of sampling designs in research methodology?

Sampling 4 is an important part of a research methodology and involves selecting a representative sample of the population to conduct the study, making statistical inferences about them, and estimating the characteristics of the whole population based on these inferences. There are two types of sampling designs in research methodology—probability and nonprobability.

  • Probability sampling

In this type of sampling design, a sample is chosen from a larger population using some form of random selection, that is, every member of the population has an equal chance of being selected. The different types of probability sampling are:

  • Systematic —sample members are chosen at regular intervals. It requires selecting a starting point for the sample and sample size determination that can be repeated at regular intervals. This type of sampling method has a predefined range; hence, it is the least time consuming.
  • Stratified —researchers divide the population into smaller groups that don’t overlap but represent the entire population. While sampling, these groups can be organized, and then a sample can be drawn from each group separately.
  • Cluster —the population is divided into clusters based on demographic parameters like age, sex, location, etc.
  • Convenience —selects participants who are most easily accessible to researchers due to geographical proximity, availability at a particular time, etc.
  • Purposive —participants are selected at the researcher’s discretion. Researchers consider the purpose of the study and the understanding of the target audience.
  • Snowball —already selected participants use their social networks to refer the researcher to other potential participants.
  • Quota —while designing the study, the researchers decide how many people with which characteristics to include as participants. The characteristics help in choosing people most likely to provide insights into the subject.

What are data collection methods?

During research, data are collected using various methods depending on the research methodology being followed and the research methods being undertaken. Both qualitative and quantitative research have different data collection methods, as listed below.

Qualitative research 5

  • One-on-one interviews: Helps the interviewers understand a respondent’s subjective opinion and experience pertaining to a specific topic or event
  • Document study/literature review/record keeping: Researchers’ review of already existing written materials such as archives, annual reports, research articles, guidelines, policy documents, etc.
  • Focus groups: Constructive discussions that usually include a small sample of about 6-10 people and a moderator, to understand the participants’ opinion on a given topic.
  • Qualitative observation : Researchers collect data using their five senses (sight, smell, touch, taste, and hearing).

Quantitative research 6

  • Sampling: The most common type is probability sampling.
  • Interviews: Commonly telephonic or done in-person.
  • Observations: Structured observations are most commonly used in quantitative research. In this method, researchers make observations about specific behaviors of individuals in a structured setting.
  • Document review: Reviewing existing research or documents to collect evidence for supporting the research.
  • Surveys and questionnaires. Surveys can be administered both online and offline depending on the requirement and sample size.

Let Paperpal help you write the perfect research methods section. Start now!

What are data analysis methods.

The data collected using the various methods for qualitative and quantitative research need to be analyzed to generate meaningful conclusions. These data analysis methods 7 also differ between quantitative and qualitative research.

Quantitative research involves a deductive method for data analysis where hypotheses are developed at the beginning of the research and precise measurement is required. The methods include statistical analysis applications to analyze numerical data and are grouped into two categories—descriptive and inferential.

Descriptive analysis is used to describe the basic features of different types of data to present it in a way that ensures the patterns become meaningful. The different types of descriptive analysis methods are:

  • Measures of frequency (count, percent, frequency)
  • Measures of central tendency (mean, median, mode)
  • Measures of dispersion or variation (range, variance, standard deviation)
  • Measure of position (percentile ranks, quartile ranks)

Inferential analysis is used to make predictions about a larger population based on the analysis of the data collected from a smaller population. This analysis is used to study the relationships between different variables. Some commonly used inferential data analysis methods are:

  • Correlation: To understand the relationship between two or more variables.
  • Cross-tabulation: Analyze the relationship between multiple variables.
  • Regression analysis: Study the impact of independent variables on the dependent variable.
  • Frequency tables: To understand the frequency of data.
  • Analysis of variance: To test the degree to which two or more variables differ in an experiment.

Qualitative research involves an inductive method for data analysis where hypotheses are developed after data collection. The methods include:

  • Content analysis: For analyzing documented information from text and images by determining the presence of certain words or concepts in texts.
  • Narrative analysis: For analyzing content obtained from sources such as interviews, field observations, and surveys. The stories and opinions shared by people are used to answer research questions.
  • Discourse analysis: For analyzing interactions with people considering the social context, that is, the lifestyle and environment, under which the interaction occurs.
  • Grounded theory: Involves hypothesis creation by data collection and analysis to explain why a phenomenon occurred.
  • Thematic analysis: To identify important themes or patterns in data and use these to address an issue.

How to choose a research methodology?

Here are some important factors to consider when choosing a research methodology: 8

  • Research objectives, aims, and questions —these would help structure the research design.
  • Review existing literature to identify any gaps in knowledge.
  • Check the statistical requirements —if data-driven or statistical results are needed then quantitative research is the best. If the research questions can be answered based on people’s opinions and perceptions, then qualitative research is most suitable.
  • Sample size —sample size can often determine the feasibility of a research methodology. For a large sample, less effort- and time-intensive methods are appropriate.
  • Constraints —constraints of time, geography, and resources can help define the appropriate methodology.

Got writer’s block? Kickstart your research paper writing with Paperpal now!

How to write a research methodology .

A research methodology should include the following components: 3,9

  • Research design —should be selected based on the research question and the data required. Common research designs include experimental, quasi-experimental, correlational, descriptive, and exploratory.
  • Research method —this can be quantitative, qualitative, or mixed-method.
  • Reason for selecting a specific methodology —explain why this methodology is the most suitable to answer your research problem.
  • Research instruments —explain the research instruments you plan to use, mainly referring to the data collection methods such as interviews, surveys, etc. Here as well, a reason should be mentioned for selecting the particular instrument.
  • Sampling —this involves selecting a representative subset of the population being studied.
  • Data collection —involves gathering data using several data collection methods, such as surveys, interviews, etc.
  • Data analysis —describe the data analysis methods you will use once you’ve collected the data.
  • Research limitations —mention any limitations you foresee while conducting your research.
  • Validity and reliability —validity helps identify the accuracy and truthfulness of the findings; reliability refers to the consistency and stability of the results over time and across different conditions.
  • Ethical considerations —research should be conducted ethically. The considerations include obtaining consent from participants, maintaining confidentiality, and addressing conflicts of interest.

Streamline Your Research Paper Writing Process with Paperpal

The methods section is a critical part of the research papers, allowing researchers to use this to understand your findings and replicate your work when pursuing their own research. However, it is usually also the most difficult section to write. This is where Paperpal can help you overcome the writer’s block and create the first draft in minutes with Paperpal Copilot, its secure generative AI feature suite.  

With Paperpal you can get research advice, write and refine your work, rephrase and verify the writing, and ensure submission readiness, all in one place. Here’s how you can use Paperpal to develop the first draft of your methods section.  

  • Generate an outline: Input some details about your research to instantly generate an outline for your methods section 
  • Develop the section: Use the outline and suggested sentence templates to expand your ideas and develop the first draft.  
  • P araph ras e and trim : Get clear, concise academic text with paraphrasing that conveys your work effectively and word reduction to fix redundancies. 
  • Choose the right words: Enhance text by choosing contextual synonyms based on how the words have been used in previously published work.  
  • Check and verify text : Make sure the generated text showcases your methods correctly, has all the right citations, and is original and authentic. .   

You can repeat this process to develop each section of your research manuscript, including the title, abstract and keywords. Ready to write your research papers faster, better, and without the stress? Sign up for Paperpal and start writing today!

Frequently Asked Questions

Q1. What are the key components of research methodology?

A1. A good research methodology has the following key components:

  • Research design
  • Data collection procedures
  • Data analysis methods
  • Ethical considerations

Q2. Why is ethical consideration important in research methodology?

A2. Ethical consideration is important in research methodology to ensure the readers of the reliability and validity of the study. Researchers must clearly mention the ethical norms and standards followed during the conduct of the research and also mention if the research has been cleared by any institutional board. The following 10 points are the important principles related to ethical considerations: 10

  • Participants should not be subjected to harm.
  • Respect for the dignity of participants should be prioritized.
  • Full consent should be obtained from participants before the study.
  • Participants’ privacy should be ensured.
  • Confidentiality of the research data should be ensured.
  • Anonymity of individuals and organizations participating in the research should be maintained.
  • The aims and objectives of the research should not be exaggerated.
  • Affiliations, sources of funding, and any possible conflicts of interest should be declared.
  • Communication in relation to the research should be honest and transparent.
  • Misleading information and biased representation of primary data findings should be avoided.

Q3. What is the difference between methodology and method?

A3. Research methodology is different from a research method, although both terms are often confused. Research methods are the tools used to gather data, while the research methodology provides a framework for how research is planned, conducted, and analyzed. The latter guides researchers in making decisions about the most appropriate methods for their research. Research methods refer to the specific techniques, procedures, and tools used by researchers to collect, analyze, and interpret data, for instance surveys, questionnaires, interviews, etc.

Research methodology is, thus, an integral part of a research study. It helps ensure that you stay on track to meet your research objectives and answer your research questions using the most appropriate data collection and analysis tools based on your research design.

Accelerate your research paper writing with Paperpal. Try for free now!

  • Research methodologies. Pfeiffer Library website. Accessed August 15, 2023. https://library.tiffin.edu/researchmethodologies/whatareresearchmethodologies
  • Types of research methodology. Eduvoice website. Accessed August 16, 2023. https://eduvoice.in/types-research-methodology/
  • The basics of research methodology: A key to quality research. Voxco. Accessed August 16, 2023. https://www.voxco.com/blog/what-is-research-methodology/
  • Sampling methods: Types with examples. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/types-of-sampling-for-social-research/
  • What is qualitative research? Methods, types, approaches, examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-qualitative-research-methods-types-examples/
  • What is quantitative research? Definition, methods, types, and examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
  • Data analysis in research: Types & methods. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/data-analysis-in-research/#Data_analysis_in_qualitative_research
  • Factors to consider while choosing the right research methodology. PhD Monster website. Accessed August 17, 2023. https://www.phdmonster.com/factors-to-consider-while-choosing-the-right-research-methodology/
  • What is research methodology? Research and writing guides. Accessed August 14, 2023. https://paperpile.com/g/what-is-research-methodology/
  • Ethical considerations. Business research methodology website. Accessed August 17, 2023. https://research-methodology.net/research-methodology/ethical-considerations/

Paperpal is a comprehensive AI writing toolkit that helps students and researchers achieve 2x the writing in half the time. It leverages 21+ years of STM experience and insights from millions of research articles to provide in-depth academic writing, language editing, and submission readiness support to help you write better, faster.  

Get accurate academic translations, rewriting support, grammar checks, vocabulary suggestions, and generative AI assistance that delivers human precision at machine speed. Try for free or upgrade to Paperpal Prime starting at US$19 a month to access premium features, including consistency, plagiarism, and 30+ submission readiness checks to help you succeed.  

Experience the future of academic writing – Sign up to Paperpal and start writing for free!  

Related Reads:

  • Dangling Modifiers and How to Avoid Them in Your Writing 
  • Webinar: How to Use Generative AI Tools Ethically in Your Academic Writing
  • Research Outlines: How to Write An Introduction Section in Minutes with Paperpal Copilot
  • How to Paraphrase Research Papers Effectively

Language and Grammar Rules for Academic Writing

Climatic vs. climactic: difference and examples, you may also like, how to ace grant writing for research funding..., how to write a high-quality conference paper, how paperpal is enhancing academic productivity and accelerating..., academic editing: how to self-edit academic text with..., 4 ways paperpal encourages responsible writing with ai, what are scholarly sources and where can you..., how to write a hypothesis types and examples , what is academic writing: tips for students, what is hedging in academic writing  , how to use ai to enhance your college....

Grad Coach

What Is Research Methodology? A Plain-Language Explanation & Definition (With Examples)

By Derek Jansen (MBA)  and Kerryn Warren (PhD) | June 2020 (Last updated April 2023)

If you’re new to formal academic research, it’s quite likely that you’re feeling a little overwhelmed by all the technical lingo that gets thrown around. And who could blame you – “research methodology”, “research methods”, “sampling strategies”… it all seems never-ending!

In this post, we’ll demystify the landscape with plain-language explanations and loads of examples (including easy-to-follow videos), so that you can approach your dissertation, thesis or research project with confidence. Let’s get started.

Research Methodology 101

  • What exactly research methodology means
  • What qualitative , quantitative and mixed methods are
  • What sampling strategy is
  • What data collection methods are
  • What data analysis methods are
  • How to choose your research methodology
  • Example of a research methodology

Free Webinar: Research Methodology 101

What is research methodology?

Research methodology simply refers to the practical “how” of a research study. More specifically, it’s about how  a researcher  systematically designs a study  to ensure valid and reliable results that address the research aims, objectives and research questions . Specifically, how the researcher went about deciding:

  • What type of data to collect (e.g., qualitative or quantitative data )
  • Who  to collect it from (i.e., the sampling strategy )
  • How to  collect  it (i.e., the data collection method )
  • How to  analyse  it (i.e., the data analysis methods )

Within any formal piece of academic research (be it a dissertation, thesis or journal article), you’ll find a research methodology chapter or section which covers the aspects mentioned above. Importantly, a good methodology chapter explains not just   what methodological choices were made, but also explains  why they were made. In other words, the methodology chapter should justify  the design choices, by showing that the chosen methods and techniques are the best fit for the research aims, objectives and research questions. 

So, it’s the same as research design?

Not quite. As we mentioned, research methodology refers to the collection of practical decisions regarding what data you’ll collect, from who, how you’ll collect it and how you’ll analyse it. Research design, on the other hand, is more about the overall strategy you’ll adopt in your study. For example, whether you’ll use an experimental design in which you manipulate one variable while controlling others. You can learn more about research design and the various design types here .

Need a helping hand?

what is included in the research methodology

What are qualitative, quantitative and mixed-methods?

Qualitative, quantitative and mixed-methods are different types of methodological approaches, distinguished by their focus on words , numbers or both . This is a bit of an oversimplification, but its a good starting point for understanding.

Let’s take a closer look.

Qualitative research refers to research which focuses on collecting and analysing words (written or spoken) and textual or visual data, whereas quantitative research focuses on measurement and testing using numerical data . Qualitative analysis can also focus on other “softer” data points, such as body language or visual elements.

It’s quite common for a qualitative methodology to be used when the research aims and research questions are exploratory  in nature. For example, a qualitative methodology might be used to understand peoples’ perceptions about an event that took place, or a political candidate running for president. 

Contrasted to this, a quantitative methodology is typically used when the research aims and research questions are confirmatory  in nature. For example, a quantitative methodology might be used to measure the relationship between two variables (e.g. personality type and likelihood to commit a crime) or to test a set of hypotheses .

As you’ve probably guessed, the mixed-method methodology attempts to combine the best of both qualitative and quantitative methodologies to integrate perspectives and create a rich picture. If you’d like to learn more about these three methodological approaches, be sure to watch our explainer video below.

What is sampling strategy?

Simply put, sampling is about deciding who (or where) you’re going to collect your data from . Why does this matter? Well, generally it’s not possible to collect data from every single person in your group of interest (this is called the “population”), so you’ll need to engage a smaller portion of that group that’s accessible and manageable (this is called the “sample”).

How you go about selecting the sample (i.e., your sampling strategy) will have a major impact on your study.  There are many different sampling methods  you can choose from, but the two overarching categories are probability   sampling and  non-probability   sampling .

Probability sampling  involves using a completely random sample from the group of people you’re interested in. This is comparable to throwing the names all potential participants into a hat, shaking it up, and picking out the “winners”. By using a completely random sample, you’ll minimise the risk of selection bias and the results of your study will be more generalisable  to the entire population. 

Non-probability sampling , on the other hand,  doesn’t use a random sample . For example, it might involve using a convenience sample, which means you’d only interview or survey people that you have access to (perhaps your friends, family or work colleagues), rather than a truly random sample. With non-probability sampling, the results are typically not generalisable .

To learn more about sampling methods, be sure to check out the video below.

What are data collection methods?

As the name suggests, data collection methods simply refers to the way in which you go about collecting the data for your study. Some of the most common data collection methods include:

  • Interviews (which can be unstructured, semi-structured or structured)
  • Focus groups and group interviews
  • Surveys (online or physical surveys)
  • Observations (watching and recording activities)
  • Biophysical measurements (e.g., blood pressure, heart rate, etc.)
  • Documents and records (e.g., financial reports, court records, etc.)

The choice of which data collection method to use depends on your overall research aims and research questions , as well as practicalities and resource constraints. For example, if your research is exploratory in nature, qualitative methods such as interviews and focus groups would likely be a good fit. Conversely, if your research aims to measure specific variables or test hypotheses, large-scale surveys that produce large volumes of numerical data would likely be a better fit.

What are data analysis methods?

Data analysis methods refer to the methods and techniques that you’ll use to make sense of your data. These can be grouped according to whether the research is qualitative  (words-based) or quantitative (numbers-based).

Popular data analysis methods in qualitative research include:

  • Qualitative content analysis
  • Thematic analysis
  • Discourse analysis
  • Narrative analysis
  • Interpretative phenomenological analysis (IPA)
  • Visual analysis (of photographs, videos, art, etc.)

Qualitative data analysis all begins with data coding , after which an analysis method is applied. In some cases, more than one analysis method is used, depending on the research aims and research questions . In the video below, we explore some  common qualitative analysis methods, along with practical examples.  

Moving on to the quantitative side of things, popular data analysis methods in this type of research include:

  • Descriptive statistics (e.g. means, medians, modes )
  • Inferential statistics (e.g. correlation, regression, structural equation modelling)

Again, the choice of which data collection method to use depends on your overall research aims and objectives , as well as practicalities and resource constraints. In the video below, we explain some core concepts central to quantitative analysis.

How do I choose a research methodology?

As you’ve probably picked up by now, your research aims and objectives have a major influence on the research methodology . So, the starting point for developing your research methodology is to take a step back and look at the big picture of your research, before you make methodology decisions. The first question you need to ask yourself is whether your research is exploratory or confirmatory in nature.

If your research aims and objectives are primarily exploratory in nature, your research will likely be qualitative and therefore you might consider qualitative data collection methods (e.g. interviews) and analysis methods (e.g. qualitative content analysis). 

Conversely, if your research aims and objective are looking to measure or test something (i.e. they’re confirmatory), then your research will quite likely be quantitative in nature, and you might consider quantitative data collection methods (e.g. surveys) and analyses (e.g. statistical analysis).

Designing your research and working out your methodology is a large topic, which we cover extensively on the blog . For now, however, the key takeaway is that you should always start with your research aims, objectives and research questions (the golden thread). Every methodological choice you make needs align with those three components. 

Example of a research methodology chapter

In the video below, we provide a detailed walkthrough of a research methodology from an actual dissertation, as well as an overview of our free methodology template .

what is included in the research methodology

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Inferential stats 101

199 Comments

Leo Balanlay

Thank you for this simple yet comprehensive and easy to digest presentation. God Bless!

Derek Jansen

You’re most welcome, Leo. Best of luck with your research!

Asaf

I found it very useful. many thanks

Solomon F. Joel

This is really directional. A make-easy research knowledge.

Upendo Mmbaga

Thank you for this, I think will help my research proposal

vicky

Thanks for good interpretation,well understood.

Alhaji Alie Kanu

Good morning sorry I want to the search topic

Baraka Gombela

Thank u more

Boyd

Thank you, your explanation is simple and very helpful.

Suleiman Abubakar

Very educative a.nd exciting platform. A bigger thank you and I’ll like to always be with you

Daniel Mondela

That’s the best analysis

Okwuchukwu

So simple yet so insightful. Thank you.

Wendy Lushaba

This really easy to read as it is self-explanatory. Very much appreciated…

Lilian

Thanks for this. It’s so helpful and explicit. For those elements highlighted in orange, they were good sources of referrals for concepts I didn’t understand. A million thanks for this.

Tabe Solomon Matebesi

Good morning, I have been reading your research lessons through out a period of times. They are important, impressive and clear. Want to subscribe and be and be active with you.

Hafiz Tahir

Thankyou So much Sir Derek…

Good morning thanks so much for the on line lectures am a student of university of Makeni.select a research topic and deliberate on it so that we’ll continue to understand more.sorry that’s a suggestion.

James Olukoya

Beautiful presentation. I love it.

ATUL KUMAR

please provide a research mehodology example for zoology

Ogar , Praise

It’s very educative and well explained

Joseph Chan

Thanks for the concise and informative data.

Goja Terhemba John

This is really good for students to be safe and well understand that research is all about

Prakash thapa

Thank you so much Derek sir🖤🙏🤗

Abraham

Very simple and reliable

Chizor Adisa

This is really helpful. Thanks alot. God bless you.

Danushika

very useful, Thank you very much..

nakato justine

thanks a lot its really useful

karolina

in a nutshell..thank you!

Bitrus

Thanks for updating my understanding on this aspect of my Thesis writing.

VEDASTO DATIVA MATUNDA

thank you so much my through this video am competently going to do a good job my thesis

Jimmy

Thanks a lot. Very simple to understand. I appreciate 🙏

Mfumukazi

Very simple but yet insightful Thank you

Adegboyega ADaeBAYO

This has been an eye opening experience. Thank you grad coach team.

SHANTHi

Very useful message for research scholars

Teijili

Really very helpful thank you

sandokhan

yes you are right and i’m left

MAHAMUDUL HASSAN

Research methodology with a simplest way i have never seen before this article.

wogayehu tuji

wow thank u so much

Good morning thanks so much for the on line lectures am a student of university of Makeni.select a research topic and deliberate on is so that we will continue to understand more.sorry that’s a suggestion.

Gebregergish

Very precise and informative.

Javangwe Nyeketa

Thanks for simplifying these terms for us, really appreciate it.

Mary Benard Mwanganya

Thanks this has really helped me. It is very easy to understand.

mandla

I found the notes and the presentation assisting and opening my understanding on research methodology

Godfrey Martin Assenga

Good presentation

Nhubu Tawanda

Im so glad you clarified my misconceptions. Im now ready to fry my onions. Thank you so much. God bless

Odirile

Thank you a lot.

prathap

thanks for the easy way of learning and desirable presentation.

Ajala Tajudeen

Thanks a lot. I am inspired

Visor Likali

Well written

Pondris Patrick

I am writing a APA Format paper . I using questionnaire with 120 STDs teacher for my participant. Can you write me mthology for this research. Send it through email sent. Just need a sample as an example please. My topic is ” impacts of overcrowding on students learning

Thanks for your comment.

We can’t write your methodology for you. If you’re looking for samples, you should be able to find some sample methodologies on Google. Alternatively, you can download some previous dissertations from a dissertation directory and have a look at the methodology chapters therein.

All the best with your research.

Anon

Thank you so much for this!! God Bless

Keke

Thank you. Explicit explanation

Sophy

Thank you, Derek and Kerryn, for making this simple to understand. I’m currently at the inception stage of my research.

Luyanda

Thnks a lot , this was very usefull on my assignment

Beulah Emmanuel

excellent explanation

Gino Raz

I’m currently working on my master’s thesis, thanks for this! I’m certain that I will use Qualitative methodology.

Abigail

Thanks a lot for this concise piece, it was quite relieving and helpful. God bless you BIG…

Yonas Tesheme

I am currently doing my dissertation proposal and I am sure that I will do quantitative research. Thank you very much it was extremely helpful.

zahid t ahmad

Very interesting and informative yet I would like to know about examples of Research Questions as well, if possible.

Maisnam loyalakla

I’m about to submit a research presentation, I have come to understand from your simplification on understanding research methodology. My research will be mixed methodology, qualitative as well as quantitative. So aim and objective of mixed method would be both exploratory and confirmatory. Thanks you very much for your guidance.

Mila Milano

OMG thanks for that, you’re a life saver. You covered all the points I needed. Thank you so much ❤️ ❤️ ❤️

Christabel

Thank you immensely for this simple, easy to comprehend explanation of data collection methods. I have been stuck here for months 😩. Glad I found your piece. Super insightful.

Lika

I’m going to write synopsis which will be quantitative research method and I don’t know how to frame my topic, can I kindly get some ideas..

Arlene

Thanks for this, I was really struggling.

This was really informative I was struggling but this helped me.

Modie Maria Neswiswi

Thanks a lot for this information, simple and straightforward. I’m a last year student from the University of South Africa UNISA South Africa.

Mursel Amin

its very much informative and understandable. I have enlightened.

Mustapha Abubakar

An interesting nice exploration of a topic.

Sarah

Thank you. Accurate and simple🥰

Sikandar Ali Shah

This article was really helpful, it helped me understanding the basic concepts of the topic Research Methodology. The examples were very clear, and easy to understand. I would like to visit this website again. Thank you so much for such a great explanation of the subject.

Debbie

Thanks dude

Deborah

Thank you Doctor Derek for this wonderful piece, please help to provide your details for reference purpose. God bless.

Michael

Many compliments to you

Dana

Great work , thank you very much for the simple explanation

Aryan

Thank you. I had to give a presentation on this topic. I have looked everywhere on the internet but this is the best and simple explanation.

omodara beatrice

thank you, its very informative.

WALLACE

Well explained. Now I know my research methodology will be qualitative and exploratory. Thank you so much, keep up the good work

GEORGE REUBEN MSHEGAME

Well explained, thank you very much.

Ainembabazi Rose

This is good explanation, I have understood the different methods of research. Thanks a lot.

Kamran Saeed

Great work…very well explanation

Hyacinth Chebe Ukwuani

Thanks Derek. Kerryn was just fantastic!

Great to hear that, Hyacinth. Best of luck with your research!

Matobela Joel Marabi

Its a good templates very attractive and important to PhD students and lectuter

Thanks for the feedback, Matobela. Good luck with your research methodology.

Elie

Thank you. This is really helpful.

You’re very welcome, Elie. Good luck with your research methodology.

Sakina Dalal

Well explained thanks

Edward

This is a very helpful site especially for young researchers at college. It provides sufficient information to guide students and equip them with the necessary foundation to ask any other questions aimed at deepening their understanding.

Thanks for the kind words, Edward. Good luck with your research!

Ngwisa Marie-claire NJOTU

Thank you. I have learned a lot.

Great to hear that, Ngwisa. Good luck with your research methodology!

Claudine

Thank you for keeping your presentation simples and short and covering key information for research methodology. My key takeaway: Start with defining your research objective the other will depend on the aims of your research question.

Zanele

My name is Zanele I would like to be assisted with my research , and the topic is shortage of nursing staff globally want are the causes , effects on health, patients and community and also globally

Oluwafemi Taiwo

Thanks for making it simple and clear. It greatly helped in understanding research methodology. Regards.

Francis

This is well simplified and straight to the point

Gabriel mugangavari

Thank you Dr

Dina Haj Ibrahim

I was given an assignment to research 2 publications and describe their research methodology? I don’t know how to start this task can someone help me?

Sure. You’re welcome to book an initial consultation with one of our Research Coaches to discuss how we can assist – https://gradcoach.com/book/new/ .

BENSON ROSEMARY

Thanks a lot I am relieved of a heavy burden.keep up with the good work

Ngaka Mokoena

I’m very much grateful Dr Derek. I’m planning to pursue one of the careers that really needs one to be very much eager to know. There’s a lot of research to do and everything, but since I’ve gotten this information I will use it to the best of my potential.

Pritam Pal

Thank you so much, words are not enough to explain how helpful this session has been for me!

faith

Thanks this has thought me alot.

kenechukwu ambrose

Very concise and helpful. Thanks a lot

Eunice Shatila Sinyemu 32070

Thank Derek. This is very helpful. Your step by step explanation has made it easier for me to understand different concepts. Now i can get on with my research.

Michelle

I wish i had come across this sooner. So simple but yet insightful

yugine the

really nice explanation thank you so much

Goodness

I’m so grateful finding this site, it’s really helpful…….every term well explained and provide accurate understanding especially to student going into an in-depth research for the very first time, even though my lecturer already explained this topic to the class, I think I got the clear and efficient explanation here, much thanks to the author.

lavenda

It is very helpful material

Lubabalo Ntshebe

I would like to be assisted with my research topic : Literature Review and research methodologies. My topic is : what is the relationship between unemployment and economic growth?

Buddhi

Its really nice and good for us.

Ekokobe Aloysius

THANKS SO MUCH FOR EXPLANATION, ITS VERY CLEAR TO ME WHAT I WILL BE DOING FROM NOW .GREAT READS.

Asanka

Short but sweet.Thank you

Shishir Pokharel

Informative article. Thanks for your detailed information.

Badr Alharbi

I’m currently working on my Ph.D. thesis. Thanks a lot, Derek and Kerryn, Well-organized sequences, facilitate the readers’ following.

Tejal

great article for someone who does not have any background can even understand

Hasan Chowdhury

I am a bit confused about research design and methodology. Are they the same? If not, what are the differences and how are they related?

Thanks in advance.

Ndileka Myoli

concise and informative.

Sureka Batagoda

Thank you very much

More Smith

How can we site this article is Harvard style?

Anne

Very well written piece that afforded better understanding of the concept. Thank you!

Denis Eken Lomoro

Am a new researcher trying to learn how best to write a research proposal. I find your article spot on and want to download the free template but finding difficulties. Can u kindly send it to my email, the free download entitled, “Free Download: Research Proposal Template (with Examples)”.

fatima sani

Thank too much

Khamis

Thank you very much for your comprehensive explanation about research methodology so I like to thank you again for giving us such great things.

Aqsa Iftijhar

Good very well explained.Thanks for sharing it.

Krishna Dhakal

Thank u sir, it is really a good guideline.

Vimbainashe

so helpful thank you very much.

Joelma M Monteiro

Thanks for the video it was very explanatory and detailed, easy to comprehend and follow up. please, keep it up the good work

AVINASH KUMAR NIRALA

It was very helpful, a well-written document with precise information.

orebotswe morokane

how do i reference this?

Roy

MLA Jansen, Derek, and Kerryn Warren. “What (Exactly) Is Research Methodology?” Grad Coach, June 2021, gradcoach.com/what-is-research-methodology/.

APA Jansen, D., & Warren, K. (2021, June). What (Exactly) Is Research Methodology? Grad Coach. https://gradcoach.com/what-is-research-methodology/

sheryl

Your explanation is easily understood. Thank you

Dr Christie

Very help article. Now I can go my methodology chapter in my thesis with ease

Alice W. Mbuthia

I feel guided ,Thank you

Joseph B. Smith

This simplification is very helpful. It is simple but very educative, thanks ever so much

Dr. Ukpai Ukpai Eni

The write up is informative and educative. It is an academic intellectual representation that every good researcher can find useful. Thanks

chimbini Joseph

Wow, this is wonderful long live.

Tahir

Nice initiative

Thembsie

thank you the video was helpful to me.

JesusMalick

Thank you very much for your simple and clear explanations I’m really satisfied by the way you did it By now, I think I can realize a very good article by following your fastidious indications May God bless you

G.Horizon

Thanks very much, it was very concise and informational for a beginner like me to gain an insight into what i am about to undertake. I really appreciate.

Adv Asad Ali

very informative sir, it is amazing to understand the meaning of question hidden behind that, and simple language is used other than legislature to understand easily. stay happy.

Jonas Tan

This one is really amazing. All content in your youtube channel is a very helpful guide for doing research. Thanks, GradCoach.

mahmoud ali

research methodologies

Lucas Sinyangwe

Please send me more information concerning dissertation research.

Amamten Jr.

Nice piece of knowledge shared….. #Thump_UP

Hajara Salihu

This is amazing, it has said it all. Thanks to Gradcoach

Gerald Andrew Babu

This is wonderful,very elaborate and clear.I hope to reach out for your assistance in my research very soon.

Safaa

This is the answer I am searching about…

realy thanks a lot

Ahmed Saeed

Thank you very much for this awesome, to the point and inclusive article.

Soraya Kolli

Thank you very much I need validity and reliability explanation I have exams

KuzivaKwenda

Thank you for a well explained piece. This will help me going forward.

Emmanuel Chukwuma

Very simple and well detailed Many thanks

Zeeshan Ali Khan

This is so very simple yet so very effective and comprehensive. An Excellent piece of work.

Molly Wasonga

I wish I saw this earlier on! Great insights for a beginner(researcher) like me. Thanks a mil!

Blessings Chigodo

Thank you very much, for such a simplified, clear and practical step by step both for academic students and general research work. Holistic, effective to use and easy to read step by step. One can easily apply the steps in practical terms and produce a quality document/up-to standard

Thanks for simplifying these terms for us, really appreciated.

Joseph Kyereme

Thanks for a great work. well understood .

Julien

This was very helpful. It was simple but profound and very easy to understand. Thank you so much!

Kishimbo

Great and amazing research guidelines. Best site for learning research

ankita bhatt

hello sir/ma’am, i didn’t find yet that what type of research methodology i am using. because i am writing my report on CSR and collect all my data from websites and articles so which type of methodology i should write in dissertation report. please help me. i am from India.

memory

how does this really work?

princelow presley

perfect content, thanks a lot

George Nangpaak Duut

As a researcher, I commend you for the detailed and simplified information on the topic in question. I would like to remain in touch for the sharing of research ideas on other topics. Thank you

EPHRAIM MWANSA MULENGA

Impressive. Thank you, Grad Coach 😍

Thank you Grad Coach for this piece of information. I have at least learned about the different types of research methodologies.

Varinder singh Rana

Very useful content with easy way

Mbangu Jones Kashweeka

Thank you very much for the presentation. I am an MPH student with the Adventist University of Africa. I have successfully completed my theory and starting on my research this July. My topic is “Factors associated with Dental Caries in (one District) in Botswana. I need help on how to go about this quantitative research

Carolyn Russell

I am so grateful to run across something that was sooo helpful. I have been on my doctorate journey for quite some time. Your breakdown on methodology helped me to refresh my intent. Thank you.

Indabawa Musbahu

thanks so much for this good lecture. student from university of science and technology, Wudil. Kano Nigeria.

Limpho Mphutlane

It’s profound easy to understand I appreciate

Mustafa Salimi

Thanks a lot for sharing superb information in a detailed but concise manner. It was really helpful and helped a lot in getting into my own research methodology.

Rabilu yau

Comment * thanks very much

Ari M. Hussein

This was sooo helpful for me thank you so much i didn’t even know what i had to write thank you!

You’re most welcome 🙂

Varsha Patnaik

Simple and good. Very much helpful. Thank you so much.

STARNISLUS HAAMBOKOMA

This is very good work. I have benefited.

Dr Md Asraul Hoque

Thank you so much for sharing

Nkasa lizwi

This is powerful thank you so much guys

I am nkasa lizwi doing my research proposal on honors with the university of Walter Sisulu Komani I m on part 3 now can you assist me.my topic is: transitional challenges faced by educators in intermediate phase in the Alfred Nzo District.

Atonisah Jonathan

Appreciate the presentation. Very useful step-by-step guidelines to follow.

Bello Suleiman

I appreciate sir

Titilayo

wow! This is super insightful for me. Thank you!

Emerita Guzman

Indeed this material is very helpful! Kudos writers/authors.

TSEDEKE JOHN

I want to say thank you very much, I got a lot of info and knowledge. Be blessed.

Akanji wasiu

I want present a seminar paper on Optimisation of Deep learning-based models on vulnerability detection in digital transactions.

Need assistance

Clement Lokwar

Dear Sir, I want to be assisted on my research on Sanitation and Water management in emergencies areas.

Peter Sone Kome

I am deeply grateful for the knowledge gained. I will be getting in touch shortly as I want to be assisted in my ongoing research.

Nirmala

The information shared is informative, crisp and clear. Kudos Team! And thanks a lot!

Bipin pokhrel

hello i want to study

Kassahun

Hello!! Grad coach teams. I am extremely happy in your tutorial or consultation. i am really benefited all material and briefing. Thank you very much for your generous helps. Please keep it up. If you add in your briefing, references for further reading, it will be very nice.

Ezra

All I have to say is, thank u gyz.

Work

Good, l thanks

Artak Ghonyan

thank you, it is very useful

Trackbacks/Pingbacks

  • What Is A Literature Review (In A Dissertation Or Thesis) - Grad Coach - […] the literature review is to inform the choice of methodology for your own research. As we’ve discussed on the Grad Coach blog,…
  • Free Download: Research Proposal Template (With Examples) - Grad Coach - […] Research design (methodology) […]
  • Dissertation vs Thesis: What's the difference? - Grad Coach - […] and thesis writing on a daily basis – everything from how to find a good research topic to which…

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Reference management. Clean and simple.

What is research methodology?

what is included in the research methodology

The basics of research methodology

Why do you need a research methodology, what needs to be included, why do you need to document your research method, what are the different types of research instruments, qualitative / quantitative / mixed research methodologies, how do you choose the best research methodology for you, frequently asked questions about research methodology, related articles.

When you’re working on your first piece of academic research, there are many different things to focus on, and it can be overwhelming to stay on top of everything. This is especially true of budding or inexperienced researchers.

If you’ve never put together a research proposal before or find yourself in a position where you need to explain your research methodology decisions, there are a few things you need to be aware of.

Once you understand the ins and outs, handling academic research in the future will be less intimidating. We break down the basics below:

A research methodology encompasses the way in which you intend to carry out your research. This includes how you plan to tackle things like collection methods, statistical analysis, participant observations, and more.

You can think of your research methodology as being a formula. One part will be how you plan on putting your research into practice, and another will be why you feel this is the best way to approach it. Your research methodology is ultimately a methodological and systematic plan to resolve your research problem.

In short, you are explaining how you will take your idea and turn it into a study, which in turn will produce valid and reliable results that are in accordance with the aims and objectives of your research. This is true whether your paper plans to make use of qualitative methods or quantitative methods.

The purpose of a research methodology is to explain the reasoning behind your approach to your research - you'll need to support your collection methods, methods of analysis, and other key points of your work.

Think of it like writing a plan or an outline for you what you intend to do.

When carrying out research, it can be easy to go off-track or depart from your standard methodology.

Tip: Having a methodology keeps you accountable and on track with your original aims and objectives, and gives you a suitable and sound plan to keep your project manageable, smooth, and effective.

With all that said, how do you write out your standard approach to a research methodology?

As a general plan, your methodology should include the following information:

  • Your research method.  You need to state whether you plan to use quantitative analysis, qualitative analysis, or mixed-method research methods. This will often be determined by what you hope to achieve with your research.
  • Explain your reasoning. Why are you taking this methodological approach? Why is this particular methodology the best way to answer your research problem and achieve your objectives?
  • Explain your instruments.  This will mainly be about your collection methods. There are varying instruments to use such as interviews, physical surveys, questionnaires, for example. Your methodology will need to detail your reasoning in choosing a particular instrument for your research.
  • What will you do with your results?  How are you going to analyze the data once you have gathered it?
  • Advise your reader.  If there is anything in your research methodology that your reader might be unfamiliar with, you should explain it in more detail. For example, you should give any background information to your methods that might be relevant or provide your reasoning if you are conducting your research in a non-standard way.
  • How will your sampling process go?  What will your sampling procedure be and why? For example, if you will collect data by carrying out semi-structured or unstructured interviews, how will you choose your interviewees and how will you conduct the interviews themselves?
  • Any practical limitations?  You should discuss any limitations you foresee being an issue when you’re carrying out your research.

In any dissertation, thesis, or academic journal, you will always find a chapter dedicated to explaining the research methodology of the person who carried out the study, also referred to as the methodology section of the work.

A good research methodology will explain what you are going to do and why, while a poor methodology will lead to a messy or disorganized approach.

You should also be able to justify in this section your reasoning for why you intend to carry out your research in a particular way, especially if it might be a particularly unique method.

Having a sound methodology in place can also help you with the following:

  • When another researcher at a later date wishes to try and replicate your research, they will need your explanations and guidelines.
  • In the event that you receive any criticism or questioning on the research you carried out at a later point, you will be able to refer back to it and succinctly explain the how and why of your approach.
  • It provides you with a plan to follow throughout your research. When you are drafting your methodology approach, you need to be sure that the method you are using is the right one for your goal. This will help you with both explaining and understanding your method.
  • It affords you the opportunity to document from the outset what you intend to achieve with your research, from start to finish.

A research instrument is a tool you will use to help you collect, measure and analyze the data you use as part of your research.

The choice of research instrument will usually be yours to make as the researcher and will be whichever best suits your methodology.

There are many different research instruments you can use in collecting data for your research.

Generally, they can be grouped as follows:

  • Interviews (either as a group or one-on-one). You can carry out interviews in many different ways. For example, your interview can be structured, semi-structured, or unstructured. The difference between them is how formal the set of questions is that is asked of the interviewee. In a group interview, you may choose to ask the interviewees to give you their opinions or perceptions on certain topics.
  • Surveys (online or in-person). In survey research, you are posing questions in which you ask for a response from the person taking the survey. You may wish to have either free-answer questions such as essay-style questions, or you may wish to use closed questions such as multiple choice. You may even wish to make the survey a mixture of both.
  • Focus Groups.  Similar to the group interview above, you may wish to ask a focus group to discuss a particular topic or opinion while you make a note of the answers given.
  • Observations.  This is a good research instrument to use if you are looking into human behaviors. Different ways of researching this include studying the spontaneous behavior of participants in their everyday life, or something more structured. A structured observation is research conducted at a set time and place where researchers observe behavior as planned and agreed upon with participants.

These are the most common ways of carrying out research, but it is really dependent on your needs as a researcher and what approach you think is best to take.

It is also possible to combine a number of research instruments if this is necessary and appropriate in answering your research problem.

There are three different types of methodologies, and they are distinguished by whether they focus on words, numbers, or both.

➡️ Want to learn more about the differences between qualitative and quantitative research, and how to use both methods? Check out our guide for that!

If you've done your due diligence, you'll have an idea of which methodology approach is best suited to your research.

It’s likely that you will have carried out considerable reading and homework before you reach this point and you may have taken inspiration from other similar studies that have yielded good results.

Still, it is important to consider different options before setting your research in stone. Exploring different options available will help you to explain why the choice you ultimately make is preferable to other methods.

If proving your research problem requires you to gather large volumes of numerical data to test hypotheses, a quantitative research method is likely to provide you with the most usable results.

If instead you’re looking to try and learn more about people, and their perception of events, your methodology is more exploratory in nature and would therefore probably be better served using a qualitative research methodology.

It helps to always bring things back to the question: what do I want to achieve with my research?

Once you have conducted your research, you need to analyze it. Here are some helpful guides for qualitative data analysis:

➡️  How to do a content analysis

➡️  How to do a thematic analysis

➡️  How to do a rhetorical analysis

Research methodology refers to the techniques used to find and analyze information for a study, ensuring that the results are valid, reliable and that they address the research objective.

Data can typically be organized into four different categories or methods: observational, experimental, simulation, and derived.

Writing a methodology section is a process of introducing your methods and instruments, discussing your analysis, providing more background information, addressing your research limitations, and more.

Your research methodology section will need a clear research question and proposed research approach. You'll need to add a background, introduce your research question, write your methodology and add the works you cited during your data collecting phase.

The research methodology section of your study will indicate how valid your findings are and how well-informed your paper is. It also assists future researchers planning to use the same methodology, who want to cite your study or replicate it.

Rhetorical analysis illustration

  • How it works

Published by Nicolas at March 21st, 2024 , Revised On March 12, 2024

The Ultimate Guide To Research Methodology

Research methodology is a crucial aspect of any investigative process, serving as the blueprint for the entire research journey. If you are stuck in the methodology section of your research paper , then this blog will guide you on what is a research methodology, its types and how to successfully conduct one. 

Table of Contents

What Is Research Methodology?

Research methodology can be defined as the systematic framework that guides researchers in designing, conducting, and analyzing their investigations. It encompasses a structured set of processes, techniques, and tools employed to gather and interpret data, ensuring the reliability and validity of the research findings. 

Research methodology is not confined to a singular approach; rather, it encapsulates a diverse range of methods tailored to the specific requirements of the research objectives.

Here is why Research methodology is important in academic and professional settings.

Facilitating Rigorous Inquiry

Research methodology forms the backbone of rigorous inquiry. It provides a structured approach that aids researchers in formulating precise thesis statements , selecting appropriate methodologies, and executing systematic investigations. This, in turn, enhances the quality and credibility of the research outcomes.

Ensuring Reproducibility And Reliability

In both academic and professional contexts, the ability to reproduce research outcomes is paramount. A well-defined research methodology establishes clear procedures, making it possible for others to replicate the study. This not only validates the findings but also contributes to the cumulative nature of knowledge.

Guiding Decision-Making Processes

In professional settings, decisions often hinge on reliable data and insights. Research methodology equips professionals with the tools to gather pertinent information, analyze it rigorously, and derive meaningful conclusions.

This informed decision-making is instrumental in achieving organizational goals and staying ahead in competitive environments.

Contributing To Academic Excellence

For academic researchers, adherence to robust research methodology is a hallmark of excellence. Institutions value research that adheres to high standards of methodology, fostering a culture of academic rigour and intellectual integrity. Furthermore, it prepares students with critical skills applicable beyond academia.

Enhancing Problem-Solving Abilities

Research methodology instills a problem-solving mindset by encouraging researchers to approach challenges systematically. It equips individuals with the skills to dissect complex issues, formulate hypotheses , and devise effective strategies for investigation.

Understanding Research Methodology

In the pursuit of knowledge and discovery, understanding the fundamentals of research methodology is paramount. 

Basics Of Research

Research, in its essence, is a systematic and organized process of inquiry aimed at expanding our understanding of a particular subject or phenomenon. It involves the exploration of existing knowledge, the formulation of hypotheses, and the collection and analysis of data to draw meaningful conclusions. 

Research is a dynamic and iterative process that contributes to the continuous evolution of knowledge in various disciplines.

Types of Research

Research takes on various forms, each tailored to the nature of the inquiry. Broadly classified, research can be categorized into two main types:

  • Quantitative Research: This type involves the collection and analysis of numerical data to identify patterns, relationships, and statistical significance. It is particularly useful for testing hypotheses and making predictions.
  • Qualitative Research: Qualitative research focuses on understanding the depth and details of a phenomenon through non-numerical data. It often involves methods such as interviews, focus groups, and content analysis, providing rich insights into complex issues.

Components Of Research Methodology

To conduct effective research, one must go through the different components of research methodology. These components form the scaffolding that supports the entire research process, ensuring its coherence and validity.

Research Design

Research design serves as the blueprint for the entire research project. It outlines the overall structure and strategy for conducting the study. The three primary types of research design are:

  • Exploratory Research: Aimed at gaining insights and familiarity with the topic, often used in the early stages of research.
  • Descriptive Research: Involves portraying an accurate profile of a situation or phenomenon, answering the ‘what,’ ‘who,’ ‘where,’ and ‘when’ questions.
  • Explanatory Research: Seeks to identify the causes and effects of a phenomenon, explaining the ‘why’ and ‘how.’

Data Collection Methods

Choosing the right data collection methods is crucial for obtaining reliable and relevant information. Common methods include:

  • Surveys and Questionnaires: Employed to gather information from a large number of respondents through standardized questions.
  • Interviews: In-depth conversations with participants, offering qualitative insights.
  • Observation: Systematic watching and recording of behaviour, events, or processes in their natural setting.

Data Analysis Techniques

Once data is collected, analysis becomes imperative to derive meaningful conclusions. Different methodologies exist for quantitative and qualitative data:

  • Quantitative Data Analysis: Involves statistical techniques such as descriptive statistics, inferential statistics, and regression analysis to interpret numerical data.
  • Qualitative Data Analysis: Methods like content analysis, thematic analysis, and grounded theory are employed to extract patterns, themes, and meanings from non-numerical data.

The research paper we write have:

  • Precision and Clarity
  • Zero Plagiarism
  • High-level Encryption
  • Authentic Sources

Choosing a Research Method

Selecting an appropriate research method is a critical decision in the research process. It determines the approach, tools, and techniques that will be used to answer the research questions. 

Quantitative Research Methods

Quantitative research involves the collection and analysis of numerical data, providing a structured and objective approach to understanding and explaining phenomena.

Experimental Research

Experimental research involves manipulating variables to observe the effect on another variable under controlled conditions. It aims to establish cause-and-effect relationships.

Key Characteristics:

  • Controlled Environment: Experiments are conducted in a controlled setting to minimize external influences.
  • Random Assignment: Participants are randomly assigned to different experimental conditions.
  • Quantitative Data: Data collected is numerical, allowing for statistical analysis.

Applications: Commonly used in scientific studies and psychology to test hypotheses and identify causal relationships.

Survey Research

Survey research gathers information from a sample of individuals through standardized questionnaires or interviews. It aims to collect data on opinions, attitudes, and behaviours.

  • Structured Instruments: Surveys use structured instruments, such as questionnaires, to collect data.
  • Large Sample Size: Surveys often target a large and diverse group of participants.
  • Quantitative Data Analysis: Responses are quantified for statistical analysis.

Applications: Widely employed in social sciences, marketing, and public opinion research to understand trends and preferences.

Descriptive Research

Descriptive research seeks to portray an accurate profile of a situation or phenomenon. It focuses on answering the ‘what,’ ‘who,’ ‘where,’ and ‘when’ questions.

  • Observation and Data Collection: This involves observing and documenting without manipulating variables.
  • Objective Description: Aim to provide an unbiased and factual account of the subject.
  • Quantitative or Qualitative Data: T his can include both types of data, depending on the research focus.

Applications: Useful in situations where researchers want to understand and describe a phenomenon without altering it, common in social sciences and education.

Qualitative Research Methods

Qualitative research emphasizes exploring and understanding the depth and complexity of phenomena through non-numerical data.

A case study is an in-depth exploration of a particular person, group, event, or situation. It involves detailed, context-rich analysis.

  • Rich Data Collection: Uses various data sources, such as interviews, observations, and documents.
  • Contextual Understanding: Aims to understand the context and unique characteristics of the case.
  • Holistic Approach: Examines the case in its entirety.

Applications: Common in social sciences, psychology, and business to investigate complex and specific instances.

Ethnography

Ethnography involves immersing the researcher in the culture or community being studied to gain a deep understanding of their behaviours, beliefs, and practices.

  • Participant Observation: Researchers actively participate in the community or setting.
  • Holistic Perspective: Focuses on the interconnectedness of cultural elements.
  • Qualitative Data: In-depth narratives and descriptions are central to ethnographic studies.

Applications: Widely used in anthropology, sociology, and cultural studies to explore and document cultural practices.

Grounded Theory

Grounded theory aims to develop theories grounded in the data itself. It involves systematic data collection and analysis to construct theories from the ground up.

  • Constant Comparison: Data is continually compared and analyzed during the research process.
  • Inductive Reasoning: Theories emerge from the data rather than being imposed on it.
  • Iterative Process: The research design evolves as the study progresses.

Applications: Commonly applied in sociology, nursing, and management studies to generate theories from empirical data.

Research design is the structural framework that outlines the systematic process and plan for conducting a study. It serves as the blueprint, guiding researchers on how to collect, analyze, and interpret data.

Exploratory, Descriptive, And Explanatory Designs

Exploratory design.

Exploratory research design is employed when a researcher aims to explore a relatively unknown subject or gain insights into a complex phenomenon.

  • Flexibility: Allows for flexibility in data collection and analysis.
  • Open-Ended Questions: Uses open-ended questions to gather a broad range of information.
  • Preliminary Nature: Often used in the initial stages of research to formulate hypotheses.

Applications: Valuable in the early stages of investigation, especially when the researcher seeks a deeper understanding of a subject before formalizing research questions.

Descriptive Design

Descriptive research design focuses on portraying an accurate profile of a situation, group, or phenomenon.

  • Structured Data Collection: Involves systematic and structured data collection methods.
  • Objective Presentation: Aims to provide an unbiased and factual account of the subject.
  • Quantitative or Qualitative Data: Can incorporate both types of data, depending on the research objectives.

Applications: Widely used in social sciences, marketing, and educational research to provide detailed and objective descriptions.

Explanatory Design

Explanatory research design aims to identify the causes and effects of a phenomenon, explaining the ‘why’ and ‘how’ behind observed relationships.

  • Causal Relationships: Seeks to establish causal relationships between variables.
  • Controlled Variables : Often involves controlling certain variables to isolate causal factors.
  • Quantitative Analysis: Primarily relies on quantitative data analysis techniques.

Applications: Commonly employed in scientific studies and social sciences to delve into the underlying reasons behind observed patterns.

Cross-Sectional Vs. Longitudinal Designs

Cross-sectional design.

Cross-sectional designs collect data from participants at a single point in time.

  • Snapshot View: Provides a snapshot of a population at a specific moment.
  • Efficiency: More efficient in terms of time and resources.
  • Limited Temporal Insights: Offers limited insights into changes over time.

Applications: Suitable for studying characteristics or behaviours that are stable or not expected to change rapidly.

Longitudinal Design

Longitudinal designs involve the collection of data from the same participants over an extended period.

  • Temporal Sequence: Allows for the examination of changes over time.
  • Causality Assessment: Facilitates the assessment of cause-and-effect relationships.
  • Resource-Intensive: Requires more time and resources compared to cross-sectional designs.

Applications: Ideal for studying developmental processes, trends, or the impact of interventions over time.

Experimental Vs Non-experimental Designs

Experimental design.

Experimental designs involve manipulating variables under controlled conditions to observe the effect on another variable.

  • Causality Inference: Enables the inference of cause-and-effect relationships.
  • Quantitative Data: Primarily involves the collection and analysis of numerical data.

Applications: Commonly used in scientific studies, psychology, and medical research to establish causal relationships.

Non-Experimental Design

Non-experimental designs observe and describe phenomena without manipulating variables.

  • Natural Settings: Data is often collected in natural settings without intervention.
  • Descriptive or Correlational: Focuses on describing relationships or correlations between variables.
  • Quantitative or Qualitative Data: This can involve either type of data, depending on the research approach.

Applications: Suitable for studying complex phenomena in real-world settings where manipulation may not be ethical or feasible.

Effective data collection is fundamental to the success of any research endeavour. 

Designing Effective Surveys

Objective Design:

  • Clearly define the research objectives to guide the survey design.
  • Craft questions that align with the study’s goals and avoid ambiguity.

Structured Format:

  • Use a structured format with standardized questions for consistency.
  • Include a mix of closed-ended and open-ended questions for detailed insights.

Pilot Testing:

  • Conduct pilot tests to identify and rectify potential issues with survey design.
  • Ensure clarity, relevance, and appropriateness of questions.

Sampling Strategy:

  • Develop a robust sampling strategy to ensure a representative participant group.
  • Consider random sampling or stratified sampling based on the research goals.

Conducting Interviews

Establishing Rapport:

  • Build rapport with participants to create a comfortable and open environment.
  • Clearly communicate the purpose of the interview and the value of participants’ input.

Open-Ended Questions:

  • Frame open-ended questions to encourage detailed responses.
  • Allow participants to express their thoughts and perspectives freely.

Active Listening:

  • Practice active listening to understand areas and gather rich data.
  • Avoid interrupting and maintain a non-judgmental stance during the interview.

Ethical Considerations:

  • Obtain informed consent and assure participants of confidentiality.
  • Be transparent about the study’s purpose and potential implications.

Observation

1. participant observation.

Immersive Participation:

  • Actively immerse yourself in the setting or group being observed.
  • Develop a deep understanding of behaviours, interactions, and context.

Field Notes:

  • Maintain detailed and reflective field notes during observations.
  • Document observed patterns, unexpected events, and participant reactions.

Ethical Awareness:

  • Be conscious of ethical considerations, ensuring respect for participants.
  • Balance the role of observer and participant to minimize bias.

2. Non-participant Observation

Objective Observation:

  • Maintain a more detached and objective stance during non-participant observation.
  • Focus on recording behaviours, events, and patterns without direct involvement.

Data Reliability:

  • Enhance the reliability of data by reducing observer bias.
  • Develop clear observation protocols and guidelines.

Contextual Understanding:

  • Strive for a thorough understanding of the observed context.
  • Consider combining non-participant observation with other methods for triangulation.

Archival Research

1. using existing data.

Identifying Relevant Archives:

  • Locate and access archives relevant to the research topic.
  • Collaborate with institutions or repositories holding valuable data.

Data Verification:

  • Verify the accuracy and reliability of archived data.
  • Cross-reference with other sources to ensure data integrity.

Ethical Use:

  • Adhere to ethical guidelines when using existing data.
  • Respect copyright and intellectual property rights.

2. Challenges and Considerations

Incomplete or Inaccurate Archives:

  • Address the possibility of incomplete or inaccurate archival records.
  • Acknowledge limitations and uncertainties in the data.

Temporal Bias:

  • Recognize potential temporal biases in archived data.
  • Consider the historical context and changes that may impact interpretation.

Access Limitations:

  • Address potential limitations in accessing certain archives.
  • Seek alternative sources or collaborate with institutions to overcome barriers.

Common Challenges in Research Methodology

Conducting research is a complex and dynamic process, often accompanied by a myriad of challenges. Addressing these challenges is crucial to ensure the reliability and validity of research findings.

Sampling Issues

Sampling bias:.

  • The presence of sampling bias can lead to an unrepresentative sample, affecting the generalizability of findings.
  • Employ random sampling methods and ensure the inclusion of diverse participants to reduce bias.

Sample Size Determination:

  • Determining an appropriate sample size is a delicate balance. Too small a sample may lack statistical power, while an excessively large sample may strain resources.
  • Conduct a power analysis to determine the optimal sample size based on the research objectives and expected effect size.

Data Quality And Validity

Measurement error:.

  • Inaccuracies in measurement tools or data collection methods can introduce measurement errors, impacting the validity of results.
  • Pilot test instruments, calibrate equipment, and use standardized measures to enhance the reliability of data.

Construct Validity:

  • Ensuring that the chosen measures accurately capture the intended constructs is a persistent challenge.
  • Use established measurement instruments and employ multiple measures to assess the same construct for triangulation.

Time And Resource Constraints

Timeline pressures:.

  • Limited timeframes can compromise the depth and thoroughness of the research process.
  • Develop a realistic timeline, prioritize tasks, and communicate expectations with stakeholders to manage time constraints effectively.

Resource Availability:

  • Inadequate resources, whether financial or human, can impede the execution of research activities.
  • Seek external funding, collaborate with other researchers, and explore alternative methods that require fewer resources.

Managing Bias in Research

Selection bias:.

  • Selecting participants in a way that systematically skews the sample can introduce selection bias.
  • Employ randomization techniques, use stratified sampling, and transparently report participant recruitment methods.

Confirmation Bias:

  • Researchers may unintentionally favour information that confirms their preconceived beliefs or hypotheses.
  • Adopt a systematic and open-minded approach, use blinded study designs, and engage in peer review to mitigate confirmation bias.

Tips On How To Write A Research Methodology

Conducting successful research relies not only on the application of sound methodologies but also on strategic planning and effective collaboration. Here are some tips to enhance the success of your research methodology:

Tip 1. Clear Research Objectives

Well-defined research objectives guide the entire research process. Clearly articulate the purpose of your study, outlining specific research questions or hypotheses.

Tip 2. Comprehensive Literature Review

A thorough literature review provides a foundation for understanding existing knowledge and identifying gaps. Invest time in reviewing relevant literature to inform your research design and methodology.

Tip 3. Detailed Research Plan

A detailed plan serves as a roadmap, ensuring all aspects of the research are systematically addressed. Develop a detailed research plan outlining timelines, milestones, and tasks.

Tip 4. Ethical Considerations

Ethical practices are fundamental to maintaining the integrity of research. Address ethical considerations early, obtain necessary approvals, and ensure participant rights are safeguarded.

Tip 5. Stay Updated On Methodologies

Research methodologies evolve, and staying updated is essential for employing the most effective techniques. Engage in continuous learning by attending workshops, conferences, and reading recent publications.

Tip 6. Adaptability In Methods

Unforeseen challenges may arise during research, necessitating adaptability in methods. Be flexible and willing to modify your approach when needed, ensuring the integrity of the study.

Tip 7. Iterative Approach

Research is often an iterative process, and refining methods based on ongoing findings enhance the study’s robustness. Regularly review and refine your research design and methods as the study progresses.

Frequently Asked Questions

What is the research methodology.

Research methodology is the systematic process of planning, executing, and evaluating scientific investigation. It encompasses the techniques, tools, and procedures used to collect, analyze, and interpret data, ensuring the reliability and validity of research findings.

What are the methodologies in research?

Research methodologies include qualitative and quantitative approaches. Qualitative methods involve in-depth exploration of non-numerical data, while quantitative methods use statistical analysis to examine numerical data. Mixed methods combine both approaches for a comprehensive understanding of research questions.

How to write research methodology?

To write a research methodology, clearly outline the study’s design, data collection, and analysis procedures. Specify research tools, participants, and sampling methods. Justify choices and discuss limitations. Ensure clarity, coherence, and alignment with research objectives for a robust methodology section.

How to write the methodology section of a research paper?

In the methodology section of a research paper, describe the study’s design, data collection, and analysis methods. Detail procedures, tools, participants, and sampling. Justify choices, address ethical considerations, and explain how the methodology aligns with research objectives, ensuring clarity and rigour.

What is mixed research methodology?

Mixed research methodology combines both qualitative and quantitative research approaches within a single study. This approach aims to enhance the details and depth of research findings by providing a more comprehensive understanding of the research problem or question.

You May Also Like

The central idea of this excerpt revolves around the exploration of key themes, offering insights that illuminate the concepts within the text.

Common topics in Botany papers include taxonomy, plant physiology, ecology and biodiversity, plant pathology, and genetics.

Craft a compelling scholarship motivation letter by showcasing your passion, achievements, and future goals concisely and impactfully.

Ready to place an order?

USEFUL LINKS

Learning resources, company details.

  • How It Works

Automated page speed optimizations for fast site performance

News alert: UC Berkeley has announced its next university librarian

Secondary menu

  • Log in to your Library account
  • Hours and Maps
  • Connect from Off Campus
  • UC Berkeley Home

Search form

Research methods--quantitative, qualitative, and more: overview.

  • Quantitative Research
  • Qualitative Research
  • Data Science Methods (Machine Learning, AI, Big Data)
  • Text Mining and Computational Text Analysis
  • Evidence Synthesis/Systematic Reviews
  • Get Data, Get Help!

About Research Methods

This guide provides an overview of research methods, how to choose and use them, and supports and resources at UC Berkeley. 

As Patten and Newhart note in the book Understanding Research Methods , "Research methods are the building blocks of the scientific enterprise. They are the "how" for building systematic knowledge. The accumulation of knowledge through research is by its nature a collective endeavor. Each well-designed study provides evidence that may support, amend, refute, or deepen the understanding of existing knowledge...Decisions are important throughout the practice of research and are designed to help researchers collect evidence that includes the full spectrum of the phenomenon under study, to maintain logical rules, and to mitigate or account for possible sources of bias. In many ways, learning research methods is learning how to see and make these decisions."

The choice of methods varies by discipline, by the kind of phenomenon being studied and the data being used to study it, by the technology available, and more.  This guide is an introduction, but if you don't see what you need here, always contact your subject librarian, and/or take a look to see if there's a library research guide that will answer your question. 

Suggestions for changes and additions to this guide are welcome! 

START HERE: SAGE Research Methods

Without question, the most comprehensive resource available from the library is SAGE Research Methods.  HERE IS THE ONLINE GUIDE  to this one-stop shopping collection, and some helpful links are below:

  • SAGE Research Methods
  • Little Green Books  (Quantitative Methods)
  • Little Blue Books  (Qualitative Methods)
  • Dictionaries and Encyclopedias  
  • Case studies of real research projects
  • Sample datasets for hands-on practice
  • Streaming video--see methods come to life
  • Methodspace- -a community for researchers
  • SAGE Research Methods Course Mapping

Library Data Services at UC Berkeley

Library Data Services Program and Digital Scholarship Services

The LDSP offers a variety of services and tools !  From this link, check out pages for each of the following topics:  discovering data, managing data, collecting data, GIS data, text data mining, publishing data, digital scholarship, open science, and the Research Data Management Program.

Be sure also to check out the visual guide to where to seek assistance on campus with any research question you may have!

Library GIS Services

Other Data Services at Berkeley

D-Lab Supports Berkeley faculty, staff, and graduate students with research in data intensive social science, including a wide range of training and workshop offerings Dryad Dryad is a simple self-service tool for researchers to use in publishing their datasets. It provides tools for the effective publication of and access to research data. Geospatial Innovation Facility (GIF) Provides leadership and training across a broad array of integrated mapping technologies on campu Research Data Management A UC Berkeley guide and consulting service for research data management issues

General Research Methods Resources

Here are some general resources for assistance:

  • Assistance from ICPSR (must create an account to access): Getting Help with Data , and Resources for Students
  • Wiley Stats Ref for background information on statistics topics
  • Survey Documentation and Analysis (SDA) .  Program for easy web-based analysis of survey data.

Consultants

  • D-Lab/Data Science Discovery Consultants Request help with your research project from peer consultants.
  • Research data (RDM) consulting Meet with RDM consultants before designing the data security, storage, and sharing aspects of your qualitative project.
  • Statistics Department Consulting Services A service in which advanced graduate students, under faculty supervision, are available to consult during specified hours in the Fall and Spring semesters.

Related Resourcex

  • IRB / CPHS Qualitative research projects with human subjects often require that you go through an ethics review.
  • OURS (Office of Undergraduate Research and Scholarships) OURS supports undergraduates who want to embark on research projects and assistantships. In particular, check out their "Getting Started in Research" workshops
  • Sponsored Projects Sponsored projects works with researchers applying for major external grants.
  • Next: Quantitative Research >>
  • Last Updated: Apr 25, 2024 11:09 AM
  • URL: https://guides.lib.berkeley.edu/researchmethods
  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • 6. The Methodology
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Applying Critical Thinking
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

The methods section describes actions taken to investigate a research problem and the rationale for the application of specific procedures or techniques used to identify, select, process, and analyze information applied to understanding the problem, thereby, allowing the reader to critically evaluate a study’s overall validity and reliability. The methodology section of a research paper answers two main questions: How was the data collected or generated? And, how was it analyzed? The writing should be direct and precise and always written in the past tense.

Kallet, Richard H. "How to Write the Methods Section of a Research Paper." Respiratory Care 49 (October 2004): 1229-1232.

Importance of a Good Methodology Section

You must explain how you obtained and analyzed your results for the following reasons:

  • Readers need to know how the data was obtained because the method you chose affects the results and, by extension, how you interpreted their significance in the discussion section of your paper.
  • Methodology is crucial for any branch of scholarship because an unreliable method produces unreliable results and, as a consequence, undermines the value of your analysis of the findings.
  • In most cases, there are a variety of different methods you can choose to investigate a research problem. The methodology section of your paper should clearly articulate the reasons why you have chosen a particular procedure or technique.
  • The reader wants to know that the data was collected or generated in a way that is consistent with accepted practice in the field of study. For example, if you are using a multiple choice questionnaire, readers need to know that it offered your respondents a reasonable range of answers to choose from.
  • The method must be appropriate to fulfilling the overall aims of the study. For example, you need to ensure that you have a large enough sample size to be able to generalize and make recommendations based upon the findings.
  • The methodology should discuss the problems that were anticipated and the steps you took to prevent them from occurring. For any problems that do arise, you must describe the ways in which they were minimized or why these problems do not impact in any meaningful way your interpretation of the findings.
  • In the social and behavioral sciences, it is important to always provide sufficient information to allow other researchers to adopt or replicate your methodology. This information is particularly important when a new method has been developed or an innovative use of an existing method is utilized.

Bem, Daryl J. Writing the Empirical Journal Article. Psychology Writing Center. University of Washington; Denscombe, Martyn. The Good Research Guide: For Small-Scale Social Research Projects . 5th edition. Buckingham, UK: Open University Press, 2014; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008.

Structure and Writing Style

I.  Groups of Research Methods

There are two main groups of research methods in the social sciences:

  • The e mpirical-analytical group approaches the study of social sciences in a similar manner that researchers study the natural sciences . This type of research focuses on objective knowledge, research questions that can be answered yes or no, and operational definitions of variables to be measured. The empirical-analytical group employs deductive reasoning that uses existing theory as a foundation for formulating hypotheses that need to be tested. This approach is focused on explanation.
  • The i nterpretative group of methods is focused on understanding phenomenon in a comprehensive, holistic way . Interpretive methods focus on analytically disclosing the meaning-making practices of human subjects [the why, how, or by what means people do what they do], while showing how those practices arrange so that it can be used to generate observable outcomes. Interpretive methods allow you to recognize your connection to the phenomena under investigation. However, the interpretative group requires careful examination of variables because it focuses more on subjective knowledge.

II.  Content

The introduction to your methodology section should begin by restating the research problem and underlying assumptions underpinning your study. This is followed by situating the methods you used to gather, analyze, and process information within the overall “tradition” of your field of study and within the particular research design you have chosen to study the problem. If the method you choose lies outside of the tradition of your field [i.e., your review of the literature demonstrates that the method is not commonly used], provide a justification for how your choice of methods specifically addresses the research problem in ways that have not been utilized in prior studies.

The remainder of your methodology section should describe the following:

  • Decisions made in selecting the data you have analyzed or, in the case of qualitative research, the subjects and research setting you have examined,
  • Tools and methods used to identify and collect information, and how you identified relevant variables,
  • The ways in which you processed the data and the procedures you used to analyze that data, and
  • The specific research tools or strategies that you utilized to study the underlying hypothesis and research questions.

In addition, an effectively written methodology section should:

  • Introduce the overall methodological approach for investigating your research problem . Is your study qualitative or quantitative or a combination of both (mixed method)? Are you going to take a special approach, such as action research, or a more neutral stance?
  • Indicate how the approach fits the overall research design . Your methods for gathering data should have a clear connection to your research problem. In other words, make sure that your methods will actually address the problem. One of the most common deficiencies found in research papers is that the proposed methodology is not suitable to achieving the stated objective of your paper.
  • Describe the specific methods of data collection you are going to use , such as, surveys, interviews, questionnaires, observation, archival research. If you are analyzing existing data, such as a data set or archival documents, describe how it was originally created or gathered and by whom. Also be sure to explain how older data is still relevant to investigating the current research problem.
  • Explain how you intend to analyze your results . Will you use statistical analysis? Will you use specific theoretical perspectives to help you analyze a text or explain observed behaviors? Describe how you plan to obtain an accurate assessment of relationships, patterns, trends, distributions, and possible contradictions found in the data.
  • Provide background and a rationale for methodologies that are unfamiliar for your readers . Very often in the social sciences, research problems and the methods for investigating them require more explanation/rationale than widely accepted rules governing the natural and physical sciences. Be clear and concise in your explanation.
  • Provide a justification for subject selection and sampling procedure . For instance, if you propose to conduct interviews, how do you intend to select the sample population? If you are analyzing texts, which texts have you chosen, and why? If you are using statistics, why is this set of data being used? If other data sources exist, explain why the data you chose is most appropriate to addressing the research problem.
  • Provide a justification for case study selection . A common method of analyzing research problems in the social sciences is to analyze specific cases. These can be a person, place, event, phenomenon, or other type of subject of analysis that are either examined as a singular topic of in-depth investigation or multiple topics of investigation studied for the purpose of comparing or contrasting findings. In either method, you should explain why a case or cases were chosen and how they specifically relate to the research problem.
  • Describe potential limitations . Are there any practical limitations that could affect your data collection? How will you attempt to control for potential confounding variables and errors? If your methodology may lead to problems you can anticipate, state this openly and show why pursuing this methodology outweighs the risk of these problems cropping up.

NOTE :   Once you have written all of the elements of the methods section, subsequent revisions should focus on how to present those elements as clearly and as logically as possibly. The description of how you prepared to study the research problem, how you gathered the data, and the protocol for analyzing the data should be organized chronologically. For clarity, when a large amount of detail must be presented, information should be presented in sub-sections according to topic. If necessary, consider using appendices for raw data.

ANOTHER NOTE : If you are conducting a qualitative analysis of a research problem , the methodology section generally requires a more elaborate description of the methods used as well as an explanation of the processes applied to gathering and analyzing of data than is generally required for studies using quantitative methods. Because you are the primary instrument for generating the data [e.g., through interviews or observations], the process for collecting that data has a significantly greater impact on producing the findings. Therefore, qualitative research requires a more detailed description of the methods used.

YET ANOTHER NOTE :   If your study involves interviews, observations, or other qualitative techniques involving human subjects , you may be required to obtain approval from the university's Office for the Protection of Research Subjects before beginning your research. This is not a common procedure for most undergraduate level student research assignments. However, i f your professor states you need approval, you must include a statement in your methods section that you received official endorsement and adequate informed consent from the office and that there was a clear assessment and minimization of risks to participants and to the university. This statement informs the reader that your study was conducted in an ethical and responsible manner. In some cases, the approval notice is included as an appendix to your paper.

III.  Problems to Avoid

Irrelevant Detail The methodology section of your paper should be thorough but concise. Do not provide any background information that does not directly help the reader understand why a particular method was chosen, how the data was gathered or obtained, and how the data was analyzed in relation to the research problem [note: analyzed, not interpreted! Save how you interpreted the findings for the discussion section]. With this in mind, the page length of your methods section will generally be less than any other section of your paper except the conclusion.

Unnecessary Explanation of Basic Procedures Remember that you are not writing a how-to guide about a particular method. You should make the assumption that readers possess a basic understanding of how to investigate the research problem on their own and, therefore, you do not have to go into great detail about specific methodological procedures. The focus should be on how you applied a method , not on the mechanics of doing a method. An exception to this rule is if you select an unconventional methodological approach; if this is the case, be sure to explain why this approach was chosen and how it enhances the overall process of discovery.

Problem Blindness It is almost a given that you will encounter problems when collecting or generating your data, or, gaps will exist in existing data or archival materials. Do not ignore these problems or pretend they did not occur. Often, documenting how you overcame obstacles can form an interesting part of the methodology. It demonstrates to the reader that you can provide a cogent rationale for the decisions you made to minimize the impact of any problems that arose.

Literature Review Just as the literature review section of your paper provides an overview of sources you have examined while researching a particular topic, the methodology section should cite any sources that informed your choice and application of a particular method [i.e., the choice of a survey should include any citations to the works you used to help construct the survey].

It’s More than Sources of Information! A description of a research study's method should not be confused with a description of the sources of information. Such a list of sources is useful in and of itself, especially if it is accompanied by an explanation about the selection and use of the sources. The description of the project's methodology complements a list of sources in that it sets forth the organization and interpretation of information emanating from those sources.

Azevedo, L.F. et al. "How to Write a Scientific Paper: Writing the Methods Section." Revista Portuguesa de Pneumologia 17 (2011): 232-238; Blair Lorrie. “Choosing a Methodology.” In Writing a Graduate Thesis or Dissertation , Teaching Writing Series. (Rotterdam: Sense Publishers 2016), pp. 49-72; Butin, Dan W. The Education Dissertation A Guide for Practitioner Scholars . Thousand Oaks, CA: Corwin, 2010; Carter, Susan. Structuring Your Research Thesis . New York: Palgrave Macmillan, 2012; Kallet, Richard H. “How to Write the Methods Section of a Research Paper.” Respiratory Care 49 (October 2004):1229-1232; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008. Methods Section. The Writer’s Handbook. Writing Center. University of Wisconsin, Madison; Rudestam, Kjell Erik and Rae R. Newton. “The Method Chapter: Describing Your Research Plan.” In Surviving Your Dissertation: A Comprehensive Guide to Content and Process . (Thousand Oaks, Sage Publications, 2015), pp. 87-115; What is Interpretive Research. Institute of Public and International Affairs, University of Utah; Writing the Experimental Report: Methods, Results, and Discussion. The Writing Lab and The OWL. Purdue University; Methods and Materials. The Structure, Format, Content, and Style of a Journal-Style Scientific Paper. Department of Biology. Bates College.

Writing Tip

Statistical Designs and Tests? Do Not Fear Them!

Don't avoid using a quantitative approach to analyzing your research problem just because you fear the idea of applying statistical designs and tests. A qualitative approach, such as conducting interviews or content analysis of archival texts, can yield exciting new insights about a research problem, but it should not be undertaken simply because you have a disdain for running a simple regression. A well designed quantitative research study can often be accomplished in very clear and direct ways, whereas, a similar study of a qualitative nature usually requires considerable time to analyze large volumes of data and a tremendous burden to create new paths for analysis where previously no path associated with your research problem had existed.

To locate data and statistics, GO HERE .

Another Writing Tip

Knowing the Relationship Between Theories and Methods

There can be multiple meaning associated with the term "theories" and the term "methods" in social sciences research. A helpful way to delineate between them is to understand "theories" as representing different ways of characterizing the social world when you research it and "methods" as representing different ways of generating and analyzing data about that social world. Framed in this way, all empirical social sciences research involves theories and methods, whether they are stated explicitly or not. However, while theories and methods are often related, it is important that, as a researcher, you deliberately separate them in order to avoid your theories playing a disproportionate role in shaping what outcomes your chosen methods produce.

Introspectively engage in an ongoing dialectic between the application of theories and methods to help enable you to use the outcomes from your methods to interrogate and develop new theories, or ways of framing conceptually the research problem. This is how scholarship grows and branches out into new intellectual territory.

Reynolds, R. Larry. Ways of Knowing. Alternative Microeconomics . Part 1, Chapter 3. Boise State University; The Theory-Method Relationship. S-Cool Revision. United Kingdom.

Yet Another Writing Tip

Methods and the Methodology

Do not confuse the terms "methods" and "methodology." As Schneider notes, a method refers to the technical steps taken to do research . Descriptions of methods usually include defining and stating why you have chosen specific techniques to investigate a research problem, followed by an outline of the procedures you used to systematically select, gather, and process the data [remember to always save the interpretation of data for the discussion section of your paper].

The methodology refers to a discussion of the underlying reasoning why particular methods were used . This discussion includes describing the theoretical concepts that inform the choice of methods to be applied, placing the choice of methods within the more general nature of academic work, and reviewing its relevance to examining the research problem. The methodology section also includes a thorough review of the methods other scholars have used to study the topic.

Bryman, Alan. "Of Methods and Methodology." Qualitative Research in Organizations and Management: An International Journal 3 (2008): 159-168; Schneider, Florian. “What's in a Methodology: The Difference between Method, Methodology, and Theory…and How to Get the Balance Right?” PoliticsEastAsia.com. Chinese Department, University of Leiden, Netherlands.

  • << Previous: Scholarly vs. Popular Publications
  • Next: Qualitative Methods >>
  • Last Updated: May 22, 2024 12:03 PM
  • URL: https://libguides.usc.edu/writingguide

Pfeiffer Library

Research Methodologies

  • What are research designs?

What are research methodologies?

Quantitative research methodologies, qualitative research methodologies, mixed method methodologies, selecting a methodology.

  • What are research methods?
  • Additional Sources

According to Dawson (2019),a research methodology is the primary principle that will guide your research.  It becomes the general approach in conducting research on your topic and determines what research method you will use. A research methodology is different from a research method because research methods are the tools you use to gather your data (Dawson, 2019).  You must consider several issues when it comes to selecting the most appropriate methodology for your topic.  Issues might include research limitations and ethical dilemmas that might impact the quality of your research.  Descriptions of each type of methodology are included below.

Quantitative research methodologies are meant to create numeric statistics by using survey research to gather data (Dawson, 2019).  This approach tends to reach a larger amount of people in a shorter amount of time.  According to Labaree (2020), there are three parts that make up a quantitative research methodology:

  • Sample population
  • How you will collect your data (this is the research method)
  • How you will analyze your data

Once you decide on a methodology, you can consider the method to which you will apply your methodology.

Qualitative research methodologies examine the behaviors, opinions, and experiences of individuals through methods of examination (Dawson, 2019).  This type of approach typically requires less participants, but more time with each participant.  It gives research subjects the opportunity to provide their own opinion on a certain topic.

Examples of Qualitative Research Methodologies

  • Action research:  This is when the researcher works with a group of people to improve something in a certain environment.  It is a common approach for research in organizational management, community development, education, and agriculture (Dawson, 2019).
  • Ethnography:  The process of organizing and describing cultural behaviors (Dawson, 2019).  Researchers may immerse themselves into another culture to receive in "inside look" into the group they are studying.  It is often a time consuming process because the researcher will do this for a long period of time.  This can also be called "participant observation" (Dawson, 2019).
  • Feminist research:  The goal of this methodology is to study topics that have been dominated by male test subjects.  It aims to study females and compare the results to previous studies that used male participants (Dawson, 2019).
  • Grounded theory:  The process of developing a theory to describe a phenomenon strictly through the data results collected in a study.  It is different from other research methodologies where the researcher attempts to prove a hypothesis that they create before collecting data.  Popular research methods for this approach include focus groups and interviews (Dawson, 2019).

A mixed methodology allows you to implement the strengths of both qualitative and quantitative research methods.  In some cases, you may find that your research project would benefit from this.  This approach is beneficial because it allows each methodology to counteract the weaknesses of the other (Dawson, 2019).  You should consider this option carefully, as it can make your research complicated if not planned correctly.

What should you do to decide on a research methodology?  The most logical way to determine your methodology is to decide whether you plan on conducting qualitative or qualitative research.  You also have the option to implement a mixed methods approach.  Looking back on Dawson's (2019) five "W's" on the previous page , may help you with this process.  You should also look for key words that indicate a specific type of research methodology in your hypothesis or proposal.  Some words may lean more towards one methodology over another.

Quantitative Research Key Words

  • How satisfied

Qualitative Research Key Words

  • Experiences
  • Thoughts/Think
  • Relationship
  • << Previous: What are research designs?
  • Next: What are research methods? >>
  • Last Updated: Aug 2, 2022 2:36 PM
  • URL: https://library.tiffin.edu/researchmethodologies

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 21 May 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • BMC Med Res Methodol

Logo of bmcmrm

A tutorial on methodological studies: the what, when, how and why

Lawrence mbuagbaw.

1 Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON Canada

2 Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario L8N 4A6 Canada

3 Centre for the Development of Best Practices in Health, Yaoundé, Cameroon

Daeria O. Lawson

Livia puljak.

4 Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000 Zagreb, Croatia

David B. Allison

5 Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN 47405 USA

Lehana Thabane

6 Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON Canada

7 Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON Canada

8 Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON Canada

Associated Data

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.

We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?

Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.

The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 – 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 – 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).

In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig.  1 .

An external file that holds a picture, illustration, etc.
Object name is 12874_2020_1107_Fig1_HTML.jpg

Trends in the number studies that mention “methodological review” or “meta-

epidemiological study” in PubMed.

The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.

The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.

What is a methodological study?

Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 – 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.

Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.

When should we conduct a methodological study?

Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.

These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].

How often are methodological studies conducted?

There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.

Why do we conduct methodological studies?

Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].

Where can we find methodological studies?

Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.

Some frequently asked questions about methodological studies

In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.

Q: How should I select research reports for my methodological study?

A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].

The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.

Q: How many databases should I search?

A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.

Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.

Q: Should I publish a protocol for my methodological study?

A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.

Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).

Q: How to appraise the quality of a methodological study?

A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.

Q: Should I justify a sample size?

A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:

  • Comparing two groups
  • Determining a proportion, mean or another quantifier
  • Determining factors associated with an outcome using regression-based analyses

For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].

Q: What should I call my study?

A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.

Q: Should I account for clustering in my methodological study?

A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”

A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].

Q: Should I extract data in duplicate?

A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].

Q: Should I assess the risk of bias of research reports included in my methodological study?

A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].

Q: What variables are relevant to methodological studies?

A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:

  • Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.
  • Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].
  • Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]
  • Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 – 67 ].
  • Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].
  • Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].
  • Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].
  • Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.

Q: Should I focus only on high impact journals?

A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.

Q: Can I conduct a methodological study of qualitative research?

A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.

Q: What reporting guidelines should I use for my methodological study?

A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.

Q: What are the potential threats to validity and how can I avoid them?

A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.

Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].

With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.

Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n  = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n  = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n  = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.

Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.

A proposed framework

In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:

  • What is the aim?

A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].

Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].

Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].

In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].

Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].

Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].

Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].

In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.

  • 2. What is the design?

Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].

Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].

  • 3. What is the sampling strategy?

Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n  = 103) [ 30 ].

Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.

  • 4. What is the unit of analysis?

Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.

Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].

This framework is outlined in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 12874_2020_1107_Fig2_HTML.jpg

A proposed framework for methodological studies

Conclusions

Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.

In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.

Acknowledgements

Abbreviations, authors’ contributions.

LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.

This work did not receive any dedicated funding.

Availability of data and materials

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Open access
  • Published: 07 September 2020

A tutorial on methodological studies: the what, when, how and why

  • Lawrence Mbuagbaw   ORCID: orcid.org/0000-0001-5855-5461 1 , 2 , 3 ,
  • Daeria O. Lawson 1 ,
  • Livia Puljak 4 ,
  • David B. Allison 5 &
  • Lehana Thabane 1 , 2 , 6 , 7 , 8  

BMC Medical Research Methodology volume  20 , Article number:  226 ( 2020 ) Cite this article

39k Accesses

53 Citations

58 Altmetric

Metrics details

Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.

We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?

Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.

Peer Review reports

The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 , 2 , 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 , 7 , 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).

In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig.  1 .

figure 1

Trends in the number studies that mention “methodological review” or “meta-

epidemiological study” in PubMed.

The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.

The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.

What is a methodological study?

Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 , 13 , 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.

Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.

When should we conduct a methodological study?

Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.

These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].

How often are methodological studies conducted?

There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.

Why do we conduct methodological studies?

Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].

Where can we find methodological studies?

Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.

Some frequently asked questions about methodological studies

In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.

Q: How should I select research reports for my methodological study?

A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].

The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.

Q: How many databases should I search?

A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.

Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.

Q: Should I publish a protocol for my methodological study?

A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.

Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).

Q: How to appraise the quality of a methodological study?

A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.

Q: Should I justify a sample size?

A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:

Comparing two groups

Determining a proportion, mean or another quantifier

Determining factors associated with an outcome using regression-based analyses

For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].

Q: What should I call my study?

A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.

Q: Should I account for clustering in my methodological study?

A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”

A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].

Q: Should I extract data in duplicate?

A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].

Q: Should I assess the risk of bias of research reports included in my methodological study?

A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].

Q: What variables are relevant to methodological studies?

A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:

Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.

Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].

Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]

Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 , 66 , 67 ].

Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].

Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].

Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].

Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.

Q: Should I focus only on high impact journals?

A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.

Q: Can I conduct a methodological study of qualitative research?

A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.

Q: What reporting guidelines should I use for my methodological study?

A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.

Q: What are the potential threats to validity and how can I avoid them?

A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.

Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].

With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.

Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n  = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n  = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n  = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.

Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.

A proposed framework

In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:

What is the aim?

Methodological studies that investigate bias

A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].

Methodological studies that investigate quality (or completeness) of reporting

Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].

Methodological studies that investigate the consistency of reporting

Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].

Methodological studies that investigate factors associated with reporting

In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].

Methodological studies that investigate methods

Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].

Methodological studies that summarize other methodological studies

Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].

Methodological studies that investigate nomenclature and terminology

Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].

Other types of methodological studies

In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.

What is the design?

Methodological studies that are descriptive

Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].

Methodological studies that are analytical

Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].

What is the sampling strategy?

Methodological studies that include the target population

Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n  = 103) [ 30 ].

Methodological studies that include a sample of the target population

Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.

What is the unit of analysis?

Methodological studies with a research report as the unit of analysis

Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.

Methodological studies with a design, analysis or reporting item as the unit of analysis

Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].

This framework is outlined in Fig.  2 .

figure 2

A proposed framework for methodological studies

Conclusions

Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.

In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.

Availability of data and materials

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

Abbreviations

Consolidated Standards of Reporting Trials

Evidence, Participants, Intervention, Comparison, Outcome, Timeframe

Grading of Recommendations, Assessment, Development and Evaluations

Participants, Intervention, Comparison, Outcome, Timeframe

Preferred Reporting Items of Systematic reviews and Meta-Analyses

Studies Within a Review

Studies Within a Trial

Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.

PubMed   Google Scholar  

Chan AW, Song F, Vickers A, Jefferson T, Dickersin K, Gotzsche PC, Krumholz HM, Ghersi D, van der Worp HB. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.

PubMed   PubMed Central   Google Scholar  

Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.

Higgins JP, Altman DG, Gotzsche PC, Juni P, Moher D, Oxman AD, Savovic J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.

Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet. 2001;357.

Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.

Shea BJ, Hamel C, Wells GA, Bouter LM, Kristjansson E, Grimshaw J, Henry DA, Boers M. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013–20.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. Bmj. 2017;358:j4008.

Lawson DO, Leenus A, Mbuagbaw L. Mapping the nomenclature, methodology, and reporting of studies that review methods: a pilot methodological review. Pilot Feasibility Studies. 2020;6(1):13.

Puljak L, Makaric ZL, Buljan I, Pieper D. What is a meta-epidemiological study? Analysis of published literature indicated heterogeneous study designs and definitions. J Comp Eff Res. 2020.

Abbade LPF, Wang M, Sriganesh K, Jin Y, Mbuagbaw L, Thabane L. The framing of research questions using the PICOT format in randomized controlled trials of venous ulcer disease is suboptimal: a systematic survey. Wound Repair Regen. 2017;25(5):892–900.

Gohari F, Baradaran HR, Tabatabaee M, Anijidani S, Mohammadpour Touserkani F, Atlasi R, Razmgir M. Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review. J Diabetes Metab Disord. 2015;15(1):36.

Wang M, Jin Y, Hu ZJ, Thabane A, Dennis B, Gajic-Veljanoski O, Paul J, Thabane L. The reporting quality of abstracts of stepped wedge randomized trials is suboptimal: a systematic survey of the literature. Contemp Clin Trials Commun. 2017;8:1–10.

Shanthanna H, Kaushal A, Mbuagbaw L, Couban R, Busse J, Thabane L: A cross-sectional study of the reporting quality of pilot or feasibility trials in high-impact anesthesia journals Can J Anaesthesia 2018, 65(11):1180–1195.

Kosa SD, Mbuagbaw L, Borg Debono V, Bhandari M, Dennis BB, Ene G, Leenus A, Shi D, Thabane M, Valvasori S, et al. Agreement in reporting between trial publications and current clinical trial registry in high impact journals: a methodological review. Contemporary Clinical Trials. 2018;65:144–50.

Zhang Y, Florez ID, Colunga Lozano LE, Aloweni FAB, Kennedy SA, Li A, Craigie S, Zhang S, Agarwal A, Lopes LC, et al. A systematic survey on reporting and methods for handling missing participant data for continuous outcomes in randomized controlled trials. J Clin Epidemiol. 2017;88:57–66.

CAS   PubMed   Google Scholar  

Hernández AV, Boersma E, Murray GD, Habbema JD, Steyerberg EW. Subgroup analyses in therapeutic cardiovascular clinical trials: are most of them misleading? Am Heart J. 2006;151(2):257–64.

Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, Fruci V, Dennis B, Bawor M, Thabane L. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.

Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59(7):697–703.

Carrasco-Labra A, Brignardello-Petersen R, Santesso N, Neumann I, Mustafa RA, Mbuagbaw L, Etxeandia Ikobaltzeta I, De Stio C, McCullagh LJ, Alonso-Coello P. Improving GRADE evidence tables part 1: a randomized trial shows improved understanding of content in summary-of-findings tables with a new format. J Clin Epidemiol. 2016;74:7–18.

The Northern Ireland Hub for Trials Methodology Research: SWAT/SWAR Information [ https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/SWATSWARInformation/ ]. Accessed 31 Aug 2020.

Chick S, Sánchez P, Ferrin D, Morrice D. How to conduct a successful simulation study. In: Proceedings of the 2003 winter simulation conference: 2003; 2003. p. 66–70.

Google Scholar  

Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.

Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mount Sinai J Med New York. 1996;63(3–4):216–24.

CAS   Google Scholar  

Areia M, Soares M, Dinis-Ribeiro M. Quality reporting of endoscopic diagnostic studies in gastrointestinal journals: where do we stand on the use of the STARD and CONSORT statements? Endoscopy. 2010;42(2):138–47.

Knol M, Groenwold R, Grobbee D. P-values in baseline tables of randomised controlled trials are inappropriate but still common in high impact journals. Eur J Prev Cardiol. 2012;19(2):231–2.

Chen M, Cui J, Zhang AL, Sze DM, Xue CC, May BH. Adherence to CONSORT items in randomized controlled trials of integrative medicine for colorectal Cancer published in Chinese journals. J Altern Complement Med. 2018;24(2):115–24.

Hopewell S, Ravaud P, Baron G, Boutron I. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.

The Cochrane Methodology Register Issue 2 2009 [ https://cmr.cochrane.org/help.htm ]. Accessed 31 Aug 2020.

Mbuagbaw L, Kredo T, Welch V, Mursleen S, Ross S, Zani B, Motaze NV, Quinlan L. Critical EPICOT items were absent in Cochrane human immunodeficiency virus systematic reviews: a bibliometric analysis. J Clin Epidemiol. 2016;74:66–72.

Barton S, Peckitt C, Sclafani F, Cunningham D, Chau I. The influence of industry sponsorship on the reporting of subgroup analyses within phase III randomised controlled trials in gastrointestinal oncology. Eur J Cancer. 2015;51(18):2732–9.

Setia MS. Methodology series module 5: sampling strategies. Indian J Dermatol. 2016;61(5):505–9.

Wilson B, Burnett P, Moher D, Altman DG, Al-Shahi Salman R. Completeness of reporting of randomised controlled trials including people with transient ischaemic attack or stroke: a systematic review. Eur Stroke J. 2018;3(4):337–46.

Kahale LA, Diab B, Brignardello-Petersen R, Agarwal A, Mustafa RA, Kwong J, Neumann I, Li L, Lopes LC, Briel M, et al. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey. J Clin Epidemiol. 2018;99:14–23.

De Angelis CD, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, Kotzin S, Laine C, Marusic A, Overbeke AJPM, et al. Is this clinical trial fully registered?: a statement from the International Committee of Medical Journal Editors*. Ann Intern Med. 2005;143(2):146–8.

Ohtake PJ, Childs JD. Why publish study protocols? Phys Ther. 2014;94(9):1208–9.

Rombey T, Allers K, Mathes T, Hoffmann F, Pieper D. A descriptive analysis of the characteristics and the peer review process of systematic review protocols published in an open peer review journal from 2012 to 2017. BMC Med Res Methodol. 2019;19(1):57.

Grimes DA, Schulz KF. Bias and causal associations in observational research. Lancet. 2002;359(9302):248–52.

Porta M (ed.): A dictionary of epidemiology, 5th edn. Oxford: Oxford University Press, Inc.; 2008.

El Dib R, Tikkinen KAO, Akl EA, Gomaa HA, Mustafa RA, Agarwal A, Carpenter CR, Zhang Y, Jorge EC, Almeida R, et al. Systematic survey of randomized trials evaluating the impact of alternative diagnostic strategies on patient-important outcomes. J Clin Epidemiol. 2017;84:61–9.

Helzer JE, Robins LN, Taibleson M, Woodruff RA Jr, Reich T, Wish ED. Reliability of psychiatric diagnosis. I. a methodological review. Arch Gen Psychiatry. 1977;34(2):129–33.

Chung ST, Chacko SK, Sunehag AL, Haymond MW. Measurements of gluconeogenesis and Glycogenolysis: a methodological review. Diabetes. 2015;64(12):3996–4010.

CAS   PubMed   PubMed Central   Google Scholar  

Sterne JA, Juni P, Schulz KF, Altman DG, Bartlett C, Egger M. Statistical methods for assessing the influence of study characteristics on treatment effects in 'meta-epidemiological' research. Stat Med. 2002;21(11):1513–24.

Moen EL, Fricano-Kugler CJ, Luikart BW, O’Malley AJ. Analyzing clustered data: why and how to account for multiple observations nested within a study participant? PLoS One. 2016;11(1):e0146721.

Zyzanski SJ, Flocke SA, Dickinson LM. On the nature and analysis of clustered data. Ann Fam Med. 2004;2(3):199–200.

Mathes T, Klassen P, Pieper D. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review. BMC Med Res Methodol. 2017;17(1):152.

Bui DDA, Del Fiol G, Hurdle JF, Jonnalagadda S. Extractive text summarization system to aid data extraction from full text in systematic review development. J Biomed Inform. 2016;64:265–72.

Bui DD, Del Fiol G, Jonnalagadda S. PDF text classification to leverage information extraction from publication reports. J Biomed Inform. 2016;61:141–8.

Maticic K, Krnic Martinic M, Puljak L. Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience. BMC Med Res Methodol. 2019;19(1):32.

Speich B. Blinding in surgical randomized clinical trials in 2015. Ann Surg. 2017;266(1):21–2.

Abraha I, Cozzolino F, Orso M, Marchesi M, Germani A, Lombardo G, Eusebi P, De Florio R, Luchetta ML, Iorio A, et al. A systematic review found that deviations from intention-to-treat are common in randomized trials and systematic reviews. J Clin Epidemiol. 2017;84:37–46.

Zhong Y, Zhou W, Jiang H, Fan T, Diao X, Yang H, Min J, Wang G, Fu J, Mao B. Quality of reporting of two-group parallel randomized controlled clinical trials of multi-herb formulae: A survey of reports indexed in the Science Citation Index Expanded. Eur J Integrative Med. 2011;3(4):e309–16.

Farrokhyar F, Chu R, Whitlock R, Thabane L. A systematic review of the quality of publications reporting coronary artery bypass grafting trials. Can J Surg. 2007;50(4):266–77.

Oltean H, Gagnier JJ. Use of clustering analysis in randomized controlled trials in orthopaedic surgery. BMC Med Res Methodol. 2015;15:17.

Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.

Balasubramanian SP, Wiener M, Alshameeri Z, Tiruvoipati R, Elbourne D, Reed MW. Standards of reporting of randomized controlled trials in general surgery: can we do better? Ann Surg. 2006;244(5):663–7.

de Vries TW, van Roon EN. Low quality of reporting adverse drug reactions in paediatric randomised controlled trials. Arch Dis Child. 2010;95(12):1023–6.

Borg Debono V, Zhang S, Ye C, Paul J, Arya A, Hurlburt L, Murthy Y, Thabane L. The quality of reporting of RCTs used within a postoperative pain management meta-analysis, using the CONSORT statement. BMC Anesthesiol. 2012;12:13.

Kaiser KA, Cofield SS, Fontaine KR, Glasser SP, Thabane L, Chu R, Ambrale S, Dwary AD, Kumar A, Nayyar G, et al. Is funding source related to study reporting quality in obesity or nutrition randomized control trials in top-tier medical journals? Int J Obes. 2012;36(7):977–81.

Thomas O, Thabane L, Douketis J, Chu R, Westfall AO, Allison DB. Industry funding and the reporting quality of large long-term weight loss trials. Int J Obes. 2008;32(10):1531–6.

Khan NR, Saad H, Oravec CS, Rossi N, Nguyen V, Venable GT, Lillard JC, Patel P, Taylor DR, Vaughn BN, et al. A review of industry funding in randomized controlled trials published in the neurosurgical literature-the elephant in the room. Neurosurgery. 2018;83(5):890–7.

Hansen C, Lundh A, Rasmussen K, Hrobjartsson A. Financial conflicts of interest in systematic reviews: associations with results, conclusions, and methodological quality. Cochrane Database Syst Rev. 2019;8:Mr000047.

Kiehna EN, Starke RM, Pouratian N, Dumont AS. Standards for reporting randomized controlled trials in neurosurgery. J Neurosurg. 2011;114(2):280–5.

Liu LQ, Morris PJ, Pengel LH. Compliance to the CONSORT statement of randomized controlled trials in solid organ transplantation: a 3-year overview. Transpl Int. 2013;26(3):300–6.

Bala MM, Akl EA, Sun X, Bassler D, Mertz D, Mejza F, Vandvik PO, Malaga G, Johnston BC, Dahm P, et al. Randomized trials published in higher vs. lower impact journals differ in design, conduct, and analysis. J Clin Epidemiol. 2013;66(3):286–95.

Lee SY, Teoh PJ, Camm CF, Agha RA. Compliance of randomized controlled trials in trauma surgery with the CONSORT statement. J Trauma Acute Care Surg. 2013;75(4):562–72.

Ziogas DC, Zintzaras E. Analysis of the quality of reporting of randomized controlled trials in acute and chronic myeloid leukemia, and myelodysplastic syndromes as governed by the CONSORT statement. Ann Epidemiol. 2009;19(7):494–500.

Alvarez F, Meyer N, Gourraud PA, Paul C. CONSORT adoption and quality of reporting of randomized controlled trials: a systematic analysis in two dermatology journals. Br J Dermatol. 2009;161(5):1159–65.

Mbuagbaw L, Thabane M, Vanniyasingam T, Borg Debono V, Kosa S, Zhang S, Ye C, Parpia S, Dennis BB, Thabane L. Improvement in the quality of abstracts in major clinical journals since CONSORT extension for abstracts: a systematic review. Contemporary Clin trials. 2014;38(2):245–50.

Thabane L, Chu R, Cuddy K, Douketis J. What is the quality of reporting in weight loss intervention studies? A systematic review of randomized controlled trials. Int J Obes. 2007;31(10):1554–9.

Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evidence Based Med. 2017;22(4):139.

METRIC - MEthodological sTudy ReportIng Checklist: guidelines for reporting methodological studies in health research [ http://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#METRIC ]. Accessed 31 Aug 2020.

Jager KJ, Zoccali C, MacLeod A, Dekker FW. Confounding: what it is and how to deal with it. Kidney Int. 2008;73(3):256–60.

Parker SG, Halligan S, Erotocritou M, Wood CPJ, Boulton RW, Plumb AAO, Windsor ACJ, Mallett S. A systematic methodological review of non-randomised interventional studies of elective ventral hernia repair: clear definitions and a standardised minimum dataset are needed. Hernia. 2019.

Bouwmeester W, Zuithoff NPA, Mallett S, Geerlings MI, Vergouwe Y, Steyerberg EW, Altman DG, Moons KGM. Reporting and methods in clinical prediction research: a systematic review. PLoS Med. 2012;9(5):1–12.

Schiller P, Burchardi N, Niestroj M, Kieser M. Quality of reporting of clinical non-inferiority and equivalence randomised trials--update and extension. Trials. 2012;13:214.

Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, Dosenovic S, Jakus D, Vrdoljak M, Poklepovic Pericic T, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;125(4):1348–54.

Thabut G, Estellat C, Boutron I, Samama CM, Ravaud P. Methodological issues in trials assessing primary prophylaxis of venous thrombo-embolism. Eur Heart J. 2005;27(2):227–36.

Puljak L, Riva N, Parmelli E, González-Lorenzo M, Moja L, Pieper D. Data extraction methods: an analysis of internal reporting discrepancies in single manuscripts and practical advice. J Clin Epidemiol. 2020;117:158–64.

Ritchie A, Seubert L, Clifford R, Perry D, Bond C. Do randomised controlled trials relevant to pharmacy meet best practice standards for quality conduct and reporting? A systematic review. Int J Pharm Pract. 2019.

Babic A, Vuka I, Saric F, Proloscic I, Slapnicar E, Cavar J, Pericic TP, Pieper D, Puljak L. Overall bias methods and their use in sensitivity analysis of Cochrane reviews were not consistent. J Clin Epidemiol. 2019.

Tan A, Porcher R, Crequit P, Ravaud P, Dechartres A. Differences in treatment effect size between overall survival and progression-free survival in immunotherapy trials: a Meta-epidemiologic study of trials with results posted at ClinicalTrials.gov. J Clin Oncol. 2017;35(15):1686–94.

Croitoru D, Huang Y, Kurdina A, Chan AW, Drucker AM. Quality of reporting in systematic reviews published in dermatology journals. Br J Dermatol. 2020;182(6):1469–76.

Khan MS, Ochani RK, Shaikh A, Vaduganathan M, Khan SU, Fatima K, Yamani N, Mandrola J, Doukky R, Krasuski RA: Assessing the Quality of Reporting of Harms in Randomized Controlled Trials Published in High Impact Cardiovascular Journals. Eur Heart J Qual Care Clin Outcomes 2019.

Rosmarakis ES, Soteriades ES, Vergidis PI, Kasiakou SK, Falagas ME. From conference abstract to full paper: differences between data presented in conferences and journals. FASEB J. 2005;19(7):673–80.

Mueller M, D’Addario M, Egger M, Cevallos M, Dekkers O, Mugglin C, Scott P. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations. BMC Med Res Methodol. 2018;18(1):44.

Li G, Abbade LPF, Nwosu I, Jin Y, Leenus A, Maaz M, Wang M, Bhatt M, Zielinski L, Sanger N, et al. A scoping review of comparisons between abstracts and full reports in primary biomedical research. BMC Med Res Methodol. 2017;17(1):181.

Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol. 2019;19(1):203.

Analytical study [ https://medical-dictionary.thefreedictionary.com/analytical+study ]. Accessed 31 Aug 2020.

Tricco AC, Tetzlaff J, Pham B, Brehaut J, Moher D. Non-Cochrane vs. Cochrane reviews were twice as likely to have positive conclusion statements: cross-sectional study. J Clin Epidemiol. 2009;62(4):380–6 e381.

Schalken N, Rietbergen C. The reporting quality of systematic reviews and Meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. 2017;8:1395.

Ranker LR, Petersen JM, Fox MP. Awareness of and potential for dependent error in the observational epidemiologic literature: A review. Ann Epidemiol. 2019;36:15–9 e12.

Paquette M, Alotaibi AM, Nieuwlaat R, Santesso N, Mbuagbaw L. A meta-epidemiological study of subgroup analyses in cochrane systematic reviews of atrial fibrillation. Syst Rev. 2019;8(1):241.

Download references

Acknowledgements

This work did not receive any dedicated funding.

Author information

Authors and affiliations.

Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada

Lawrence Mbuagbaw, Daeria O. Lawson & Lehana Thabane

Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario, L8N 4A6, Canada

Lawrence Mbuagbaw & Lehana Thabane

Centre for the Development of Best Practices in Health, Yaoundé, Cameroon

Lawrence Mbuagbaw

Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000, Zagreb, Croatia

Livia Puljak

Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN, 47405, USA

David B. Allison

Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON, Canada

Lehana Thabane

Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON, Canada

Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON, Canada

You can also search for this author in PubMed   Google Scholar

Contributions

LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.

Corresponding author

Correspondence to Lawrence Mbuagbaw .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Mbuagbaw, L., Lawson, D.O., Puljak, L. et al. A tutorial on methodological studies: the what, when, how and why. BMC Med Res Methodol 20 , 226 (2020). https://doi.org/10.1186/s12874-020-01107-7

Download citation

Received : 27 May 2020

Accepted : 27 August 2020

Published : 07 September 2020

DOI : https://doi.org/10.1186/s12874-020-01107-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Methodological study
  • Meta-epidemiology
  • Research methods
  • Research-on-research

BMC Medical Research Methodology

ISSN: 1471-2288

what is included in the research methodology

  • Open access
  • Published: 16 May 2024

Promoting equality, diversity and inclusion in research and funding: reflections from a digital manufacturing research network

  • Oliver J. Fisher 1 ,
  • Debra Fearnshaw   ORCID: orcid.org/0000-0002-6498-9888 2 ,
  • Nicholas J. Watson 3 ,
  • Peter Green 4 ,
  • Fiona Charnley 5 ,
  • Duncan McFarlane 6 &
  • Sarah Sharples 2  

Research Integrity and Peer Review volume  9 , Article number:  5 ( 2024 ) Cite this article

245 Accesses

1 Altmetric

Metrics details

Equal, diverse, and inclusive teams lead to higher productivity, creativity, and greater problem-solving ability resulting in more impactful research. However, there is a gap between equality, diversity, and inclusion (EDI) research and practices to create an inclusive research culture. Research networks are vital to the research ecosystem, creating valuable opportunities for researchers to develop their partnerships with both academics and industrialists, progress their careers, and enable new areas of scientific discovery. A feature of a network is the provision of funding to support feasibility studies – an opportunity to develop new concepts or ideas, as well as to ‘fail fast’ in a supportive environment. The work of networks can address inequalities through equitable allocation of funding and proactive consideration of inclusion in all of their activities.

This study proposes a strategy to embed EDI within research network activities and funding review processes. This paper evaluates 21 planned mitigations introduced to address known inequalities within research events and how funding is awarded. EDI data were collected from researchers engaging in a digital manufacturing network activities and funding calls to measure the impact of the proposed method.

Quantitative analysis indicates that the network’s approach was successful in creating a more ethnically diverse network, engaging with early career researchers, and supporting researchers with care responsibilities. However, more work is required to create a gender balance across the network activities and ensure the representation of academics who declare a disability. Preliminary findings suggest the network’s anonymous funding review process has helped address inequalities in funding award rates for women and those with care responsibilities, more data are required to validate these observations and understand the impact of different interventions individually and in combination.

Conclusions

In summary, this study offers compelling evidence regarding the efficacy of a research network's approach in advancing EDI within research and funding. The network hopes that these findings will inform broader efforts to promote EDI in research and funding and that researchers, funders, and other stakeholders will be encouraged to adopt evidence-based strategies for advancing this important goal.

Peer Review reports

Introduction

Achieving equality, diversity, and inclusion (EDI) is an underpinning contributor to human rights, civilisation and society-wide responsibility [ 1 ]. Furthermore, promoting and embedding EDI within research environments is essential to make the advancements required to meet today’s research challenges [ 2 ]. This is evidenced by equal, diverse and inclusive teams leading to higher productivity, creativity and greater problem-solving ability [ 3 ], which increases the scientific impact of research outputs and researchers [ 4 ]. However, there remains a gap between EDI research and the everyday implementation of inclusive practices to achieve change [ 5 ]. This paper presents and reflects on the EDI measures trialled by the UK Engineering and Physical Sciences Research Council (EPSRC) funded digital manufacturing research network, Connected Everything (grant number: EP/S036113/1) [ 6 ]. The EPSRC is a UK research council that funds engineering and physical sciences research. By sharing these reflections, this work aims to contribute to the wider effort of creating an inclusive research culture. The perceptions of equality, diversity, and inclusion may vary among individuals. For the scope of this study, the following definitions are adopted:

Equality: Equality is about ensuring that every individual has an equal opportunity to make the most of their lives and talents. No one should have poorer life chances because of the way they were born, where they come from, what they believe, or whether they have a disability.

Diversity: Diversity concerns understanding that each individual is unique, recognising our differences, and exploring these differences in a safe, positive, and nurturing way to value each other as individuals.

Inclusion: Inclusion is an effort and practice in which groups or individuals with different backgrounds are culturally and socially accepted, welcomed and treated equally. This concerns treating each person as an individual, making them feel valued, and supported and being respectful of who they are.

Research networks have varied goals, but a common purpose is to create new interdisciplinary research communities, by fostering interactions between researchers and appropriate scientific, technological and industrial groups. These networks aim to offer valuable career progression opportunities for researchers, through access to research funding, forming academic and industrial collaborations at network events, personal and professional development, and research dissemination. However, feedback from a 2021 survey of 19 UK research networks, suggests that these research networks are not always diverse, and whilst on the face of it they seem inclusive, they are perceived as less inclusive by minority groups (including non-males, those with disabilities, and ethnic minority respondents) [ 7 ]. The exclusivity of these networks further exacerbates the inequality within the academic community as it prevents certain groups from being able to engage with all aspects of network activities.

Research investigating the causes of inequality and exclusivity has identified several suggestions to make research culture more inclusive, including improving diverse representation within event programmes and panels [ 8 , 9 ]; ensuring events are accessible to all [ 10 ]; providing personalised resources and training to build capacity and increase engagement [ 11 ]; educating institutions and funders to understand and address the barriers to research [ 12 ]; and increasing diversity in peer review and funding panels [ 13 ]. Universities, research institutions and research funding bodies are increasingly taking responsibility to ensure the health of the research and innovation system and to foster inclusion. For example, the EPSRC has set out their own ‘Expectation for EDI’ to promote the formation of a diverse and inclusive research culture [ 14 ]. To drive change, there is an emphasis on the importance of measuring diversity and links to measured outcomes to benchmark future studies on how interventions affect diversity [ 5 ]. Further, collecting and sharing EDI data can also drive aspirations, provide a target for actions, and allow institutions to consider common issues. However, there is a lack of available data regarding the impact of EDI practices on diversity that presents an obstacle, impeding the realisation of these benefits and hampering progress in addressing common issues and fostering diversity and inclusion [ 5 ].

Funding acquisition is important to an academic’s career progression, yet funding may often be awarded in ways that feel unequal and/or non-transparent. The importance of funding in academic career progression means that, if credit for obtaining funding is not recognised appropriately, careers can be damaged, and, as a result of the lack of recognition for those who have been involved in successful research, funding bodies may not have a complete picture of the research community, and are unable to deliver the best value for money [ 15 ]. Awarding funding is often a key research network activity and an area where networks can have a positive impact on the wider research community. It is therefore important that practices are established to embed EDI consideration within the funding process and to ensure that network funding is awarded without bias. Recommendations from the literature to make the funding award process fairer include: ensuring a diverse funding panel; funders instituting reviewer anti-bias training; anonymous review; and/or automatic adjustments to correct for known biases [ 16 ]. In the UK, the government organisation UK Research and Innovation (UKRI), tasked with overseeing research and innovation funding, has pledged to publish data to enhance transparency. This initiative aims to furnish an evidence base for designing interventions and evaluating their efficacy. While the data show some positive signs (e.g., the award rates for male and female PI applicants were equal at 29% in 2020–21), Ottoline Leyser (UKRI Chief Executive) highlights the ‘persistent pernicious disparities for under-represented groups in applying for and winning research funding’ [ 17 ]. This suggests that a more radical approach to rethinking the traditional funding review process may be required.

This paper describes the approach taken by the ‘Connected Everything’ EPSRC-funded Network to embed EDI in all aspects of its research funding process, and evaluates the impact of this ambition, leading to recommendations for embedding EDI in research funding allocation.

Connected everything’s equality diversity and inclusion strategy

Connected Everything aims to create a multidisciplinary community of researchers and industrialists to address key challenges associated with the future of digital manufacturing. The network is managed by an investigator team who are responsible for the strategic planning and, working with the network manager, to oversee the delivery of key activities. The network was first funded between 2016–2019 (grant number: EP/P001246/1) and was awarded a second grant (grant number: EP/S036113/1). The network activities are based around three goals: building partnerships, developing leadership and accelerating impact.

The Connected Everything network represents a broad range of disciplines, including manufacturing, computer science, cybersecurity, engineering, human factors, business, sociology, innovation and design. Some of the subject areas, such as Computer Science and Engineering, tend to be male-dominated (e.g., in 2021/22, a total of 185,42 higher education student enrolments in engineering & technology subjects was broken down as 20.5% Female and 79.5% Male [ 18 ]). The networks also face challenges in terms of accessibility for people with care responsibilities and disabilities. In 2019, Connected Everything committed to embedding EDI in all its network activities and published a guiding principle and goals for improving EDI (see Additional file 1 ). When designing the processes to deliver the second iteration of Connected Everything, the team identified several sources of potential bias/exclusion which have the potential to impact engagement with the network. Based on these identified factors, a series of mitigation interventions were implemented and are outlined in Table  1 .

Connected everything anonymous review process

A key Connected Everything activity is the funding of feasibility studies to enable cross-disciplinary, foresight, speculative and risky early-stage research, with a focus on low technology-readiness levels. Awards are made via a short, written application followed by a pitch to a multidisciplinary diverse panel including representatives from industry. Six- to twelve-month-long projects are funded to a maximum value of £60,000.

The current peer-review process used by funders may reveal the applicants’ identities to the reviewer. This can introduce dilemmas to the reviewer regarding (a) deciding whether to rely exclusively on information present within the application or search for additional information about the applicants and (b) whether or not to account for institutional prestige [ 34 ]. Knowing an applicant’s identity can bias the assessment of the proposal, but by focusing the assessment on the science rather than the researcher, equality is more frequently achieved between award rates (i.e., the proportion of successful applications) [ 15 ]. To progress Connected Everything’s commitment to EDI, the project team created a 2-stage review process, where the applicants’ identity was kept anonymous during the peer review stage. This anonymous process, which is outlined in Fig.  1 , was created for the feasibility study funding calls in 2019 and used for subsequent funding calls.

figure 1

Connected Everything’s anonymous review process [EDI: Equality, diversity, and inclusion]

To facilitate the anonymous review process, the proposal was submitted in two parts: part A the research idea and part B the capability-to-deliver statement. All proposals were first anonymously reviewed by a random selection of two members from the Connected Everything executive group, which is a diverse group of digital manufacturing experts and peers from academia, industry and research institutions that provide guidance and leadership on Connected Everything activities. The reviewers rated the proposals against the selection criteria (see Additional file 1 , Table 1) and provided overall comments alongside a recommendation on whether or not the applicant should be invited to the panel pitch. This information was summarised and shared with a moderation sift panel, made up of a minimum of two Connected Everything investigators and a minimum of one member of the executive group, that tensioned the reviewers’ comments (i.e. comments and evaluations provided by the peer reviewers are carefully considered and weighed against each other) and ultimately decided which proposals to invite to the panel. This tension process included using the identifying information to ensure the applicants did have the capability to deliver the project. If this remained unclear, the applicants were asked to confirm expertise in an area the moderation sift panel thought was key or asked to bring in additional expertise to the project team during the panel pitch.

During stage two the applicants were invited to pitch their research idea to a panel of experts who were selected to reflect the diversity of the community. The proposals, including applicants’ identities, were shared with the panel at least two weeks ahead of the panel. Individual panel members completed a summary sheet at the end of the pitch session to record how well the proposal met the selection criteria (see Additional file 1 , Table 1). Panel members did not discuss their funding decision until all the pitches had been completed. A panel chair oversaw the process but did not declare their opinion on a specific feasibility study unless the panel could not agree on an outcome. The panel and panel chair were reminded to consider ways to manage their unconscious bias during the selection process.

Due to the positive response received regarding the anonymous review process, Connected Everything extended its use when reviewing other funded activities. As these awards were for smaller grant values (~ £5,000), it was decided that no panel pitch was required, and the researcher’s identity was kept anonymous for the entire process.

Data collection and analysis methods

Data collection.

Equality, diversity and inclusion data were voluntarily collected from applicants for Connected Everything research funding and from participants who won scholarships to attend Connected Everything funded activities. Responses to the EDI data requests were collected from nine Connected Everything coordinated activities between 2019 and 2022. Data requests were sent after the applicant had applied for Connected Everything funding or had attended a Connected Everything funded activity. All data requests were completed voluntarily, with reassurance given that completion of the data requested in no way affected their application. In total 260 responses were received, of which the three feasibility study calls comprised 56.2% of the total responses received. Overall, there was a 73.8% response rate.

To understand the diversity of participants engaging with Connected Everything activities and funding, the data requests asked for details of specific diversity characteristics: gender, transgender, disability, ethnicity, age, and care responsibilities. Although sex and gender are terms that are often used interchangeably, they are two different concepts. To clarify, the definitions used by the UK government describe sex as a set of biological attributes that is generally limited to male or female, and typically attributed to individuals at birth. In contrast, gender identity is a social construction related to behaviours and attributes, and is self-determined based on a person’s internal perception, identification and experience. Transgender is a term used to describe people whose gender identity is not the same as the sex they were registered at birth. Respondents were first asked to identify their gender and then whether their gender was different from their birth sex.

For this study, respondents were asked to (voluntarily) self-declare whether they consider themselves to be disabled or not. Ethnicity within the data requests was based on the 2011 census classification system. When reporting ethnicity data, this study followed the AdvanceHE example to aggregate the census categories into six groups to enable benchmarking against the available academic ethnicity data. AdvanceHE is a UK charity that works to improve the higher education system for staff, students and society. However, it was acknowledged that there were limitations with this grouping, including the assumption that minority ethnic staff or students are a homogenous group [ 16 ]. Therefore, this study made sure to breakdown these groups during the discussion of the results. The six groups are:

Asian: Asian/Asian British: Indian, Pakistani, Bangladeshi, and any other Asian background;

Black: Black/African/Caribbean/Black British: African, Caribbean, and any other Black/African/Caribbean background;

Other ethnic backgrounds, including Arab.

White: all white ethnic groups.

Benchmarking data

Published data from the Higher Education Statistics Agency [ 26 ] (a UK organisation responsible for collecting, analysing, and disseminating data related to higher education institutions and students), UKRI funding data [ 19 , 35 ] and 2011 census data [ 36 ] were used to benchmark the EDI data collected within this study. The responses to the data collected were compared to the engineering and technology cluster of academic disciplines, as this is most represented by Connected Everything’s main funded EPSRC. The Higher Education Statistics Agency defines the engineering and technology cluster as including the following subject areas: general engineering; chemical engineering; mineral, metallurgy & materials engineering; civil engineering; electrical, electronic & computer engineering; mechanical, aero & production engineering and; IT, systems sciences & computer software engineering [ 37 ].

When assessing the equality in funding award rates, previous studies have focused on analysing the success rates of only the principal investigators [ 15 , 16 , 38 ]; however, Connected Everything recognised that writing research proposals is a collaborative task, so requested diversity data from the whole research team. The average of the last six years of published principal investigator and co-investigator diversity data for UKRI and EPSRC funding awards (2015–2021) was used to benchmark the Connected Everything funding data [ 35 ]. The UKRI and EPSRC funding review process includes a peer review stage followed by panel pitch and assessment stage; however, the applicant's track record is assessed during the peer review stage, unlike the Connected Everything review process.

The data collected have been used to evaluate the success of the planned migrations to address EDI factors affecting the higher education research ecosystem, as outlined in Table  1 (" Connected Everything’s Equality Diversity and Inclusion Strategy " Section).

Dominance of small number of research-intensive universities receiving funding from network

The dominance of a small number of research-intensive universities receiving funding from a network can have implications for the field of research, including: the unequal distribution of resources; a lack of diversity of research, limited collaboration opportunities; and impact on innovation and progress. Analysis of published EPSRC funding data between 2015 and 2021 [ 19 ], shows that the funding has been predominately (74.1%, 95% CI [71.%, 76.9%] out of £3.98 billion) awarded to Russell Group universities. The Russell Group is a self-selected association of 24 research-intensive universities (out of the 174 universities) in the UK, established in 1994. Evaluation of the universities that received Connected Everything feasibility study funding between 2016–2019, shows that Connected Everything awarded just over half (54.6%, 95% CI [25.1%, 84.0%] out of 11 awards) to Russell Group universities. Figure  2 shows that the Connected Everything funding awarded to Russell Group universities reduced to 44.4%, 95% CI [12.0%, 76.9%] of 9 awards between 2019–2022.

figure 2

A comparison of funding awarded by EPSRC (total = £3.98 billion) across Russell Group universities and non-Russell Group universities, alongside the allocations for Connected Everything I (total = £660 k) and Connected Everything II (total = £540 k)

Dominance of successful applications from men

The percentage point difference between the award rates of researchers who identified as female, those who declare a disability, or identified as ethnic minority applicants and carers and their respective counterparts have been plotted in Fig.  3 . Bars to the right of the axis mean that the award rate of the female/declared-disability/ethnic-minority/carer applicants is greater than that of male/non- disability/white/not carer applicants.

figure 3

Percentage point (PP) differences in award rate by funding provider for gender, disability status, ethnicity and care responsibilities (data not collected by UKRI and EPSRC [ 35 ]). The total number of applicants for each funder are as follows: Connected Everything = 146, EPSRC = 37,960, and UKRI = 140,135. *The numbers of applicants were too small (< 5) to enable a meaningful discussion

Figure  3 (A) shows that between 2015 and 2021 research team applicants who identified as male had a higher award rate than those who identified as female when applying for EPSRC and wider UKRI research council funding. Connected Everything funding applicants who identified as female achieved a higher award rate (19.4%, 95% CI [6.5%, 32.4%] out of 146) compared to male applicants (15.6%, 95% CI [8.8%, 22.4%] out of 146). These data suggest that biases have been reduced by the Connected Everything review process and other mitigation strategies (e.g., visible gender diversity in panel pitch members and publishing CE principal and goals to demonstrate commitment to equality and fairness). This finding aligns with an earlier study that found gender bias during the peer review process, resulting in female investigators receiving less favourable evaluations than their male counterparts [ 15 ].

Over-representation of people identifying as male in engineering and technology academic community

Figure  4 shows the response to the gender question, with 24.2%, 95% CI [19.0%, 29.4%] of 260 responses identifying as female. This aligns with the average for the engineering and technology cluster (21.4%, 95% CI [20.9%, 21.9%] female of 27,740 academic staff), which includes subject areas representative of our main funder, EPSRC [ 22 ]. We also sought to understand the representation of transgender researchers within the network. However, following the rounding policy outlined by UK Government statistics policies and procedures [ 39 ], the number of responses that identified as a different sex to birth was too low (< 5) to enable a meaningful discussion.

figure 4

Gender question responses from a total of 260 respondents

Dominance of successful applications from white academics

Figure  3 (C) shows that researchers with a minority ethnicity consistently have a lower award rate than white researchers when applying for EPSRC and UKRI funding. Similarly, the results in Fig.  3 (C) indicate that white researchers are more successful (8.0% percentage point, 95% CI [-8.6%, 24.6%]) when applying for Connected Everything funding. These results indicate that more measures should be implemented to support the ethnic minority researchers applying for Connected Everything funding, as well as sense checking there is no unconscious bias in any of the Connected Everything funding processes. The breakdown of the ethnicity diversity of applicants at different stages of the Connected Everything review process (i.e. all applications, applicants invited to panel pitch and awarded feasibility studies) has been plotted in Fig.  5 to help identify where more support is needed. Figure  5 shows an increase in the proportion of white researchers from 54%, 95% CI [45.4%, 61.8%] of all 146 applicants to 66%, 95% CI [52.8%, 79.1%] of the 50 researchers invited to the panel pitch. This suggests that stage 1 of the Connected Everything review process (anonymous review of written applications) may favour white applicants and/or introduce unconscious bias into the process.

figure 5

Ethnicity questions responses from different stages during the Connected Everything anonymous review process. The total number of applicants is 146, with 50 at the panel stage and 23 ultimately awarded

Under-representation of those from black or minority ethnic backgrounds

Connected Everything appears to have a wide range of ethnic diversity, as shown in Fig.  6 . The ethnicities Asian (18.3%, 95% CI [13.6%, 23.0%]), Black (5.1%, 95% CI [2.4%, 7.7%]), Chinese (12.5%, 95% CI [8.4%, 16.5%]), mixed (3.5%, 95% CI [1.3%, 5.7%]) and other (7.8%, 95% CI [4.5%, 11.1%]) have a higher representation among the 260 individuals engaging with network’s activities, in contrast to both the engineering and technology academic community and the wider UK population. When separating these groups into the original ethnic diversity answers, it becomes apparent that there is no engagement with ‘Black or Black British: Caribbean’, ‘Mixed: White and Black Caribbean’ or ‘Mixed: White and Asian’ researchers within Connected Everything activities. The lack of engagement with researchers from a Caribbean heritage is systemic of a lack of representation within the UK research landscape [ 25 ].

figure 6

Ethnicity question responses from a total of 260 respondents compared to distribution of the 13,085 UK engineering and technology (E&T) academic staff [ 22 ] and 56 million people recorded in the UK 2011 census data [ 36 ]

Under-representation of disabilities, chronic conditions, invisible illnesses and neurodiversity in funded activities and events.

Figure  7 (A) shows that 5.7%, 95% CI [2.4%, 8.9%] of 194 responses declared a disability. This is higher than the average of engineering and technology academics that identify as disabled (3.4%, 95% CI [3.2%, 3.7%] of 27,730 academics). Between Jan-March 2022, 9.0 million people of working age (16–64) within the UK were identified as disabled by the Office for National Statistics [ 40 ], which is 21% of the working age population [ 27 ]. Considering these statistics, there is a stark under-representation of disabilities, chronic conditions, invisible illnesses and neurodiversity amongst engineering and technology academic staff and those engaging in Connected Everything activities.

figure 7

Responses to A  Disability and B  Care responsibilities questions colected from a total of 194 respondents

Between 2015 and 2020 academics that declared a disability have been less successful than academics without a disability in attracting UKRI and EPSRC funding, as shown in Fig.  3 (B). While Fig.  3 (B) shows that those who declare a disability have a higher Connected Everything funding award rate, the number of applicants who declared a disability was too small (< 5) to enable a meaningful discussion regarding this result.

Under-representation of those with care responsibilities in funded activities and events

In response to the care responsibilities question, Fig.  7 (B) shows that 27.3%, 95% CI [21.1%, 33.6%] of 194 respondents identified as carers, which is higher than the 6% of adults estimated to be providing informal care across the UK in a UK Government survey of the 2020/2021 financial year [ 41 ]. However, the ‘informal care’ definition used by the 2021 survey includes unpaid care to a friend or family member needing support, perhaps due to illness, older age, disability, a mental health condition or addiction [ 41 ]. The Connected Everything survey included care responsibilities across the spectrum of care that includes partners, children, other relatives, pets, friends and kin. It is important to consider a wide spectrum of care responsibilities, as key academic events, such as conferences, have previously been demonstrably exclusionary sites for academics with care responsibilities [ 42 ]. Breakdown analysis of the responses to care responsibilities by gender in Fig.  8 reveals that 37.8%, 95% CI [25.3%, 50.3%] of 58 women respondents reported care responsibilities, compared to 22.6%, 95% CI [61.1%, 76.7%] of 136 men respondents. Our findings reinforce similar studies that conclude the burden of care falls disproportionately on female academics [ 43 ].

figure 8

Responses to care responsibilities when grouped by A  136 males and B  58 females

Figure  3 (D) shows that researchers with careering responsibilities applying for Connected Everything funding have a higher award rate than those researchers applying without care responsibilities. These results suggest that the Connected Everything review process is supportive of researchers with care responsibilities, who have faced barriers in other areas of academia.

Reduced opportunities for ECRs

Early-career researchers (ECRs) represent the transition stage between starting a PhD and senior academic positions. EPSRC defines an ECR as someone who is either within eight years of their PhD award, or equivalent professional training or within six years of their first academic appointment [ 44 ]. These periods exclude any career break, for example, due to family care; health reasons; and reasons related to COVID-19 such as home schooling or increased teaching load. The median age for starting a PhD in the UK is 24 to 25, while PhDs usually last between three and four years [ 45 ]. Therefore, these data would imply that the EPSRC median age of ECRs is between 27 and 37 years. It should be noted, however, that this definition is not ideal and excludes ECRs who may have started their research career later in life.

Connected Everything aims to support ECRs via measures that include mentoring support, workshops, summer schools and podcasts. Figure  9 shows a greater representation of researchers engaging with Connected Everything activities that are aged between 30–44 (62.4%, 95% CI [55.6%, 69.2%] of 194 respondents) when compared to the wider engineering and technology academic community (43.7%, 95% CI [43.1%, 44.3%] of 27,780 academics) and UK population (26.9%, 95% CI [26.9%, 26.9%]).

figure 9

Age question responses from a total of 194 respondents compared to distribution of the 27,780 UK engineering and technology (E&T) academic staff [ 22 ] and 56 million people recorded in the UK 2011 census data [ 36 ]

High competition for funding has a greater impact on ECRs

Figure  10 shows that the largest age bracket applying for and winning Connected Everything funding is 31–45, whereas 72%, CI 95% [70.1%, 74.5%] of 12,075 researchers awarded EPSRC grants between 2015 and 2021 were 40 years or older. These results suggest that measures introduced by Connected Everything has been successful at providing funding opportunities for researchers who are likely to be early-mid career stage.

figure 10

Age of researchers at applicant and awarded funding stages for A  Connected Everything between 2019–2022 (total of 146 applicants and 23 awarded) and B  EPSRC funding between 2015–2021 [ 35 ] (total of 35,780 applicants and 12,075 awarded)

The results of this paper provide insights into the impact that Connected Everything’s planned mitigations have had on promoting equality, diversity, and inclusion (EDI) in research and funding. Collecting EDI data from individuals who engage with network activities and apply for research funding enabled an evaluation of whether these mitigations have been successful in achieving the intended outcomes outlined at the start of the study, as summarised in Table  2 .

The results in Table  2 indicate that Connected Everything’s approach to EDI has helped achieve the intended outcome to improve representation of women, ECRs, those with a declared disability and black/minority ethnic backgrounds engaging with network events when compared to the engineering and technology academic community. In addition, the network has helped raise awareness of the high presence of researchers with care responsibilities at network events, which can help to track progress towards making future events inclusive and accessible towards these carers. The data highlights two areas for improvement: (1) ensuring a gender balance; and (2) increasing representation of those with declared disabilities. Both these discrepancies are indicative of the wider imbalances and underrepresentation of these groups in the engineering and technology academic community [ 26 ], yet represent areas where networks can strive to make a difference. Possible strategies include: using targeted outreach; promoting greater representation of these groups in event speakers; and going further to create a welcoming and inclusive environment. One barrier that can disproportionately affect women researchers is the need to balance care responsibilities with attending network events [ 46 ]. This was reflected in the Connected Everything data that reported 37.8%, 95% CI [25.3%, 50.3%] of women engaging with network activities had care responsibilities, compared to 22.6%, 95% CI [61.1%, 76.7%] of men. Providing accommodations such as on-site childcare, flexible scheduling, or virtual attendance options can therefore help to promote inclusivity and allow more women researchers to attend.

Only 5.7%, 95% CI [2.4%, 8.9%] of responses engaging with Connected Everything declared a disability, which is higher than the engineering and technology academic community (3.4%, 95% CI [3.2%, 3.7%]) [ 26 ], but unrepresentative of the wider UK population. It has been suggested that academics can be uncomfortable when declaring disabilities because scholarly contributions and institutional citizenship are so prized that they feel they cannot be honest about their issues or health concerns and keep them secret [ 47 ]. In research networks, it is important to be mindful of this hidden group within higher education and ensure that measures are put in place to make the network’s activities inclusive to all. Future considerations for accommodations to improve research events inclusivity include: improving physical accessibility of events; providing assistive technology such as screen readers, audio descriptions, and captioning can help individuals with visual or hearing impairments to access and participate; providing sign language interpreters; offering flexible scheduling options; and the provision of quiet rooms, written materials in accessible formats, and support staff trained to work with individuals with cognitive disabilities.

Connected Everything introduced measures (e.g., anonymised reviewing process, Q&A sessions before funding calls, inclusive design of panel pitch) to help address inequalities in how funding is awarded. Table 2 shows success in reducing the dominance of researchers who identify as male and research-intensive universities in winning research funding and that researchers with care responsibilities were more successful at winning funding than those without care responsibilities. The data revealed that the proposed measures were unable to address the inequality in award rates between white and ethnic minority researchers, which is an area to look to improve. The inequality appears to occur during the anonymous review stage, with a greater proportion of white researchers being invited to panel. Recommendations to make the review process fairer include: ensuring greater diversity of reviewers; reviewer anti-bias training; and automatic adjustments to correct for known biases in writing style [ 16 , 32 ].

When reflecting on the development of a strategy to embed EDI throughout the network, Connected Everything has learned several key lessons that may benefit other networks undergoing a similar activity. These include:

EDI is never ‘done’: There is a constant need to review approaches to EDI to ensure they remain relevant to the network community. Connected Everything could review its principles to include the concept of justice in its approach to diversity and inclusion. The concept of justice concerning EDI refers to the removal of systematic barriers that stop fair and equitable distribution of resources and opportunities among all members of society, regardless of their individual characteristics or backgrounds. The principles and subsequent actions could be reviewed against the EDI expectations [ 14 ], paying particular attention to areas where barriers may still be present. For example, shifting from welcoming people into existing structures and culture to creating new structures and culture together, with specific emphasis on decision or advisory mechanisms within the network. This activity could lend itself to focusing more on tailored support to overcome barriers, thus achieving equity, if it is not within the control of the network to remove the barrier itself (justice).

Widen diversity categories: By collecting data on a broad range of characteristics, we can identify and address disparities and biases that might otherwise be overlooked. A weakness of this dataset is that ignores the experience of those with intersectional identities, across race, ethnicity, gender, class, disability and/ or LGBTQI. The Wellcome Trust noted how little was known about the socio-economic background of scientists and researchers [ 48 ].

Collect data on whole research teams: For the first two calls for feasibility study funding, Connected Everything only asked the Principal Investigator to voluntarily provide their data. We realised that this was a limited approach and, in the third call, asked for the data regarding the whole research team to be shared anonymously. Furthermore, we do not currently measure the diversity of our event speakers, panellists or reviewers. Collecting these data in the future will help to ensure the network is accountable and will ensure that all groups are represented during our activities and in the funding decision-making process.

High response rate: Previous surveys measuring network diversity (e.g., [ 7 ]) have struggled to get responses when surveying their memberships; whereas, this study achieved a response rate of 73.8%. We attribute this high response rate to sending EDI data requests on the point of contact with the network (e.g., on submitting funding proposals or after attending network events), rather than trying to survey the entire network membership at anyone point in time.

Improve administration: The administration associated with collecting EDI data requires a commitment to transparency, inclusivity, and continuous improvement. For example, during the first feasibility funding call, Connected Everything made it clear that the review process would be anonymous, but the application form was not in separate documents. This made anonymising the application forms extremely time-consuming. For the subsequent calls, separate documents were created – Part A for identifying information (Principal Investigator contact details, Project Team and Industry collaborators) and Part B for the research idea.

Accepting that this can be uncomfortable: Trying to improve EDI can be uncomfortable because it often requires challenging our assumptions, biases, and existing systems and structures. However, it is essential if we want to make real progress towards equity and inclusivity. Creating processes to support embedding EDI takes time and Connected Everything has found it is rare to get it right the first time. Connected Everything is sharing its learning as widely as possible both to support others in their approaches and continue our learning as we reflect on how to continually improve, even when it is challenging.

Enabling individual engagement with EDI: During this work, Connected Everything recognised that methods for engaging with such EDI issues in research design and delivery are lacking. Connected Everything, with support from the Future Food Beacon of Excellence at the University of Nottingham, set out to develop a card-based tool [ 49 ] to help researchers and stakeholders identify questions around how their work may promote equity and increase inclusion or have a negative impact towards one or more protected groups and how this can be overcome. The results of this have been shared at conference presentations [ 50 ] and will be published later.

While this study provides insights into how EDI can be improved in research network activities and funding processes, it is essential to acknowledge several limitations that may impact the interpretation of the findings.

Sample size and generalisability: A total of 260 responses were received, which may not be representative of our overall network of 500 + members. Nevertheless, this data provides a sense of the current diversity engaging in Connected Everything activities and funding opportunities, which we can compare with other available data to steer action to further diversify the network.

Handling of missing data: Out of the 260 responses, 66 data points were missing for questions regarding age, disability, and caring responsibilities. These questions were mistakenly omitted from a Connected Everything summer school survey, contributing to 62 missing data points. While we assumed the remainer of missing data to be at random during analysis, it's important to acknowledge it could be related to other factors, potentially introducing bias into our results.

Emphasis on quantitative data: The study relies on using quantitative data to evaluate the impact of the EDI measures introduced by Connected Everything. However, relying solely on quantitative metrics may overlook nuanced aspects of EDI that cannot be easily quantified. For example, EDI encompasses multifaceted issues influenced by historical, cultural, and contextual factors. These nuances may not be fully captured by numbers alone. In addition, some EDI efforts may not yield immediate measurable outcomes but still contribute to a more inclusive environment.

Diversity and inclusion are not synonymous: The study proposes 21 measures to contribute towards creating an equal, diverse and inclusive research culture and collects diversity data to measure the impact of these measures. However, while diversity is simpler to monitor, increasing diversity alone does not guarantee equality or inclusion. Even with diverse research groups, individuals from underrepresented groups may still face barriers, microaggressions, or exclusion.

Balancing anonymity and rigour in grant reviews:The proposed anonymous review process proposed by Connected Everything removes personal and organisational details from the research ideas under reviewer evaluation. However, there exists a possibility that a reviewer could discern the identity of the grant applicant based on the research idea. Reviewers are expected to be subject matter experts in the field relevant to the grant proposal they are evaluating. Given the specialised nature of scientific research, it is conceivable that a well-known applicant could be identified through the specifics of the work, the methodologies employed, and even the writing style.

Expanding gender identity options: A limitation of this study emerged from the restricted gender options (male, female, other, prefer not to say) provided to respondents when answering the gender identity question. This limitation reflects the context of data collection in 2018, a time when diversity monitoring guidance was still limited. As our understanding of gender identity evolves beyond binary definitions, future data collection efforts should embrace a more expansive and inclusive approach, recognising the diverse spectrum of gender identities.

In conclusion, this study provides evidence of the effectiveness of a research network's approach to promoting equality, diversity, and inclusion (EDI) in research and funding. By collecting EDI data from individuals who engage with network activities and apply for research funding, this study has shown that the network's initiatives have had a positive impact on representation and fairness in the funding process. Specifically, the analysis reveals that the network is successful at engaging with ECRs, and those with care responsibilities and has a diverse range of ethnicities represented at Connected Everything events. Additionally, the network activities have a more equal gender balance and greater representation of researchers with disabilities when compared to the engineering and technology academic community, though there is still an underrepresentation of these groups compared to the national population.

Connected Everything introduced measures to help address inequalities in how funding is awarded. The measures introduced helped reduce the dominance of researchers who identified as male and research-intensive universities in winning research funding. Additionally, researchers with care responsibilities were more successful at winning funding than those without care responsibilities. However, inequality persisted with white researchers achieving higher award rates than those from ethnic minority backgrounds. Recommendations to make the review process fairer include: ensuring greater diversity of reviewers; reviewer anti-bias training; and automatic adjustments to correct for known biases in writing style.

Connected Everything’s approach to embedding EDI in network activities has already been shared widely with other EPSRC-funded networks and Hubs (e.g. the UKRI Circular Economy Hub and the UK Acoustics Network Plus). The network hopes that these findings will inform broader efforts to promote EDI in research and funding and that researchers, funders, and other stakeholders will be encouraged to adopt evidence-based strategies for advancing this important goal.

Availability of data and materials

The data collected was anonymously, however, it may be possible to identify an individual by combining specific records of the data request form data. Therefore, the study data has been presented in aggregate form to protect the confidential of individuals and the data utilised in this study cannot be made openly accessible due to ethical obligations to protect the privacy and confidentiality of the data providers.

Abbreviations

Early career researcher

Equality, diversity and inclusion

Engineering physical sciences research council

UK research and innovation

Xuan J, Ocone R. The equality, diversity and inclusion in energy and AI: call for actions. Energy AI. 2022;8:100152.

Article   Google Scholar  

Guyan K, Oloyede FD. Equality, diversity and inclusion in research and innovation: UK review. Advance HE; 2019.  https://www.ukri.org/wp-content/uploads/2020/10/UKRI-020920-EDI-EvidenceReviewUK.pdf .

Cooke A, Kemeny T. Cities, immigrant diversity, and complex problem solving. Res Policy. 2017;46:1175–85.

AlShebli BK, Rahwan T, Woon WL. The preeminence of ethnic diversity in scientific collaboration. Nat Commun. 2018;9:5163.

Gagnon S, Augustin T, Cukier W. Interplay for change in equality, diversity and inclusion studies: Hum Relations. Epub ahead of print 23 April 2021. https://doi.org/10.1177/00187267211002239 .

Everything C. https://connectedeverything.ac.uk/ . Accessed 27 Feb (2023).

Chandler-Wilde S, Kanza S, Fisher O, Fearnshaw D, Jones E. Reflections on an EDI Survey of UK-Government-Funded Research Networks in the UK. In: The 51st International Congress and Exposition on Noise Control Engineering. St. Albans: Institute of Acoustics; 2022. p. 9.0–940.

Google Scholar  

Prathivadi Bhayankaram K, Prathivadi Bhayankaram N. Conference panels: do they reflect the diversity of the NHS workforce? BMJ Lead 2022;6:57 LP – 59.

Goodman SW, Pepinsky TB. Gender representation and strategies for panel diversity: Lessons from the APSA Annual Meeting. PS Polit Sci Polit 2019;52:669–676.

Olsen J, Griffiths M, Soorenian A, et al. Reporting from the margins: disabled academics reflections on higher education. Scand J Disabil Res. 2020;22:265–74.

Baldie D, Dickson CAW, Sixsmith J. Building an Inclusive Research Culture. In: Knowledge, Innovation, and Impact. 2021, pp. 149–157.

Sato S, Gygax PM, Randall J, et al. The leaky pipeline in research grant peer review and funding decisions: challenges and future directions. High Educ 2020 821. 2020;82:145–62.

Recio-Saucedo A, Crane K, Meadmore K, et al. What works for peer review and decision-making in research funding: a realist synthesis. Res Integr Peer Rev. 2022;2022 71:7: 1–28.

EPSRC. Expectations for equality, diversity and inclusion – UKRI, https://www.ukri.org/about-us/epsrc/our-policies-and-standards/equality-diversity-and-inclusion/expectations-for-equality-diversity-and-inclusion/ (2022, Accessed 26 Apr 2022).

Witteman HO, Hendricks M, Straus S, et al. Are gender gaps due to evaluations of the applicant or the science? A natural experiment at a national funding agency. Lancet. 2019;393:531–40.

Li YL, Bretscher H, Oliver R, et al. Racism, equity and inclusion in research funding. Sci Parliam. 2020;76:17–9.

UKRI publishes latest diversity. data for research funding – UKRI, https://www.ukri.org/news/ukri-publishes-latest-diversity-data-for-research-funding/ (Accessed 28 July 2022).

Higher Education Statistics Agency. What do HE students study? https://www.hesa.ac.uk/data-and-analysis/students/what-study (2023, Accessed 25 March 2023).

UKRI. Competitive funding decisions, https://www.ukri.org/what-we-offer/what-we-have-funded/competitive-funding-decisions / (2023, Accessed 2 April 2023).

Santos G, Van Phu SD. Gender and academic rank in the UK. Sustain. 2019;11:3171.

Jebsen JM, Nicoll Baines K, Oliver RA, et al. Dismantling barriers faced by women in STEM. Nat Chem. 2022;14:1203–6.

Advance HE. Equality in higher education: staff statistical report 2021 | Advance HE, https://www.advance-he.ac.uk/knowledge-hub/equality-higher-education-statistical-report-2021 (28 October 2021, Accessed 26 April 2022).

EngineeringUK. Engineering in Higher Education, https://www.engineeringuk.com/media/318874/engineering-in-higher-education_report_engineeringuk_march23_fv.pdf (2023, Accessed 25 March 2023).

Bhopal K. Academics of colour in elite universities in the UK and the USA: the ‘unspoken system of exclusion’. Stud High Educ. 2022;47:2127–37.

Williams P, Bath S, Arday J et al. The Broken Pieline: Barriers to Black PhD Students Accessing Research Council Funding . 2019.

HESA. Who’s working in HE? Personal characteristics, https://www.hesa.ac.uk/data-and-analysis/staff/working-in-he/characteristics (2023, Accessed 1 April 2023).

Office for National Statistics. Principal projection - UK population in age groups, https://www.ons.gov.uk/peoplepopulationandcommunity/populationandmigration/populationprojections/datasets/tablea21principalprojectionukpopulationinagegroups (2022, Accessed 3 August 2022).

HESA. Who’s studying in HE? Personal characteristics, https://www.hesa.ac.uk/data-and-analysis/students/whos-in-he/characteristics (2023, Accessed 1 April 2023).

Herman E, Nicholas D, Watkinson A et al. The impact of the pandemic on early career researchers: what we already know from the internationally published literature. Prof la Inf ; 30. Epub ahead of print 11 March 2021. https://doi.org/10.3145/epi.2021.mar.08 .

Moreau M-P, Robertson M. ‘Care-free at the top’? Exploring the experiences of senior academic staff who are caregivers, https://srhe.ac.uk/wp-content/uploads/2020/03/Moreau-Robertson-SRHE-Research-Report.pdf (2019).

Shillington AM, Gehlert S, Nurius PS, et al. COVID-19 and long-term impacts on tenure-line careers. J Soc Social Work Res. 2020;11:499–507.

de Winde CM, Sarabipour S, Carignano H et al. Towards inclusive funding practices for early career researchers. J Sci Policy Gov; 18. Epub ahead of print 24 March 2021. https://doi.org/10.38126/JSPG180105 .

Trust W. Grant funding data report 2018/19, https://wellcome.org/sites/default/files/grant-funding-data-2018-2019.pdf (2020).

Vallée-Tourangeau G, Wheelock A, Vandrevala T, et al. Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences. Humanit Soc Sci Commun. 2022;2022 91:9: 1–11.

Diversity data – UKRI. https://www.ukri.org/what-we-offer/supporting-healthy-research-and-innovation-culture/equality-diversity-and-inclusion/diversity-data/ (accessed 30 September 2022).

2011 Census - Office for National Statistics. https://www.ons.gov.uk/census/2011census (Accessed 2 August 2022).

Cost centres. (2012/13 onwards) | HESA, https://www.hesa.ac.uk/support/documentation/cost-centres/2012-13-onwards (Accessed 28 July 2022).

Viner N, Powell P, Green R. Institutionalized biases in the award of research grants: a preliminary analysis revisiting the principle of accumulative advantage. Res Policy. 2004;33:443–54.

ofqual. Rounding policy - GOV.UK, https://www.gov.uk/government/publications/ofquals-statistics-policies-and-procedures/rounding-policy (2023, Accessed 2 April 2023).

Office for National Statistics. Labour market status of disabled people, https://www.ons.gov.uk/employmentandlabourmarket/peopleinwork/employmentandemployeetypes/datasets/labourmarketstatusofdisabledpeoplea08 (2022, Accessed 3 August 2022).

Family Resources Survey. financial year 2020 to 2021 - GOV.UK, https://www.gov.uk/government/statistics/family-resources-survey-financial-year-2020-to-2021 (Accessed 10 Aug 2022).

Henderson E. Academics in two places at once: (not) managing caring responsibilities at conferences. 2018, p. 218.

Jolly S, Griffith KA, DeCastro R, et al. Gender differences in time spent on parenting and domestic responsibilities by high-achieving young physician-researchers. Ann Intern Med. 2014;160:344–53.

UKRI. Early career researchers, https://www.ukri.org/what-we-offer/developing-people-and-skills/esrc/early-career-researchers/ (2022, Accessed 2 April 2023).

Cornell B. PhD Life: The UK student experience , www.hepi.ac.uk (2019, Accessed 2 April 2023).

Kibbe MR, Kapadia MR. Underrepresentation of women at academic medical conferences—manels must stop. JAMA Netw Open 2020; 3:e2018676–e2018676.

Brown N, Leigh J. Ableism in academia: where are the disabled and ill academics? 2018; 33: 985–989.  https://doi.org/10.1080/0968759920181455627

Bridge Group. Diversity in Grant Awarding and Recruitment at Wellcome Summary Report. 2017.

Peter Craigon O, Fisher D, Fearnshaw et al. VERSION 1 - The Equality Diversity and Inclusion cards. Epub ahead of print 2022. https://doi.org/10.6084/m9.figshare.21222212.v3 .

Connected Everything II. EDI ideation cards for research - YouTube, https://www.youtube.com/watch?v=GdJjL6AaBbc&ab_channel=ConnectedEverythingII (2022, Accessed 7 June 2023).

Download references

Acknowledgements

The authors would like to acknowledge the support Engineering and Physical Sciences Research Council (EPSRC) [grant number EP/S036113/1], Connected Everything II: Accelerating Digital Manufacturing Research Collaboration and Innovation. The authors would also like to gratefully acknowledge the Connected Everything Executive Group for their contribution towards developing Connected Everything’s equality, diversity and inclusion strategy.

This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) [grant number EP/S036113/1].

Author information

Authors and affiliations.

Food, Water, Waste Research Group, Faculty of Engineering, University of Nottingham, University Park, Nottingham, UK

Oliver J. Fisher

Human Factors Research Group, Faculty of Engineering, University of Nottingham, University Park, Nottingham, UK

Debra Fearnshaw & Sarah Sharples

School of Food Science and Nutrition, University of Leeds, Leeds, UK

Nicholas J. Watson

School of Engineering, University of Liverpool, Liverpool, UK

Peter Green

Centre for Circular Economy, University of Exeter, Exeter, UK

Fiona Charnley

Institute for Manufacturing, University of Cambridge, Cambridge, UK

Duncan McFarlane

You can also search for this author in PubMed   Google Scholar

Contributions

OJF analysed and interpreted the data, and was the lead author in writing and revising the manuscript. DF led the data acquisition and supported the interpretation of the data. DF was also a major contributor to the design of the equality diversity and inclusion (EDI) strategy proposed in this work. NJW supported the design of the EDI strategy and was a major contributor in reviewing and revising the manuscript. PG supported the design of the EDI strategy, and was a major contributor in reviewing and revising the manuscript. FC supported the design of the EDI strategy and the interpretation of the data. DM supported the design of the EDI strategy. SS led the development EDI strategy proposed in this work, and was a major contributor in data interpretation and reviewing and revising the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Debra Fearnshaw .

Ethics declarations

Ethics approval and consent to participate.

Research was considered exempt from requiring ethical approval as is uses completely anonymous surveys results that are routinely collected as part of the administration of the network plus and informed consent was obtained at the time of original data collection.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Fisher, O.J., Fearnshaw, D., Watson, N.J. et al. Promoting equality, diversity and inclusion in research and funding: reflections from a digital manufacturing research network. Res Integr Peer Rev 9 , 5 (2024). https://doi.org/10.1186/s41073-024-00144-w

Download citation

Received : 12 October 2023

Accepted : 09 April 2024

Published : 16 May 2024

DOI : https://doi.org/10.1186/s41073-024-00144-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Research integrity
  • Network policy
  • Funding reviewing
  • EDI interventions

Research Integrity and Peer Review

ISSN: 2058-8615

what is included in the research methodology

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 83, Issue 6
  • EULAR recommendations for the management of psoriatic arthritis with pharmacological therapies: 2023 update
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0002-4528-310X Laure Gossec 1 , 2 ,
  • http://orcid.org/0000-0002-6685-8873 Andreas Kerschbaumer 3 ,
  • http://orcid.org/0000-0002-2517-0247 Ricardo J O Ferreira 4 , 5 ,
  • http://orcid.org/0000-0003-2108-0030 Daniel Aletaha 3 ,
  • http://orcid.org/0000-0002-9475-9362 Xenofon Baraliakos 6 ,
  • Heidi Bertheussen 7 ,
  • Wolf-Henning Boehncke 8 ,
  • http://orcid.org/0000-0001-5331-8221 Bente Appel Esbensen 9 , 10 ,
  • Iain B McInnes 11 ,
  • Dennis McGonagle 12 , 13 ,
  • http://orcid.org/0000-0002-3892-6947 Kevin L Winthrop 14 ,
  • Andra Balanescu 15 ,
  • Peter V Balint 16 ,
  • http://orcid.org/0000-0001-7518-1131 Gerd R Burmester 17 ,
  • http://orcid.org/0000-0003-2606-0573 Juan D Cañete 18 , 19 ,
  • Pascal Claudepierre 20 , 21 ,
  • http://orcid.org/0000-0002-1473-1715 Lihi Eder 22 ,
  • http://orcid.org/0000-0003-4229-6818 Merete Lund Hetland 23 , 24 ,
  • http://orcid.org/0000-0001-5592-724X Annamaria Iagnocco 25 ,
  • Lars Erik Kristensen 26 , 27 ,
  • Rik Lories 28 , 29 ,
  • http://orcid.org/0000-0002-8418-7145 Rubén Queiro 30 , 31 ,
  • http://orcid.org/0000-0002-9022-8863 Daniele Mauro 32 ,
  • http://orcid.org/0000-0002-9683-3407 Helena Marzo-Ortega 12 , 13 ,
  • http://orcid.org/0000-0002-6620-0457 Philip J Mease 33 , 34 ,
  • http://orcid.org/0000-0002-2571-788X Peter Nash 35 ,
  • Wendy Wagenaar 36 , 37 ,
  • Laura Savage 38 ,
  • http://orcid.org/0000-0001-8740-9615 Georg Schett 39 ,
  • http://orcid.org/0000-0002-9441-5535 Stephanie J W Shoop-Worrall 40 ,
  • http://orcid.org/0000-0002-0807-7139 Yoshiya Tanaka 41 ,
  • http://orcid.org/0000-0002-3561-5932 Filip E Van den Bosch 42 ,
  • Annette van der Helm-van Mil 43 ,
  • http://orcid.org/0000-0002-0573-464X Alen Zabotti 44 ,
  • http://orcid.org/0000-0002-5781-158X Désirée van der Heijde 43 ,
  • Josef S Smolen 3
  • 1 INSERM, Institut Pierre Louis d'Epidémiologie et de Santé Publique , Sorbonne Universite , Paris , France
  • 2 APHP, Rheumatology Department , Hopital Universitaire Pitie Salpetriere , Paris , France
  • 3 Division of Rheumatology, Department of Medicine 3 , Medical University of Vienna , Vienna , Austria
  • 4 Nursing Research, Innovation and Development Centre of Lisbon (CIDNUR) , Higher School of Nursing of Lisbon , Lisbon , Portugal
  • 5 Rheumatology Department , Centro Hospitalar e Universitário de Coimbra EPE , Coimbra , Portugal
  • 6 Rheumazentrum Ruhrgebiet , Ruhr University Bochum , Herne , Germany
  • 7 EULAR Patient Research Partner , EULAR , Oslo , Norway
  • 8 Dermatology and Venereology , Geneva University Hospitals , Geneva , Switzerland
  • 9 Copenhagen Center for Arthritis Research, Center for Rheumatology and Spine Diseases, Centre for Head and Orthopaedics , Rigshospitalet , Glostrup , Denmark
  • 10 Department of Clinical Medicine , University of Copenhagen , Copenhagen , Denmark
  • 11 College of Medical Veterinary and Life Sciences , University of Glasgow , Glasgow , UK
  • 12 LTHT , NIHR Leeds Biomedical Research Centre , Leeds , UK
  • 13 Leeds Institute of Rheumatic and Musculoskeletal Medicine , University of Leeds , Leeds , UK
  • 14 Division of Infectious Diseases, School of Medicine, School of Public Health , Oregon Health & Science University , Portland , Oregon , USA
  • 15 Sf Maria Hospital , University of Medicine and Pharmacy Carol Davila Bucharest , Bucharest , Romania
  • 16 Medical Imaging Centre, Semmelweis University, 3rd Rheumatology Department, National Institute of Musculoskeletal Diseases , Budapest , Hungary
  • 17 Department of Rheumatology and Clinical Immunology, Freie Universität Berlin and Humboldt-Universität zu Berlin , Charité Universitätsmedizin Berlin , Berlin , Germany
  • 18 Arthritis Unit, Department of Rheumatology , Hospital Clínic Barcelona , Barcelona , Spain
  • 19 FCRB , IDIBAPS , Barcelona , Spain
  • 20 Rheumatology , AP-HP, Henri Mondor University Hospital , Creteil , France
  • 21 EA Epiderme , UPEC , Creteil , France
  • 22 Department of Medicine, University of Toronto , Women's College Hospital , Toronto , Toronto , Canada
  • 23 The Copenhagen Center for Arthritis Research, Center for Rheumatology and Spine Diseases, Centre of Head and Orthopedics , Rigshospitalet Glostrup , Glostrup , Denmark
  • 24 Department of Clinical Medicine, Faculty of Health and Medical Sciences , University of Copenhagen , Copenhagen , Denmark
  • 25 Academic Rheumatology Centre, Dipartimento Scienze Cliniche Biologiche , Università di Torino - AO Mauriziano Torino , Turin , Italy
  • 26 The Parker Institute , Bispebjerg , Denmark
  • 27 Frederiksberg Hospital , Copenhagen University , Copenhagen , Denmark
  • 28 Laboratory of Tissue Homeostasis and Disease, Skeletal Biology and Engineering Research Center , KU Leuven , Leuven , Belgium
  • 29 Division of Rheumatology , University Hospitals Leuven , Leuven , Belgium
  • 30 Rheumatology , Hospital Universitario Central de Asturias , Oviedo , Spain
  • 31 Translational Immunology Division, Biohealth Research Institute of the Principality of Asturias , Oviedo University School of Medicine , Oviedo , Spain
  • 32 Department of Precision Medicine , University of Campania Luigi Vanvitelli , Naples , Italy
  • 33 Rheumatology Research , Providence Swedish , Seattle , Washington , USA
  • 34 University of Washington School of Medicine , Seattle , Washington , USA
  • 35 School of Medicine , Griffith University , Brisbane , Queensland , Australia
  • 36 Tranzo, Tilburg School of Social and Behavioral Sciences , Tilburg University , Tilburg , The Netherlands
  • 37 Young PARE Patient Research Partner , EULAR , Zurich , Switzerland
  • 38 School of Medicine and Dermatology, Leeds Teaching Hospitals NHS Trust , University of Leeds , Leeds , UK
  • 39 Department of Internal Medicine 3, Rheumatology and Immunology and Universitätsklinikum Erlangen , Friedrich-Alexander-Universität Erlangen-Nürnberg , Erlangen , Germany
  • 40 Children and Young Person’s Rheumatology Research Programme, Centre for Musculoskeletal Research , The University of Manchester , Manchester , UK
  • 41 First Department of Internal Medicine , University of Occupational and Environmental Health, Japan , Kitakyushu , Japan
  • 42 Department of Internal Medicine and Pediatrics, VIB Center for Inflammation Research , Ghent University , Gent , Belgium
  • 43 Rheumatology , Leiden University Medical Center , Leiden , The Netherlands
  • 44 Department of Medical and Biological Sciences , Azienda sanitaria universitaria Friuli Centrale , Udine , Italy
  • Correspondence to Laure Gossec, INSERM, Institut Pierre Louis d'Epidémiologie et de Santé Publique, Sorbonne Universite, Paris, France; laure.gossec{at}aphp.fr

Objective New modes of action and more data on the efficacy and safety of existing drugs in psoriatic arthritis (PsA) required an update of the EULAR 2019 recommendations for the pharmacological treatment of PsA.

Methods Following EULAR standardised operating procedures, the process included a systematic literature review and a consensus meeting of 36 international experts in April 2023. Levels of evidence and grades of recommendations were determined.

Results The updated recommendations comprise 7 overarching principles and 11 recommendations, and provide a treatment strategy for pharmacological therapies. Non-steroidal anti-inflammatory drugs should be used in monotherapy only for mild PsA and in the short term; oral glucocorticoids are not recommended. In patients with peripheral arthritis, rapid initiation of conventional synthetic disease-modifying antirheumatic drugs is recommended and methotrexate preferred. If the treatment target is not achieved with this strategy, a biological disease-modifying antirheumatic drug (bDMARD) should be initiated, without preference among modes of action. Relevant skin psoriasis should orient towards bDMARDs targeting interleukin (IL)-23p40, IL-23p19, IL-17A and IL-17A/F inhibitors. In case of predominant axial or entheseal disease, an algorithm is also proposed. Use of Janus kinase inhibitors is proposed primarily after bDMARD failure, taking relevant risk factors into account, or in case bDMARDs are not an appropriate choice. Inflammatory bowel disease and uveitis, if present, should influence drug choices, with monoclonal tumour necrosis factor inhibitors proposed. Drug switches and tapering in sustained remission are also addressed.

Conclusion These updated recommendations integrate all currently available drugs in a practical and progressive approach, which will be helpful in the pharmacological management of PsA.

  • Psoriatic Arthritis
  • Biological Therapy
  • Biosimilar Pharmaceuticals

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/ .

https://doi.org/10.1136/ard-2024-225531

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Psoriatic arthritis (PsA) is a disease which has benefited from notable progress over recent years. Concepts have evolved, such as very early diagnosis and pre-PsA, as well as defining treatment targets and applying a holistic approach to comorbidity management. 1–4 Pharmacological options have extended, with the approval of new agents targeting various modes of action for PsA (as well as skin psoriasis). Drugs licensed for PsA now include (1) conventional synthetic (cs) disease-modifying antirheumatic drugs (DMARDs), such as methotrexate (MTX), sulfasalazine and leflunomide; (2) biological (b) DMARDs targeting tumour necrosis factor (TNF), the interleukin (IL)-12/23 or IL-23 pathway, and the IL-17A and IL-17A/F pathway; and (3) targeted synthetic (ts) DMARDs that inhibit Janus kinases (JAKs) or phosphodiesterase 4 (PDE4) ( table 1 ). 5 New safety data have emerged in inflammatory arthritis, particularly a worldwide cautionary comment regarding JAK inhibitors (JAKis), following a large randomised controlled trial (RCT) of tofacitinib in rheumatoid arthritis (RA). 6–8 Since the last EULAR recommendations for the pharmacological management of PsA in 2019, the field has changed significantly. 9–12 An update of the EULAR PsA management recommendations was therefore timely. 9

  • View inline

Disease-modifying treatment options for psoriatic arthritis in 2023

This update addresses the non-topical, pharmacological management of PsA, with a specific focus on musculoskeletal (MSK) manifestations, while also addressing the spectrum of PsA, including how skin psoriasis, extra-MSK manifestations and comorbidities should influence treatment choices.

In accordance with the EULAR updated standardised operating procedures, 13 the process leading to this update included a data-driven approach and expert opinion.

After approval for an update by the EULAR Council in September 2022, taskforce members were selected by the convenor (JSS) and the methodologist (LG), to include more than one-third of new members, as well as country and gender representation. For the first time, experts from Australia, Japan and North America participated. Representatives from the health professionals in rheumatology (HPR) committee, patient research partners from PARE (People with Arthritis/Rheumatism) and young colleagues from the EMEUNET (EMerging EUlar NETwork) were included. Five members were recruited through an open call to EULAR countries via a competitive application process.

In October 2022, the steering group had its first meeting. The steering group consisted of seven rheumatologists (including the convenor, the methodologist and the fellow: JSS, LG, AK, DA, XB, IBM and DGM), a dermatologist (W-HB), an infectious disease specialist (KLW), an experienced fellow rheumatologist (AK), a patient research partner (HB) and two health professionals (BAE and RJOF, the latter acting in the capacity of a junior methodologist). Questions were then defined and addressed through a systematic literature review (SLR), performed by the fellow (AK) between November 2022 and April 2023, for the literature pertaining to pharmacological treatments of PsA and published since the previous SLR (ie, since the end of 2018). 5

The taskforce comprised the steering group and 23 other experts; members came from 19 different countries (of which 15 were EULAR countries), and included 27 rheumatology specialists, 2 dermatologists, 1 infectious disease specialist, 2 people affected with PsA acting as patient research partners, 2 HPRs and 3 rheumatology/epidemiology fellows/trainees. Overall, 47% of the taskforce members had not participated in the previous update in 2019. In April 2023, the taskforce met for a physical meeting to develop the updated bullet points. Each point was discussed in detail both in smaller (breakout) groups and in plenary sessions until consensus was reached. Group approval was sought through votes (by raised hands) for each bullet point; the limit for acceptance of individual recommendations was set at ≥75% majority among the taskforce for the first voting round; then (after discussions and potential reformulations) at ≥67% majority; and finally, if required, the last round of votes was accepted with >50% acceptance or else a proposal was rejected. 13

Although the SLR was a strong component of the discussions, the process was not only evidence-based but also experience-based and consensus-based, and included consideration of safety, efficacy, cost and long-term data. The levels of evidence (LoE) and grades of recommendation (GoR) were determined for each recommendation based on the Oxford Evidence Based System. 13 14 In May 2023, an anonymised email-based voting on the level of agreement (LoA) among the taskforce members was performed on a 0–10 scale (with 10 meaning full agreement) allowing calculation of mean LoA.

These recommendations address non-topical pharmacological treatments with a main focus on MSK manifestations. These recommendations concern stakeholders, such as experts involved in the care of patients with PsA, particularly rheumatologists and other health professionals (such as rheumatology nurses), general practitioners, dermatologists and other specialists; and also people with PsA as well as other stakeholders, for example, government and hospital officials, patient organisations, regulatory agencies and reimbursement institutions.

The overarching principles (OAPs) and recommendations are shown in table 2 , with LoE, GoR and LoA. The updated recommendations include 7 OAPs (vs 6 in 2019) and 11 recommendations (vs 12 in 2019, due to merges). Of the 11 recommendations, only 4 are unchanged compared with 2019 (the modifications compared with the 2019 recommendations are represented in table 3 ).

2023 updated EULAR recommendations for the pharmacological management of psoriatic arthritis

Comparison of the 2019 and 2023 EULAR recommendations for the management of psoriatic arthritis

Overarching principles

Of the seven OAPs, three remain unchanged, three were reworded and one has been added (overarching principle G). For more information on the thought process leading to the OAPs (unchanged or slightly changed), please refer to the 2015 and 2019 recommendations manuscripts. 9 15 Key points from the discussion of the OAPs are addressed in the following:

A. Psoriatic arthritis is a heterogeneous and potentially severe disease, which may require multidisciplinary treatment (unchanged) .

Although PsA is potentially severe, not all patients will develop severe forms. 16 17 Multidisciplinary management is helpful for many patients, through collaboration between physicians of different specialties and HPRs with the appropriate expertise. 18 19

B. Treatment of psoriatic arthritis patients should aim at the best care and must be based on a shared decision between the patient and the rheumatologist, considering efficacy, safety, patient preferences and costs.

This OAP was modified from 2019 to add patient preferences as an element to be considered and emphasise the importance of shared decision-making to maximise treatment adherence and efficacy while at the same time minimise complications driven by uncontrolled (active) disease as well as potential side effects of pharmacological drugs. 20 21

C. Rheumatologists are the specialists who should primarily care for the musculoskeletal manifestations of patients with psoriatic arthritis; in the presence of clinically relevant skin involvement, a rheumatologist and a dermatologist should collaborate in diagnosis and management.

We consider that rheumatology experts provide the best care for patients with PsA, given their experience with the many drugs used to treat these and other rheumatic and musculoskeletal diseases (RMDs), including the important aspects of safety and comorbidities. Consultation with dermatologists and sometimes other specialists may be helpful in individual clinical scenarios (see also overarching principles F and G). A very slight rewording was performed to discuss skin involvement as ‘clinically relevant’ rather than ‘clinically significant’ for more homogeneity with other bullet points. This bullet point does not address the role of HPRs, who are usually not prescribers in EULAR countries.

D. The primary goal of treating patients with psoriatic arthritis is to maximise health-related quality of life, through control of symptoms, prevention of structural damage, normalisation of function and social participation; abrogation of inflammation is an important component to achieve these goals (unchanged).

For more details, please see the 2019 update of these recommendations. 9

E. In managing patients with psoriatic arthritis, consideration should be given to each musculoskeletal manifestation and treatment decisions made accordingly (unchanged).

For more details, please refer to the 2019 update. 9

F. When managing patients with psoriatic arthritis, non-musculoskeletal manifestations (skin, eye and gastrointestinal tract) should be taken into account; comorbidities such as obesity, metabolic syndrome, cardiovascular disease or depression should also be considered.

The wording ‘such as obesity’ was added, since obesity is frequent in PsA and can influence outcomes. 22 23 Obesity concerns excess body fat, while metabolic syndrome is a collection of risk factors that increase the likelihood of developing cardiovascular disease and type 2 diabetes. Obesity is a significant contributor to the development of metabolic syndrome. The taskforce members discussed if other comorbidities should be added, but it was felt that the term ‘such as’ entails that comorbidities overall should be considered, without a need to list them. Depression and potentially other mental health issues may influence treatment choice. Central sensitisation to pain perception is frequent in PsA and also influences outcomes; this may lead to difficulties in disease management. 24 25 Bone health and malignancies were also specifically highlighted. The management of comorbidities poses specific issues, in particular as to who is responsible for managing distinct disease domains. Solutions need to be applied according to the individual patient, each country’s specific setting and healthcare system organisation.

G. The choice of treatment should take account of safety considerations regarding individual modes of action to optimise the benefit–risk profile (new).

Given new data on the safety of different modes of action, the taskforce proposed this new OAP to emphasise the importance of taking into account safety considerations for each patient. 6 The taskforce was aware that this item is somewhat redundant with overarching principle B but wished to emphasise the importance of benefit–risk assessment when considering the use of specific agents.

Recommendations

Of note, these recommendations are centred on non-topical pharmacological treatments; topical and non-pharmacological treatments are also important in PsA but are outside our scope. Figure 1 shows a summarised algorithm of the treatment proposals.

  • Download figure
  • Open in new tab
  • Download powerpoint

2023 EULAR recommendations algorithm for the management of PsA. bDMARD, biological disease-modifying antirheumatic drug; csDMARD, conventional synthetic disease-modifying antirheumatic drug; IBD, inflammatory bowel disease; L, interleukin; JAK, Janus kinase inhibitor; JAKi, Janus kinase inhibitor; NSAID, non-steroidal anti-inflammatory drugs; TNF, tumour necrosis factor; TNFI, tumour necrosis factor inhibitor.

Some safety issues will be briefly addressed, but for a full picture of the adverse event profile of different drugs the package inserts should be consulted.

Recommendation 1

Treatment should be aimed at reaching the target of remission or, alternatively, low disease activity, by regular disease activity assessment and appropriate adjustment of therapy.

This (unchanged) recommendation is in keeping with the principles of treating-to-target. 26 27 Given the lack of new data to support treat-to-target in PsA, the LoE and GoR are also unchanged. The use of instruments to assess disease activity has been addressed in the treat-to-target recommendations. 26 The definition of remission in PsA remains a subject of debate. 28–30 For the context of these recommendations, remission should be seen as an abrogation of inflammation.

The taskforce members emphasised that disease activity should be regularly assessed across individual involved manifestations (eg, joints, skin, enthesitis, dactylitis, axial disease), and that treatment adjustments will depend on the predominant manifestation of the disease at a given moment. 31

Recommendation 2

Non-steroidal anti-inflammatory drugs may be used to relieve musculoskeletal signs and symptoms; local injections of glucocorticoids may be considered as adjunctive therapy.

This recommendation deals with the short-term use of symptomatic treatment. It was developed by merging the two previous recommendations 2 and 3, which dealt separately with non-steroidal anti-inflammatory drugs (NSAIDs) and glucocorticoids, as both only serve to relieve symptoms in the short term. It was decided to no longer allude to systemic glucocorticoids in a bullet point, since the data underlying the prescription of systemic glucocorticoids in PsA are scarce. Moreover, glucocorticoids harbour many potential safety issues, in particular when taking into account the high prevalence of comorbidities and cardiovascular risk factors in PsA. 3 32 However, the taskforce members agreed that, in some selected cases, systemic glucocorticoid therapy may be helpful for some patients, especially for polyarticular forms and/or as bridging therapy.

NSAIDs offer symptomatic relief to patients with MSK involvement, but have not shown any efficacy in psoriasis. NSAIDs and local glucocorticoid injections are useful to relieve symptoms and local inflammation temporarily, and may be used combined with DMARDs as needed (please see recommendation 3). However, the safety aspects of (potentially long-term) NSAID use have to be taken into account.

The taskforce emphasised that the vast majority of patients should not be treated with NSAIDs alone (without DMARDs), in keeping with a proactive treat-to-target approach to PsA. Only patients with very mild peripheral disease, or with predominant entheseal or axial disease, may sufficiently benefit from NSAIDs as monotherapy. Even in these cases, it is proposed that the use of symptomatic treatments alone should usually be short term, for example, limited to 4 weeks or so. In peripheral arthritis, this duration is based on the opinion of the group; in predominant axial disease, it is in keeping with the Assesment of Spondyloarthritis International Society (ASAS)/EULAR recommendations for axial spondyloarthritis (axSpA) whereby persistent disease after 4 weeks of treatment is considered a failure of NSAIDs. 33 On the other hand, for patients with predominant axial disease who experience significant improvement in clinical symptoms, continuous NSAID use may be proposed if needed to control symptoms, always taking the risks and benefits into account. Of note, data regarding the efficacy of NSAIDs in enthesitis are limited.

Recommendation 3

In patients with polyarthritis or those with monoarthritis/oligoarthritis and poor prognostic factors (eg, structural damage, elevated acute phase reactants, dactylitis or nail involvement), a csDMARD should be initiated rapidly, with methotrexate preferred in those with clinically relevant skin involvement.

Among patients with peripheral arthritis, 34 35 a distinction is made according to the number of swollen joints and according to prognostic factors. 36 In 2019, polyarthritis and monoarthritis/oligoarthritis with poor prognostic markers were addressed in separate bullet points, which were merged for clarity in this update ( table 3 ). Oligoarticular disease is defined as arthritis (swollen joints) of up to four (included) joints. 9 This definition applies to clinical detection (rather than imaging). The prognostic factors have also been previously defined 9 17 and are unchanged.

We recommend rapid csDMARD start, concomitant (or close) with the initiation of symptomatic therapy, for both patients with polyarticular disease and patients with oligoarticular disease and poor prognostic factors. Patients with oligoarticular disease and lack of poor prognostic factors should also receive a csDMARD, but there is less urgency for these patients given the more favourable long-term prognosis. The latter may receive csDMARDs after a longer delay, and potentially a period of symptomatic treatment alone (see recommendation 2). Since there is a lack of strong evidence to support this approach of rapid treatment introduction, this recommendation was mainly based on expert opinion.

Of note, there is no specific recommendation for dactylitis. We consider dactylitis as an association of (oligo)synovitis, tenosynovitis and enthesitis. Patients with isolated dactylitis should be treated similarly to patients with oligoarthritis; this includes the use of joint glucocorticoid injections and csDMARDs, which have shown efficacy in relieving dactylitis. 37

The first DMARD should be a csDMARD (meaning MTX, leflunomide or sulfasalazine). The decision concerning the first-line DMARD is important and led to much taskforce discussion, and has been put as an element for further research in the research agenda ( table 4 ). The continued prioritisation of csDMARDs reflects consensual expert opinion within the taskforce that favoured the benefit–risk–cost balance of csDMARDs and in particular MTX over targeted drugs. The absence of new data indicating the superiority of a b/tsDMARD as first-line, and in the presence of new data on MTX, was seen as confirming the efficacy of this drug in PsA. 5 37–39

Research agenda indicating priorities for future research in PsA

Since the EULAR recommendations adhere to a treat-to-target (T2T) approach which implies a reduction of disease activity by at least 50% within 3 months and reaching the treatment target within 6 months, a csDMARD should not be continued if these therapeutic goals are not attained. On csDMARD inefficacy, another DMARD, such as a bDMARD (see recommendation 4), can be rapidly instituted. Generally speaking, we recommend assessing the efficacy of the csDMARD and deciding if it should be pursued as monotherapy or not, after 12 weeks, in line with the T2T recommendations. 26 Although MTX use in PsA has typically been founded on evidence from other immune-mediated diseases such as RA and psoriasis, 40 there is also evidence for its efficacy in PsA, with recent confirmatory data both from observational data sources and from a randomised trial indicating that a proportion of patients will respond to escalation of doses of MTX. 39 41–43 The efficacy–safety balance of MTX should be assessed regularly, given the general metabolic profile of patients with PsA which can put them at a higher risk for adverse events such as hepatotoxicity. 42–44 The MTX dose should be sufficient, that is, usually between 20 mg and 25 mg weekly (about 0.3 mg/kg), and use of folate supplementation is recommended to reduce the adverse effects of MTX. 45

Other csDMARDs (ie, leflunomide and sulfasalazine) are potential treatment options and have demonstrated efficacy in PsA peripheral arthritis. 15 A recent trial of the combination of MTX with leflunomide indicated a low efficacy to safety ratio; thus, this association is not recommended. 38

Recommendation 4

In patients with peripheral arthritis and an inadequate response to at least one csDMARD, therapy with a bDMARD should be commenced.

This recommendation is relevant to patients with peripheral arthritis and therefore is meant to include both those with monoarticular/oligoarticular and those with polyarticular disease. However, where peripheral involvement is limited and without poor prognostic factors, it is not unreasonable to apply a second csDMARD course before initiating a bDMARD/tsDMARD, when this decision is agreed by the prescriber and the patient.

After failure of at least one csDMARD, the taskforce proposed as next step one of the many available bDMARDs ( table 1 ). 5

JAKi is efficacious in PsA, but the taskforce decided that at present the efficacy–safety balance, costs and long-term experience with many bDMARDs clearly favour their recommendation over JAKi. Relevant comorbidities in many patients with PsA also favour bDMARD selection.

Regarding bDMARDs, no order of preference is given since no bDMARD has demonstrated superiority for joint involvement over other bDMARDs ( table 1 ). 46–48 Herein they are listed in numeric order of the targeted cytokine, and not in order of preference. However, in the context of the present recommendation, CTLA4 (cytotoxic T-lymphocyte–associated antigen 4) inhibition is not considered a good option due to its limited efficacy in clinical trials. 49 The GoR is high for this bullet point, reflecting robust accrued data. 50

Unlike MSK manifestations, non-MSK domains of PsA allow differential order of bDMARD recommendation (se recommendation 9). 5 Two head-to-head trials of bDMARDs in PsA, both comparing an IL-17A inhibitor with adalimumab, showed similar efficacy for IL-17A inhibition and TNF inhibition, as regards efficacy on the joints, while skin responses are better with the former. 46 47 We also note that there is evidence on the better efficacy of a bDMARD compared with MTX in skin psoriasis (and evidence for differences between bDMARDs, please see recommendation 9). 51 52

All bDMARDs and JAKi showed efficacy regarding inhibition of radiographic progression; such data are lacking for apremilast.

The safety of the different available categories of bDMARDs appears acceptable in our SLR. 5 All bDMARDs increase the risk of infections. 5 The risks of TNF inhibitors (TNFis) are well known. Candidiasis (usually mucocutaneous) is more frequent with IL-17A and IL-17A/F inhibition, particularly the latter. 53 54 While IL-23-p19i is a more recent addition to the armament, its safety appears satisfactory, in line with ustekinumab which also interferes with IL-23 (p40 chain) whose adverse event profile is well known and appears satisfactory. 5

As a general rule, safety and comorbidities need to be taken into account when a decision to start a new drug is taken. More complete information regarding the safety aspects of bDMARDs is provided in the individual drug’s product information. Costs should also be taken into account, but these may vary at the country level; cost savings will occur in many countries due to the availability of biosimilar TNF blockers and potentially other biosimilars in due course. Personalised medicine, to facilitate an optimal choice of the first bDMARD, is currently difficult due to the lack of individualised predictors of response to treatment. 55 As previously discussed, it is of key importance to take into account the patient phenotype and potential extra-MSK features ( figure 1 ). Comorbidities are also to be considered. 23 56 More research is needed on the predictors of drug response, including the effect of sex. 57 58

Combination of a bDMARD with a csDMARD

First-line bDMARDs are often given in combination with csDMARDs, such as MTX. 41 59 However, there are conflicting data regarding the added benefit of concomitant MTX with targeted DMARDs in patients with peripheral disease and no evidence of a benefit of MTX in patients with axial symptoms. 33 60 61

MTX combination with bDMARDs has been explored mainly for TNFi; studies have generally found similar efficacy with or without concomitant MTX, although with increased drug survival when using MTX, in some studies. 41 59 62 A recent large study reported increased remission rates with TNFi plus MTX combination therapy. 59 With other modes of action, there is a lack of data to support comedication. Overall, the taskforce proposed to combine a first bDMARD with the previously prescribed csDMARD, in all cases where such a treatment has already been tolerated by the patient and in particular when the first bDMARD is a TNFi. For other modes of action, given the lack of data, we cannot recommend comedication, although the usual practice would be to continue a csDMARD when initiating a bDMARD (doses of the csDMARD can be diminished if needed).

Recommendation 5

In patients with peripheral arthritis and an inadequate response to at least one bDMARD, or when a bDMARD is not appropriate, a JAKi may be considered, taking safety considerations into account.

This recommendation elicited much debate. On the one hand, since 2019, new data have accrued on JAKis in terms of efficacy, such as the publication of positive trials on upadacitinib in PsA. 63 On the other hand, there is currently a worldwide cautionary statement issued by both the Food and Drug Administration and the European Medicine Agency restricting the use of JAKis in all diseases including PsA, based on an increased risk of cardiovascular and malignancy events observed with tofacitinib in older patients with RA with cardiovascular risk factors. 6–8 JAKis lead to increased general infection rates of similar magnitude to bDMARDs, but higher for herpes zoster infections. 5 Drug safety for the JAKis tofacitinib and upadacitinib in the specific context of PsA was recently reported and appeared reassuring; however, follow-up was short and further data are warranted. 64 65 While currently long-term extension data do not show increased cardiovascular/cancer risk related to JAKi use in PsA, there are no RCTs similar to the ORAL-Surveillance trial available at present in PsA. Therefore, the taskforce felt that the precautions related to RA also have to be taken for PsA, especially since various comorbidities important for the JAKi risk profile may be more prevalent in PsA than in RA (eg, obesity and cardiovascular risk factors). On the other hand, controlling inflammation is important to decrease cardiovascular risk.

Safety of JAKis should be carefully considered 66 ; we propose in table 2 and figure 1 a shortened version of the EMA warning/limitation to use, which includes age, smoking status and other cardiovascular/venous/cancer risk factors. 7 8

After much discussion, we considered that the efficacy–safety balance of JAKis did not justify putting JAKis on the same level as bDMARDs for order of choice (ie, proposing JAKis as usual treatment after insufficient response and/or intolerance to csDMARD treatment).

Therefore, JAKis are proposed usually as second-line targeted therapies (or third-line DMARDs). Of note, we recognise that, for some patients, JAKis may be a relevant option after a csDMARD; this is reflected in the wording of the bullet point (‘when a bDMARD is not appropriate’). This ‘non-appropriateness’ may include contraindications to bDMARDs, practical issues leading to a strong preference for oral administrations (eg, lack of proper conservation at regulated temperatures) and patient preferences, including risk of non-adherence to injections (in accordance with the first OAP concerning shared decision-making). Nevertheless, patients will have to weigh their preferences against potential risks.

The GoR was low for this recommendation, in particular regarding safety considerations, since the data are sparse in PsA and we had to rely on data taken from RA. The taskforce suggests using JAKi after bDMARDs have failed because several new bDMARDs with excellent effects on skin involvement and relatively good safety data are now available (IL-23, IL-17 inhibitors) and more long-term data on JAKi efficacy and safety are needed in PsA. The efficacy to safety ratio of JAKis was also put into the research agenda ( table 4 ).

Currently, drugs from the tyrosine kinase 2 (TYK2) pathway inhibition are being assessed in PsA 5 ; they are not currently licensed for use, and indeed the data are at this point limited in particular for safety (including in psoriasis where such therapy is licensed). Thus, we did not include TYK2 inhibition in the current recommendations.

Recommendation 6

In patients with mild disease and an inadequate response to at least one csDMARD, in whom neither a bDMARD nor a JAKi is appropriate, a PDE4 inhibitor may be considered.

This recommendation is unchanged from 2019, with unchanged LoE. ‘Mild disease’ is defined as oligoarticular or entheseal disease without poor prognostic factors and limited skin involvement. 9 67 The FOREMOST trial recently confirmed the efficacy of apremilast compared with placebo in oligoarticular PsA. 67 Nevertheless, the reason to place apremilast differently from bDMARDs or other tsDMARDs is not only based on its consistently relatively low efficacy, but also on the lack of structural efficacy data (thus putting the term ‘DMARD’ at risk since there are no data on inhibition of damage progression).

This recommendation received the lowest LoA within the taskforce, reflecting that more than a quarter of the taskforce participants were in favour of only discussing apremilast in the text without a specific bullet point.

The use of apremilast in combination with TNFi is off-label, and is a more costly drug combination with no supporting data and cannot be recommended.

Recommendation 7

In patients with unequivocal enthesitis and an insufficient response to NSAIDs or local glucocorticoid injections, therapy with a bDMARD should be considered.

This bullet point remains unchanged. Unequivocal enthesitis refers (as in 2019) to definite entheseal inflammation (which might need additional diagnostic imaging) to avoid overtreatment of entheseal pain not related to PsA (eg, in the context of widespread pain syndrome or repetitive mechanical stress). 68 69 In terms of treatment options, the taskforce discussed the recent data indicating indirectly some efficacy for MTX in enthesitis. 5 38 39 However, it was felt that the data for MTX were not sufficiently strong to propose MTX in the bullet point. We do acknowledge that, for some patients with enthesitis, MTX may be an option ( figure 1 ).

For unequivocal predominant enthesitis, the proposal is to introduce a bDMARD (without a preference for a specific mode of action) since all currently approved bDMARDs have demonstrated efficacy on enthesitis, with similar magnitudes of response, although head-to-head trials are missing ( figure 1 ). 5 Here, costs may be important, but other manifestations will also have to be taken into account (see recommendations 8 and 9). Of note, although tsDMARDs are not mentioned specifically in the bullet point, they are an option in some cases of enthesitis (always considering benefit to risk ratios, in particular for JAKis). 7 8

Recommendation 8

In patients with clinically relevant axial disease with an insufficient response to NSAIDs, therapy with an IL-17Ai, a TNFi, an IL-17 A/Fi or a JAKi should be considered.

The formulation for axial disease was modified from predominant to clinically relevant. For axial disease, in agreement also with the recently updated ASAS/EULAR axSpA recommendations, 33 we continue to judge csDMARDs as not relevant. bDMARDs targeting TNF and IL-17A and IL-17A/F as well as tsDMARDs targeting JAK are recommended. For JAKis, safety issues should be considered. Of note, we propose a choice between the drugs, not a combination of the drugs.

For this recommendation, the order of the drugs listed is of relevance, meaning that IL-17A inhibition has been put first due to the availability of currently only one trial specifically investigating axial PsA and using secukinumab (the MAXIMISE trial), 70 with the other drugs listed thereafter. Thus, the LoE is stronger for IL-17A inhibition than for the other drugs, where the data are derived from axial SpA. 33

The other drugs are listed with TNF inhibition first due to long-term safety data, then IL-17 A/F inhibition which has been recently licensed for axial SpA and JAK inhibition as an option taking into account safety. JAKis are here proposed in the same recommendation as bDMARDs, also reflecting that comorbidity profiles of patients with predominant or isolated axial PsA may be more comparable to patients with axial SpA and therefore may have a more favourable safety profile with respect to cardiovascular and cancer risks than many patients with predominant peripheral arthritis. The taskforce discussed the circumstantial evidence that IL-23 inhibition may be efficacious for axial PsA; however, given negative trials for IL-12/23 inhibition in axSpA, the IL-23 pathway is not recommended here. 33 71–73 Axial PsA remains a challenging form of PsA in terms of definition and differences with axial SpA; thus, this phenotype is part of the research agenda ( table 4 ).

Recommendation 9

The choice of the mode of action should reflect non-musculoskeletal manifestations related to PsA; with clinically relevant skin involvement, preference should be given to an IL-17A or IL-17A/F or IL-23 or IL-12/23 inhibitor; with uveitis to an anti-TNF monoclonal antibody; and with IBD to an anti-TNF monoclonal antibody or an IL-23 inhibitor or IL-12/23 inhibitor or a JAKi.

This is a new recommendation to clarify more visibly than in 2019 ( table 3 ) that the choice of drug should take into account not only the MSK PsA phenotype but also extra-MSK manifestations.

The first extra-MSK manifestation of interest in PsA is skin psoriasis. Although most patients with PsA present with skin psoriasis or have a personal history of skin psoriasis, registry data indicate that many patients with PsA have mild skin involvement. 74 However, even limited skin psoriasis can be troublesome, since relevant skin involvement is defined as either extensive (body surface area involvement >10%), or as important to the patient, that is, impacting negatively their quality of life (such as is the case with face or genital involvement). 9 For these patients, we recommend preferentially considering drugs targeting the IL-17A, IL-17A/F or IL-23 pathway (here, the order between drugs is cited in order of numbered cytokine, not preference). There are strong data, including head-to-head trials, in the field of skin psoriasis showing that drugs targeting the IL-23 and IL-17 pathways are superior to TNFis and to JAKis for skin psoriasis. 51 52 75–78 This justified proposing these modes of action preferentially in case of relevant skin involvement. This is in keeping with psoriasis recommendations. 79

Uveitis is not as frequent in PsA as it is in axial SpA; the prevalence is reported around 5%. 80 However, uveitis can be severe and should influence treatment decisions. Currently, the only mode of action with direct proof of efficacy on uveitis is TNF inhibition through monoclonal antibodies (ie, adalimumab and infliximab). Thus, for patients with uveitis, an anti-TNF monoclonal antibody is preferred.

Inflammatory bowel disease (IBD) concerns 2%–4% of patients with PsA. 80 The armamentarium for IBD has widened recently, and this recommendation reflects this fact, proposing that one of the modes of action currently licensed for IBD should be prescribed when it coexists with PsA. No order of preference is given here and prescribers are urged to adhere to EMA authorisations for IBD and take into account safety. For informative purposes, as of mid-2023, drugs authorised for IBD include anti-TNF monoclonal antibodies (ie, adalimumab and infliximab), the IL-12/23i ustekinumab, the IL-23i risankizumab (for Crohn’s disease) and two JAKis (one of which, tofacitinib, only for Crohn’s disease). 81–85 IL-17is (both A and A/F) are not recommended in case of active IBD, given indications of a heightened risk of flares. 86–88

Decisions for patients presenting with major skin involvement, with uveitis or with IBD should be discussed with the relevant specialist colleagues, as needed.

In all cases, the prescriber must refer to current drug authorisations and take into account safety and comorbidities.

To present an order for choosing drugs, we propose that the first element to take into account is the PsA subtype, then as a second element extra-MSK manifestations (always considering safety and comorbidities).

Recommendation 10

In patients with an inadequate response or intolerance to a bDMARD or a JAKi, switching to another bDMARD or JAKi should be considered, including one switch within a class.

This recommendation is unchanged from 2019, with unchanged LoE. 9 After failing one targeted drug, it is logical to switch to another targeted drug; there are currently no strong data to prefer a switch with a change in mode of action to a switch within the same mode of action. Of note, this recommendation does not limit the total number of switches for a given patient. It also does not necessarily mean that more switches within a class could not be done, but the taskforce felt that a switch should not necessarily be done after one drug of a class has failed. Switches can be made, as appropriate, between bDMARDs, or between bDMARDs and JAKis. We include abatacept as a treatment option ( table 1 ), 49 but note that it demonstrated modest efficacy and hence this is an option to be used only after failing one or more other targeted drugs. The efficacy of bimekizumab, the dual IL-17 A/F inhibitor, appeared similar in TNF-naïve and TNF-experienced populations; this will warrant confirmation. 53 54 Finally, a combination of bDMARDs is being explored, but cannot be recommended at this time.

Recommendation 11

In patients in sustained remission, tapering of DMARDs may be considered.

This bullet point is unchanged. However, more data have accrued on tapering, leading to a higher grade of recommendation. 89–91 By tapering we mean ‘dose reduction’ not drug discontinuation since the latter usually leads to flares. Drug tapering is a logical step when patients are doing well over time, from a safety and a cost perspective (tapering is often performed by the patient himself/herself alone). On the other hand, long-term data are missing and currently drug tapering is off-label. For all of these reasons, the taskforce kept the tentative wording of ‘may be considered’ (to ensure it is not made mandatory) and of course in the context of a shared decision with the patient (as is the case also for the other treatment decisions).

Research agenda

The taskforce felt that many issues needed more data, and an extensive research agenda was developed ( table 4 ).

This paper presents updated recommendations for the management of PsA, a treatment algorithm and a research agenda. This update addresses all currently available drugs and modes of action, and recommends an order to their use, taking into account the phenotype of the MSK and the non-MSK manifestations.

These elements should be helpful in the management of individual patients, but also in the advocacy for better access to care and for research.

This 2023 update is a major update since most of the recommendations were modified substantially. The EULAR standardised operating procedures propose a voting system for updates which discourages minor modifications for rewordings. 13 Since 2019, many new drugs have become available in PsA; the choice of which drug to prescribe to which patients rests on data related to efficacy, clinical phenotype, adverse event risk profile, tolerance, long-term data, cost and access. While laboratory biomarkers for stratified treatment approaches are lacking, the taskforce used clinical markers to develop clinical phenotypic preferences for specific drugs. In these updated recommendations, the taskforce applied expert opinion to the available data, to propose a pragmatic, logical order of a step-up approach to targeted treatments of PsA. The taskforce felt that proposing an order is helpful both for clinicians and to advocate for access to drugs for patients with PsA.

The drug options considered in these recommendations are currently licensed for PsA. We are aware that other drugs are being tested, or are available in other related conditions, especially skin psoriasis; however, these drugs are considered out of the scope of the present recommendations. Brodalumab was at the time of these recommendations only approved for psoriasis; TYK2 inhibitors such as deucravacitinib and brepocitinib have also been developed or in development for skin psoriasis and PsA; izokibep is a novel antibody mimetic, a small IL-17i currently undergoing testing; and an oral IL-23i is also in development. 5

The taskforce had extensive discussions on the positioning of JAKi in the recommendations. 63 92 We as a group feel that it is important to make haste slowly , and to uphold high safety standards when promoting drugs with only short-to-medium-term experience and for which long-term data are lacking—not least in PsA. In fact, this cautious attitude was also adhered to in the 2019 recommendations, and further safety developments have later confirmed that this attitude was appropriate. 7 8 It is of key importance to continue monitoring the drugs and, ideally, perform controlled trials, as only hard and high-level data can be reassuring.

Costs are also an important aspect in patient management, and it is generally recommended to prescribe the cheaper drug if two agents have similar efficacy and safety. Of note, even if one mode of action may have somewhat better efficacy on certain manifestations, a less expensive agent could still be preferred as long as it does not bear much lesser efficacy in that disease domain. Biosimilars are available for several TNFis and have led to significant reduction in expenditure and more use in many countries, while their price is not much lower than that of originators in many other ones. Tofacitinib will soon become generic, and the same is true for apremilast, which should also lower the costs for these agents and allow wider application especially in less affluent countries. Thus, overall, the taskforce felt that the prescription of drugs would account for the relationships between efficacy, safety and cost, in line with the OAPs and the 11 recommendations which are summarised in the algorithm ( figure 1 ). Many points are still to be confirmed in the management of PsA, leading to an extensive research agenda. 93

In conclusion, the updated 2023 recommendations should be helpful to clinicians but also to health professionals and patients when discussing treatment options. They can also be helpful to promote access to optimal care. As new data become available and new drugs are authorised in PsA, these recommendations should be again updated.

Ethics statements

Patient consent for publication.

Not required.

  • Zabotti A ,
  • De Marco G ,
  • Gossec L , et al
  • Alharbi S ,
  • Lee K-A , et al
  • Ferguson LD ,
  • Siebert S ,
  • McInnes IB , et al
  • Lubrano E ,
  • Scriffignano S ,
  • de Vlam K , et al
  • Kerschbaumer A ,
  • Smolen JSS ,
  • Ferreira JO , et al
  • Ytterberg SR ,
  • Mikuls TR , et al
  • ↵ European Medicine Agency statement . Available : https://www.ema.europa.eu/en/medicines/human/referrals/janus-kinase-inhibitors-jaki [Accessed 7 Nov 2023 ].
  • ↵ US food and Drug Administration . Available : https://www.fda.gov/safety/medical-product-safety-information/janus-kinase-jak-inhibitors-drug-safety-communication-fda-requires-warnings-about-increased-risk [Accessed 7 Nov 2023 ].
  • Baraliakos X ,
  • Kerschbaumer A , et al
  • Coates LC ,
  • Soriano ER ,
  • Corp N , et al
  • Ogdie A , et al
  • ↵ Available : https://www.eular.org/web/static/lib/pdfjs/web/viewer.html?file=https://www.eular.org/document/download/680/b9eb08d0-faca-4606-8ed9-d0539b3f312a/660 [Accessed 1 Mar 2023 ].
  • Chalmers I ,
  • Glasziou P ,
  • Greenhalgh T , et al
  • Smolen JS ,
  • Ramiro S , et al
  • FitzGerald O ,
  • Chandran V , et al
  • Kerola AM ,
  • Rollefstad S , et al
  • Wendling D ,
  • Hecquet S ,
  • Fogel O , et al
  • Gladman D ,
  • McNeil HP , et al
  • Chimenti MS ,
  • Navarini L , et al
  • Otero-Losada M ,
  • Kölliker Frers RA , et al
  • Orbai A-M , et al
  • Trouvin AP ,
  • Ballegaard C ,
  • Skougaard M ,
  • Guldberg-Møller J , et al
  • Braun J , et al
  • Moverley AR ,
  • McParland L , et al
  • Gayraud M , et al
  • Landewé RBM ,
  • van der Heijde D
  • Orbai A-M ,
  • Mease P , et al
  • Vincken NLA ,
  • Balak DMW ,
  • Knulst AC , et al
  • Nikiphorou E ,
  • Sepriano A , et al
  • de Vlam K ,
  • Steinfeld S ,
  • Toukap AN , et al
  • Kishimoto M ,
  • Deshpande GA ,
  • Fukuoka K , et al
  • Vieira-Sousa E ,
  • Rodrigues AM , et al
  • Mulder MLM ,
  • Vriezekolk JE ,
  • van Hal TW , et al
  • Tillett W ,
  • D’Agostino M-A , et al
  • Bergstra SA , et al
  • Lindström U ,
  • di Giuseppe D ,
  • Exarchou S , et al
  • Wilsdon TD ,
  • Whittle SL ,
  • Thynne TR , et al
  • Lambert De Cursay G ,
  • Lespessailles E
  • Curtis JR ,
  • Beukelman T ,
  • Onofrei A , et al
  • Wang C , et al
  • Behrens F , et al
  • McInnes IB ,
  • Behrens F ,
  • Mease PJ , et al
  • Bergmans P , et al
  • Gottlieb AB ,
  • van der Heijde D , et al
  • Sawyer LM ,
  • Markus K , et al
  • Sbidian E ,
  • Chaimani A ,
  • Garcia-Doval I , et al
  • Guelimi R , et al
  • Asahina A ,
  • Coates LC , et al
  • Merola JF ,
  • Landewé R ,
  • Miyagawa I ,
  • Nakayamada S ,
  • Nakano K , et al
  • Drosos GC ,
  • Houben E , et al
  • Tarannum S ,
  • Leung Y-Y ,
  • Johnson SR , et al
  • Gorlier C , et al
  • Di Giuseppe D ,
  • Delcoigne B , et al
  • Cañete JD ,
  • Olivieri I , et al
  • Rossmanith T ,
  • Foldenauer AC , et al
  • Fagerli KM ,
  • Anderson JK ,
  • Magrey M , et al
  • Burmester GR ,
  • Winthrop KL , et al
  • Charles-Schoeman C ,
  • Cohen S , et al
  • Kristensen LE ,
  • Yndestad A , et al
  • Coates L , et al
  • Lories RJ ,
  • Marchesoni A ,
  • Merashli M , et al
  • Pournara E , et al
  • Helliwell PS ,
  • Gladman DD ,
  • Chakravarty SD , et al
  • Deodhar A ,
  • Gensler LS ,
  • Sieper J , et al
  • Gladman DD , et al
  • Love TJ , et al
  • Bachelez H ,
  • van de Kerkhof PCM ,
  • Strohal R , et al
  • Blauvelt A ,
  • Bukhalo M , et al
  • Reich K , et al
  • Leonardi C ,
  • Elewski B , et al
  • Strober BE ,
  • Kaplan DH , et al
  • Harrison NL , et al
  • Feagan BG ,
  • Sandborn WJ ,
  • Gasink C , et al
  • Sands BE , et al
  • Panaccione R , et al
  • Vermeire S ,
  • Zhou W , et al
  • Loftus EV ,
  • Lacerda AP , et al
  • Letarouilly J-G ,
  • Pierache A , et al
  • Komaki Y , et al
  • Tucker LJ ,
  • Pillai SG ,
  • Tahir H , et al
  • Ruwaard J ,
  • L’ Ami MJ ,
  • Kneepkens EL , et al
  • Fleishaker D , et al
  • Widdifield J ,
  • Wu CF , et al

Handling editor Dimitrios T Boumpas

X @LGossec, @FerreiraRJO, @lihi_eder, @dranielmar, @drpnash, @sshoopworrall

Contributors All authors have contributed to this work and approved the final version.

Funding Supported by EULAR (QoC016).

Competing interests No support to any author for the present work. Outside the submitted work: LG: research grants: AbbVie, Biogen, Lilly, Novartis, UCB; consulting fees: AbbVie, Amgen, BMS, Celltrion, Janssen, Lilly, MSD, Novartis, Pfizer, UCB; non-financial support: AbbVie, Amgen, Galapagos, Janssen, MSD, Novartis, Pfizer, UCB; membership on an entity’s Board of Directors or advisory committees: EULAR Treasurer. AK: speakers bureau, consultancy: AbbVie, Amgen, Galapagos, Janssen, Eli Lilly, MSD, Novartis, Pfizer, UCB. RJOF: research grants: Medac, Lilly; consulting fees: Sanofi. DA: research grants: Galapagos, Lilly; consulting fees: AbbVie, Gilead, Janssen, Lilly, Merck, Novartis, Sanofi. XB: research grants: AbbVie, MSD, Novartis; consultancies: AbbVie, Amgen, Celltrion, Chugai, Eli Lilly, Galapagos, Janssen, MSD, Novartis, Pfizer, Roche, Sandoz, UCB; membership on an entity’s Board of Directors or advisory committees: ASAS President, EULAR President Elect. W-HB: honoraria: AbbVie, Almirall, BMS, Janssen, Leo, Eli Lilly, Novartis, UCB; expert testimony: Novartis; participation on a Data Safety Monitoring Board or Advisory Board: AbbVie, Almirall, BMS, Janssen, Leo, Eli Lilly, Novartis, UCB. IBM: honoraria/consultation fees non-exec roles: NHS GGC Board Member, Evelo Board of Directors, Versus Arthritis Trustee Status; stock or stock options: Evelo, Cabaletta, Compugen, Causeway Therapeutics, Dextera. DGM: research grants: Janssen, AbbVie, Lilly, Novartis, UCB, BMS, Moonlake; consulting fees: Janssen, AbbVie, Lilly, Novartis, UCB, BMS, Moonlake, Celgene; honoraria: Janssen, AbbVie, Lilly, Novartis, UCB, BMS, Moonlake. KLW: research grants: BMS, Pfizer; consulting: Pfizer, AbbVie, AstraZeneca, BMS, Eli Lilly, Galapagos, GlaxoSmithKline (GSK), Gilead, Novartis, Moderna, Regeneron, Roche, Sanofi, UCB Pharma. AB: speakers fees: AbbVie, Amgen, AlphaSigma, AstraZeneca, Angelini, Biogen, BMS, Berlin-Chemie, Boehringer Ingelheim, Janssen, Lilly, MSD, Novartis, Pfizer, Roche, Sandoz, Teva, UCB, Zentiva; consultancies: Akros, AbbVie, Amgen, AlphaSigma, Biogen, Boehringer Ingelheim, Lilly, Mylan, MSD, Novartis, Pfizer, Roche, Sandoz, Sobi, UCB. PVB: consulting fees: AbbVie, Janssen-Cilag, Pfizer; honoraria: AbbVie, Bausch Health, Celltrion Healthcare, Eli Lilly, Gedeon Richter, IBSA Pharma, Infomed, Janssen-Cilag, Novartis, Pfizer, Sandoz; payment for expert testimony: Gedeon Richter; other: President, Hungarian Association of Rheumatologists. GRB: honoraria and/or speaker fees: AbbVie, BMS, Janssen, Lilly, Novartis, Pfizer. JDC: honoraria: UCB. PC: research grants: AbbVie, Amgen, Biogen, Jansen, Lilly, Novartis, UCB; consulting fees: AbbVie, Amgen, Celltrion, Janssen, Lilly, MSD, Novartis, Pfizer, UCB. LE: consultation fee/advisory board: AbbVie, Novartis, Janssen, UCB, BMS, Eli Lilly; research/educational grants: AbbVie, Fresenius Kabi, Janssen, Amgen, UCB, Novartis, Eli Lilly, Sandoz, Pfizer. MLH: grant support: AbbVie, Biogen, BMS, Celltrion, Eli Lilly, Janssen Biologics BV, Lundbeck Foundation, MSD, Pfizer, Roche, Samsung Bioepis, Sandoz, Novartis, Nordforsk; honoraria: Pfizer, Medac, Sandoz; advisory board: AbbVie; past-chair of the steering committee of the Danish Rheumatology Quality Registry (DANBIO, DRQ), which receives public funding from the hospital owners and funding from pharmaceutical companies; cochair of EuroSpA, partly funded by Novartis. AI: research grants from AbbVie, Pfizer, Novartis; honoraria for lectures, presentations, speakers bureaus from AbbVie, Alfasigma, BMS, Celgene, Celltrion, Eli Lilly, Galapagos, Gilead, Janssen, MSD, Novartis, Pfizer, Sanofi Genzyme, Sobi; EULAR Board Member; EULAR Congress Committee, Education Committee and Advocacy Committee Advisor; EULAR Past President. LEK: consultancies: AbbVie, Amgen, Biogen, BMS, Celgene, Eli Lilly, Pfizer, UCB, Sanofi, GSK, Galapagos, Forward Pharma, MSD, Novartis, Janssen; has been representing rheumatology FOREUM scientific chair. RQ: consultancy and/or speaker’s honoraria from and/or participated in clinical trials and/or research projects sponsored by AbbVie, Amgen-Celgene, Eli Lilly, Novartis, Janssen, Pfizer, MSD, UCB. DM: honoraria: UCB, Janssen, GSK, AstraZeneca, AbbVie; support to meetings: Janssen. HM-O: grant support: Janssen, Novartis, UCB; honoraria and/or speaker fees: AbbVie, Biogen, Eli Lilly, Janssen, Moonlake, Novartis, Pfizer, Takeda, UCB. PJM: grant support: AbbVie, Acelyrin, Amgen, Bristol Myers Squibb, Eli Lilly, Genascence, Janssen, Novartis, Pfizer, UCB; consulting fees: AbbVie, Acelyrin, Aclaris, Alumis, Amgen, Boehringer Ingelheim, Bristol Myers Squibb, Eli Lilly, Genascence, Inmagene, Janssen, Moonlake, Novartis, Pfizer, Takeda, UCB, Ventyx, Xinthera; honoraria: AbbVie, Amgen, Eli Lilly, Janssen, Novartis, Pfizer, UCB. PN: consulting fees and honoraria: AbbVie, Amgen, BMS, Lilly, Janssen, GSK, Novartis, UCB, Servatus; boards: Amgen, BMS, Janssen, GSK, Novartis, UCB; GRAPPA Steering Committee, Chair ASMPOC, ARA. LS: consulting fees: AbbVie, Almirall, Novartis, Janssen, Lilly, UCB, Pfizer, Bristol Myers Squibb, Boehringer Ingelheim; honoraria: AbbVie, Almirall, Novartis, Janssen, UCB, Pfizer, Takeda, Galderma, Biogen, Celgene, Celltrion, Lilly, Sanofi, Bristol Myers Squibb, Boehringer Ingelheim; support to attending meetings: AbbVie, Janssen, Lilly, Novartis, UCB, Galderma, Bristol Myers Squibb, Boehringer Ingelheim; participation in boards: AbbVie, Almirall, Novartis, Janssen, UCB, Pfizer, Galderma, Biogen, Lilly, Sanofi, Bristol Myers Squibb, Boehringer Ingelheim; GRAPPA Executive Board (elected), British Society for Medical Dermatology (BSMD) Committee. GS: honoraria: Novartis, Janssen. SJWS-W: grant support: Medical Research Council (MR/W027151/1). YT: research grants from Mitsubishi Tanabe, Eisai, Chugai, Taisho; speaking fees and/or honoraria from Eli Lilly, AstraZeneca, AbbVie, Gilead, Chugai, Boehringer Ingelheim, GlaxoSmithKline, Eisai, Taisho, Bristol Myers, Pfizer, Taiho. FEVdB: consultancy honoraria from AbbVie, Amgen, Eli Lilly, Galapagos, Janssen, Novartis, Pfizer, UCB. AZ: speakers bureau: AbbVie, Novartis, Janssen, Lilly, UCB, Amgen; paid instructor for AbbVie, Novartis, UCB. DvdH: consulting fees AbbVie, Argenx, Bayer, BMS, Galapagos, Gilead, GlaxoSmithKline, Janssen, Lilly, Novartis, Pfizer, Takeda, UCB Pharma; Director of Imaging Rheumatology bv; Associate Editor for Annals of the Rheumatic Diseases ; Editorial Board Member for Journal of Rheumatology and RMD Open ; Advisor Assessment Axial Spondyloarthritis International Society. JSS: research grants from AbbVie, AstraZeneca, Lilly, Galapagos; royalties from Elsevier (textbook); consulting fees from AbbVie, Galapagos/Gilead, Novartis-Sandoz, BMS, Samsung, Sanofi, Chugai, R-Pharma, Lilly; honoraria from Samsung, Lilly, R-Pharma, Chugai, MSD, Janssen, Novartis-Sandoz; participation in advisory board from AstraZeneca.

Provenance and peer review Not commissioned; externally peer reviewed.

Read the full text or download the PDF:

U.S. flag

A .gov website belongs to an official government organization in the United States.

A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Managing Substance Use Disorders
  • Health Equity
  • Data Resources
  • Additional HHS Resources
  • Overdose Prevention Resources
  • Management and Treatment of Pain
  • Clinical Practice Guideline at a Glance
  • Clinical Care and Treatment
  • Health Care Provider Toolkits
  • Electronic CDS Tools for Opioids
  • Strategies and Partnerships
  • Resources for Public Health Professionals
  • Journal Articles
  • MMWR Articles
  • Guides and Meeting Reports
  • Related Publications
  • Overdose Data to Action
  • Drug-Free Communities (DFC)

Medications for Opioid Use Disorder (MOUD) Study

  • In 2022, an estimated 6.1 million people ages 12 or older reported having an opioid use disorder (OUD).
  • Common treatment options for OUD include medications for opioid use disorder (MOUD) (including methadone, buprenorphine, naltrexone) and counseling without medication.
  • CDC's National Center for Injury Prevention and Control conducted a study of Medications for Opioid Use Disorder (MOUD) to better understand treatment engagement and factors that may influence treatment experiences and outcomes.

Female Doctor And Patient Consultation

MOUD study objectives

The objectives of this 18-month observational cohort study in outpatient settings without random assignment to OUD treatment were to:

  • Identify factors that influence the type of OUD treatment offered to patients and the patient's treatment choice.
  • Better understand patient and outpatient treatment facility factors associated with key OUD treatment outcomes.
  • Identify relevant antecedents or co-occurring conditions that could be addressed through primary prevention.
  • Inform evidence-based practices and OUD treatment policies.

Types of data

The collected data are available for the public to access. The publicly available data from the MOUD Study include:

  • Type of OUD treatment
  • Substance use
  • Drug overdose
  • Health-related quality of life
  • Socioeconomic status
  • Side effects (e.g., patients reported experiencing an adverse reaction to the current or most recent MOUD treatment)
  • Resources utilization (e.g., patients reported receiving peer-to-peer recovery support services)
  • COVID-related information (e.g., patients reported completing the survey during the COVID-19 pandemic)
  • Distance to treatment facility
  • Treatment facility ID
  • Region of treatment facility
  • Mortality outcomes

Patients from around the United States were included

All sites were outpatient treatment facilities. 1 Information about individual treatment site will not be released to protect patient and staff privacy.

Cities included in the study:

  • Birmingham, Alabama
  • Boston, Massachusetts
  • Chicago, Illinois
  • Cincinnati, Ohio
  • Dallas, Texas
  • Denver, Colorado
  • Huntington, West Virginia
  • Los Angeles, California
  • New York, New York
  • Phoenix, Arizona
  • Raleigh-Durham, North Carolina
  • Salt Lake City, Utah
  • San Francisco, California
  • Seattle, Washington
  • Washington, DC Metro Area

Data collection methods

Opioid use disorder (oud)‎.

The MOUD Study was a longitudinal, observational cohort study. The study sample included 1,974 adults with opioid use disorder (OUD) receiving various types of OUD treatment at 62 outpatient facilities over 18 months.

  • Participants were followed from March 2018 through May 2021.
  • The types of OUD treatment included MOUD (i.e., methadone, buprenorphine, naltrexone), or counseling without MOUD.
  • Patients in this study were asked to complete five web-based, self-administered questionnaires—baseline and follow up at 3-, 6-, 12-, and 18-months post-baseline.

The response rate was 100% at baseline, 72% at 3 months, 68% at 6 months, 52% at 12 months, and 53% at 18 months. 1

Researchers might consider imputation methods to deal with missing data. Researchers can use treatment facility ID to address the correlation between observations within the same treatment facility.

How the data is interpreted

Please note, CDC staff are unable to provide analytic or technical assistance to researchers who use this publicly available data.

Contact Information‎

Published manuscripts using the MOUD Study Data:

  • Dever, Jill A., et al. "The Medications for Opioid Use Disorder Study: Methods and Initial Outcomes From an 18-Month Study of Patients in Treatment for Opioid Use Disorder." Public Health Reports (2023): 00333549231222479. https://journals.sagepub.com/doi/epub/10.1177/00333549231222479
  • Villamil, Vanessa I., et al. "Barriers to retention in medications for opioid use disorder treatment in real-world practice." Journal of Substance Use and Addiction Treatment (2024): 209310. https://www.sciencedirect.com/science/article/pii/S2949875924000225
  • Nataraj, Nisha., et al. "Public Health Interventions and Overdose-Related Outcomes Among Persons With Opioid Use Disorder." JAMA Netw Open. 2024;7(4):e244617. DOI: 10.1001/jamanetworkopen.2024.4617

Data reporting

There are no costs for accessing the publicly available MOUD Study data.

Download Data Files, Patient Questionnaires, and Codebooks:

The MOUD Study Codebook

  • Contains a list of variables, their labels and distributions from baseline to 18 months post-baseline.

Patient questionnaires

  • Contains questions the enrolled patients were asked from baseline to 18 months post-baseline.

Publicly available MOUD Study Data:

  • Patient Baseline Data
  • Patient 3-Month Data
  • Patient 6-Month Data
  • Patient 12-Month Data
  • Patient 18-Month Data

Data quality

Data limitations can be found in the MOUD Study publications listed below.

More information on:

  • Medications for Opioid Use Disorder (MOUD)
  • Find Help & Treatment
  • Preventing Overdose
  • Preventing Opioid Use Disorder
  • Treating Opioid Use Disorder
  • Linking People to Opioid Use Disorder Treatment
  • Addiction Medicine Toolkit
  • Substance Abuse and Mental Health Services Administration. (2023). Key substance use and mental health indicators in the United States: Results from the 2022 National Survey on Drug Use and Health (HHS Publication No. PEP23-07-01-006, NSDUH Series H-58). Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration. https://www.samhsa.gov/data/report/2022-nsduh-annual-national-report

Overdose Prevention

Overdose prevention is a CDC priority that impacts families and communities. Drug overdose is a leading cause of preventable death in the U.S.

For Everyone

Health care providers, public health.

NTRS - NASA Technical Reports Server

Available downloads, related records.

  • Open access
  • Published: 14 May 2024

Developing a survey to measure nursing students’ knowledge, attitudes and beliefs, influences, and willingness to be involved in Medical Assistance in Dying (MAiD): a mixed method modified e-Delphi study

  • Jocelyn Schroeder 1 ,
  • Barbara Pesut 1 , 2 ,
  • Lise Olsen 2 ,
  • Nelly D. Oelke 2 &
  • Helen Sharp 2  

BMC Nursing volume  23 , Article number:  326 ( 2024 ) Cite this article

181 Accesses

Metrics details

Medical Assistance in Dying (MAiD) was legalized in Canada in 2016. Canada’s legislation is the first to permit Nurse Practitioners (NP) to serve as independent MAiD assessors and providers. Registered Nurses’ (RN) also have important roles in MAiD that include MAiD care coordination; client and family teaching and support, MAiD procedural quality; healthcare provider and public education; and bereavement care for family. Nurses have a right under the law to conscientious objection to participating in MAiD. Therefore, it is essential to prepare nurses in their entry-level education for the practice implications and moral complexities inherent in this practice. Knowing what nursing students think about MAiD is a critical first step. Therefore, the purpose of this study was to develop a survey to measure nursing students’ knowledge, attitudes and beliefs, influences, and willingness to be involved in MAiD in the Canadian context.

The design was a mixed-method, modified e-Delphi method that entailed item generation from the literature, item refinement through a 2 round survey of an expert faculty panel, and item validation through a cognitive focus group interview with nursing students. The settings were a University located in an urban area and a College located in a rural area in Western Canada.

During phase 1, a 56-item survey was developed from existing literature that included demographic items and items designed to measure experience with death and dying (including MAiD), education and preparation, attitudes and beliefs, influences on those beliefs, and anticipated future involvement. During phase 2, an expert faculty panel reviewed, modified, and prioritized the items yielding 51 items. During phase 3, a sample of nursing students further evaluated and modified the language in the survey to aid readability and comprehension. The final survey consists of 45 items including 4 case studies.

Systematic evaluation of knowledge-to-date coupled with stakeholder perspectives supports robust survey design. This study yielded a survey to assess nursing students’ attitudes toward MAiD in a Canadian context.

The survey is appropriate for use in education and research to measure knowledge and attitudes about MAiD among nurse trainees and can be a helpful step in preparing nursing students for entry-level practice.

Peer Review reports

Medical Assistance in Dying (MAiD) is permitted under an amendment to Canada’s Criminal Code which was passed in 2016 [ 1 ]. MAiD is defined in the legislation as both self-administered and clinician-administered medication for the purpose of causing death. In the 2016 Bill C-14 legislation one of the eligibility criteria was that an applicant for MAiD must have a reasonably foreseeable natural death although this term was not defined. It was left to the clinical judgement of MAiD assessors and providers to determine the time frame that constitutes reasonably foreseeable [ 2 ]. However, in 2021 under Bill C-7, the eligibility criteria for MAiD were changed to allow individuals with irreversible medical conditions, declining health, and suffering, but whose natural death was not reasonably foreseeable, to receive MAiD [ 3 ]. This population of MAiD applicants are referred to as Track 2 MAiD (those whose natural death is foreseeable are referred to as Track 1). Track 2 applicants are subject to additional safeguards under the 2021 C-7 legislation.

Three additional proposed changes to the legislation have been extensively studied by Canadian Expert Panels (Council of Canadian Academics [CCA]) [ 4 , 5 , 6 ] First, under the legislation that defines Track 2, individuals with mental disease as their sole underlying medical condition may apply for MAiD, but implementation of this practice is embargoed until March 2027 [ 4 ]. Second, there is consideration of allowing MAiD to be implemented through advanced consent. This would make it possible for persons living with dementia to receive MAID after they have lost the capacity to consent to the procedure [ 5 ]. Third, there is consideration of extending MAiD to mature minors. A mature minor is defined as “a person under the age of majority…and who has the capacity to understand and appreciate the nature and consequences of a decision” ([ 6 ] p. 5). In summary, since the legalization of MAiD in 2016 the eligibility criteria and safeguards have evolved significantly with consequent implications for nurses and nursing care. Further, the number of Canadians who access MAiD shows steady increases since 2016 [ 7 ] and it is expected that these increases will continue in the foreseeable future.

Nurses have been integral to MAiD care in the Canadian context. While other countries such as Belgium and the Netherlands also permit euthanasia, Canada is the first country to allow Nurse Practitioners (Registered Nurses with additional preparation typically achieved at the graduate level) to act independently as assessors and providers of MAiD [ 1 ]. Although the role of Registered Nurses (RNs) in MAiD is not defined in federal legislation, it has been addressed at the provincial/territorial-level with variability in scope of practice by region [ 8 , 9 ]. For example, there are differences with respect to the obligation of the nurse to provide information to patients about MAiD, and to the degree that nurses are expected to ensure that patient eligibility criteria and safeguards are met prior to their participation [ 10 ]. Studies conducted in the Canadian context indicate that RNs perform essential roles in MAiD care coordination; client and family teaching and support; MAiD procedural quality; healthcare provider and public education; and bereavement care for family [ 9 , 11 ]. Nurse practitioners and RNs are integral to a robust MAiD care system in Canada and hence need to be well-prepared for their role [ 12 ].

Previous studies have found that end of life care, and MAiD specifically, raise complex moral and ethical issues for nurses [ 13 , 14 , 15 , 16 ]. The knowledge, attitudes, and beliefs of nurses are important across practice settings because nurses have consistent, ongoing, and direct contact with patients who experience chronic or life-limiting health conditions. Canadian studies exploring nurses’ moral and ethical decision-making in relation to MAiD reveal that although some nurses are clear in their support for, or opposition to, MAiD, others are unclear on what they believe to be good and right [ 14 ]. Empirical findings suggest that nurses go through a period of moral sense-making that is often informed by their family, peers, and initial experiences with MAID [ 17 , 18 ]. Canadian legislation and policy specifies that nurses are not required to participate in MAiD and may recuse themselves as conscientious objectors with appropriate steps to ensure ongoing and safe care of patients [ 1 , 19 ]. However, with so many nurses having to reflect on and make sense of their moral position, it is essential that they are given adequate time and preparation to make an informed and thoughtful decision before they participate in a MAID death [ 20 , 21 ].

It is well established that nursing students receive inconsistent exposure to end of life care issues [ 22 ] and little or no training related to MAiD [ 23 ]. Without such education and reflection time in pre-entry nursing preparation, nurses are at significant risk for moral harm. An important first step in providing this preparation is to be able to assess the knowledge, values, and beliefs of nursing students regarding MAID and end of life care. As demand for MAiD increases along with the complexities of MAiD, it is critical to understand the knowledge, attitudes, and likelihood of engagement with MAiD among nursing students as a baseline upon which to build curriculum and as a means to track these variables over time.

Aim, design, and setting

The aim of this study was to develop a survey to measure nursing students’ knowledge, attitudes and beliefs, influences, and willingness to be involved in MAiD in the Canadian context. We sought to explore both their willingness to be involved in the registered nursing role and in the nurse practitioner role should they chose to prepare themselves to that level of education. The design was a mixed-method, modified e-Delphi method that entailed item generation, item refinement through an expert faculty panel [ 24 , 25 , 26 ], and initial item validation through a cognitive focus group interview with nursing students [ 27 ]. The settings were a University located in an urban area and a College located in a rural area in Western Canada.

Participants

A panel of 10 faculty from the two nursing education programs were recruited for Phase 2 of the e-Delphi. To be included, faculty were required to have a minimum of three years of experience in nurse education, be employed as nursing faculty, and self-identify as having experience with MAiD. A convenience sample of 5 fourth-year nursing students were recruited to participate in Phase 3. Students had to be in good standing in the nursing program and be willing to share their experiences of the survey in an online group interview format.

The modified e-Delphi was conducted in 3 phases: Phase 1 entailed item generation through literature and existing survey review. Phase 2 entailed item refinement through a faculty expert panel review with focus on content validity, prioritization, and revision of item wording [ 25 ]. Phase 3 entailed an assessment of face validity through focus group-based cognitive interview with nursing students.

Phase I. Item generation through literature review

The goal of phase 1 was to develop a bank of survey items that would represent the variables of interest and which could be provided to expert faculty in Phase 2. Initial survey items were generated through a literature review of similar surveys designed to assess knowledge and attitudes toward MAiD/euthanasia in healthcare providers; Canadian empirical studies on nurses’ roles and/or experiences with MAiD; and legislative and expert panel documents that outlined proposed changes to the legislative eligibility criteria and safeguards. The literature review was conducted in three online databases: CINAHL, PsycINFO, and Medline. Key words for the search included nurses , nursing students , medical students , NPs, MAiD , euthanasia , assisted death , and end-of-life care . Only articles written in English were reviewed. The legalization and legislation of MAiD is new in many countries; therefore, studies that were greater than twenty years old were excluded, no further exclusion criteria set for country.

Items from surveys designed to measure similar variables in other health care providers and geographic contexts were placed in a table and similar items were collated and revised into a single item. Then key variables were identified from the empirical literature on nurses and MAiD in Canada and checked against the items derived from the surveys to ensure that each of the key variables were represented. For example, conscientious objection has figured prominently in the Canadian literature, but there were few items that assessed knowledge of conscientious objection in other surveys and so items were added [ 15 , 21 , 28 , 29 ]. Finally, four case studies were added to the survey to address the anticipated changes to the Canadian legislation. The case studies were based upon the inclusion of mature minors, advanced consent, and mental disorder as the sole underlying medical condition. The intention was to assess nurses’ beliefs and comfort with these potential legislative changes.

Phase 2. Item refinement through expert panel review

The goal of phase 2 was to refine and prioritize the proposed survey items identified in phase 1 using a modified e-Delphi approach to achieve consensus among an expert panel [ 26 ]. Items from phase 1 were presented to an expert faculty panel using a Qualtrics (Provo, UT) online survey. Panel members were asked to review each item to determine if it should be: included, excluded or adapted for the survey. When adapted was selected faculty experts were asked to provide rationale and suggestions for adaptation through the use of an open text box. Items that reached a level of 75% consensus for either inclusion or adaptation were retained [ 25 , 26 ]. New items were categorized and added, and a revised survey was presented to the panel of experts in round 2. Panel members were again asked to review items, including new items, to determine if it should be: included, excluded, or adapted for the survey. Round 2 of the modified e-Delphi approach also included an item prioritization activity, where participants were then asked to rate the importance of each item, based on a 5-point Likert scale (low to high importance), which De Vaus [ 30 ] states is helpful for increasing the reliability of responses. Items that reached a 75% consensus on inclusion were then considered in relation to the importance it was given by the expert panel. Quantitative data were managed using SPSS (IBM Corp).

Phase 3. Face validity through cognitive interviews with nursing students

The goal of phase 3 was to obtain initial face validity of the proposed survey using a sample of nursing student informants. More specifically, student participants were asked to discuss how items were interpreted, to identify confusing wording or other problematic construction of items, and to provide feedback about the survey as a whole including readability and organization [ 31 , 32 , 33 ]. The focus group was held online and audio recorded. A semi-structured interview guide was developed for this study that focused on clarity, meaning, order and wording of questions; emotions evoked by the questions; and overall survey cohesion and length was used to obtain data (see Supplementary Material 2  for the interview guide). A prompt to “think aloud” was used to limit interviewer-imposed bias and encourage participants to describe their thoughts and response to a given item as they reviewed survey items [ 27 ]. Where needed, verbal probes such as “could you expand on that” were used to encourage participants to expand on their responses [ 27 ]. Student participants’ feedback was collated verbatim and presented to the research team where potential survey modifications were negotiated and finalized among team members. Conventional content analysis [ 34 ] of focus group data was conducted to identify key themes that emerged through discussion with students. Themes were derived from the data by grouping common responses and then using those common responses to modify survey items.

Ten nursing faculty participated in the expert panel. Eight of the 10 faculty self-identified as female. No faculty panel members reported conscientious objector status and ninety percent reported general agreement with MAiD with one respondent who indicated their view as “unsure.” Six of the 10 faculty experts had 16 years of experience or more working as a nurse educator.

Five nursing students participated in the cognitive interview focus group. The duration of the focus group was 2.5 h. All participants identified that they were born in Canada, self-identified as female (one preferred not to say) and reported having received some instruction about MAiD as part of their nursing curriculum. See Tables  1 and 2 for the demographic descriptors of the study sample. Study results will be reported in accordance with the study phases. See Fig.  1 for an overview of the results from each phase.

figure 1

Fig. 1  Overview of survey development findings

Phase 1: survey item generation

Review of the literature identified that no existing survey was available for use with nursing students in the Canadian context. However, an analysis of themes across qualitative and quantitative studies of physicians, medical students, nurses, and nursing students provided sufficient data to develop a preliminary set of items suitable for adaptation to a population of nursing students.

Four major themes and factors that influence knowledge, attitudes, and beliefs about MAiD were evident from the literature: (i) endogenous or individual factors such as age, gender, personally held values, religion, religiosity, and/or spirituality [ 35 , 36 , 37 , 38 , 39 , 40 , 41 , 42 ], (ii) experience with death and dying in personal and/or professional life [ 35 , 40 , 41 , 43 , 44 , 45 ], (iii) training including curricular instruction about clinical role, scope of practice, or the law [ 23 , 36 , 39 ], and (iv) exogenous or social factors such as the influence of key leaders, colleagues, friends and/or family, professional and licensure organizations, support within professional settings, and/or engagement in MAiD in an interdisciplinary team context [ 9 , 35 , 46 ].

Studies of nursing students also suggest overlap across these categories. For example, value for patient autonomy [ 23 ] and the moral complexity of decision-making [ 37 ] are important factors that contribute to attitudes about MAiD and may stem from a blend of personally held values coupled with curricular content, professional training and norms, and clinical exposure. For example, students report that participation in end of life care allows for personal growth, shifts in perception, and opportunities to build therapeutic relationships with their clients [ 44 , 47 , 48 ].

Preliminary items generated from the literature resulted in 56 questions from 11 published sources (See Table  3 ). These items were constructed across four main categories: (i) socio-demographic questions; (ii) end of life care questions; (iii) knowledge about MAiD; or (iv) comfort and willingness to participate in MAiD. Knowledge questions were refined to reflect current MAiD legislation, policies, and regulatory frameworks. Falconer [ 39 ] and Freeman [ 45 ] studies were foundational sources for item selection. Additionally, four case studies were written to reflect the most recent anticipated changes to MAiD legislation and all used the same open-ended core questions to address respondents’ perspectives about the patient’s right to make the decision, comfort in assisting a physician or NP to administer MAiD in that scenario, and hypothesized comfort about serving as a primary provider if qualified as an NP in future. Response options for the survey were also constructed during this stage and included: open text, categorical, yes/no , and Likert scales.

Phase 2: faculty expert panel review

Of the 56 items presented to the faculty panel, 54 questions reached 75% consensus. However, based upon the qualitative responses 9 items were removed largely because they were felt to be repetitive. Items that generated the most controversy were related to measuring religion and spirituality in the Canadian context, defining end of life care when there is no agreed upon time frames (e.g., last days, months, or years), and predicting willingness to be involved in a future events – thus predicting their future selves. Phase 2, round 1 resulted in an initial set of 47 items which were then presented back to the faculty panel in round 2.

Of the 47 initial questions presented to the panel in round 2, 45 reached a level of consensus of 75% or greater, and 34 of these questions reached a level of 100% consensus [ 27 ] of which all participants chose to include without any adaptations) For each question, level of importance was determined based on a 5-point Likert scale (1 = very unimportant, 2 = somewhat unimportant, 3 = neutral, 4 = somewhat important, and 5 = very important). Figure  2 provides an overview of the level of importance assigned to each item.

figure 2

Ranking level of importance for survey items

After round 2, a careful analysis of participant comments and level of importance was completed by the research team. While the main method of survey item development came from participants’ response to the first round of Delphi consensus ratings, level of importance was used to assist in the decision of whether to keep or modify questions that created controversy, or that rated lower in the include/exclude/adapt portion of the Delphi. Survey items that rated low in level of importance included questions about future roles, sex and gender, and religion/spirituality. After deliberation by the research committee, these questions were retained in the survey based upon the importance of these variables in the scientific literature.

Of the 47 questions remaining from Phase 2, round 2, four were revised. In addition, the two questions that did not meet the 75% cut off level for consensus were reviewed by the research team. The first question reviewed was What is your comfort level with providing a MAiD death in the future if you were a qualified NP ? Based on a review of participant comments, it was decided to retain this question for the cognitive interviews with students in the final phase of testing. The second question asked about impacts on respondents’ views of MAiD and was changed from one item with 4 subcategories into 4 separate items, resulting in a final total of 51 items for phase 3. The revised survey was then brought forward to the cognitive interviews with student participants in Phase 3. (see Supplementary Material 1 for a complete description of item modification during round 2).

Phase 3. Outcomes of cognitive interview focus group

Of the 51 items reviewed by student participants, 29 were identified as clear with little or no discussion. Participant comments for the remaining 22 questions were noted and verified against the audio recording. Following content analysis of the comments, four key themes emerged through the student discussion: unclear or ambiguous wording; difficult to answer questions; need for additional response options; and emotional response evoked by questions. An example of unclear or ambiguous wording was a request for clarity in the use of the word “sufficient” in the context of assessing an item that read “My nursing education has provided sufficient content about the nursing role in MAiD.” “Sufficient” was viewed as subjective and “laden with…complexity that distracted me from the question.” The group recommended rewording the item to read “My nursing education has provided enough content for me to care for a patient considering or requesting MAiD.”

An example of having difficulty answering questions related to limited knowledge related to terms used in the legislation such as such as safeguards , mature minor , eligibility criteria , and conscientious objection. Students were unclear about what these words meant relative to the legislation and indicated that this lack of clarity would hamper appropriate responses to the survey. To ensure that respondents are able to answer relevant questions, student participants recommended that the final survey include explanation of key terms such as mature minor and conscientious objection and an overview of current legislation.

Response options were also a point of discussion. Participants noted a lack of distinction between response options of unsure and unable to say . Additionally, scaling of attitudes was noted as important since perspectives about MAiD are dynamic and not dichotomous “agree or disagree” responses. Although the faculty expert panel recommended the integration of the demographic variables of religious and/or spiritual remain as a single item, the student group stated a preference to have religion and spirituality appear as separate items. The student focus group also took issue with separate items for the variables of sex and gender, specifically that non-binary respondents might feel othered or “outed” particularly when asked to identify their sex. These variables had been created based upon best practices in health research but students did not feel they were appropriate in this context [ 49 ]. Finally, students agreed with the faculty expert panel in terms of the complexity of projecting their future involvement as a Nurse Practitioner. One participant stated: “I certainly had to like, whoa, whoa, whoa. Now let me finish this degree first, please.” Another stated, “I'm still imagining myself, my future career as an RN.”

Finally, student participants acknowledged the array of emotions that some of the items produced for them. For example, one student described positive feelings when interacting with the survey. “Brought me a little bit of feeling of joy. Like it reminded me that this is the last piece of independence that people grab on to.” Another participant, described the freedom that the idea of an advance request gave her. “The advance request gives the most comfort for me, just with early onset Alzheimer’s and knowing what it can do.” But other participants described less positive feelings. For example, the mature minor case study yielded a comment: “This whole scenario just made my heart hurt with the idea of a child requesting that.”

Based on the data gathered from the cognitive interview focus group of nursing students, revisions were made to 11 closed-ended questions (see Table  4 ) and 3 items were excluded. In the four case studies, the open-ended question related to a respondents’ hypothesized actions in a future role as NP were removed. The final survey consists of 45 items including 4 case studies (see Supplementary Material 3 ).

The aim of this study was to develop and validate a survey that can be used to track the growth of knowledge about MAiD among nursing students over time, inform training programs about curricular needs, and evaluate attitudes and willingness to participate in MAiD at time-points during training or across nursing programs over time.

The faculty expert panel and student participants in the cognitive interview focus group identified a need to establish core knowledge of the terminology and legislative rules related to MAiD. For example, within the cognitive interview group of student participants, several acknowledged lack of clear understanding of specific terms such as “conscientious objector” and “safeguards.” Participants acknowledged discomfort with the uncertainty of not knowing and their inclination to look up these terms to assist with answering the questions. This survey can be administered to nursing or pre-nursing students at any phase of their training within a program or across training programs. However, in doing so it is important to acknowledge that their baseline knowledge of MAiD will vary. A response option of “not sure” is important and provides a means for respondents to convey uncertainty. If this survey is used to inform curricular needs, respondents should be given explicit instructions not to conduct online searches to inform their responses, but rather to provide an honest appraisal of their current knowledge and these instructions are included in the survey (see Supplementary Material 3 ).

Some provincial regulatory bodies have established core competencies for entry-level nurses that include MAiD. For example, the BC College of Nurses and Midwives (BCCNM) requires “knowledge about ethical, legal, and regulatory implications of medical assistance in dying (MAiD) when providing nursing care.” (10 p. 6) However, across Canada curricular content and coverage related to end of life care and MAiD is variable [ 23 ]. Given the dynamic nature of the legislation that includes portions of the law that are embargoed until 2024, it is important to ensure that respondents are guided by current and accurate information. As the law changes, nursing curricula, and public attitudes continue to evolve, inclusion of core knowledge and content is essential and relevant for investigators to be able to interpret the portions of the survey focused on attitudes and beliefs about MAiD. Content knowledge portions of the survey may need to be modified over time as legislation and training change and to meet the specific purposes of the investigator.

Given the sensitive nature of the topic, it is strongly recommended that surveys be conducted anonymously and that students be provided with an opportunity to discuss their responses to the survey. A majority of feedback from both the expert panel of faculty and from student participants related to the wording and inclusion of demographic variables, in particular religion, religiosity, gender identity, and sex assigned at birth. These and other demographic variables have the potential to be highly identifying in small samples. In any instance in which the survey could be expected to yield demographic group sizes less than 5, users should eliminate the demographic variables from the survey. For example, the profession of nursing is highly dominated by females with over 90% of nurses who identify as female [ 50 ]. Thus, a survey within a single class of students or even across classes in a single institution is likely to yield a small number of male respondents and/or respondents who report a difference between sex assigned at birth and gender identity. When variables that serve to identify respondents are included, respondents are less likely to complete or submit the survey, to obscure their responses so as not to be identifiable, or to be influenced by social desirability bias in their responses rather than to convey their attitudes accurately [ 51 ]. Further, small samples do not allow for conclusive analyses or interpretation of apparent group differences. Although these variables are often included in surveys, such demographics should be included only when anonymity can be sustained. In small and/or known samples, highly identifying variables should be omitted.

There are several limitations associated with the development of this survey. The expert panel was comprised of faculty who teach nursing students and are knowledgeable about MAiD and curricular content, however none identified as a conscientious objector to MAiD. Ideally, our expert panel would have included one or more conscientious objectors to MAiD to provide a broader perspective. Review by practitioners who participate in MAiD, those who are neutral or undecided, and practitioners who are conscientious objectors would ensure broad applicability of the survey. This study included one student cognitive interview focus group with 5 self-selected participants. All student participants had held discussions about end of life care with at least one patient, 4 of 5 participants had worked with a patient who requested MAiD, and one had been present for a MAiD death. It is not clear that these participants are representative of nursing students demographically or by experience with end of life care. It is possible that the students who elected to participate hold perspectives and reflections on patient care and MAiD that differ from students with little or no exposure to end of life care and/or MAiD. However, previous studies find that most nursing students have been involved with end of life care including meaningful discussions about patients’ preferences and care needs during their education [ 40 , 44 , 47 , 48 , 52 ]. Data collection with additional student focus groups with students early in their training and drawn from other training contexts would contribute to further validation of survey items.

Future studies should incorporate pilot testing with small sample of nursing students followed by a larger cross-program sample to allow evaluation of the psychometric properties of specific items and further refinement of the survey tool. Consistent with literature about the importance of leadership in the context of MAiD [ 12 , 53 , 54 ], a study of faculty knowledge, beliefs, and attitudes toward MAiD would provide context for understanding student perspectives within and across programs. Additional research is also needed to understand the timing and content coverage of MAiD across Canadian nurse training programs’ curricula.

The implementation of MAiD is complex and requires understanding of the perspectives of multiple stakeholders. Within the field of nursing this includes clinical providers, educators, and students who will deliver clinical care. A survey to assess nursing students’ attitudes toward and willingness to participate in MAiD in the Canadian context is timely, due to the legislation enacted in 2016 and subsequent modifications to the law in 2021 with portions of the law to be enacted in 2027. Further development of this survey could be undertaken to allow for use in settings with practicing nurses or to allow longitudinal follow up with students as they enter practice. As the Canadian landscape changes, ongoing assessment of the perspectives and needs of health professionals and students in the health professions is needed to inform policy makers, leaders in practice, curricular needs, and to monitor changes in attitudes and practice patterns over time.

Availability of data and materials

The datasets used and/or analysed during the current study are not publicly available due to small sample sizes, but are available from the corresponding author on reasonable request.

Abbreviations

British Columbia College of Nurses and Midwives

Medical assistance in dying

Nurse practitioner

Registered nurse

University of British Columbia Okanagan

Nicol J, Tiedemann M. Legislative Summary: Bill C-14: An Act to amend the Criminal Code and to make related amendments to other Acts (medical assistance in dying). Available from: https://lop.parl.ca/staticfiles/PublicWebsite/Home/ResearchPublications/LegislativeSummaries/PDF/42-1/c14-e.pdf .

Downie J, Scallion K. Foreseeably unclear. The meaning of the “reasonably foreseeable” criterion for access to medical assistance in dying in Canada. Dalhousie Law J. 2018;41(1):23–57.

Nicol J, Tiedeman M. Legislative summary of Bill C-7: an act to amend the criminal code (medical assistance in dying). Ottawa: Government of Canada; 2021.

Google Scholar  

Council of Canadian Academies. The state of knowledge on medical assistance in dying where a mental disorder is the sole underlying medical condition. Ottawa; 2018. Available from: https://cca-reports.ca/wp-content/uploads/2018/12/The-State-of-Knowledge-on-Medical-Assistance-in-Dying-Where-a-Mental-Disorder-is-the-Sole-Underlying-Medical-Condition.pdf .

Council of Canadian Academies. The state of knowledge on advance requests for medical assistance in dying. Ottawa; 2018. Available from: https://cca-reports.ca/wp-content/uploads/2019/02/The-State-of-Knowledge-on-Advance-Requests-for-Medical-Assistance-in-Dying.pdf .

Council of Canadian Academies. The state of knowledge on medical assistance in dying for mature minors. Ottawa; 2018. Available from: https://cca-reports.ca/wp-content/uploads/2018/12/The-State-of-Knowledge-on-Medical-Assistance-in-Dying-for-Mature-Minors.pdf .

Health Canada. Third annual report on medical assistance in dying in Canada 2021. Ottawa; 2022. [cited 2023 Oct 23]. Available from: https://www.canada.ca/en/health-canada/services/medical-assistance-dying/annual-report-2021.html .

Banner D, Schiller CJ, Freeman S. Medical assistance in dying: a political issue for nurses and nursing in Canada. Nurs Philos. 2019;20(4): e12281.

Article   PubMed   Google Scholar  

Pesut B, Thorne S, Stager ML, Schiller CJ, Penney C, Hoffman C, et al. Medical assistance in dying: a review of Canadian nursing regulatory documents. Policy Polit Nurs Pract. 2019;20(3):113–30.

Article   PubMed   PubMed Central   Google Scholar  

College of Registered Nurses of British Columbia. Scope of practice for registered nurses [Internet]. Vancouver; 2018. Available from: https://www.bccnm.ca/Documents/standards_practice/rn/RN_ScopeofPractice.pdf .

Pesut B, Thorne S, Schiller C, Greig M, Roussel J, Tishelman C. Constructing good nursing practice for medical assistance in dying in Canada: an interpretive descriptive study. Global Qual Nurs Res. 2020;7:2333393620938686. https://doi.org/10.1177/2333393620938686 .

Article   Google Scholar  

Pesut B, Thorne S, Schiller CJ, Greig M, Roussel J. The rocks and hard places of MAiD: a qualitative study of nursing practice in the context of legislated assisted death. BMC Nurs. 2020;19:12. https://doi.org/10.1186/s12912-020-0404-5 .

Pesut B, Greig M, Thorne S, Burgess M, Storch JL, Tishelman C, et al. Nursing and euthanasia: a narrative review of the nursing ethics literature. Nurs Ethics. 2020;27(1):152–67.

Pesut B, Thorne S, Storch J, Chambaere K, Greig M, Burgess M. Riding an elephant: a qualitative study of nurses’ moral journeys in the context of Medical Assistance in Dying (MAiD). Journal Clin Nurs. 2020;29(19–20):3870–81.

Lamb C, Babenko-Mould Y, Evans M, Wong CA, Kirkwood KW. Conscientious objection and nurses: results of an interpretive phenomenological study. Nurs Ethics. 2018;26(5):1337–49.

Wright DK, Chan LS, Fishman JR, Macdonald ME. “Reflection and soul searching:” Negotiating nursing identity at the fault lines of palliative care and medical assistance in dying. Social Sci & Med. 2021;289: 114366.

Beuthin R, Bruce A, Scaia M. Medical assistance in dying (MAiD): Canadian nurses’ experiences. Nurs Forum. 2018;54(4):511–20.

Bruce A, Beuthin R. Medically assisted dying in Canada: "Beautiful Death" is transforming nurses' experiences of suffering. The Canadian J Nurs Res | Revue Canadienne de Recherche en Sci Infirmieres. 2020;52(4):268–77. https://doi.org/10.1177/0844562119856234 .

Canadian Nurses Association. Code of ethics for registered nurses. Ottawa; 2017. Available from: https://www.cna-aiic.ca/en/nursing/regulated-nursing-in-canada/nursing-ethics .

Canadian Nurses Association. National nursing framework on Medical Assistance in Dying in Canada. Ottawa: 2017. Available from: https://www.virtualhospice.ca/Assets/cna-national-nursing-framework-on-maidEng_20170216155827.pdf .

Pesut B, Thorne S, Greig M. Shades of gray: conscientious objection in medical assistance in dying. Nursing Inq. 2020;27(1): e12308.

Durojaiye A, Ryan R, Doody O. Student nurse education and preparation for palliative care: a scoping review. PLoS ONE. 2023. https://doi.org/10.1371/journal.pone.0286678 .

McMechan C, Bruce A, Beuthin R. Canadian nursing students’ experiences with medical assistance in dying | Les expériences d’étudiantes en sciences infirmières au regard de l’aide médicale à mourir. Qual Adv Nurs Educ - Avancées en Formation Infirmière. 2019;5(1). https://doi.org/10.17483/2368-6669.1179 .

Adler M, Ziglio E. Gazing into the oracle. The Delphi method and its application to social policy and public health. London: Jessica Kingsley Publishers; 1996

Keeney S, Hasson F, McKenna H. Consulting the oracle: ten lessons from using the Delphi technique in nursing research. J Adv Nurs. 2006;53(2):205–12.

Keeney S, Hasson F, McKenna H. The Delphi technique in nursing and health research. 1st ed. City: Wiley; 2011.

Willis GB. Cognitive interviewing: a tool for improving questionnaire design. 1st ed. Thousand Oaks, Calif: Sage; 2005. ISBN: 9780761928041

Lamb C, Evans M, Babenko-Mould Y, Wong CA, Kirkwood EW. Conscience, conscientious objection, and nursing: a concept analysis. Nurs Ethics. 2017;26(1):37–49.

Lamb C, Evans M, Babenko-Mould Y, Wong CA, Kirkwood K. Nurses’ use of conscientious objection and the implications of conscience. J Adv Nurs. 2018;75(3):594–602.

de Vaus D. Surveys in social research. 6th ed. Abingdon, Oxon: Routledge; 2014.

Boateng GO, Neilands TB, Frongillo EA, Melgar-Quiñonez HR, Young SL. Best practices for developing and validating scales for health, social, and behavioral research: A primer. Front Public Health. 2018;6:149. https://doi.org/10.3389/fpubh.2018.00149 .

Puchta C, Potter J. Focus group practice. 1st ed. London: Sage; 2004.

Book   Google Scholar  

Streiner DL, Norman GR, Cairney J. Health measurement scales: a practical guide to their development and use. 5th ed. Oxford: Oxford University Press; 2015.

Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

Adesina O, DeBellis A, Zannettino L. Third-year Australian nursing students’ attitudes, experiences, knowledge, and education concerning end-of-life care. Int J of Palliative Nurs. 2014;20(8):395–401.

Bator EX, Philpott B, Costa AP. This moral coil: a cross-sectional survey of Canadian medical student attitudes toward medical assistance in dying. BMC Med Ethics. 2017;18(1):58.

Beuthin R, Bruce A, Scaia M. Medical assistance in dying (MAiD): Canadian nurses’ experiences. Nurs Forum. 2018;53(4):511–20.

Brown J, Goodridge D, Thorpe L, Crizzle A. What is right for me, is not necessarily right for you: the endogenous factors influencing nonparticipation in medical assistance in dying. Qual Health Res. 2021;31(10):1786–1800.

Falconer J, Couture F, Demir KK, Lang M, Shefman Z, Woo M. Perceptions and intentions toward medical assistance in dying among Canadian medical students. BMC Med Ethics. 2019;20(1):22.

Green G, Reicher S, Herman M, Raspaolo A, Spero T, Blau A. Attitudes toward euthanasia—dual view: Nursing students and nurses. Death Stud. 2022;46(1):124–31.

Hosseinzadeh K, Rafiei H. Nursing student attitudes toward euthanasia: a cross-sectional study. Nurs Ethics. 2019;26(2):496–503.

Ozcelik H, Tekir O, Samancioglu S, Fadiloglu C, Ozkara E. Nursing students’ approaches toward euthanasia. Omega (Westport). 2014;69(1):93–103.

Canning SE, Drew C. Canadian nursing students’ understanding, and comfort levels related to medical assistance in dying. Qual Adv Nurs Educ - Avancées en Formation Infirmière. 2022;8(2). https://doi.org/10.17483/2368-6669.1326 .

Edo-Gual M, Tomás-Sábado J, Bardallo-Porras D, Monforte-Royo C. The impact of death and dying on nursing students: an explanatory model. J Clin Nurs. 2014;23(23–24):3501–12.

Freeman LA, Pfaff KA, Kopchek L, Liebman J. Investigating palliative care nurse attitudes towards medical assistance in dying: an exploratory cross-sectional study. J Adv Nurs. 2020;76(2):535–45.

Brown J, Goodridge D, Thorpe L, Crizzle A. “I am okay with it, but I am not going to do it:” the exogenous factors influencing non-participation in medical assistance in dying. Qual Health Res. 2021;31(12):2274–89.

Dimoula M, Kotronoulas G, Katsaragakis S, Christou M, Sgourou S, Patiraki E. Undergraduate nursing students’ knowledge about palliative care and attitudes towards end-of-life care: A three-cohort, cross-sectional survey. Nurs Educ Today. 2019;74:7–14.

Matchim Y, Raetong P. Thai nursing students’ experiences of caring for patients at the end of life: a phenomenological study. Int J Palliative Nurs. 2018;24(5):220–9.

Canadian Institute for Health Research. Sex and gender in health research [Internet]. Ottawa: CIHR; 2021 [cited 2023 Oct 23]. Available from: https://cihr-irsc.gc.ca/e/50833.html .

Canadian Nurses’ Association. Nursing statistics. Ottawa: CNA; 2023 [cited 2023 Oct 23]. Available from: https://www.cna-aiic.ca/en/nursing/regulated-nursing-in-canada/nursing-statistics .

Krumpal I. Determinants of social desirability bias in sensitive surveys: a literature review. Qual Quant. 2013;47(4):2025–47. https://doi.org/10.1007/s11135-011-9640-9 .

Ferri P, Di Lorenzo R, Stifani S, Morotti E, Vagnini M, Jiménez Herrera MF, et al. Nursing student attitudes toward dying patient care: a European multicenter cross-sectional study. Acta Bio Medica Atenei Parmensis. 2021;92(S2): e2021018.

PubMed   PubMed Central   Google Scholar  

Beuthin R, Bruce A. Medical assistance in dying (MAiD): Ten things leaders need to know. Nurs Leadership. 2018;31(4):74–81.

Thiele T, Dunsford J. Nurse leaders’ role in medical assistance in dying: a relational ethics approach. Nurs Ethics. 2019;26(4):993–9.

Download references

Acknowledgements

We would like to acknowledge the faculty and students who generously contributed their time to this work.

JS received a student traineeship through the Principal Research Chairs program at the University of British Columbia Okanagan.

Author information

Authors and affiliations.

School of Health and Human Services, Selkirk College, Castlegar, BC, Canada

Jocelyn Schroeder & Barbara Pesut

School of Nursing, University of British Columbia Okanagan, Kelowna, BC, Canada

Barbara Pesut, Lise Olsen, Nelly D. Oelke & Helen Sharp

You can also search for this author in PubMed   Google Scholar

Contributions

JS made substantial contributions to the conception of the work; data acquisition, analysis, and interpretation; and drafting and substantively revising the work. JS has approved the submitted version and agreed to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature. BP made substantial contributions to the conception of the work; data acquisition, analysis, and interpretation; and drafting and substantively revising the work. BP has approved the submitted version and agreed to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature. LO made substantial contributions to the conception of the work; data acquisition, analysis, and interpretation; and substantively revising the work. LO has approved the submitted version and agreed to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature. NDO made substantial contributions to the conception of the work; data acquisition, analysis, and interpretation; and substantively revising the work. NDO has approved the submitted version and agreed to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature. HS made substantial contributions to drafting and substantively revising the work. HS has approved the submitted version and agreed to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Authors’ information

JS conducted this study as part of their graduate requirements in the School of Nursing, University of British Columbia Okanagan.

Corresponding author

Correspondence to Barbara Pesut .

Ethics declarations

Ethics approval and consent to participate.

The research was approved by the Selkirk College Research Ethics Board (REB) ID # 2021–011 and the University of British Columbia Behavioral Research Ethics Board ID # H21-01181.

All participants provided written and informed consent through approved consent processes. Research was conducted in accordance with the Declaration of Helsinki.

Consent for publication

Not applicable.

Competing interests

The authors declare they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., supplementary material 2., supplementary material 3., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Schroeder, J., Pesut, B., Olsen, L. et al. Developing a survey to measure nursing students’ knowledge, attitudes and beliefs, influences, and willingness to be involved in Medical Assistance in Dying (MAiD): a mixed method modified e-Delphi study. BMC Nurs 23 , 326 (2024). https://doi.org/10.1186/s12912-024-01984-z

Download citation

Received : 24 October 2023

Accepted : 28 April 2024

Published : 14 May 2024

DOI : https://doi.org/10.1186/s12912-024-01984-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Medical assistance in dying (MAiD)
  • End of life care
  • Student nurses
  • Nursing education

BMC Nursing

ISSN: 1472-6955

what is included in the research methodology

COMMENTS

  1. What Is a Research Methodology?

    Step 1: Explain your methodological approach. Step 2: Describe your data collection methods. Step 3: Describe your analysis method. Step 4: Evaluate and justify the methodological choices you made. Tips for writing a strong methodology chapter. Other interesting articles.

  2. Research Methodology

    The research methodology is an important section of any research paper or thesis, as it describes the methods and procedures that will be used to conduct the research. It should include details about the research design, data collection methods, data analysis techniques, and any ethical considerations.

  3. What is Research Methodology? Definition, Types, and Examples

    Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...

  4. What Is Research Methodology? Definition + Examples

    As we mentioned, research methodology refers to the collection of practical decisions regarding what data you'll collect, from who, how you'll collect it and how you'll analyse it. Research design, on the other hand, is more about the overall strategy you'll adopt in your study. For example, whether you'll use an experimental design ...

  5. What Is a Research Methodology?

    Revised on 10 October 2022. Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research.

  6. What is research methodology? [Update 2024]

    A research methodology encompasses the way in which you intend to carry out your research. This includes how you plan to tackle things like collection methods, statistical analysis, participant observations, and more. You can think of your research methodology as being a formula. One part will be how you plan on putting your research into ...

  7. The Ultimate Guide To Research Methodology

    Research methodology is the systematic process of planning, executing, and evaluating scientific investigation. It encompasses the techniques, tools, and procedures used to collect, analyze, and interpret data, ensuring the reliability and validity of research findings.

  8. Your Step-by-Step Guide to Writing a Good Research Methodology

    Research methodology is the process or the way you intend to execute your entire research. A research methodology provides a description of the process you will undertake to convert your idea into a study. ... All the relevant information that explains and simplifies your research paper must be included in the methodology section. If you are ...

  9. A Comprehensive Guide to Methodology in Research

    Research methodology refers to the system of procedures, techniques, and tools used to carry out a research study. It encompasses the overall approach, including the research design, data collection methods, data analysis techniques, and the interpretation of findings. Research methodology plays a crucial role in the field of research, as it ...

  10. Research Methods--Quantitative, Qualitative, and More: Overview

    About Research Methods. This guide provides an overview of research methods, how to choose and use them, and supports and resources at UC Berkeley. As Patten and Newhart note in the book Understanding Research Methods, "Research methods are the building blocks of the scientific enterprise. They are the "how" for building systematic knowledge.

  11. 6. The Methodology

    I. Groups of Research Methods. There are two main groups of research methods in the social sciences: The empirical-analytical group approaches the study of social sciences in a similar manner that researchers study the natural sciences.This type of research focuses on objective knowledge, research questions that can be answered yes or no, and operational definitions of variables to be measured.

  12. What are research methodologies?

    A research methodology is different from a research method because research methods are the tools you use to gather your data (Dawson, 2019). You must consider several issues when it comes to selecting the most appropriate methodology for your topic. Issues might include research limitations and ethical dilemmas that might impact the quality of ...

  13. Research Design

    Table of contents. Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies.

  14. A tutorial on methodological studies: the what, when, how and why

    The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies.

  15. PDF Methodology: What It Is and Why It Is So Important

    components of methodology one could add. For example, the historical roots of science and science and social policy are legitimate topics that could be covered as well. Yet, in developing an appreciation for methodology and the skills involved in many of the key facets of actually conducting research, the five will suffice.

  16. How to Write Research Methodology in 2024: Overview, Tips, and

    Methodology in research is defined as the systematic method to resolve a research problem through data gathering using various techniques, providing an interpretation of data gathered and drawing conclusions about the research data. Essentially, a research methodology is the blueprint of a research or study (Murthy & Bhojanna, 2009, p. 32).

  17. What Is Research Methodology? (Why It's Important and Types)

    Research methodology is a way of explaining how a researcher intends to carry out their research. It's a logical, systematic plan to resolve a research problem. A methodology details a researcher's approach to the research to ensure reliable, valid results that address their aims and objectives. It encompasses what data they're going to collect ...

  18. A tutorial on methodological studies: the what, when, how and why

    The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies.

  19. Promoting equality, diversity and inclusion in research and funding

    Equal, diverse, and inclusive teams lead to higher productivity, creativity, and greater problem-solving ability resulting in more impactful research. However, there is a gap between equality, diversity, and inclusion (EDI) research and practices to create an inclusive research culture. Research networks are vital to the research ecosystem, creating valuable opportunities for researchers to ...

  20. EULAR recommendations for the management of psoriatic arthritis with

    Objective New modes of action and more data on the efficacy and safety of existing drugs in psoriatic arthritis (PsA) required an update of the EULAR 2019 recommendations for the pharmacological treatment of PsA. Methods Following EULAR standardised operating procedures, the process included a systematic literature review and a consensus meeting of 36 international experts in April 2023.

  21. Medications for Opioid Use Disorder (MOUD) Study

    The study sample included 1,974 adults with opioid use disorder (OUD) receiving various types of OUD treatment at 62 outpatient facilities over 18 months. Participants were followed from March 2018 through May 2021. ... Researchers might consider imputation methods to deal with missing data. Researchers can use treatment facility ID to address ...

  22. Acquisition of and Access to Research Omics Data

    Omics data are essential for understanding the myriad and complex effects of space environments on humans. To assure maximum benefit from these kinds of data, the NASA Human Research Program Data Management Plan stipulates that human omics data should be archived within and accessed through the NASA Life Sciences Portal (NLSP). The NLSP has the capability to acquire and provision access to ...

  23. Developing a survey to measure nursing students' knowledge, attitudes

    Medical Assistance in Dying (MAiD) was legalized in Canada in 2016. Canada's legislation is the first to permit Nurse Practitioners (NP) to serve as independent MAiD assessors and providers. Registered Nurses' (RN) also have important roles in MAiD that include MAiD care coordination; client and family teaching and support, MAiD procedural quality; healthcare provider and public education ...