• Open access
  • Published: 12 October 2012

Writing implementation research grant proposals: ten key ingredients

  • Enola K Proctor 1 ,
  • Byron J Powell 1 ,
  • Ana A Baumann 1 ,
  • Ashley M Hamilton 1 &
  • Ryan L Santens 1  

Implementation Science volume  7 , Article number:  96 ( 2012 ) Cite this article

98k Accesses

86 Citations

76 Altmetric

Metrics details

All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.

This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project’s feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

Peer Review reports

Investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Researchers need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, most investigators may feel ‘new to the field.’ Furthermore, young investigators may have less preliminary data, and the path to successful proposal writing may seem less clear.

This article identifies ten of the important ingredients in well-crafted implementation proposals; in particular, it addresses how investigators can set the stage for proposed work through pilot data and a well-crafted and rationalized proposed study approach. It addresses questions such as: What preliminary work is important in the grant applications, and how can implementation researchers meet this challenge? How can investigators balance scientific impact with feasibility? Where in an implementation research proposal can investigators demonstrate their capacity to conduct a study as proposed?

The importance of the question

A significant and innovative research question is the first and primary ingredient in a successful proposal. A competitive implementation research application needs to pursue scientific questions that remain unanswered, questions whose answers advance knowledge of implementation with generalizability beyond a given setting. By definition, implementation research in health focuses on a health condition or disease, healthcare settings, and particular evidence-based interventions and programs with promise of reducing a gap in quality of care. It is conducted in usual care settings with practical quality gaps that stakeholders want to reduce. However, to make a compelling argument for scientific innovation and public health significance, a research grant application must have potential beyond reducing a quality gap and implementing a particular evidence-based healthcare practice. The application must have potential to advance the science of implementation by yielding generalizable knowledge. With only one journal devoted solely to implementation science [ 1 ], researchers must be aware of implementation literature that is scattered across a host of discipline-specific journals. Implementation researchers—akin to students with multiple majors—must demonstrate their grounding in implementation science, health diseases, disorders and their treatments, and real-world healthcare delivery.

Although implementation science is often characterized as an emerging field, its bar for scientifically important questions is rising rapidly. Descriptive studies of barriers have dominated implementation science for too long, and the field is urged to ‘move on’ to questions of how and why implementation processes are effective. Accordingly, the Institute of Medicine [ 2 ] has identified studies comparing the effectiveness of alternative dissemination and implementation strategies as a top-quartile priority for comparative effectiveness research. But experimental studies testing implementation strategies need to be informed by systematic background research on the contexts and processes of implementation. While investigators must demonstrate their understanding of these complexities, their grant proposals must balance feasibility with scientific impact. This paper addresses the challenges of preparing grant applications that succeed on these fronts. Though this article focuses on U.S. funding sources and grant mechanisms, the principles that are discussed should be relevant to implementation researchers internationally.

Guidance from grant program announcements

Grant review focuses on the significance of proposed aims, impact and innovation, investigator capacity to conduct the study as proposed, and support for the study hypotheses and research design. The entire application should address these issues. Investigators early in their research careers or new to implementation science often struggle to demonstrate their capacity to conduct the proposed study and the feasibility of the proposed methods. Not all National Institutes of Health (NIH) program announcements require preliminary data. However, those that do are clear that applications must convey investigator training and experience, capacity to conduct the study as proposed, and support for the study hypotheses and research design [ 3 ]. The more complex the project, the more important it is to provide evidence of capacity and feasibility [ 4 ].

The R01grant mechanism is typically large in scope compared to the R03, R21 and R34 a . Program announcements for grant mechanisms that are preliminary to R01 studies give important clues as to how to set the stage for an R01 and demonstrate feasibility. Investigator capacity can be demonstrated by describing prior work, experience, and training relevant to the application’s setting, substantive issues, and methodology—drawing on prior employment and research experience. For example, the NIH R03 small grant mechanism is often used to establish the feasibility of procedures, pilot test instruments, and refine data management procedures to be employed in a subsequent R01. The NIH R21 and the R34 mechanisms support the development of new tools or technologies; proof of concept studies; early phases of research that evaluate the feasibility, tolerability, acceptability and safety of novel treatments; demonstrate the feasibility of recruitment protocols; and support the development of assessment protocols and manuals for programs and treatments to be tested in subsequent R01 studies. These exploratory grants do not require extensive background material or preliminary information, but rather serve as sources for gathering data for subsequent R01 studies. These grant program announcements provide a long list of how pre-R01 mechanisms can be used, and no single application can or should provide all the stage-setting work exemplified in these descriptions.

Review criteria, typically available on funding agency web sites or within program announcements, may vary slightly by funding mechanism. However grants are typically reviewed and scored according to such criteria as: significance, approach (feasibility, appropriateness, robustness), impact, innovation, investigator team, and research environment. Table 1 summarizes the ten ingredients, provides a checklist for reviewing applications prior to submission, and ties each ingredient to one or more of the typical grant review criteria.

The literature does not provide a ‘. . . a comprehensive, prescriptive, and robust-yet practical-model to help…researchers understand (the) factors need to be considered and addressed’ in an R01 study [ 5 ]. Therefore we examined a variety of sources to identify recommendations and examples of background work that can strengthen implementation research proposals. This paper reflects our team’s experience with early career implementation researchers, specifically through training programs in implementation science and our work to provide technical assistance in implementation research through our university’s Clinical and Translational Science Award CTSA program. We also studied grant program announcements, notably the R03, R21, R18, and R01 program announcements in implementation science [ 6 – 9 ]. We studied how successful implementation research R01 grant applications ‘set the stage’ for the proposed study in various sections of the proposal. We conducted a literature search using combinations of the following key words: ‘implementation research,’ ‘implementation studies,’ ‘preliminary studies,’ ‘preliminary data,’ ‘pilot studies,’ ‘pilot data,’ ‘pilot,’ ‘implementation stages,’ ‘implementation phases,’ and ‘feasibility.’ We also drew on published studies describing the introduction and testing of implementation strategies and those that characterize key elements and phases of implementation research [ 10 , 11 ].

From these reviews, we identified ten ingredients that are important in all implementation research grants: the gap between usual care and evidence-based care; the background of the evidence-based treatment to be implemented, its empirical base, and requisites; the theoretical framework for implementation and explicit theoretical justification for the choice of implementation strategies; information about stakeholders’ (providers, consumers, policymakers) treatment priorities; the setting’s (and providers’) readiness to adopt new treatments; the implementation strategies planned or considered in order to implement evidence-based care; the study team’s experience with the setting, treatment, or implementation process and the research environment; the feasibility and requisites of the proposed methods; the measurement and analysis of study variables; and the health delivery setting’s policy/funding environment, leverage or support for sustaining change.

Given the sparse literature on the importance of preliminary studies for implementation science grant applications, we ‘vetted’ our list of grant application components with a convenience sample of experts. Ultimately, nine experts responded to our request, including six members of the Implementation Science editorial board. We asked the experts to rate the importance of each of the ten elements, rating them as ‘1: Very important to address this is the application,’ ‘2: Helpful but not necessary to the application,’ or ‘3: Not very important to address’ within the context of demonstrating investigator capacity and study feasibility. Respondents were also asked whether there are any additional factors that were not listed.

While all the ten ingredients below were considered important for a successful application, several experts noted that their importance varies according to the aims of the application. For example, one expert affirmed the importance of the settings’ readiness to change, but noted that it may not be crucial to address in a given proposal: ‘the setting’s readiness may be unimportant to establish or report prior to the study, because the study purpose may be to establish an answer to this question.’ However, another maintained, ‘in a good grant application, you have to dot all the ‘I’s’ and cross all the ‘T’s.’ I consider all these important.’ One expert noted that applications might need to argue the importance of implementation research itself, including the importance of closing or reducing gaps in the quality of care. This was viewed as particularly important when the study section to review the grant may not understand or appreciate implementation research. In these cases, it may be important to define and differentiate implementation research from other types of clinical and health services research. For example, it may be useful to situate one’s proposal within the Institute of Medicine’s ‘prevention research cycle,’ which demonstrates the progression from pre-intervention, efficacy, and effectiveness research to dissemination and implementation studies that focus on the adoption, sustainability, and scale-up of interventions [ 12 ]. It may also be important to convey that implementation research is very complex, necessitating the use of multiple methods, a high degree of stakeholder involvement, and a fair amount of flexibility in order to ensure that implementers will be able to respond appropriately to unforeseen barriers.

Ten key ingredients of a competitive implementation research grant application

As emphasized at the beginning of this article, the essential ingredient in a successful implementation science proposal is a research question that is innovative and, when answered, can advance the field of implementation science. Assuming that an important question has been established to potential reviewers, we propose that the following ten ingredients can help investigators demonstrate their capacity to conduct the study and to demonstrate the feasibility of completing the study as proposed. For each ingredient, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

The care gap, or quality gap, addressed in the application

The primary rationale for all implementation efforts, and thus a key driver in implementation science, is discovering how to reduce gaps in healthcare access, quality, or, from a public health perspective, reducing the gap between Healthy People 2020 [ 13 ] goals and current health status. Accordingly, implementation research proposals should provide clear evidence that gaps exists and that there is room for improvement and impact through the proposed implementation effort. This is a primary way of demonstrating the public health significance of the proposed work.

Gaps in the quality of programs, services, and healthcare can be measured and documented at the population-, organization-, and provider-levels [ 14 ]. Several kinds of preliminary data can demonstrate the quality gap to be reduced through the proposed implementation effort. For example, investigators can emphasize the burden of disease through data that reflect its morbidity, mortality, quality of life, and cost [ 14 ]. An implementation research grant should cite service system research that demonstrates unmet need [ 15 ], the wide variation in the use of evidence-based treatments in usual care [ 16 – 19 ], or the association between the burden of disease and variations in the use of guidelines [ 20 ]. Investigators can also document that few providers adopt evidence-based treatments [ 21 , 22 ], that evidence-based treatments or programs have limited reach [ 23 ], or that penetration [ 24 ] into a system of care can be addressed by the implementation study. Regardless of the specific approach to documenting a quality gap, investigators should use rigorous methods and involve all relevant stakeholders [ 14 ]. In fact, stakeholders can demonstrate their involvement and endorse quality gaps through letters of support attesting to the lack of evidence-based services in usual care.

The evidence-based treatment to be implemented

A second key ingredient in implementation research proposals is the evidence-based program, treatment, policies, or set of services whose implementation will be studied in the proposed research [ 25 – 27 ]. The research ‘pipeline’ [ 28 – 30 ] contains many effective programs and treatments in a backlog, waiting to be implemented. Moreover, many health settings experience a huge demand for better care. An appropriate evidence-based treatment contributes to the project’s public health significance and practical impact, presuming of course that it will be studied in a way that contributes to implementation science.

Implementation research proposals must demonstrate that the evidence-based service is ready for implementation. The strength of the empirical evidence for a given guideline or treatment [ 31 , 32 ], a key part of ‘readiness,’ can be demonstrated in a variety of ways; in some fields, specific thresholds must be met before an intervention is deemed ‘evidence-based’ or ‘empirically-supported’ [ 33 – 35 ]. For example, Chambless et al. [ 35 ] suggest that interventions should demonstrate efficacy by being shown to be superior to placebos or to another treatment in at least two between group design experiments; or by showing efficacy in a large series of single case design experiments. Further, Chambless et al. [ 35 ] note that the experiments must have been conducted with treatment manuals, the characteristics of the samples must have been clearly specified, and the effects must have been demonstrated by at least two different investigators or investigative teams.

The strength of evidence for a given treatment can also be classified using the Cochrane EPOC’s criteria for levels of evidence, which considers randomized controlled trials, controlled clinical trials, time series designs, and controlled before-and-after studies as appropriate [ 36 ]. Researchers who come to implementation research as effectiveness researchers or as program or treatment developers are well positioned, because they can point to their prior research as part of their own background work. Other researchers can establish readiness for implementation by reviewing evidence for the treatment or program as part of the background literature review, preferably relying on well-conducted systematic reviews and meta-analyses of randomized-controlled trials (if available). At a minimum, ‘evaluability assessment’ [ 37 ] can help reflect what changes or improvements are needed to optimize effectiveness given the context of the implementation effort.

Conceptual model and theoretical justification

Any research striving for generalizable knowledge should be guided by and propose to test conceptual frameworks, models, and theories [ 38 ]. Yet, theory has been drastically underutilized and underspecified in implementation research [ 38 – 40 ]. For example, in a review of 235 implementation studies, less than 25% of the studies employed theory in any way, and only 6% were explicitly theory-based [ 39 ]. While translating theory into research design is not an easy task [ 36 ], the absence of theory in implementation research has limited our ability to specify key contextual variables and to identify the precise mechanisms by which implementation strategies exert their effects.

McDonald et al. [ 41 ] present a useful hierarchy of theories and models, which serves to organize the different levels of theory and specify the ways in which they can be useful in implementation research. They differentiate between conceptual models, frameworks, and systems, which are used to represent global ideas about a phenomenon and theory, which is an ‘organized, heuristic, coherent, and systematic set of statements related to significant questions that are communicated in a meaningful whole’ [ 41 ]. Within the realm of theory, they differentiate between grand or macro theories ( e.g. , Rogers’ Diffusion of Innovations theory [ 26 ]), mid-range theories ( e.g. , transtheoretical model of change [ 42 ]), and micro-theories ( e.g. , feedback intervention theory [ 43 ]). Though models, frameworks, and systems are generally at a higher level of abstraction than theories, it is important to note that the level of abstraction varies both between and within the categories of the hierarchy. The thoughtful integration of both conceptual models and theories can substantially strengthen an application.

Conceptual models, frameworks, and systems can play a critical role in anchoring a research study theoretically by portraying the key variables and relationships to be tested. Even studies that address only a subset of variables within a conceptual model need to be framed conceptually, so that reviewers perceive the larger context (and body of literature) that a particular study proposes to inform. Given the confusion surrounding definitions and terminology within the still-evolving field of dissemination and implementation [ 44 , 45 ], grant proposals need to employ consistent language, clear definitions for constructs, and the most valid and reliable measures for the constructs that correspond to the guiding conceptual framework or theoretical model. Proposal writers should be cautioned that the theory or conceptual model used to frame the study must be used within the application. A mere mention will not suffice. A conceptual model can help frame study questions and hypotheses, anchor the background literature, clarify the constructs to be measured, and illustrate the relationships to be evaluated or tested. The application must also spell out how potential findings will inform the theory or model.

Numerous models and frameworks can inform implementation research. For example, Glasgow et al. [ 23 ] RE-AIM framework can inform evaluation efforts in the area of implementation science. Similarly, Proctor et al. [ 46 ] have proposed a model that informs evaluation by differentiating implementation, service system, and clinical outcomes, and identifying a range of implementation outcomes that can be assessed [ 24 ]. Damschroder et al. ’s [ 10 ] Consolidated Framework for Implementation Research identifies five domains that are critical to successful implementation: intervention characteristics (evidentiary support, relative advantage, adaptability, trialability, and complexity); the outer setting (patient needs and resources, organizational connectedness, peer pressure, external policy and incentives); the inner setting (structural characteristics, networks and communications, culture, climate, readiness for implementation); the characteristics of the individuals involved (knowledge, self-efficacy, stage of change, identification with organization, etc.); and the process of implementation (planning, engaging, executing, reflecting, evaluating). Others have published stage or phase models of implementation. For example, the Department of Veteran Affairs’ QUERI initiative [ 47 ] specifies a four-phase model spanning pilot projects, small clinical trials, regional implementation, and implementation on the national scale; and Aarons, Hurlburt and Horwitz [ 48 ] developed a four phase model of exploration, adoption/preparation, active implementation, and sustainment. Magnabosco [ 49 ] delineates between pre-implementation, initial implementation, and sustainability planning phases.

McDonald et al. [ 41 ] note that grand theories are similar to conceptual models, and that they generally represent theories of change. They differentiate between classical models of change that emphasize natural or passive change processes, such as Rogers’ diffusion of innovations theory [ 26 ], and planned models of change that specify central elements of active implementation efforts. Investigators may find it more helpful to draw from mid-range theories because they discuss the mechanisms of change at various levels of the implementation context [ 26 ]. For example, social psychological theories, organizational theories, cognitive psychology theories, educational theories, and a host of others may be relevant to the proposed project. While conceptual models are useful in framing a study theoretically and providing a ‘big picture’ of the hypothesized relationships between variables, mid-range theories can be more helpful in justifying the selection of specific implementation strategies specifying the mechanisms by which they may exert their effects. Given the different roles that theory can play in implementation research, investigators would be wise to consider relevant theories at multiple levels of the theoretical hierarchy when preparing their proposals. It is far beyond the scope of this article to review conceptual models and theories in detail; however, several authors have produced invaluable syntheses of conceptual models and theories that investigators may find useful [ 10 , 41 , 50 – 56 ].

Stakeholder priorities and engagement in change

Successful implementation of evidence-based interventions largely depends on their fit with the preferences and priorities of those who shape, deliver, and participate in healthcare. Stakeholders in implementation, and thus in implementation research, include treatment or guideline developers, researchers, administrators, providers, funders, community-based organizations, consumers, families, and perhaps legislators who shape reimbursement policies (see Mendel et al. ’ article [ 57 ] for a framework that outlines different levels of stakeholders). These stakeholders are likely to vary in their knowledge, perceptions, and preferences for healthcare. Their perspectives contribute substantially to the context of implementation and must be understood and addressed if the implementation effort is to succeed. A National Institute of Mental Health Council workgroup report [ 58 ] calls for the engagement of multiple stakeholder perspectives, from concept development to implementation, in order to improve the sustainability of evidence-based services in real-world practice. The engagement of key stakeholders in implementation research affects both the impact of proposed implementation efforts, the sustainability of the proposed change, and the feasibility and ultimate success of the proposed research project. Thus, implementation research grant proposals should convey the extent and manner in which key stakeholders are engaged in the project.

Stakeholders and researchers can forge different types of collaborative relationships. Lindamer et al. [ 59 ] describe three different approaches researchers and stakeholders can take that vary with respect to the level of participation of the stakeholders and community in decisions about the research. In the ‘community-targeted’ approach, stakeholders are involved in recruitment and in the dissemination of the results. In the ‘community-based’ approach, stakeholders participate in the selection of research topics, but the researcher makes the final decision on the study design, methodology, and analysis of data. Finally, the ‘community-driven’ approach or community-based participatory research (CBPR) approach entails participation of the stakeholders in all aspects of the research. Some authors advocate for the CBPR model as a strategy to decrease the gap between research and practice because it addresses some of the barriers to implementation and dissemination [ 60 – 62 ] by enhancing the external validity of the research and promoting the sustainability of the intervention. Kerner et al. [ 62 ] note:

‘When community-based organizations are involved as full partners in study design, implementation, and evaluation of study findings, these organizations may be more amenable to adopting the approaches identified as being effective, as their tacit knowledge about ‘what works’ would have been evaluated explicitly through research.’

Stakeholder analysis can be carried out to evaluate and understand stakeholders’ interests, interrelations, influences, preferences, and priorities. The information gathered from stakeholder analysis can then be used to develop strategies for collaborating with stakeholders, to facilitate the implementation of decisions or organizational objectives, or to understand the future of policy directions [ 63 , 64 ].

Implementation research grant applications are stronger when preliminary data, qualitative or quantitative, reflect stakeholder preferences around the proposed change. Engagement is also reflected in publications that the principal investigator (PI) and key stakeholders have shared in authorship, or methodological details that reflect stakeholder priorities. Letters of support are a minimal reflection of stakeholder investment in the proposed implementation project.

Context: Setting’s readiness to adopt new services/ treatments/ programs

Implementation research proposals are strengthened by information that reflects the setting’s readiness, capacity, or appetite for change, specifically around adoption of the proposed evidence-based treatment. This is not to say that all implementation research should be conducted in settings with high appetite for change. Implementation research is often criticized for disproportionate focus on settings that are eager and ready for change. ‘Cherry picking’ sites, where change is virtually guaranteed, or studying implementation only with eager and early adopters, does not produce knowledge that can generalize to usual care, where change is often challenging. The field of implementation science needs information about the process of change where readiness varies, including settings where change is resisted.

Preliminary data on the organizational and policy context and its readiness for change can strengthen an application. Typically viewed as ‘nuisance’ variance to be controlled in efficacy and effectiveness research, contextual factors are key in implementation research [ 65 – 67 ]. The primacy of context is reflected in the choice of ‘it’s all about context’ as a theme at the 2011 NIH Training Institute in Dissemination and Implementation Research in Health [ 68 ]. Because organization, policy, and funding context may be among the strongest influences on implementation outcomes, context needs to be examined front and center in implementation research [ 69 ]. A number of scales are available to capture one key aspect of context, the setting’s readiness or capacity for change. Weiner et al. [ 70 ] extensive review focusing on the conceptualization and measurement of organizational readiness for change identified 43 different instruments; though, they acknowledged substantial problems with the reliability and validity of many of the measures. Due in part to issues with reliability and validity of the measures used in the field, work in this area is ongoing [ 71 , 72 ].

Other approaches to assessing readiness have focused on organizational culture, climate, and work attitudes [ 73 ], and on providers’ attitudes towards evidence-based practices [ 21 , 22 , 74 ]. Furthermore, a prospective identification of implementation barriers and facilitators can be helpful in demonstrating readiness to change, increasing reviewers’ confidence that the PI has thoroughly assessed the implementation context, and informing the selection of implementation strategies (discussed in the following section) [ 75 – 77 ]. An evaluation of barriers and facilitators can be conducted through qualitative [ 78 – 80 ] or survey [ 81 , 82 ] methodology. In fact, a number of scales for measuring implementation barriers have been developed [ 74 , 83 , 84 ]. Letters from agency partners or policy makers, while weaker than data, can also be used to convey the setting’s readiness and capacity for change. Letters are stronger when they address the alignment of the implementation effort to setting or organizational priorities or to current or emergent policies.

Implementation strategy/process

Though the assessment of implementation barriers can play an important role in implementation research, the ‘rising bar’ in the field demands that investigators move beyond the study of barriers to research that generates knowledge about the implementation processes and strategies that can overcome them. Accordingly, the NIH has prioritized efforts to ‘identify, develop, and refine effective and efficient methods, structures, and strategies to disseminate and implement’ innovations in healthcare [ 7 ].

A number of implementation strategies have been identified and discussed in the literature [ 36 , 85 – 87 ]. However, as the Improved Clinical Effectiveness through Behavioural Research Group notes [ 38 ], the most consistent finding from systematic reviews of implementation strategies is that most are effective some, but not all of the time, and produce effect sizes ranging from no effect to a large effect. Our inability to determine how, why, when, and for whom these strategies are effective is hampered in large part by the absence of detailed descriptions of implementation strategies [ 40 ], the use of inconsistent language [ 44 ], and the lack of clear theoretical justification for the selection of specific strategies [ 39 ]. Thus, investigators should take great care in providing detailed descriptions of implementation strategies to be observed or empirically tested. Implementation Science has endorsed [ 40 ] the use of the WIDER Recommendations to Improve Reporting of the Content of Behaviour Change Interventions [ 88 ] as a means of improving the conduct and reporting of implementation research, and these recommendations will undoubtedly be useful to investigators whose proposals employ implementation strategies. Investigators may also find the Standards for Quality Improvement Reporting Excellence (SQUIRE) helpful [ 89 ]. Additional design specific reporting guidelines can be found on the Equator Network website [ 90 ]. The selection of strategies must be justified conceptually by drawing upon models and frameworks that outline critical implementation elements [ 10 ]. Theory should be used to explain the mechanisms through which implementation strategies are proposed to exert their effects [ 39 ], and it may be helpful to clarify the proposed mechanisms of change through the development of a logic model and illustrate the model through a figure [ 91 ].

According to Brian Mittman, in addition to being theory-based, implementation strategies should be: multifaceted or multilevel (if appropriate); robust or readily adaptable; feasible and acceptable to stakeholders; compelling, saleable, trialable, and observable; sustainable; and scalable [ 92 , 93 ]. We therefore emphasize taking stock of the budget impact of implementation strategies [ 94 ] as well as any cost and cost-effectiveness data related to the implementation strategies [ 95 ]. Although budget impact is a key concern to administrators and some funding agencies require budget impact analysis, implementation science to date suffers a dearth of economic evaluations from which to draw [ 96 , 97 ].

The empirical evidence for the effectiveness of multifaceted strategies has been mixed, because early research touted the benefits of multifaceted strategies [ 98 , 99 ], while a systematic review of 235 implementation trials by Grimshaw et al. found no relationship between the number of component interventions and the effects of multifaceted interventions [ 100 ]. However, Wensing et al. [ 101 ] note that while multifaceted interventions were assumed to address multiple barriers to change, many focus on only one barrier. For example, providing training and consultation is a multifaceted implementation strategy; however, it primarily serves to increase provider knowledge, and does not address other implementation barriers. Thus, Wensing et al. [ 101 ] argue that multifaceted interventions could be more effective if they address different types of implementation barriers ( e.g. , provider knowledge and the organizational context). While the methods for tailoring clinical interventions and implementation strategies to local contexts need to be improved [ 102 ], intervention mapping [ 103 ] and a recently developed ‘behaviour change wheel’ [ 104 ] are two promising approaches.

Proposals that employ multifaceted and multilevel strategies that address prospectively identified implementation barriers [ 102 ] may be more compelling to review committees, but mounting complex experiments may be beyond the reach of many early-stage investigators and many grant mechanisms. However, it is within the scope of R03, R21, and R34 supported research to develop implementation strategies and to conduct pilot tests of their feasibility and acceptability—work that can strengthen the case for sustainability and scalability. Proposal writers should provide preliminary work for implementation strategies in much the same way that intervention developers do, such as by providing manuals or protocols to guide their use, and methods to gauge their fidelity. Such work is illustrated in the pilot study conducted by Kauth et al. [ 105 ], which demonstrated that an external facilitation strategy intended to increase the use of cognitive behavioral therapy within Veteran Affairs clinics was a promising and low-cost strategy; such pilot data would likely bolster reviewers’ confidence that the strategy is feasible, scalable, and ultimately, sustainable. Investigators should also make plans to document any modifications to the intervention and, if possible, incorporate adaptation models to the implementation process, because interventions are rarely implemented without being modified [ 67 , 106 ].

While providing detailed specification of theory-based implementation strategies is critical, it is also imperative that investigators acknowledge the complexity of implementation processes. Aarons and Palinkas [ 107 ] comment:

‘It is unrealistic to assume that implementation is a simple process, that one can identify all of the salient concerns, be completely prepared, and then implement effectively without adjustments. It is becoming increasingly clear that being prepared to implement EBP means being prepared to evaluate, adjust, and adapt in a continuing process that includes give and take between intervention developers, service system researchers, organizations, providers, and consumers.’

Ultimately, proposals that reflect the PI’s understanding of the complexity of the process of implementing evidence-based practices and that provide supporting detail about strategies and processes will be perceived as more feasible to complete through the proposed methods.

Team experience with the setting, treatment, implementation process, and research environment

Grant reviewers are asked to specifically assess a PI’s capacity to successfully complete a proposed study. Grant applications that convey the team’s experience with the study setting, the treatment whose implementation is being studied, and implementation processes help convey capacity and feasibility to complete an implementation research project [ 108 ].

The reader should observe that NIH gives different scores for the team experience with the setting and for the research environment ( http://grants.nih.gov/grants/writing_application.htm ) but the purpose of both sections is demonstrating capacity to successfully carry out the study as proposed. Investigators can convey capacity through a variety of ways. Chief among them is building a strong research team, whose members bring depth and experience in areas the PI does not yet have. Implementation research exemplifies multidisciplinary team science, informed by a diverse range of substantive and methodological fields [ 96 , 109 ]. A team that brings the needed disciplines and skill sets directly to the project enhances the project’s likelihood of success. Early-stage implementation researchers who collaborate or partner with senior investigators reassure reviewers that the proposed work will benefit from the senior team member’s experience and expertise. Similarly, collaborators play important roles in complementing, or rounding out, the PI’s disciplinary perspective and methodological skill set. Early career investigators, therefore, should surround themselves with more established colleagues who bring knowledge and experience in areas key to the study aims and methods. The narrative should cite team members’ relevant work, and their prior work can be addressed in a discussion of preliminary studies. Additionally, the new formats for NIH biosketches and budget justifications enable a clear portrayal of what each team member brings to the proposed study.

For the NIH applications, the research environment is detailed in the resources and environment section of a grant application. Here, an investigator can describe the setting’s track record in implementation research; research centers, labs, and offices that the PI can draw on; and structural and historic ties to healthcare settings. For example, a PI can describe how their project will draw upon the University’s CTSA program [ 110 ], statistics or design labs, established pools of research staff, and health services research centers. Preliminary studies and biosketches provide additional ways to convey the strengths of the environment and context within which an investigator will launch a proposed study.

In summary, researchers need to detail the strengths of the research environment, emphasizing in particular the resources, senior investigators, and research infrastructure that can contribute to the success of the proposed study. A strong research environment is especially important for implementation research, which is typically team-based, requires expertise of multiple disciplines, and requires strong relationships between researchers and community based health settings. Investigators who are surrounded by experienced implementation researchers, working in a setting with strong community ties, and drawing on experienced research staff can inspire greater confidence in the proposed study’s likelihood of success.

Feasibility of proposed research design and methods

One of the most important functions of preliminary work is to demonstrate the feasibility of the proposed research design and methods. Landsverk [ 108 ] urges PIs to consider every possible question reviewers might raise, and to explicitly address those issues in the application. Data from small feasibility studies or pilot work around referral flow; participant entry into the study; participant retention; and the extent to which key measures are understood by participants, acceptable for use, and capture variability can demonstrate that the proposed methods are likely to work. The methods section should contain as much detail as possible, as well as lay out possible choice junctures and contingencies, should methods not work as planned. It is not only important to justify methodological choices, but also to discuss why potential alternatives were not selected. For example, if randomization is not feasible or acceptable to stakeholders, investigators should make that clear. Letters from study site collaborators can support, but should not replace, the narrative’s detail on study methods. For example, letters attesting the willingness of study sites to be randomized or to support recruitment for the proposed timeframe can help offset reviewer concerns about some of the real-world challenges of launching implementation studies.

Measurement and analysis

A grant application must specify a measurement plan for each construct in the study’s overarching conceptual model or guiding theory, whether those constructs pertain to implementation strategies, the context of implementation, stakeholder preferences and priorities, and implementation outcomes [ 111 ]. Yet, crafting the study approach section is complicated by the current lack of consensus on methodological approaches to the study of implementation processes, measuring implementation context and outcomes, and testing implementation strategies [ 112 , 113 ]. Measurement is a particularly important aspect of study methods, because it determines the quality of data. Unlike efficacy and effectiveness studies, implementation research often involves some customization of an intervention to fit local context; accordingly, measurement plans need to address the intervention’s degree of customization versus fidelity [ 97 ]. Moreover, implementation science encompasses a broad range of constructs, from a variety of disciplines, with little standardization of measures or agreement on definitions of constructs across different studies, fields, authors, or research groups, further compounding the burden to present a clear and robust measurement plan along with its rationale. Two current initiatives seek to advance the harmonization, standardization, and rigor of measurement in implementation science, the U.S. National Cancer Institute’s (NCI) Grid-Enabled Measures (GEM) portal [ 114 ] and the Comprehensive Review of Dissemination and Implementation Science Instruments efforts supported by the Seattle Implementation Research Conference (SIRC) at the University of Washington [ 115 ]. Both initiatives engage the implementation science research community to enhance the quality and harmonization of measures. Their respective web sites are being populated with measures and ratings, affording grant writers an invaluable resource in addressing a key methodological challenge.

Key challenges in crafting the analysis plan for implementation studies include: determining the unit of analysis, given the ‘action’ at individual, team, organizational, and policy environments; shaping meditational analyses given the role of contextual variables; and developing and using appropriate methods for characterizing the speed, quality, and degree of implementation. The proposed study’s design, assessment tools, analytic strategies, and analytic tools must address these challenges in some manner [ 113 ]. Grant applications that propose the testing of implementation strategies or processes often provide preliminary data from small-scale pilot studies to examine feasibility and assess sources of variation. However, the magnitude of effects in small pilots should be determined by clinical relevance [ 113 ], given the uncertainty of power calculations from small scale studies [ 116 ].

Policy/funding environment; leverage or support for sustaining change

PIs should ensure that grant applications reflect their understanding of the policy and funding context of the implementation effort. Health policies differ in many ways that impact quality [ 117 ], and legal, reimbursement, and regulatory factors affect the adoption and sustainability of evidence-based treatments [ 118 ]. Raghavan et al. [ 119 ] discuss the policy ecology of implementation, and emphasize that greater attention should be paid to marginal costs associated with implementing evidence-based treatments, including expenses for provider training, supervision, and consultation. Glasgow et al. [ 120 ] recently extended their heretofore behaviorally focused RE-AIM framework for public health interventions to health policies, revealing the challenges associated with policy as a practice-change lever.

PIs can address the policy context of the implementation initiative through the narrative, background literature, letters of support, and the resource and environment section. Proposals that address how the implementation initiative aligns with policy trends enhance their likelihood of being viewed as having high public health significance, as well as greater practical impact, feasibility, and sustainability. It is important to note that it may behoove investigators to address the policy context within a proposal even if it is not likely to be facilitative of implementation, because it demonstrates to reviewers that the investigator is not naïve to the challenges and barriers that exist at this level.

We identify and discuss ten key ingredients in implementation research grant proposals. The paper reflects the team’s experience and expertise: writing for federal funding agencies in the United States. We acknowledge that this will be a strength for some readers and a limitation for international readers, whom we encourage to contribute additional perspectives. Setting the stage with careful background detail and preliminary data may be more important for implementation research, which poses a unique set of challenges that investigators should anticipate and demonstrate their capacity to manage. Data to set the stage for implementation research may be collected by the study team through preliminary, feasibility, or pilot studies, or the team may draw on others’ work, citing background literature to establish readiness for the proposed research.

Every PI struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project’s feasibility and likelihood of success. The relative emphasis on, and thus length of text addressing, the various sections of a grant proposal varies with the program mechanism, application ‘call,’ and funding source. For NIH applications, most attention and detail should be allocated to the study method because the ‘approach’ section is typically weighted most heavily in scoring. Moreover, the under-specification or lack of detail in study methodology usually receives the bulk of reviewer criticism. Well-constructed, parsimonious tables, logic models, and figures reflecting key concepts and the analytic plan for testing their relationships all help add clarity, focus reviewers, and prevent misperceptions. All implementation research grants need to propose aims, study questions, or hypotheses whose answers will advance implementation science. Beyond this fundamental grounding, proposed implementation studies should address most, if not all, of the ingredients identified here. While no application can include a high level of detail about every ingredient, addressing these components can help assure reviewers of the significance, feasibility, and impact of the proposed research.

a For more information regarding different grant mechanisms, please see: http://grants.nih.gov/grants/funding/funding_program.htm .

Authors’ information

EKP directs the Center for Mental Health Services Research at Washington University in St. Louis (NIMH P30 MH085979), the Dissemination and Implementation Research Core (DIRC) of the Washington University Institute of Clinical and Translational Sciences (NCRR UL1RR024992), and the Implementation Research Institute (NIMH R25 MH080916).

Implementation Science. http://www.implementationscience.com ,

Institute of Medicine: Initial national priorities for comparative effectiveness research. 2009, Washington, DC: The National Academies Press

Google Scholar  

Agency for Health Care Research and Quality's Essentials of the Research Plan. http://www.ahrq.gov/fund/esstplan.htm#Preliminary ,

National Institutes of Health Grant Cycle. http://www.niaid.nih.gov/researchfunding/grant/cycle/Pages/part05.aspx ,

Feldstein AC, Glasgow RE: A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Joint Commission on Accreditation of Healthcare Organizations. 2008, 34: 228-243.

Researching Implementation and Change while Improving Quality (R18). http://grants.nih.gov/grants/guide/pa-files/PAR-08-136.html ,

Dissemination and Implementation Research in Health (R01). http://grants.nih.gov/grants/guide/pa-files/PAR-10-038.html ,

Dissemination and Implementation Research in Health (R03). http://grants.nih.gov/grants/guide/pa-files/PAR-10-039.html ,

Dissemination and Implementation Research in Health (R21). http://grants.nih.gov/grants/guide/pa-files/PAR-10-040.html ,

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC: Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009, 4 (50): 1-15.

Stetler CB, Mittman BS, Francis J: Overview of the VA quality enhancement research inititative (QUERI) and QUERI theme articles: QUERI series. Implementation Science. 2008, 3: 1-9. 10.1186/1748-5908-3-1.

Institute of Medicine: Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. 2009, Washington, DC: National Academies Press

Healthy People. 2020, http://www.healthypeople.gov/2020/default.aspx ,

Kitson A, Straus SE: Identifying the knowledge-to-action gaps. Knowledge Translation in Health Care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Hoboken, NJ: Wiley-Blackwell, 60-72.

Burns BJ, Phillips SD, Wagner HR, Barth RP, Kolko DJ, Campbell Y, Landsverk J: Mental health need and access to mental health services by youths involved with child welfare: a national survey. J Am Acad Child Adolesc Psychiatry. 2004, 43: 960-970. 10.1097/01.chi.0000127590.95585.65.

PubMed   Google Scholar  

McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA: The quality of health care delivered to adults in the United States. N Engl J Med. 2003, 348: 2635-2645. 10.1056/NEJMsa022615.

Raghavan R, Inoue M, Ettner SL, Hamilton BH: A preliminary analysis of the receipt of mental health services consistent with national standards among children in the child welfare system. Am J Public Health. 2010, 100: 742-749. 10.2105/AJPH.2008.151472.

PubMed   PubMed Central   Google Scholar  

Wang PS, Berglund P, Kessler RC: Recent care of common mental disorders in the United States. J Gen Intern Med. 2000, 15: 284-292. 10.1046/j.1525-1497.2000.9908044.x.

CAS   PubMed   PubMed Central   Google Scholar  

Zima BT, Hurlburt MS, Knapp P, Ladd H, Tang L, Duan N, Wallace P, Rosenblatt A, Landsverk J, Wells KB: Quality of publicly-funded outpatient specialty mental health care for common childhood psychiatric disorders in California. J Am Acad Child Adolesc Psychiatry. 2005, 44: 130-144. 10.1097/00004583-200502000-00005.

Brook BS, Dominici F, Pronovost PJ, Makary MA, Schneider E, Pawlik TM: Variations in surgical outcomes associated with hospital compliance with safety. Surgery. 2012, 151: 651-659. 10.1016/j.surg.2011.12.001.

Aarons GA: Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004, 6: 61-74.

Aarons GA, Cafri G, Lugo L, Sawitzky A: Expanding the domains of attitudes towards evidence-based practice: The Evidence Based Attitudes Scale-50. Administration and Policy in Mental Health and Mental Health Services Research. 2012, 5: 331-340.

Glasgow RE, Vogt TM, Boles SM: Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Public Health. 1999, 89: 1322-1327. 10.2105/AJPH.89.9.1322.

Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M: Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2010, 38: 65-76.

PubMed Central   Google Scholar  

Bond GR, Drake R, Becker D: Beyond evidence-based practice: Nine ideal features of a mental health intervention. Research on Social Work Practice. 2010, 20: 493-501. 10.1177/1049731509358085.

Rogers EM: Diffusion of Innovations. 2003, New York: Free Press, 5

Grol R, Wensing M: Characteristics of successful innovations. Improving patient care: The implementation of change in clinical practice. Edited by: Grol R, Wensing M, Eccles M. 2005, Edinburgh: Elsevier, 60-70.

Diner BM, Carpenter CR, O'Connell T, Pang P, Brown MD, Seupaul RA, Celentano JJ, Mayer D: Graduate medical education and knowledge translation: Role models, information pipelines, and practice change thresholds. Acad Emerg Med. 2007, 14: 1008-1014.

Westfall JM, Mold J, Fagnan L: Practice-based research: ‘Blue Highways’ on the NIH roadmap. JAMA. 2007, 297: 403-406. 10.1001/jama.297.4.403.

CAS   PubMed   Google Scholar  

Kleinman MS, Mold JW: Defining the components of the research pipeline. Clin Transl Sci. 2009, 2: 312-314. 10.1111/j.1752-8062.2009.00119.x.

Oxman AD: Grading quality of evidence and strength of recommendations. BMJ. 2004, 328: 1490-1494.

Ebell MH, Siwek J, Weiss BD, Woolf SH, Susman J, Ewigman B, Bowman M: Strength of recommendation taxonomy (SORT): A patient-centered approach to grading evidence in the medical literature. J Am Board Fam Pract. 2004, 17: 59-67. 10.3122/jabfm.17.1.59.

Roth A, Fonagy P: What works for whom? A critical review of psychotherapy research. 2005, New York: Guilford

Weissman MM, Verdeli H, Gameroff MJ, Bledsoe SE, Betts K, Mufson L, Fitterling H, Wickramaratne P: National survey of psychotherapy training in psychiatry, psychology, and social work. Arch Gen Psychiatry. 2006, 63: 925-934. 10.1001/archpsyc.63.8.925.

Chambless DL, Baker MJ, Baucom DH, Beutler LE, Calhoun KS, Crits-Christoph P, Daiuto A, DeRubeis R, Detweiler J, Haaga DAF: Update on empirically validated therapies, II. The Clinical Psychologist. 1998, 51: 3-16.

Cochrane Effective Practice and Organisation of Care group: Data collection checklist. 2002, EPOC measures for review authors

Leviton LC, Khan LK, Rog D, Dawkins N, Cotton D: Evaluability assessment to improve public health policies, programs, and practices. Annu Rev Public Health. 2010, 31: 213-233. 10.1146/annurev.publhealth.012809.103625.

The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG): Designing theoretically-informed implementation interventions. Implementation Science. 2006, 1 (4): 1-8.

Davies P, Walker AE, Grimshaw JM: A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implementation Science. 2010, 5: 1-6. 10.1186/1748-5908-5-1.

Michie S, Fixsen D, Grimshaw JM, Eccles MP: Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implementation Science. 2009, 4 (Article: 40): 1-6.

McDonald KM, Graham ID, Grimshaw J: Toward a theoretical basis for quality improvement interventions. Closing the quality gap: A critical analysis of quality improvement strategies. Edited by: Shojania KG, McDonald KM, Wachter RM, Owens DK. 2004, Rockville, MD: Agency for Healthcare Research and Quality, 27-40.

Prochaska JO, Velicer WF: The transtheoretical model of health behavior change. Am J Health Promot. 1997, 12: 38-48. 10.4278/0890-1171-12.1.38.

Kluger AN, DeNisi A: The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol Bull. 1996, 119: 254-284.

McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Haynes RB, Straus SE: A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: A Tower of Babel?. Implementation Science. 2010, 5: 1-11. 10.1186/1748-5908-5-1.

Rabin BA, Brownson RC, Joshu-Haire D, Kreuter MW, Weaver NL: A glossary of dissemination and implementation research in health. Journal of Public Health Management. 2008, 14: 117-123.

Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B: Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009, 36: 24-34. 10.1007/s10488-008-0197-4.

Stetler CB, McQueen L, Demakis J, Mittman BS: An organizational framework and strategic implementation for systems-level change to enhance research-based practice: QUERI series. Implementation Science. 2008, 3: 1-11. 10.1186/1748-5908-3-1.

Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011, 38: 4-23. 10.1007/s10488-010-0327-7.

Magnabosco JL: Innovations in mental health services implementation: A report on state-level data from the U.S. evidence-based practices project. Implementation Science. 2006, 1: 1-11. 10.1186/1748-5908-1-1.

Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A: Making psychological theory useful for implementing evidence based practice: A consensus approach. Qual Saf Health Care. 2005, 14: 26-33. 10.1136/qshc.2004.011155.

Grol R, Wensing M, Hulscher M, Eccles M: Theories on implementation of change in healthcare. Improving patient care: The implementation of change in clinical practice. Edited by: Grol R, Wensing M, Eccles M. 2005, Edinburgh: Elsevier, 15-40.

Grol R, Bosch MC, Hulscher MEJL, Eccles MP, Wensing M: Planning and studying improvement in patient care: The use of theoretical perspectives. Milbank Q. 2007, 85: 93-138. 10.1111/j.1468-0009.2007.00478.x.

Denis J-L, Lehoux P: Organizational theory. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Hoboken, NJ: Wiley-Blackwell, 215-225.

Graham ID, Tetroe J, KT Theories Group: Planned action theories. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Hoboken, NJ: Wiley-Blackwell, 185-195.

Hutchinson A, Estabrooks CA: Cognitive psychology theories of change. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Hoboken, NJ: Wiley-Blackwell, 196-205.

Hutchinson A, Estabrooks CA: Educational theories. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Hoboken, NJ: Wiley-Blackwell, 206-214.

Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB: Interventions in organizational and community context: A framework for building evidence on dissemination and implementation research. Adm Policy Ment Health. 2008, 35: 21-37. 10.1007/s10488-007-0144-9.

National Advisory Mental Health Council's Services Research and Clinical Epidemiology Workgroup: The road ahead: Research partnerships to transform services. 2006, Bethesda, Maryland: National Institute of Mental Health

Lindamer LA, Lebowitz B, Hough RL, Garcia P, Aguirre A, Halpain MC: Establishing an implementation network: Lessons learned from community-based participatory research. Implementation Science. 2009, 4 (17): 1-7.

Chen PG, Diaz N, Lucas G, Rosenthal MS: Dissemination of results in community-based participatory research. Am J Prev Med. 2010, 39: 372-378. 10.1016/j.amepre.2010.05.021.

Wallenstein N, Duran B: Community-based participatory research contributions to intervention research: The intersection of science and practice to improve health equity. Am J Public Health. 2010, 100: S40-S46. 10.2105/AJPH.2009.184036.

Kerner J, Rimer B, Emmons K: Dissemination research and research dissemination: How can we close the gap?. Health Psychol. 2005, 24: 443-446.

Brugha R, Varvasovszky Z: Stakeholder analysis: A review. Health Policy Plan. 2000, 15: 239-246. 10.1093/heapol/15.3.239.

Varvasovszky Z, Brugha R: How to do (or not to do) a stakeholder analysis. Health Policy Plan. 2000, 15: 338-345. 10.1093/heapol/15.3.338.

Chambers DA: Advancing the science of implementation: A workshop summary. Administration and Policy in Mental Health and Mental Health Services Research. 2008, 35: 3-10. 10.1007/s10488-007-0146-7.

Glasgow RE, Klesges LM, Dzewaltowski DA, Bull SS, Estabrooks P: The future of health behavior change research: What is needed to improve translation of research into health promotion practice. Ann Behav Med. 2004, 27: 3-12. 10.1207/s15324796abm2701_2.

Schoenwald SK, Hoagwood K: Effectiveness, transportability, and dissemination of interventions: What matters when?. Psychiatr Serv. 2001, 52: 1190-1197. 10.1176/appi.ps.52.9.1190.

Training institute for dissemination and implementation research in health. http://conferences.thehillgroup.com/OBSSRinstitutes/TIDIRH2011/index.html ,

Dearing J: Evolution of diffusion and dissemination theory. J Public Health Manag Pract. 2008, 14: 99-108.

Weiner BJ, Amick H, Lee S-YD: Conceptualization and measurement of organizational readiness for change: A review of the literature in health services research and other fields. Medical Care Research and Review. 2008, 65: 379-436. 10.1177/1077558708317802.

Stamatakis K: Measurement properties of a novel survey to assess stages of organizational readiness for evidence-based practice in community prevention programs. 4th Annual National Institutes of Health Conference on the Science of Dissemination and Implementation. 2011, Maryland: Bethesda

Gagnon M-P, Labarthe J, Legare F, Ouimet M, Estabrooks CA, Roch G, Ghandour EK, Grimshaw J: Measuring organizational readiness for knowledge translation in chronic care. Implementation Science. 2011, 6 (72): 1-10.

Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood KE, Mayberg S, Green P: Assessing the organizational social context (OSC) of mental health services: implications for research and practice. Adm Policy Ment Health. 2008, 35: 98-113. 10.1007/s10488-007-0148-5.

Larson E: A tool to assess barriers to adherence to hand hygiene guideline. Am J Infect Control. 2004, 32: 48-51. 10.1016/j.ajic.2003.05.005.

Grol R, Wensing M: What drives change? Barriers to and incentives for achieving evidence-based practice. Medical Journal of Australia. 2004, 180: S57-S60.

Légaré F: Assessing barriers and facilitators to knowledge use. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Hoboken, NJ: Wiley-Blackwell, 83-93.

Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud P-AC, Rubin HR: Why don't physicians follow clinical practice guidelines?. JAMA. 1999, 282: 1458-1465. 10.1001/jama.282.15.1458.

Forsner T, Hansson J, Brommels M, Wistedt AA, Forsell Y: Implementing clinical guidelines in psychiatry: A qualitative study of perceived facilitators and barriers. BMC Psychiatry. 2010, 10: 1-10. 10.1186/1471-244X-10-1.

Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, Callaghan J, Holter M: Barriers to evidence-based practice implementation: Results of a qualitative study. Community Ment Health J. 2010, 46: 112-118. 10.1007/s10597-009-9238-z.

Manuel JI, Mullen EJ, Fang L, Bellamy JL, Bledsoe SE: Preparing social work practitioners to use evidence-based practice: A comparison of experiences from an implementation project. Research on Social Work Practice. 2009, 19: 613-627. 10.1177/1049731509335547.

Chenot J-F, Scherer M, Becker A, Donner-Banzhoff N, Baum E, Leonhardt C, Kellar S, Pfingsten M, Hildebrandt J, Basler H-D, Kochen MM: Acceptance and perceived barriers of implementing a guideline for managing low back in general practice. Implementation Science. 2008, 3: 1-6. 10.1186/1748-5908-3-1.

Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC: Barriers to evidence-based decision making in public health: A national survey of chronic disease practitioners. Public Health Rep. 2010, 125: 736-742.

Wensing M, Grol R: Methods to identify implementation problems. Improving Patient Care: The implementation of change in clinical practice. Edited by: Grol R, Wensing M, Eccles M. 2005, Edinburgh: Elsevier, 109-120.

Funk SG, Champagne MT, Wiese RA, Tornquist EM: BARRIERS: The barriers to research utilization scale. Clinical Methods. 1991, 4: 39-45.

CAS   Google Scholar  

Grol R, Wensing M, Eccles M: Improving patient care: The implementation of change in clinical practice. 2005, Edinburgh: Elsevier

Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, York JL: A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2012, 69: 123-157. 10.1177/1077558711430690.

Straus S, Tetroe J, Graham ID: Knowledge translation in health care: Moving from evidence to practice. 2009, Hoboken, NJ: Wiley-Blackwell

Recommendations to improve reporting of the content of behaviour change interventions. http://interventiondesign.co.uk/ ,

Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney S: Publication guidelines for quality improvement in health care: Evolution of the SQUIRE project. Qual Saf Health Care. 2008, 17: i3-i9. 10.1136/qshc.2008.029066.

Equator Network. http://www.equator-network.org/ ,

Goeschel CA, Weiss WM, Pronovost PJ: Using a logic model to design and evaluate quality and patient safety improvement programs. 2012, 24: 330-337.

Implementation Research Institute. http://cmhsr.wustl.edu/Training/IRI/Pages/ImplementationResearchTraining.aspx ,

Criteria for peer review of D/I funding applications. Implementation Research Institute. Edited by: Mittman BS. 2010, St. Louis, Missouri

Mauskopf JA, Sullivan SD, Annemans L, Caro J, Mullins CD, Nuijten M, Orlewska E, Watkins J, Trueman P: Principles of good practice for budget impact analysis: Report of the ISPOR task force on good research practices: Budget impact analysis. Values in Health. 2007, 10: 336-347. 10.1111/j.1524-4733.2007.00187.x.

Raghavan R: The role of economic evaluation in dissemination and implementation research. Dissemination and implementation research in health: Translating science to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 94-113.

Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, Gasziou P, Ilott I, Kinmonth A-L, Leng G: An implementation research agenda. Implementation Science. 2009, 4: 1-7. 10.1186/1748-5908-4-1.

Glasgow RE: Critical measurement issues in translational research. Research on Social Work Practice. 2009, 19: 560-568. 10.1177/1049731509335497.

Wensing M, Weijden TVD, Grol R: Implementing guidelines and innovations in general practice: Which interventions are effective?. Br J Gen Pract. 1998, 48: 991-997.

Solberg LI, Brekke ML, Fazio CJ, Fowles J, Jacobsen DN, Kottke TE, Mosser G, O'Connor PJ, Ohnsorg KA, Rolnick SJ: Lessons from experienced guideline implementers: Attend to many factors and use multiple strategies. Journal on Quality Improvement. 2000, 26: 171-188.

Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L: Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004, 8 (6): 1-72.

Wensing M, Bosch M, Grol R: Selecting, tailoring, and implementing knowledge translation interventions. Knowledge Translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Oxford, UK: Wiley-Blackwell, 94-113.

Baker R, Camosso-Stefanovic J, Gilliss CL, Shaw EJ, Cheater F, Flottorp S, Robertson N: Tailored interventions to overcome identified barriers to change: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2010, CD005470-

Bartholomew LK, Parcel GS, Kok G, Gottlieb NH: Planning health promotion programs: An intervention mapping approach. 2011, San Francisco: Jossey-Bass

Michie S, van Stralen MM, West R: The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science. 2011, 6 (42): 1-11.

Kauth MR, Sullivan G, Blevins D, Cully JA, Landes RD, Said Q, Teasdale TA: Employing external facilitation to implement cognitive behavioral therapy in VA clinics: A pilot study. Implementation Science. 2010, 5 (75): 1-11.

Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, Silovsky JF, Hecht DB, Chaffin MJ: Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science. 2012, 7 (32): 1-9.

Aarons GA, Palinkas LA: Implementation of evidence-based practice in child welfare: Service provider perspectives. Administrative Policy in Mental Health & Mental Health Services Research. 2007, 34: 411-419. 10.1007/s10488-007-0121-3.

Landsverk J: Creating interdisciplinary research teams and using consultants. The field research survivors guide. Edited by: Stiffman AR. 2009, New York: Oxford University Press, 127-145.

Institute of Medicine: The state of quality improvement and implementation research: Workshop summary. 2007, Washington, DC: The National Academies Press

Zerhouni EA, Alving B: Clinical and Translational Science Awards: A framework for a national research agenda. Transl Res. 2006, 148: 4-5. 10.1016/j.lab.2006.05.001.

Proctor EK, Brownson RC: Measurement issues in dissemination and implementation research. Dissemination and implementation research in health: Translating research to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 261-280.

Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M: Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011, 38: 65-76. 10.1007/s10488-010-0319-7.

Landsverk J, Brown CH, Chamberlain P, Palinkas LA, Ogihara M, Czaja S, Goldhaber-Fiebert JD, Rolls-Reutz JA, Horwitz SM: Design and analysis in dissemination and implementation research. Dissemination and implementation research in health: Translating research to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 225-260.

Grid-enabled measures database. https://www.gem-beta.org/Public/Home.aspx ,

Instrument review project: A comprehensive review of dissemination and implementation science instruments. http://www.seattleimplementation.org/sirc-projects/sirc-instrument-project/ ,

Kraemer HC, Mintz J, Noda A, Tinklenberg J, Yesavage JA: Caution regarding the use of pilot studies to guide power calculations for study proposals. Arch Gen Psychiatry. 2006, 63: 484-489. 10.1001/archpsyc.63.5.484.

Institute of Medicine: Improving the quality of health care for mental and substance-use conditions. 2006, Washington, DC: National Academy Press

Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B: Implementation of evidence-based practice in behavioral health: Agency director perspectives. Adm Policy Ment Health. 2007, 34: 479-488. 10.1007/s10488-007-0129-8.

Raghavan R, Bright CL, Shadoin AL: Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science. 2008, 3: 1-9. 10.1186/1748-5908-3-1.

Jilcott S, Ammerman A, Sommers J, Glasgow RE: Applying the RE-AIM framework to assess the public health impact of policy change. Ann Behav Med. 2007, 34: 105-114. 10.1007/BF02872666.

Download references

Acknowledgements

Preparation of this paper was supported in part by National Center for Research Resources through the Dissemination and Implementation Research Core of Washington University in St. Louis’ Institute of Clinical and Translational Sciences (NCRR UL1 RR024992) and the National Institute of Mental Health through the Center for Mental Health Services Research (NIMH P30 MH068579), the Implementation Research Institute (NIMH R25 MH080916), and a Ruth L. Kirschstein National Research Service Award (NIMH T32 RR024992). An earlier version of this paper was an invited presentation at an early investigator workshop, held at the 4 th Annual National Institutes of Health Conference on Advancing the Science of Dissemination and Implementation on March 22, 2011 in Bethesda, Maryland.

Author information

Authors and affiliations.

Center for Mental Health Services Research, George Warren Brown School of Social Work, Washington University in St. Louis, Campus Box 1196, One Brookings Drive, St. Louis, MO, 63130, USA

Enola K Proctor, Byron J Powell, Ana A Baumann, Ashley M Hamilton & Ryan L Santens

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Enola K Proctor .

Additional information

Competing interests.

The authors declare that they have no competing interests.

Authors’ contributions

EKP conceived the idea for this paper and led the writing. BJP, AAB, AMH, and RLS contributed to the conceptualization, literature review, and the writing of this manuscript. All authors read and approved the final manuscript.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article.

Proctor, E.K., Powell, B.J., Baumann, A.A. et al. Writing implementation research grant proposals: ten key ingredients. Implementation Sci 7 , 96 (2012). https://doi.org/10.1186/1748-5908-7-96

Download citation

Received : 21 February 2012

Accepted : 04 October 2012

Published : 12 October 2012

DOI : https://doi.org/10.1186/1748-5908-7-96

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementation research
  • Grant writing
  • Preliminary studies

Implementation Science

ISSN: 1748-5908

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

writing implementation research grant proposals

Writing implementation research grant proposals: ten key ingredients

Affiliation.

  • 1 Center for Mental Health Services Research, George Warren Brown School of Social Work, Washington University in St, Louis, St, Louis, MO 63130, USA. [email protected]
  • PMID: 23062065
  • PMCID: PMC3541090
  • DOI: 10.1186/1748-5908-7-96

Background: All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.

Discussion: This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

Summary: Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project's feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

Publication types

  • Research Support, N.I.H., Extramural
  • Cooperative Behavior
  • Evidence-Based Medicine
  • Financing, Organized*
  • Health Services Research / organization & administration*
  • Organizational Innovation
  • Research Design*
  • Translational Research, Biomedical / organization & administration*

Grants and funding

  • UL1 RR024992/RR/NCRR NIH HHS/United States
  • TL1 TR000449/TR/NCATS NIH HHS/United States
  • P30 MH068579/MH/NIMH NIH HHS/United States
  • R25 MH080916/MH/NIMH NIH HHS/United States
  • P30 DK092950/DK/NIDDK NIH HHS/United States
  • T32 RR024992/RR/NCRR NIH HHS/United States

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Writing implementation research grant proposals: ten key ingredients

Profile image of Ryan Santens

BackgroundAll investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.DiscussionThis article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the applic...

Related Papers

Ashley Hamilton

writing implementation research grant proposals

Journal of Speech Language and Hearing Research

Lesley Olswang

Purpose This article introduces implementation science, which focuses on research methods that promote the systematic application of research findings to practice. Method The narrative defines implementation science and highlights the importance of moving research along the pipeline from basic science to practice as one way to facilitate evidence-based service delivery. This review identifies challenges in developing and testing interventions in order to achieve widespread adoption in practice settings. A framework for conceptualizing implementation research is provided, including an example to illustrate the application of principles in speech-language pathology. Last, the authors reflect on the status of implementation research in the discipline of communication sciences and disorders. Conclusions The extant literature highlights the value of implementation science for reducing the gap between research and practice in our discipline. While having unique principles guiding implemen...

Implementation Science

Monica Matthieu , Thomas J Waltz

Pasquale Roberge , Martin Provencher

Implementation science : IS

Implementation science has progressed towards increased use of theoretical approaches to provide better understanding and explanation of how and why implementation succeeds or fails. The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection and application of relevant approaches in implementation research and practice and to foster cross-disciplinary dialogue among implementation researchers. Theoretical approaches used in implementation science have three overarching aims: describing and/or guiding the process of translating research into practice (process models); understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories, implementation theories); and evaluating implementation (evaluation frameworks). This article proposes five categories of theoretical approaches to achieve three overarchin...

Beverley Slater , Natalie Taylor

Signe Flottorp

Systematic Reviews

Joëlle Dufour

Background Knowledge translation (KT) is an important means of improving the health service quality. Most research on the effectiveness of KT strategies has focused on individual strategies, i.e., those directly targeting the modification of allied health professionals’ knowledge, attitudes, and behaviors, for example. In general, these strategies are moderately effective in changing practices (maximum 10% change). Effecting change in organizational contexts (e.g., change readiness, general and specific organizational capacity, organizational routines) is part of a promising new avenue to service quality improvement through the implementation of evidence-based practices. The objective of this study will be to identify why, how, and under what conditions organizational KT strategies have been shown to be effective or ineffective in changing the (a) knowledge, (b) attitudes, and (c) clinical behaviors of allied health professionals in traumatology settings. Methods This is a realist r...

RELATED PAPERS

Louise Lemieux-Charles

Graeme Currie

Psychological Services

Michael Kauth

Ann Macaulay

Sandra Fielden , Louise Fitzgerald , David Bamford

Children and Youth Services Review

Louise Hamelin Brabant

Gregory Aarons

Lisa Rubenstein

Susan Kirsh

Evelyn Cornelissen

David Armstrong

Jo Rycroft-Malone

Antoine Boivin

Christian Helfrich

Implementation Science Communications

Lisa Saldana

Jonathan Newbury

Melanie Barwick , Cameo Stanick

Journal of Nursing Management

Ania Willman , Boel Sandström

Bernie Pauly

Bodoor Alshehri

Nursing Leadership

Mindy Flanagan

Marie-pascale Pomey

Clinical and translational science

Maureen Dobbins

Francine Ducharme

Healthcare Policy | Politiques de Santé

Dennis H Li

Wim Verstappen , Ron Winkens , Jasper Trietsch

Simon Carroll

Christian Helfrich , Phil Ullrich

Kate O'Donnell

Philippa Garety

Cathleen Willging

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Research Profiles at Washington University School of Medicine Logo

  • Help & FAQ

Writing implementation research grant proposals: Ten key ingredients

  • Division of Public Health Sciences
  • Institute of Clinical and Translational Sciences (ICTS)

Research output : Contribution to journal › Article › peer-review

Background: All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.Discussion: This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.Summary: Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project's feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

  • Grant writing
  • Implementation research
  • Preliminary studies

Access to Document

  • 10.1186/1748-5908-7-96

Other files and links

  • Link to publication in Scopus

Fingerprint

  • research implementation Social Sciences 100%
  • Organized Financing Medicine & Life Sciences 92%
  • grant Social Sciences 64%
  • Research Design Medicine & Life Sciences 62%
  • competitive proposal Social Sciences 47%
  • Research Personnel Medicine & Life Sciences 43%
  • applicant Social Sciences 24%
  • Implementation Science Medicine & Life Sciences 23%

T1 - Writing implementation research grant proposals

T2 - Ten key ingredients

AU - Proctor, Enola K.

AU - Powell, Byron J.

AU - Baumann, Ana A.

AU - Hamilton, Ashley M.

AU - Santens, Ryan L.

N1 - Funding Information: Ten key ingredients of a competitive implementation research grant application Funding Information: Writing implementation research grant proposals: ten key ingredients

PY - 2012/10/12

Y1 - 2012/10/12

N2 - Background: All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.Discussion: This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.Summary: Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project's feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

AB - Background: All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.Discussion: This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.Summary: Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project's feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

KW - Grant writing

KW - Implementation research

KW - Preliminary studies

UR - http://www.scopus.com/inward/record.url?scp=84867318086&partnerID=8YFLogxK

U2 - 10.1186/1748-5908-7-96

DO - 10.1186/1748-5908-7-96

M3 - Article

C2 - 23062065

AN - SCOPUS:84867318086

SN - 1748-5908

JO - Implementation Science

JF - Implementation Science

PRDV009: Writing Grant Proposals (2020.A.01)

Ten key ingredients for writing research grant proposals.

Review this academic article that specifies ten key ingredients for scientific proposals, which are relevant to grants in other fields as well.

All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.

This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project's feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

Investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Researchers need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, most investigators may feel 'new to the field'. Furthermore, young investigators may have less preliminary data, and the path to successful proposal writing may seem less clear.

This article identifies ten of the important ingredients in well-crafted implementation proposals; in particular, it addresses how investigators can set the stage for proposed work through pilot data and a well-crafted and rationalized proposed study approach. It addresses questions such as: What preliminary work is important in the grant applications, and how can implementation researchers meet this challenge? How can investigators balance scientific impact with feasibility? Where in an implementation research proposal can investigators demonstrate their capacity to conduct a study as proposed?

The Importance of the Question

A significant and innovative research question is the first and primary ingredient in a successful proposal. A competitive implementation research application needs to pursue scientific questions that remain unanswered, questions whose answers advance knowledge of implementation with generalizability beyond a given setting. By definition, implementation research in health focuses on a health condition or disease, healthcare settings, and particular evidence-based interventions and programs with promise of reducing a gap in quality of care. It is conducted in usual care settings with practical quality gaps that stakeholders want to reduce.

However, to make a compelling argument for scientific innovation and public health significance, a research grant application must have potential beyond reducing a quality gap and implementing a particular evidence-based healthcare practice. The application must have potential to advance the science of implementation by yielding generalizable knowledge. With only one journal devoted solely to implementation science, researchers must be aware of implementation literature that is scattered across a host of discipline-specific journals. Implementation researchers – akin to students with multiple majors – must demonstrate their grounding in implementation science, health diseases, disorders and their treatments, and real-world healthcare delivery.

Although implementation science is often characterized as an emerging field, its bar for scientifically important questions is rising rapidly. Descriptive studies of barriers have dominated implementation science for too long, and the field is urged to 'move on' to questions of how and why implementation processes are effective. Accordingly, the Institute of Medicine has identified studies comparing the effectiveness of alternative dissemination and implementation strategies as a top-quartile priority for comparative effectiveness research. But experimental studies testing implementation strategies need to be informed by systematic background research on the contexts and processes of implementation. While investigators must demonstrate their understanding of these complexities, their grant proposals must balance feasibility with scientific impact. This paper addresses the challenges of preparing grant applications that succeed on these fronts. Though this article focuses on U.S. funding sources and grant mechanisms, the principles that are discussed should be relevant to implementation researchers internationally.

Guidance from Grant Program Announcements

Grant review focuses on the significance of proposed aims, impact and innovation, investigator capacity to conduct the study as proposed, and support for the study hypotheses and research design. The entire application should address these issues. Investigators early in their research careers or new to implementation science often struggle to demonstrate their capacity to conduct the proposed study and the feasibility of the proposed methods. Not all National Institutes of Health (NIH) program announcements require preliminary data. However, those that do are clear that applications must convey investigator training and experience, capacity to conduct the study as proposed, and support for the study hypotheses and research design. The more complex the project, the more important it is to provide evidence of capacity and feasibility.

The R01grant mechanism is typically large in scope compared to the R03, R21 and R34 a . Program announcements for grant mechanisms that are preliminary to R01 studies give important clues as to how to set the stage for an R01 and demonstrate feasibility. Investigator capacity can be demonstrated by describing prior work, experience, and training relevant to the application's setting, substantive issues, and methodology – drawing on prior employment and research experience. For example, the NIH R03 small grant mechanism is often used to establish the feasibility of procedures, pilot test instruments, and refine data management procedures to be employed in a subsequent R01.

The NIH R21 and the R34 mechanisms support the development of new tools or technologies; proof of concept studies; early phases of research that evaluate the feasibility, tolerability, acceptability and safety of novel treatments; demonstrate the feasibility of recruitment protocols; and support the development of assessment protocols and manuals for programs and treatments to be tested in subsequent R01 studies. These exploratory grants do not require extensive background material or preliminary information, but rather serve as sources for gathering data for subsequent R01 studies. These grant program announcements provide a long list of how pre-R01 mechanisms can be used, and no single application can or should provide all the stage-setting work exemplified in these descriptions.

Review criteria, typically available on funding agency web sites or within program announcements, may vary slightly by funding mechanism. However, grants are typically reviewed and scored according to such criteria as: significance, approach (feasibility, appropriateness, robustness), impact, innovation, investigator team, and research environment.

Table 1 summarizes the ten ingredients, provides a checklist for reviewing applications prior to submission, and ties each ingredient to one or more of the typical grant review criteria.

Table 1 Ten Key ingredients for Implementation Research Proposals

The literature does not provide a '. . . a comprehensive, prescriptive, and robust-yet practical-model to help…researchers understand (the) factors need to be considered and addressed' in an R01 study. Therefore we examined a variety of sources to identify recommendations and examples of background work that can strengthen implementation research proposals. This paper reflects our team's experience with early career implementation researchers, specifically through training programs in implementation science and our work to provide technical assistance in implementation research through our university's Clinical and Translational Science Award CTSA program. We also studied grant program announcements, notably the R03, R21, R18, and R01 program announcements in implementation science. We studied how successful implementation research R01 grant applications 'set the stage' for the proposed study in various sections of the proposal. We conducted a literature search using combinations of the following key words: 'implementation research', 'implementation studies', 'preliminary studies', 'preliminary data', 'pilot studies', 'pilot data', 'pilot', 'implementation stages', 'implementation phases', and 'feasibility'. We also drew on published studies describing the introduction and testing of implementation strategies and those that characterize key elements and phases of implementation research.

From these reviews, we identified ten ingredients that are important in all implementation research grants: the gap between usual care and evidence-based care; the background of the evidence-based treatment to be implemented, its empirical base, and requisites; the theoretical framework for implementation and explicit theoretical justification for the choice of implementation strategies; information about stakeholders' (providers, consumers, policymakers) treatment priorities; the setting's (and providers') readiness to adopt new treatments; the implementation strategies planned or considered in order to implement evidence-based care; the study team's experience with the setting, treatment, or implementation process and the research environment; the feasibility and requisites of the proposed methods; the measurement and analysis of study variables; and the health delivery setting's policy/funding environment, leverage or support for sustaining change.

Given the sparse literature on the importance of preliminary studies for implementation science grant applications, we 'vetted' our list of grant application components with a convenience sample of experts. Ultimately, nine experts responded to our request, including six members of the Implementation Science editorial board. We asked the experts to rate the importance of each of the ten elements, rating them as '1: Very important to address this is the application', '2: Helpful but not necessary to the application', or '3: Not very important to address' within the context of demonstrating investigator capacity and study feasibility. Respondents were also asked whether there are any additional factors that were not listed.

While all the ten ingredients below were considered important for a successful application, several experts noted that their importance varies according to the aims of the application. For example, one expert affirmed the importance of the settings' readiness to change, but noted that it may not be crucial to address in a given proposal: 'the setting's readiness may be unimportant to establish or report prior to the study, because the study purpose may be to establish an answer to this question'. However, another maintained, 'in a good grant application, you have to dot all the 'I's' and cross all the 'T's'. I consider all these important'.

One expert noted that applications might need to argue the importance of implementation research itself, including the importance of closing or reducing gaps in the quality of care. This was viewed as particularly important when the study section to review the grant may not understand or appreciate implementation research. In these cases, it may be important to define and differentiate implementation research from other types of clinical and health services research.

For example, it may be useful to situate one's proposal within the Institute of Medicine's 'prevention research cycle', which demonstrates the progression from pre-intervention, efficacy, and effectiveness research to dissemination and implementation studies that focus on the adoption, sustainability, and scale-up of interventions. It may also be important to convey that implementation research is very complex, necessitating the use of multiple methods, a high degree of stakeholder involvement, and a fair amount of flexibility in order to ensure that implementers will be able to respond appropriately to unforeseen barriers.

Ten Key Ingredients of a Competitive Implementation Research Grant Application

As emphasized at the beginning of this article, the essential ingredient in a successful implementation science proposal is a research question that is innovative and, when answered, can advance the field of implementation science. Assuming that an important question has been established to potential reviewers, we propose that the following ten ingredients can help investigators demonstrate their capacity to conduct the study and to demonstrate the feasibility of completing the study as proposed. For each ingredient, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

1. The Care Gap, or Quality Gap, Addressed in the Application

The primary rationale for all implementation efforts, and thus a key driver in implementation science, is discovering how to reduce gaps in healthcare access, quality, or, from a public health perspective, reducing the gap between Healthy People 2020 goals and current health status. Accordingly, implementation research proposals should provide clear evidence that gaps exists and that there is room for improvement and impact through the proposed implementation effort. This is a primary way of demonstrating the public health significance of the proposed work.

Gaps in the quality of programs, services, and healthcare can be measured and documented at the population-, organization-, and provider-levels. Several kinds of preliminary data can demonstrate the quality gap to be reduced through the proposed implementation effort. For example, investigators can emphasize the burden of disease through data that reflect its morbidity, mortality, quality of life, and cost. An implementation research grant should cite service system research that demonstrates unmet need, the wide variation in the use of evidence-based treatments in usual care, or the association between the burden of disease and variations in the use of guidelines.

Investigators can also document that few providers adopt evidence-based treatments, that evidence-based treatments or programs have limited reach, or that penetration into a system of care can be addressed by the implementation study. Regardless of the specific approach to documenting a quality gap, investigators should use rigorous methods and involve all relevant stakeholders. In fact, stakeholders can demonstrate their involvement and endorse quality gaps through letters of support attesting to the lack of evidence-based services in usual care.

2. The Evidence-Based Treatment to be Implemented

A second key ingredient in implementation research proposals is the evidence-based program, treatment, policies, or set of services whose implementation will be studied in the proposed research. The research 'pipeline' contains many effective programs and treatments in a backlog, waiting to be implemented. Moreover, many health settings experience a huge demand for better care. An appropriate evidence-based treatment contributes to the project's public health significance and practical impact, presuming of course that it will be studied in a way that contributes to implementation science.

Implementation research proposals must demonstrate that the evidence-based service is ready for implementation. The strength of the empirical evidence for a given guideline or treatment, a key part of 'readiness', can be demonstrated in a variety of ways; in some fields, specific thresholds must be met before an intervention is deemed 'evidence-based' or 'empirically-supported'. For example, Chambless et al.  suggest that interventions should demonstrate efficacy by being shown to be superior to placebos or to another treatment in at least two between group design experiments; or by showing efficacy in a large series of single case design experiments. Further, Chambless et al.  note that the experiments must have been conducted with treatment manuals, the characteristics of the samples must have been clearly specified, and the effects must have been demonstrated by at least two different investigators or investigative teams.

The strength of evidence for a given treatment can also be classified using the Cochrane EPOC's criteria for levels of evidence, which considers randomized controlled trials, controlled clinical trials, time series designs, and controlled before-and-after studies as appropriate. Researchers who come to implementation research as effectiveness researchers or as program or treatment developers are well positioned, because they can point to their prior research as part of their own background work. Other researchers can establish readiness for implementation by reviewing evidence for the treatment or program as part of the background literature review, preferably relying on well-conducted systematic reviews and meta-analyses of randomized-controlled trials (if available). At a minimum, 'evaluability assessment'  can help reflect what changes or improvements are needed to optimize effectiveness given the context of the implementation effort.

3. Conceptual Model and Theoretical Justification

Any research striving for generalizable knowledge should be guided by and propose to test conceptual frameworks, models, and theories. Yet, theory has been drastically underutilized and underspecified in implementation research. For example, in a review of 235 implementation studies, less than 25% of the studies employed theory in any way, and only 6% were explicitly theory-based. While translating theory into research design is not an easy task, the absence of theory in implementation research has limited our ability to specify key contextual variables and to identify the precise mechanisms by which implementation strategies exert their effects.

McDonald et al.  present a useful hierarchy of theories and models, which serves to organize the different levels of theory and specify the ways in which they can be useful in implementation research. They differentiate between conceptual models, frameworks, and systems, which are used to represent global ideas about a phenomenon and theory, which is an 'organized, heuristic, coherent, and systematic set of statements related to significant questions that are communicated in a meaningful whole'. Within the realm of theory, they differentiate between grand or macro theories ( e.g. , Rogers' Diffusion of Innovations theory, mid-range theories ( e.g. , transtheoretical model of change), and micro-theories ( e.g. , feedback intervention theory). Though models, frameworks, and systems are generally at a higher level of abstraction than theories, it is important to note that the level of abstraction varies both between and within the categories of the hierarchy. The thoughtful integration of both conceptual models and theories can substantially strengthen an application.

Conceptual models, frameworks, and systems can play a critical role in anchoring a research study theoretically by portraying the key variables and relationships to be tested. Even studies that address only a subset of variables within a conceptual model need to be framed conceptually, so that reviewers perceive the larger context (and body of literature) that a particular study proposes to inform. Given the confusion surrounding definitions and terminology within the still-evolving field of dissemination and implementation, grant proposals need to employ consistent language, clear definitions for constructs, and the most valid and reliable measures for the constructs that correspond to the guiding conceptual framework or theoretical model.

Proposal writers should be cautioned that the theory or conceptual model used to frame the study must be used within the application. A mere mention will not suffice. A conceptual model can help frame study questions and hypotheses, anchor the background literature, clarify the constructs to be measured, and illustrate the relationships to be evaluated or tested. The application must also spell out how potential findings will inform the theory or model.

Numerous models and frameworks can inform implementation research. For example, Glasgow et al.  RE-AIM framework can inform evaluation efforts in the area of implementation science. Similarly, Proctor et al.  have proposed a model that informs evaluation by differentiating implementation, service system, and clinical outcomes, and identifying a range of implementation outcomes that can be assessed. Damschroder et al. 's Consolidated Framework for Implementation Research identifies five domains that are critical to successful implementation:  

  • Intervention characteristics (evidentiary support, relative advantage, adaptability, trialability, and complexity);
  • The outer setting (patient needs and resources, organizational connectedness, peer pressure, external policy and incentives);
  • The inner setting (structural characteristics, networks and communications, culture, climate, readiness for implementation);
  • The characteristics of the individuals involved (knowledge, self-efficacy, stage of change, identification with organization, etc.); and
  • The process of implementation (planning, engaging, executing, reflecting, evaluating).

Others have published stage or phase models of implementation. For example, the Department of Veteran Affairs' QUERI initiative specifies a four-phase model spanning pilot projects, small clinical trials, regional implementation, and implementation on the national scale; and Aarons, Hurlburt and Horwitz  developed a four phase model of exploration, adoption/preparation, active implementation, and sustainment. Magnabosco delineates between pre-implementation, initial implementation, and sustainability planning phases.

McDonald et al.  note that grand theories are similar to conceptual models, and that they generally represent theories of change. They differentiate between classical models of change that emphasize natural or passive change processes, such as Rogers' diffusion of innovations theory, and planned models of change that specify central elements of active implementation efforts. Investigators may find it more helpful to draw from mid-range theories because they discuss the mechanisms of change at various levels of the implementation context.

For example, social psychological theories, organizational theories, cognitive psychology theories, educational theories, and a host of others may be relevant to the proposed project. While conceptual models are useful in framing a study theoretically and providing a 'big picture' of the hypothesized relationships between variables, mid-range theories can be more helpful in justifying the selection of specific implementation strategies specifying the mechanisms by which they may exert their effects. Given the different roles that theory can play in implementation research, investigators would be wise to consider relevant theories at multiple levels of the theoretical hierarchy when preparing their proposals. It is far beyond the scope of this article to review conceptual models and theories in detail; however, several authors have produced invaluable syntheses of conceptual models and theories that investigators may find useful.

4. Stakeholder Priorities and Engagement in Change

Successful implementation of evidence-based interventions largely depends on their fit with the preferences and priorities of those who shape, deliver, and participate in healthcare. Stakeholders in implementation, and thus in implementation research, include treatment or guideline developers, researchers, administrators, providers, funders, community-based organizations, consumers, families, and perhaps legislators who shape reimbursement policies (see Mendel et al. ' article for a framework that outlines different levels of stakeholders). These stakeholders are likely to vary in their knowledge, perceptions, and preferences for healthcare. Their perspectives contribute substantially to the context of implementation and must be understood and addressed if the implementation effort is to succeed.

A National Institute of Mental Health Council workgroup report calls for the engagement of multiple stakeholder perspectives, from concept development to implementation, in order to improve the sustainability of evidence-based services in real-world practice. The engagement of key stakeholders in implementation research affects both the impact of proposed implementation efforts, the sustainability of the proposed change, and the feasibility and ultimate success of the proposed research project. Thus, implementation research grant proposals should convey the extent and manner in which key stakeholders are engaged in the project.

Stakeholders and researchers can forge different types of collaborative relationships. Lindamer et al.  describe three different approaches researchers and stakeholders can take that vary with respect to the level of participation of the stakeholders and community in decisions about the research. In the 'community-targeted' approach, stakeholders are involved in recruitment and in the dissemination of the results. In the 'community-based' approach, stakeholders participate in the selection of research topics, but the researcher makes the final decision on the study design, methodology, and analysis of data. Finally, the 'community-driven' approach or community-based participatory research (CBPR) approach entails participation of the stakeholders in all aspects of the research. Some authors advocate for the CBPR model as a strategy to decrease the gap between research and practice because it addresses some of the barriers to implementation and dissemination by enhancing the external validity of the research and promoting the sustainability of the intervention. Kerner et al.  note:

'When community-based organizations are involved as full partners in study design, implementation, and evaluation of study findings, these organizations may be more amenable to adopting the approaches identified as being effective, as their tacit knowledge about 'what works' would have been evaluated explicitly through research'.

Stakeholder analysis can be carried out to evaluate and understand stakeholders' interests, interrelations, influences, preferences, and priorities. The information gathered from stakeholder analysis can then be used to develop strategies for collaborating with stakeholders, to facilitate the implementation of decisions or organizational objectives, or to understand the future of policy directions.

Implementation research grant applications are stronger when preliminary data, qualitative or quantitative, reflect stakeholder preferences around the proposed change. Engagement is also reflected in publications that the principal investigator (PI) and key stakeholders have shared in authorship, or methodological details that reflect stakeholder priorities. Letters of support are a minimal reflection of stakeholder investment in the proposed implementation project.

5. Context: Setting's Readiness to Adopt New Services, Treatments, Programs

Implementation research proposals are strengthened by information that reflects the setting's readiness, capacity, or appetite for change, specifically around adoption of the proposed evidence-based treatment. This is not to say that all implementation research should be conducted in settings with high appetite for change. Implementation research is often criticized for disproportionate focus on settings that are eager and ready for change. 'Cherry picking' sites, where change is virtually guaranteed, or studying implementation only with eager and early adopters, does not produce knowledge that can generalize to usual care, where change is often challenging. The field of implementation science needs information about the process of change where readiness varies, including settings where change is resisted.

Preliminary data on the organizational and policy context and its readiness for change can strengthen an application. Typically viewed as 'nuisance' variance to be controlled in efficacy and effectiveness research, contextual factors are key in implementation research. The primacy of context is reflected in the choice of 'it's all about context' as a theme at the 2011 NIH Training Institute in Dissemination and Implementation Research in Health. Because organization, policy, and funding context may be among the strongest influences on implementation outcomes, context needs to be examined front and center in implementation research. A number of scales are available to capture one key aspect of context, the setting's readiness or capacity for change. Weiner et al.  extensive review focusing on the conceptualization and measurement of organizational readiness for change identified 43 different instruments; though, they acknowledged substantial problems with the reliability and validity of many of the measures. Due in part to issues with reliability and validity of the measures used in the field, work in this area is ongoing.

Other approaches to assessing readiness have focused on organizational culture, climate, and work attitudes, and on providers' attitudes towards evidence-based practices. Furthermore, a prospective identification of implementation barriers and facilitators can be helpful in demonstrating readiness to change, increasing reviewers' confidence that the PI has thoroughly assessed the implementation context, and informing the selection of implementation strategies (discussed in the following section). An evaluation of barriers and facilitators can be conducted through qualitative or survey methodology. In fact, a number of scales for measuring implementation barriers have been developed . Letters from agency partners or policy makers, while weaker than data, can also be used to convey the setting's readiness and capacity for change. Letters are stronger when they address the alignment of the implementation effort to setting or organizational priorities or to current or emergent policies.

6. Implementation Strategy/Process

Though the assessment of implementation barriers can play an important role in implementation research, the 'rising bar' in the field demands that investigators move beyond the study of barriers to research that generates knowledge about the implementation processes and strategies that can overcome them. Accordingly, the NIH has prioritized efforts to 'identify, develop, and refine effective and efficient methods, structures, and strategies to disseminate and implement' innovations in healthcare.

A number of implementation strategies have been identified and discussed in the literature. However, as the Improved Clinical Effectiveness through Behavioural Research Group notes, the most consistent finding from systematic reviews of implementation strategies is that most are effective some, but not all of the time, and produce effect sizes ranging from no effect to a large effect. Our inability to determine how, why, when, and for whom these strategies are effective is hampered in large part by the absence of detailed descriptions of implementation strategies, the use of inconsistent language, and the lack of clear theoretical justification for the selection of specific strategies.

Thus, investigators should take great care in providing detailed descriptions of implementation strategies to be observed or empirically tested. Implementation Science has endorsed the use of the WIDER Recommendations to Improve Reporting of the Content of Behaviour Change Interventions as a means of improving the conduct and reporting of implementation research, and these recommendations will undoubtedly be useful to investigators whose proposals employ implementation strategies.

Investigators may also find the Standards for Quality Improvement Reporting Excellence (SQUIRE) helpful. Additional design specific reporting guidelines can be found on the Equator Network website. The selection of strategies must be justified conceptually by drawing upon models and frameworks that outline critical implementation elements. Theory should be used to explain the mechanisms through which implementation strategies are proposed to exert their effects, and it may be helpful to clarify the proposed mechanisms of change through the development of a logic model and illustrate the model through a figure.

According to Brian Mittman, in addition to being theory-based, implementation strategies should be: multifaceted or multilevel (if appropriate); robust or readily adaptable; feasible and acceptable to stakeholders; compelling, saleable, trialable, and observable; sustainable; and scalable . We therefore emphasize taking stock of the budget impact of implementation strategies as well as any cost and cost-effectiveness data related to the implementation strategies. Although budget impact is a key concern to administrators and some funding agencies require budget impact analysis, implementation science to date suffers a dearth of economic evaluations from which to draw.

The empirical evidence for the effectiveness of multifaceted strategies has been mixed, because early research touted the benefits of multifaceted strategies, while a systematic review of 235 implementation trials by Grimshaw et al. found no relationship between the number of component interventions and the effects of multifaceted interventions. However, Wensing et al.  note that while multifaceted interventions were assumed to address multiple barriers to change, many focus on only one barrier.

For example, providing training and consultation is a multifaceted implementation strategy; however, it primarily serves to increase provider knowledge, and does not address other implementation barriers. Thus, Wensing et al.  argue that multifaceted interventions could be more effective if they address different types of implementation barriers ( e.g. , provider knowledge and the organizational context). While the methods for tailoring clinical interventions and implementation strategies to local contexts need to be improved, intervention mapping and a recently developed 'behaviour change wheel' are two promising approaches.

Proposals that employ multifaceted and multilevel strategies that address prospectively identified implementation barriers may be more compelling to review committees, but mounting complex experiments may be beyond the reach of many early-stage investigators and many grant mechanisms. However, it is within the scope of R03, R21, and R34 supported research to develop implementation strategies and to conduct pilot tests of their feasibility and acceptability – work that can strengthen the case for sustainability and scalability. Proposal writers should provide preliminary work for implementation strategies in much the same way that intervention developers do, such as by providing manuals or protocols to guide their use, and methods to gauge their fidelity. Such work is illustrated in the pilot study conducted by Kauth et al. , which demonstrated that an external facilitation strategy intended to increase the use of cognitive behavioral therapy within Veteran Affairs clinics was a promising and low-cost strategy; such pilot data would likely bolster reviewers' confidence that the strategy is feasible, scalable, and ultimately, sustainable. Investigators should also make plans to document any modifications to the intervention and, if possible, incorporate adaptation models to the implementation process, because interventions are rarely implemented without being modified.

While providing detailed specification of theory-based implementation strategies is critical, it is also imperative that investigators acknowledge the complexity of implementation processes. Aarons and Palinkas comment:

'It is unrealistic to assume that implementation is a simple process, that one can identify all of the salient concerns, be completely prepared, and then implement effectively without adjustments. It is becoming increasingly clear that being prepared to implement EBP means being prepared to evaluate, adjust, and adapt in a continuing process that includes give and take between intervention developers, service system researchers, organizations, providers, and consumers'.

Ultimately, proposals that reflect the PI's understanding of the complexity of the process of implementing evidence-based practices and that provide supporting detail about strategies and processes will be perceived as more feasible to complete through the proposed methods.

7. Team Experience with the Setting, Treatment, Implementation Process, and Research Environment

Grant reviewers are asked to specifically assess a PI's capacity to successfully complete a proposed study. Grant applications that convey the team's experience with the study setting, the treatment whose implementation is being studied, and implementation processes help convey capacity and feasibility to complete an implementation research project.

The reader should observe that NIH gives different scores for the team experience with the setting and for the research environment (http://grants.nih.gov/grants/writing_application.htm) but the purpose of both sections is demonstrating capacity to successfully carry out the study as proposed. Investigators can convey capacity through a variety of ways. Chief among them is building a strong research team, whose members bring depth and experience in areas the PI does not yet have. Implementation research exemplifies multidisciplinary team science, informed by a diverse range of substantive and methodological fields.

A team that brings the needed disciplines and skill sets directly to the project enhances the project's likelihood of success. Early-stage implementation researchers who collaborate or partner with senior investigators reassure reviewers that the proposed work will benefit from the senior team member's experience and expertise. Similarly, collaborators play important roles in complementing, or rounding out, the PI's disciplinary perspective and methodological skill set. Early career investigators, therefore, should surround themselves with more established colleagues who bring knowledge and experience in areas key to the study aims and methods.  The narrative should cite team members' relevant work, and their prior work can be addressed in a discussion of preliminary studies. Additionally, the new formats for NIH biosketches and budget justifications enable a clear portrayal of what each team member brings to the proposed study.

For the NIH applications, the research environment is detailed in the resources and environment section of a grant application. Here, an investigator can describe the setting's track record in implementation research; research centers, labs, and offices that the PI can draw on; and structural and historic ties to healthcare settings. For example, a PI can describe how their project will draw upon the University's CTSA program, statistics or design labs, established pools of research staff, and health services research centers. Preliminary studies and biosketches provide additional ways to convey the strengths of the environment and context within which an investigator will launch a proposed study.

In summary, researchers need to detail the strengths of the research environment, emphasizing in particular the resources, senior investigators, and research infrastructure that can contribute to the success of the proposed study. A strong research environment is especially important for implementation research, which is typically team-based, requires expertise of multiple disciplines, and requires strong relationships between researchers and community based health settings. Investigators who are surrounded by experienced implementation researchers, working in a setting with strong community ties, and drawing on experienced research staff can inspire greater confidence in the proposed study's likelihood of success.

8. Feasibility of Proposed Research Design and Methods

One of the most important functions of preliminary work is to demonstrate the feasibility of the proposed research design and methods. Landsverk  urges PIs to consider every possible question reviewers might raise, and to explicitly address those issues in the application. Data from small feasibility studies or pilot work around referral flow; participant entry into the study; participant retention; and the extent to which key measures are understood by participants, acceptable for use, and capture variability can demonstrate that the proposed methods are likely to work.

The methods section should contain as much detail as possible, as well as lay out possible choice junctures and contingencies, should methods not work as planned. It is not only important to justify methodological choices, but also to discuss why potential alternatives were not selected. For example, if randomization is not feasible or acceptable to stakeholders, investigators should make that clear. Letters from study site collaborators can support, but should not replace, the narrative's detail on study methods. For example, letters attesting the willingness of study sites to be randomized or to support recruitment for the proposed timeframe can help offset reviewer concerns about some of the real-world challenges of launching implementation studies.

9. Measurement and Analysis

A grant application must specify a measurement plan for each construct in the study's overarching conceptual model or guiding theory, whether those constructs pertain to implementation strategies, the context of implementation, stakeholder preferences and priorities, and implementation outcomes. Yet, crafting the study approach section is complicated by the current lack of consensus on methodological approaches to the study of implementation processes, measuring implementation context and outcomes, and testing implementation strategies.

Measurement is a particularly important aspect of study methods, because it determines the quality of data. Unlike efficacy and effectiveness studies, implementation research often involves some customization of an intervention to fit local context; accordingly, measurement plans need to address the intervention's degree of customization versus fidelity. Moreover, implementation science encompasses a broad range of constructs, from a variety of disciplines, with little standardization of measures or agreement on definitions of constructs across different studies, fields, authors, or research groups, further compounding the burden to present a clear and robust measurement plan along with its rationale.

Two current initiatives seek to advance the harmonization, standardization, and rigor of measurement in implementation science, the U.S. National Cancer Institute's (NCI) Grid-Enabled Measures (GEM) portal and the Comprehensive Review of Dissemination and Implementation Science Instruments efforts supported by the Seattle Implementation Research Conference (SIRC) at the University of Washington. Both initiatives engage the implementation science research community to enhance the quality and harmonization of measures. Their respective web sites are being populated with measures and ratings, affording grant writers an invaluable resource in addressing a key methodological challenge.

Key challenges in crafting the analysis plan for implementation studies include: determining the unit of analysis, given the 'action' at individual, team, organizational, and policy environments; shaping meditational analyses given the role of contextual variables; and developing and using appropriate methods for characterizing the speed, quality, and degree of implementation. The proposed study's design, assessment tools, analytic strategies, and analytic tools must address these challenges in some manner. Grant applications that propose the testing of implementation strategies or processes often provide preliminary data from small-scale pilot studies to examine feasibility and assess sources of variation. However, the magnitude of effects in small pilots should be determined by clinical relevance, given the uncertainty of power calculations from small scale studies.

10. Policy/Funding Environment; Leverage or Support for Sustaining Change

PIs should ensure that grant applications reflect their understanding of the policy and funding context of the implementation effort. Health policies differ in many ways that impact quality, and legal, reimbursement, and regulatory factors affect the adoption and sustainability of evidence-based treatments. Raghavan et al.  discuss the policy ecology of implementation, and emphasize that greater attention should be paid to marginal costs associated with implementing evidence-based treatments, including expenses for provider training, supervision, and consultation. Glasgow et al.  recently extended their heretofore behaviorally focused RE-AIM framework for public health interventions to health policies, revealing the challenges associated with policy as a practice-change lever.

PIs can address the policy context of the implementation initiative through the narrative, background literature, letters of support, and the resource and environment section. Proposals that address how the implementation initiative aligns with policy trends enhance their likelihood of being viewed as having high public health significance, as well as greater practical impact, feasibility, and sustainability. It is important to note that it may behoove investigators to address the policy context within a proposal even if it is not likely to be facilitative of implementation, because it demonstrates to reviewers that the investigator is not naïve to the challenges and barriers that exist at this level.

We identify and discuss ten key ingredients in implementation research grant proposals. The paper reflects the team's experience and expertise: writing for federal funding agencies in the United States. We acknowledge that this will be a strength for some readers and a limitation for international readers, whom we encourage to contribute additional perspectives. Setting the stage with careful background detail and preliminary data may be more important for implementation research, which poses a unique set of challenges that investigators should anticipate and demonstrate their capacity to manage. Data to set the stage for implementation research may be collected by the study team through preliminary, feasibility, or pilot studies, or the team may draw on others' work, citing background literature to establish readiness for the proposed research.

Every PI struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project's feasibility and likelihood of success. The relative emphasis on, and thus length of text addressing, the various sections of a grant proposal varies with the program mechanism, application 'call', and funding source. For NIH applications, most attention and detail should be allocated to the study method because the 'approach' section is typically weighted most heavily in scoring.

Moreover, the under-specification or lack of detail in study methodology usually receives the bulk of reviewer criticism. Well-constructed, parsimonious tables, logic models, and figures reflecting key concepts and the analytic plan for testing their relationships all help add clarity, focus reviewers, and prevent misperceptions. All implementation research grants need to propose aims, study questions, or hypotheses whose answers will advance implementation science. Beyond this fundamental grounding, proposed implementation studies should address most, if not all, of the ingredients identified here. While no application can include a high level of detail about every ingredient, addressing these components can help assure reviewers of the significance, feasibility, and impact of the proposed research.

Creative Commons License

  • other learning opportunities
  • sample grants
  • grant-writing resources
  • funding opportunities
  • conferences
  • collaboration
  • newsletters
  • twitter & latest news
  • sources of evidence
  • theories & frameworks

Concocting that Magic Elixir: Successful Grant Application writing in Dissemination and Implementation Research by Ross Brownson, Graham Colditz, Maureen Dobbins, Karen Emmons, Jon Kerner, Margaret Padek, Enola Proctor and Kurt Stange.

This article explores the core competencies of dissemination and implementation (D&I) grant writing. It also includes a compilation of tips for writing a successful D&I proposal. While geared toward the US NIH funding processes, the competencies and tips should prove useful for other funding agencies & mechanisms.

Tips for Getting Funded from the Users’ Guide to Dissemination and Implementation in Health for Researchers and Practitioners

Checklists, tools, self-ratings and guides for successful proposals in dissemination and implementation (D&I) research developed by Enola Proctor, Ross Brownson, Byron Powell, Ana Baumann, Ashley Hamilton and Ryan Santens; and adapted for the internet by Amy Huebschmann, Chase Cameron, Demetria McNeal and Russell Glasgow.

Writing Implementation Research Grant Proposals: Ten Key Ingredients by Enola Proctor, Byron Powell, Ana Bauman, Ashley Hamilton and Ryan Santens

This article summarizes the key ingredients of an implementation research grant application with examples of how preliminary data, background literature and narrative details can strengthen the application.

Dissemination and Implementation Research at the National Cancer Institute : A Review of Funded Studies (2006–2019) and Opportunities to Advance the Field presents an analysis of NCI-funded DIRH grants and highlights areas in need of more research.

A review of policy dissemination and implementation research funded by the National Institutes of Health, 2007-2014 by Jonathan Purtle, Rachel Peters and Ross C. Brownson.

Implementation science in cancer prevention and control: a decade of grant funding by the National Cancer Institute and future directions by Gila Neta, Michael A. Sanchez, David A. Chambers, Siobhan M. Phillips, Bryan Leyva, Laurie Cynkin, Margaret M. Farrell, Suzanne Heurtin-Roberts and Cynthia Vinson.

The Science of Implementation in Health and Healthcare (SIHH) Study Section reviews applications that identify, develop, and evaluate dissemination and implementation theories, strategies and methods designed to integrate evidence-based health interventions into public health, clinical, and community settings. Applications reviewed in SIHH should have a major methods, strategy, or theoretical development component in implementation science in order to understand how interventions are implemented and measure implementation outcomes in public health, clinical, and community settings.

Sample Grants

Home

Introduction to implementation research

  • Introduction
  • The audience for this toolkit
  • Relevance of IR for improved access and delivery of interventions
  • The purpose of this Toolkit
  • Research teams
  • Self-assessment and reflection activities

Understanding implementation research

  • The need for IR
  • Outcomes of IR
  • Characteristics of IR
  • How IR works
  • Community engagement in IR
  • Ethical challenges in IR

Developing an Implementation Research Proposal

  • The team and the research challenge
  • Structure of an IR proposal
  • Components of an IR proposal
  • Research Design
  • Project plan
  • Impact and measuring project results
  • Supplements
  • Funding an IR project
  • Common problems with applications

Research methods and data management

  • Study design for IR projects
  • Selecting research methods
  • Mixed methods
  • Research tools and techniques
  • Data collection
  • Data management
  • Data analysis

IR-Planning and Conducting IR

  • Project planning
  • Project monitoring plan
  • Developing a logic model
  • Developing monitoring questions
  • Data use and reporting
  • Project execution
  • Ethical issues
  • Good practices in planning and conducting IR

IR-related communications and advocacy

  • Productive Dialogue
  • Knowledge Translation
  • Research Evidence: Barriers and Facilitators to Uptake
  • Policy Advocacy and Strategic Communications
  • Data Presentation and Visualization
  • Developing a Communication Strategy
  • Steps in Developing a Communication Strategy
  • Communication materials and Platforms

Integrating implementation research into health systems

  • Start up, mapping and convening
  • Productive dialogue
  • Ownership, trust, responsibilities and roles
  • Setting priorities, defining problems and research questions
  • Capacity strengthening
  • Uptake of findings
  • Documentation
  • Using the WHO Health Systems Framework in IR
  • Principles of sustainability

Developing implementation research projects with an intersectional gender lens

  • Integrating an intersectional gender lens in IR
  • Proposal development with an intersectional gender lens
  • Execution of an IR project with an intersectional gender lens
  • Good practices in IR projects with an intersectional gender perspective

TDR Implementation research toolkit

This module is designed as an aid to the development of a high quality implementation research (IR) proposal by a research team. It draws extensively and builds upon the content of the proposal development module in the first edition of this toolkit. 1

Although there are certain elements that are common to various types of research proposals, some aspects are emphasized in this module to guide the process of developing a proposal designed to address barriers to optimizing the effectiveness of a given health intervention, policy or strategy that form the basis of an IR ‘problem’.

writing implementation research grant proposals

If your team is embarking on the development of an IR proposal and are unsure where to begin, rest assured you are not alone! Even defining the research question can seem overwhelming at the outset. The purpose of this module is to help team members understand the process and take each of the individual steps involved in writing an IR proposal.

The content and activities in this module are organized into a series of sections, each addressing a specific element of an IR proposal in a step-wise process. Respective sections comprise the following elements:

  • Identifying what you will accomplish by the end of each section.
  • Essential information to help you understand the specific steps in proposal writing.
  • Exercises to facilitate your understanding and put ideas into practice.
  • Reflection opportunities for you to consider specific issues in relation to your project, and explore how successive ideas should be incorporated into your team’s evolving proposal and thinking.

Overall, the module provides harmonized guidelines for proposal development, recognizing that an IR team includes members from diverse backgrounds. Many users are likely to be seasoned researchers or at least have some research experience.

references

TDR Implementation research toolkit (Second edition)

  • Acknowledgements
  • Self-assessment tool
  • © Photo credit
  • Download PDF version
  • Download offline site

writing implementation research grant proposals

Get Funding

Get funding

Implementation science grant writing

Implementation science grant proposals require specific elements that are not common features in other grant types. Thankfully for those new to implementation science, to grant writing, or both, Dr. Enola Proctor and colleagues published an incredibly useful resource.

✪ Writing implementation research grant proposals: Ten key ingredients was published in the journal Implementation Science in 2012, and is a rich source of expert knowledge on writing successful grants in this young field.

Below you will find a high level overview of this article, to serve as an introduction to thinking about this type of grant. The full article linked above is available open source - be sure to read it!

Don’t miss the video adaptation of “10 Key Ingredients”, produced by the Colorado Clinical and Translational Sciences Institute .

Doing research, frame your question, pick a theory, model, or framework, identify implementation strategies, select research method, select study design, choose measures, ⇥ get funding, report results, key considerations.

Access/quality gap: It is not enough to simply document the burden of disease. Proposals need to demonstrate an unmet need, underuse or poor reach in the targeted population or low adoption, wide variation, or poor penetration of evidence based practices in usual care settings.

Evidence Based Practice: Must clearly establish that the evidence based practice is ready to implement. Strength of empirical evidence is key part of determining readiness, yet how much evidence is necessary is socially determined. Note the importance of evidence of effectiveness in targeted settings and populations. How much adaptation will be needed?

Conceptual model: Active and intentional use of theory is needed to produce generalizable knowledge and situate the study within larger body of knowledge. This is particularly needed for specifying key contextual variables that influence implementation and the mechanisms by which strategies have their effects. A theoretically based conceptual model helps to frame study questions and hypotheses, clarify the constructs to be measured, and specify the relationships to be evaluated or tested.

Key questions for implementation science grant applications.

Stakeholder priorities and engagement: Engagement affects the impact of the proposed implementation effort, the sustainability of the proposed change, and the feasibility and ultimate success of the proposed project. Varying levels and forms of engagement are possible. The key is, have they demonstrated its impact?

Setting readiness: Stronger proposals include information that reflects the setting’s readiness for change. This is especially true regarding the adoption and implementation of the proposed evidence based practice. Conducting a formal readiness assessment, formative evaluation of implementation facilitators and barriers, and letters of support from the setting that address the alignment of the proposed implementation effort with organizational priorities, policy, and funding context.

Implementation strategy: Describe the strategy in detail (see our page Identify Implementation Strategies for more on this topic). Strategy selection should be justified conceptually and matched to known barriers (determinants) in the setting. Theory should be used to explain the mechanisms of action. The strategy should be robust, adaptable, feasible, acceptable, scalable and sustainable. It is important to know the cost and budget impact of selecting a particular strategy.

Team experience: Does the research team possess all of the relevant expertise? Does the team have experience with the study setting, the evidence based practice, and the strategy?

Feasibility of the proposed research design and methods: Is there preliminary or pilot data? Is the approach well described? Have potential problems and alternative approaches been described?

Measurement and analysis: Does the conceptual framework inform the measures? Does the analysis plan align with the research questions? How well is the speed, quality, or degree of implementation captured? Is there appropriate attention to implementation outcomes not just clinical effectiveness outcomes?

Policy/funding environment: Proposals that address how the implementation effort aligns with policy trends enhance their likelihood of being perceived as having high public health significance, as well as greater practical impact, feasibility, and sustainability.

Open Access articles will be marked with ✪ Please note some journals will require subscriptions to access a linked article.

Learn more:.

An Evidence-Based Guide to Writing Grant Proposals for Clinical Research ( Annals of Internal Medicine , 2005)

✪ How research funding agencies support science integration into policy and practice: An international overview ( Implementation Science , 2014)

✪ Implementation science in cancer prevention and control: A decade of grant funding by the National Cancer Institute and future directions ( Implementation Science , 2015)

✪ Concocting that Magic Elixir: Successful Grant Application Writing in Dissemination and Implementation Research ( Clinical and Translational Science , 2015)

✪ A review of policy dissemination and implementation research funded by the National Institutes of Health, 2007–2014 ( Implementation Science , 2016)

✪ A methodology for enhancing implementation science proposals: comparison of face-to-face versus virtual workshops ( Implementation Science , 2016)

✪ Standardizing an approach to the evaluation of implementation science proposals ( Implementation Science , 2018)

💻 Webinars on Grant Development & Funding

What new investigators need to know about dissemination and implementatino research

Implementation Science Funding Announcements

National institutes of health.

Dissemination and Implementation Research in Health (R01 Clinical Trial Optional)

Dissemination and Implementation Research in Health (R21 Clinical Trial Optional)

Dissemination and Implementation Research in Health (R03)

Targeted Implementation Science to Achieve 90/90/90 Goals for HIV/AIDS Prevention and Treatment (R21 Clinical Trial Optional)

Strengthening the HIV Pre-Exposure Prophylaxis (PrEP) Care Continuum through Behavioral, Social, and Implementation Science (R01 Clinical Trial Optional)

Multi-Site Studies for System-Level Implementation of Substance Use Prevention and Treatment Services (R01 Clinical Trial Optional)

Agency for Healthcare Research and Quality

Funding Announcements Overview

Improving Management of Opioids and Opioid Use Disorder (OUD) in Older Adults (R18)

Patient-Centered Outcomes Research Institute

PCORI: Funding Opportunities

Example Funded Grants

Selection of NCI-Funded Implementation Science Grants

Be boundless

Connect with us:.

© 2024 University of Washington | Seattle, WA

The Ultimate Grant Writing Guide (and How to Find and Apply for Grants)

Securing grants requires strategic planning. Identifying relevant opportunities, building collaborations, and crafting a comprehensive grant proposal are crucial steps. Read our ultimate guide on grant writing, finding grants, and applying for grants to get the funding for your research.

Updated on February 22, 2024

The Ultimate Grant Writing Guide (and How to Find and Apply for Grants)

Embarking on a journey of groundbreaking research and innovation always requires more than just passion and dedication, it demands financial support. In the academic and research domains, securing grants is a pivotal factor for transforming these ideas into tangible outcomes. 

Grant awards not only offer the backing needed for ambitious projects but also stand as a testament to the importance and potential impact of your work. The process of identifying, pursuing, and securing grants, however, is riddled with nuances that necessitate careful exploration. 

Whether you're a seasoned researcher or a budding academic, navigating this complex world of grants can be challenging, but we’re here to help. In this comprehensive guide, we'll walk you through the essential steps of applying for grants, providing expert tips and insights along the way.

Finding grant opportunities 

Prior to diving into the application phase, the process of finding grants involves researching and identifying those that are relevant and realistic to your project. While the initial step may seem as simple as entering a few keywords into a search engine, the full search phase takes a more thorough investigation.

By focusing efforts solely on the grants that align with your goals, this pre-application preparation streamlines the process while also increasing the likelihood of meeting all the requirements. In fact, having a well thought out plan and a clear understanding of the grants you seek both simplifies the entire activity and sets you and your team up for success.

Apply these steps when searching for appropriate grant opportunities:

1. Determine your need

Before embarking on the grant-seeking journey, clearly articulate why you need the funds and how they will be utilized. Understanding your financial requirements is crucial for effective grant research.

2. Know when you need the money

Grants operate on specific timelines with set award dates. Align your grant-seeking efforts with these timelines to enhance your chances of success.

3. Search strategically

Build a checklist of your most important, non-negotiable search criteria for quickly weeding out grant options that absolutely do not fit your project. Then, utilize the following resources to identify potential grants:

  • Online directories
  • Small Business Administration (SBA)
  • Foundations

4. Develop a tracking tool

After familiarizing yourself with the criteria of each grant, including paperwork, deadlines, and award amounts, make a spreadsheet or use a project management tool to stay organized. Share this with your team to ensure that everyone can contribute to the grant cycle.

Here are a few popular grant management tools to try: 

  • Jotform : spreadsheet template
  • Airtable : table template
  • Instrumentl : software
  • Submit : software

Tips for Finding Research Grants

Consider large funding sources : Explore major agencies like NSF and NIH.

Reach out to experts : Consult experienced researchers and your institution's grant office.

Stay informed : Regularly check news in your field for novel funding sources.

Know agency requirements : Research and align your proposal with their requisites.

Ask questions : Use the available resources to get insights into the process.

Demonstrate expertise : Showcase your team's knowledge and background.

Neglect lesser-known sources : Cast a wide net to diversify opportunities.

Name drop reviewers : Prevent potential conflicts of interest.

Miss your chance : Find field-specific grant options.

Forget refinement : Improve proposal language, grammar, and clarity.

Ignore grant support services : Enhance the quality of your proposal.

Overlook co-investigators : Enhance your application by adding experience.

Grant collaboration 

Now that you’ve taken the initial step of identifying potential grant opportunities, it’s time to find collaborators. The application process is lengthy and arduous. It requires a diverse set of skills. This phase is crucial for success.

With their valuable expertise and unique perspectives, these collaborators play instrumental roles in navigating the complexities of grant writing. While exploring the judiciousness that goes into building these partnerships, we will underscore why collaboration is both advantageous and indispensable to the pursuit of securing grants.

Why is collaboration important to the grant process?

Some grant funding agencies outline collaboration as an outright requirement for acceptable applications. However, the condition is more implied with others. Funders may simply favor or seek out applications that represent multidisciplinary and multinational projects.

To get an idea of the types of collaboration major funders prefer, try searching “collaborative research grants” to uncover countless possibilities, such as:

  • National Endowment for the Humanities
  • American Brain Tumor Association

For exploring grants specifically for international collaboration, check out this blog:

  • 30+ Research Funding Agencies That Support International Collaboration

Either way, proposing an interdisciplinary research project substantially increases your funding opportunities. Teaming up with multiple collaborators who offer diverse backgrounds and skill sets enhances the robustness of your research project and increases credibility.

This is especially true for early career researchers, who can leverage collaboration with industry, international, or community partners to boost their research profile. The key lies in recognizing the multifaceted advantages of collaboration in the context of obtaining funding and maximizing the impact of your research efforts.

How can I find collaborators?

Before embarking on the search for a collaborative partner, it's essential to crystallize your objectives for the grant proposal and identify the type of support needed. Ask yourself these questions: 

1)Which facet of the grant process do I need assistance with:

2) Is my knowledge lacking in a specific: 

  • Population?

3) Do I have access to the necessary:

Use these questions to compile a detailed list of your needs and prioritize them based on magnitude and ramification. These preliminary step ensure that search for an ideal collaborator is focused and effective.

Once you identify targeted criteria for the most appropriate partners, it’s time to make your approach. While a practical starting point involves reaching out to peers, mentors, and other colleagues with shared interests and research goals, we encourage you to go outside your comfort zone.

Beyond the first line of potential collaborators exists a world of opportunities to expand your network. Uncover partnership possibilities by engaging with speakers and attendees at events, workshops, webinars, and conferences related to grant writing or your field.

Also, consider joining online communities that facilitate connections among grant writers and researchers. These communities offer a space to exchange ideas and information. Sites like Collaboratory , NIH RePorter , and upwork provide channels for canvassing and engaging with feasible collaborators who are good fits for your project. 

Like any other partnership, carefully weigh your vetted options before committing to a collaboration. Talk with individuals about their qualifications and experience, availability and work style, and terms for grant writing collaborations.

Transparency on both sides of this partnership is imperative to forging a positive work environment where goals, values, and expectations align for a strong grant proposal.

Putting together a winning grant proposal

It’s time to assemble the bulk of your grant application packet – the proposal itself. Each funder is unique in outlining the details for specific grants, but here are several elements fundamental to every proposal:

  • Executive Summary
  • Needs assessment
  • Project description
  • Evaluation plan
  • Team introduction
  • Sustainability plan 

This list of multi-faceted components may seem daunting, but careful research and planning will make it manageable. 

Start by reading about the grant funder to learn:

  • What their mission and goals are,
  • Which types of projects they have funded in the past, and
  • How they evaluate and score applications.

Next, view sample applications to get a feel for the length, flow, and tone the evaluators are looking for. Many funders offer samples to peruse, like these from the NIH , while others are curated by online platforms , such as Grantstation.

Also, closely evaluate the grant application’s requirements. they vary between funding organizations and opportunities, and also from one grant cycle to the next. Take notes and make a checklist of these requirements to add to an Excel spreadsheet, Google smartsheet, or management system for organizing and tracking your grant process.

Finally, understand how you will submit the final grant application. Many funders use online portals with character or word limits for each section. Be aware of these limits beforehand. Simplify the editing process by first writing each section in a Word document to be copy and pasted into the corresponding submission fields.

If there is no online application platform, the funder will usually offer a comprehensive Request for Proposal (RFP) to guide the structure of your grant proposal. The RFP: 

  • Specifies page constraints
  • Delineates specific sections
  • Outlines additional attachments
  • Provides other pertinent details

Components of a grant proposal

Cover letter.

Though not always explicitly requested, including a cover letter is a strategic maneuver that could be the factor determining whether or not grant funders engage with your proposal. It’s an opportunity to give your best first impression by grabbing the reviewer’s attention and compelling them to read further. 

Cover letters are not the place for excessive emotion or detail, keep it brief and direct, stating your financial needs and purpose confidently from the outset. Also, try to clearly demonstrate the connection between your project and the funder’s mission to create additional value beyond the formal proposal.

Executive summary

Like an abstract for your research manuscript, the executive summary is a brief synopsis that encapsulates the overarching topics and key points of your grant proposal. It must set the tone for the main body of the proposal while providing enough information to stand alone if necessary.

Refer to How to Write an Executive Summary for a Grant Proposal for detailed guidance like:

  • Give a clear and concise account of your identity, funding needs, and project roadmap.
  • Write in an instructive manner aiming for an objective and persuasive tone
  • Be convincing and pragmatic about your research team's ability.
  • Follow the logical flow of main points in your proposal.
  • Use subheadings and bulleted lists for clarity.
  • Write the executive summary at the end of the proposal process.
  • Reference detailed information explained in the proposal body.
  • Address the funder directly.
  • Provide excessive details about your project's accomplishments or management plans.
  • Write in the first person.
  • Disclose confidential information that could be accessed by competitors.
  • Focus excessively on problems rather than proposed solutions.
  • Deviate from the logical flow of the main proposal.
  • Forget to align with evaluation criteria if specified

Project narrative

After the executive summary is the project narrative . This is the main body of your grant proposal and encompasses several distinct elements that work together to tell the story of your project and justify the need for funding. 

Include these primary components:

Introduction of the project team

Briefly outline the names, positions, and credentials of the project’s directors, key personnel, contributors, and advisors in a format that clearly defines their roles and responsibilities. Showing your team’s capacity and ability to meet all deliverables builds confidence and trust with the reviewers.

Needs assessment or problem statement

A compelling needs assessment (or problem statement) clearly articulates a problem that must be urgently addressed. It also offers a well-defined project idea as a possible solution. This statement emphasizes the pressing situation and highlights existing gaps and their consequences to illustrate how your project will make a difference.

To begin, ask yourself these questions:

  • What urgent need are we focusing on with this project?
  • Which unique solution does our project offer to this urgent need? 
  • How will this project positively impact the world once completed?

Here are some helpful examples and templates.

Goals and objectives

Goals are broad statements that are fairly abstract and intangible. Objectives are more narrow statements that are concrete and measurable. For example :

  • Goal : “To explore the impact of sleep deprivation on cognitive performance in college students.”
  • Objective : “To compare cognitive test scores of students with less than six hours of sleep and those with 8 or more hours of sleep.”

Focus on outcomes, not processes, when crafting goals and objectives. Use the SMART acronym to align them with the proposal's mission while emphasizing their impact on the target audience.

Methods and strategies

It is vitally important to explain how you intend to use the grant funds to fulfill the project’s objectives. Detail the resources and activities that will be employed. Methods and strategies are the bridge between idea and action. They must prove to reviewers the plausibility of your project and the significance of their possible funding.

Here are some useful guidelines for writing your methods section that are outlined in " Winning Grants: Step by Step ."

  • Firmly tie your methods to the proposed project's objectives and needs assessment.
  • Clearly link them to the resources you are requesting in the proposal budget.
  • Thoroughly explain why you chose these methods by including research, expert opinion, and your experience.
  • Precisely list the facilities and capital equipment that you will use in the project.
  • Carefully structure activities so that the program moves toward the desired results in a time-bound manner.

A comprehensive evaluation plan underscores the effectiveness and accountability of a project for both the funders and your team. An evaluation is used for tracking progress and success. The evaluation process shows how to determine the success of your project and measure the impact of the grant award by systematically gauging and analyzing each phase of your project as it compares to the set objectives.

Evaluations typically fall into two standard categories:

1. Formative evaluation : extending from project development through implementation, continuously provides feedback for necessary adjustments and improvements. 

2. Summative evaluation : conducted post-project completion, critically assesses overall success and impact by compiling information on activities and outcomes.

Creating a conceptual model of your project is helpful when identifying these key evaluation points. Then, you must consider exactly who will do the evaluations, what specific skills and resources they need, how long it will take, and how much it will cost.

Sustainability

Presenting a solid plan that illustrates exactly how your project will continue to thrive after the grant money is gone builds the funder's confidence in the project’s longevity and significance. In this sustainability section, it is vital to demonstrate a diversified funding strategy for securing the long-term viability of your program.

There are three possible long term outcomes for projects with correlated sustainability options:

  • Short term projects: Though only implemented once, will have ongoing maintenance costs, such as monitoring, training, and updates.

(E.g., digitizing records, cleaning up after an oil spill)

  • Projects that will generate income at some point in the future: must be funded until your product or service can cover operating costs with an alternative plan in place for deficits.

(E.g., medical device, technology, farming method)

  • Ongoing projects: will eventually need a continuous stream of funding from a government entity or large organization.

(E.g., space exploration, hurricane tracking)

Along with strategies for funding your program beyond the initial grant,  reference your access to institutional infrastructure and resources that will reduce costs.

Also, submit multi-year budgets that reflect how sustainability factors are integrated into the project’s design.

The budget section of your grant proposal, comprising both a spreadsheet and a narrative, is the most influential component. It should be able to stand independently as a suitable representation of the entire endeavor. Providing a detailed plan to outline how grant funds will be utilized is crucial for illustrating cost-effectiveness and careful consideration of project expenses. 

A comprehensive grant budget offers numerous benefits to both the grantor , or entity funding the grant, and the grantee , those receiving the funding, such as:

  • Grantor : The budget facilitates objective evaluation and comparison between multiple proposals by conveying a project's story through responsible fund management and financial transparency.
  • Grantee : The budget serves as a tracking tool for monitoring and adjusting expenses throughout the project and cultivates trust with funders by answering questions before they arise.

Because the grant proposal budget is all-encompassing and integral to your efforts for securing funding, it can seem overwhelming. Start by listing all anticipated expenditures within two broad categories, direct and indirect expenses , where:

  • Direct : are essential for successful project implementation, are measurable project-associated costs, such as salaries, equipment, supplies, travel, and external consultants, and are itemized and detailed in various categories within the grant budget.
  • Indirect : includes administrative costs not directly or exclusively tied to your project, but necessary for its completion, like rent, utilities, and insurance, think about lab or meeting spaces that are shared by multiple project teams, or Directors who oversee several ongoing projects.

After compiling your list, review sample budgets to understand the typical layout and complexity. Focus closely on the budget narratives , where you have the opportunity to justify each aspect of the spreadsheet to ensure clarity and validity.

budget example

While not always needed, the appendices consist of relevant supplementary materials that are clearly referenced within your grant application. These might include: 

  • Updated resumes that emphasize staff members' current positions and accomplishments. 
  • Letters of support from people or organizations that have authority in the field of your research, or community members that may benefit from the project.
  • Visual aids like charts, graphs, and maps that contribute directly to your project’s story and are referred to previously in the application. 

Finalizing your grant application

Now that your grant application is finished, make sure it's not just another document in the stack Aim for a grant proposal that captivates the evaluator. It should stand out not only for presenting an excellent project, but for being engaging and easily comprehended . 

Keep the language simple. Avoid jargon. Prioritizing accuracy and conciseness. Opt for reader-friendly formatting with white space, headings, standard fonts, and illustrations to enhance readability.

Always take time for thorough proofreading and editing. You can even set your proposal aside for a few days before revisiting it for additional edits and improvements. At this stage, it is helpful to seek outside feedback from those familiar with the subject matter as well as novices to catch unnoticed mistakes and improve clarity.

If you want to be absolutely sure your grant proposal is polished, consider getting it edited by AJE .

How can AI help the grant process?

When used efficiently, AI is a powerful tool for streamlining and enhancing various aspects of the grant process.

  • Use AI algorithms to review related studies and identify knowledge gaps.
  • Employ AI for quick analysis of complex datasets to identify patterns and trends.
  • Leverage AI algorithms to match your project with relevant grant opportunities.
  • Apply Natural Language Processing for analyzing grant guidelines and tailoring proposals accordingly.
  • Utilize AI-powered tools for efficient project planning and execution.
  • Employ AI for tracking project progress and generating reports.
  • Take advantage of AI tools for improving the clarity, coherence, and quality of your proposal.
  • Rely solely on manual efforts that are less comprehensive and more time consuming.
  • Overlook the fact that AI is designed to find patterns and trends within large datasets.
  • Minimize AI’s ability to use set parameters for sifting through vast amounts of data quickly.
  • Forget that the strength of AI lies in its capacity to follow your prompts without divergence.
  • Neglect tools that assist with scheduling, resource allocation, and milestone tracking.
  • Settle for software that is not intuitive with automated reminders and updates.
  • Hesitate to use AI tools for improving grammar, spelling, and composition throughout the writing process.

Remember that AI provides a diverse array of tools; there is no universal solution. Identify the most suitable tool for your specific task. Also, like a screwdriver or a hammer, AI needs informed human direction and control to work effectively.

Looking for tips when writing your grant application? 

Check out these resources:

  • 4 Tips for Writing a Persuasive Grant Proposal
  • Writing Effective Grant Applications
  • 7 Tips for Writing an Effective Grant Proposal
  • The best-kept secrets to winning grants
  • The Best Grant Writing Books for Beginner Grant Writers
  • Research Grant Proposal Funding: How I got $1 Million

Final thoughts

The bottom line – applying for grants is challenging. It requires passion, dedication, and a set of diverse skills rarely found within one human being.

Therefore, collaboration is key to a successful grant process . It encourages everyone’s strengths to shine. Be honest and ask yourself, “Which elements of this grant application do I really need help with?” Seek out experts in those areas.

Keep this guide on hand to reference as you work your way through this funding journey. Use the resources contained within. Seek out answers to all the questions that will inevitably arise throughout the process.

The grants are out there just waiting for the right project to present itself – one that shares the funder’s mission and is a benefit to our communities. Find grants that align with your project goals, tell your story through a compelling proposal, and get ready to make the world a better place with your research.

The AJE Team

The AJE Team

See our "Privacy Policy"

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Implement Sci

Logo of implemsci

Standardizing an approach to the evaluation of implementation science proposals

Erika l. crable.

1 Evans Center for Implementation and Improvement Sciences, Boston University School of Medicine, 88 East Newton Street, Vose 216, Boston, MA 02118 USA

2 Department of Health Law, Policy & Management, Boston University School of Public Health, Boston, MA USA

Dea Biancarelli

Allan j. walkey.

3 Section of Pulmonary, Allergy, and Critical Care Medicine, Department of Medicine, Boston University School of Medicine, Boston, MA USA

Caitlin G. Allen

4 Behavioral Sciences and Health Education Department, Rollins School of Public Health, Emory University, Atlanta, GA USA

Enola K. Proctor

5 Center for Mental Health Services Research, The Brown School at Washington University in St. Louis, St. Louis, MO USA

Mari-Lynn Drainoni

6 Section of Infectious Diseases, Department of Medicine, Boston University School of Medicine, Boston, MA USA

7 Center for Healthcare Organization and Implementation Research, Edith Nourse Rogers Memorial VA Hospital, Bedford, MA USA

Associated Data

The datasets generated and/or analyzed during the current study are not publicly available because they represent study proposals prepared by individual investigators. Proposal scoring data are available from the corresponding author on reasonable request.

The fields of implementation and improvement sciences have experienced rapid growth in recent years. However, research that seeks to inform health care change may have difficulty translating core components of implementation and improvement sciences within the traditional paradigms used to evaluate efficacy and effectiveness research. A review of implementation and improvement sciences grant proposals within an academic medical center using a traditional National Institutes of Health framework highlighted the need for tools that could assist investigators and reviewers in describing and evaluating proposed implementation and improvement sciences research.

We operationalized existing recommendations for writing implementation science proposals as the ImplemeNtation and Improvement Science Proposals Evaluation CriTeria (INSPECT) scoring system. The resulting system was applied to pilot grants submitted to a call for implementation and improvement science proposals at an academic medical center. We evaluated the reliability of the INSPECT system using Krippendorff’s alpha coefficients and explored the utility of the INSPECT system to characterize common deficiencies in implementation research proposals.

We scored 30 research proposals using the INSPECT system. Proposals received a median cumulative score of 7 out of a possible score of 30. Across individual elements of INSPECT, proposals scored highest for criteria rating evidence of a care or quality gap. Proposals generally performed poorly on all other criteria. Most proposals received scores of 0 for criteria identifying an evidence-based practice or treatment (50%), conceptual model and theoretical justification (70%), setting’s readiness to adopt new services/treatment/programs (54%), implementation strategy/process (67%), and measurement and analysis (70%). Inter-coder reliability testing showed excellent reliability (Krippendorff’s alpha coefficient 0.88) for the application of the scoring system overall and demonstrated reliability scores ranging from 0.77 to 0.99 for individual elements.

Conclusions

The INSPECT scoring system presents a new scoring criteria with a high degree of inter-rater reliability and utility for evaluating the quality of implementation and improvement sciences grant proposals.

The recognition that experimental efficacy studies alone are insufficient to improve public health [ 1 ] has led to the rapid expansion of the fields of implementation and improvement sciences [ 2 – 5 ]. However, studies that aim to identify strategies that facilitate adoption, sustainability, and scalability of evidence may not translate well within traditional efficacy and effectiveness research paradigms [ 6 ].

The need for new tools to aid investigators and research stakeholders in implementation science became clear during evaluation of grant submissions to the Evans Center for Implementation and Improvement Sciences (CIIS) at Boston University. CIIS was established in 2016 to promote scientific rigor in new and ongoing projects aimed at increasing the use of evidence and improving patient outcomes within an urban, academic, safety net medical center. As part of CIIS’s goal to foster rigorous implementation and improvement methods, CIIS established a call for pilot grant applications for implementation and improvement sciences [ 7 ]. Proposals were peer-reviewed using traditional National Institutes of Health (NIH) scoring criteria [ 8 ]. Through two cycles of grant applications, proposal reviewers identified a need for improved evaluation criteria capable of identifying specific strengths and weaknesses in order to rate the potential impact of implementation and/or improvement study designs.

We describe the development and evaluation of ImplemeNtation and Improvement Science Proposal Evaluation CriTeria (INSPECT) : a tool for the standardized evaluation of implementation and improvement research proposals. The INSPECT tool seeks to operationalize criteria proposed by Proctor et al. as “key ingredients” that constitute a well-crafted implementation science proposal, which operate within the NIH proposal scoring framework [ 6 ].

Assessment of need

CIIS released requests for pilot grant applications focused on implementation and improvement sciences in April 2016 and April 2017 [ 7 ]. The request for applications described an opportunity for investigators to receive up to $15,000 for innovative implementation and improvement sciences research on any topic related to improving the processes and outcomes of health care delivery in safety net settings. CIIS funds pilot grants with the goal of providing investigators with the opportunity to obtain preliminary data for further research. Proposals were required to include a specific aims page and a three-page research plan structured within the traditional NIH framework with subheadings for significance, innovation, approach, environment, and research team. The NIH framework was required because it corresponds with the grant proposal structure required by the NIH. A study budget and justification as well as research team biographical sketches were required with no page limit restrictions. CIIS received 30 pilot grant applications covering a broad array of content areas, such as smoking cessation, hepatitis C, diabetes, cancer, and neonatal abstinence syndrome.

Six researchers with experience in implementation and improvement sciences served as grant reviewers. Four reviewers scored each proposal. Reviewers evaluated the quality of pilot study proposals, assigning numerical scores from 1 to 9 (1 = exceptional, 9 = poor) for each of the NIH criteria (significance, innovation, investigators, approach, environment, overall impact) [ 8 ]. CIIS elected to use the NIH criteria to evaluate the pilot grant applications because the criteria are those used by the NIH peer review systems to evaluate the scientific and technical merit of grant proposals. The CIIS grant review team held a “study section” to review and discuss the proposals. However, during that meeting, reviewers provided feedback that the NIH evaluation criteria, based in the traditional efficacy and effectiveness research paradigm, did not offer sufficient guidance for evaluating implementation and improvement science proposals, nor did it provide enough specificity for the proposal writers who are less experienced in implementation research. Grant reviewers requested new proposal evaluation criteria that would better inform score decisions and feedback to proposal writers on specific aspects of implementation science including measuring the strength of implementation study design, strategy, feasibility, and relevance.

Despite the challenges of using the traditional NIH evaluation criteria, the review panel used those criteria to score all of the grants received during the first 2 years of proposal requests. CIIS pilot grant funding was awarded to applications that received the lowest (best) scores under the NIH criteria and received positive feedback from the review panel.

The request for more explicit implementation science evaluation criteria prompted the CIIS research team to conduct a qualitative needs assessment of all 30 pilot study applications in order to determine how the proposals described study designs, implementation strategies, and other aspects of proposed implementation and improvement research. Three members of the CIIS research team (MLD, AJW, DB) independently open-coded pilot proposals to identify properties related to core implementation science concepts or efficacy and effectiveness research [ 9 ]. The team identified common themes in the proposals, including an emphasis on efficacy hypotheses, descriptions of untested interventions, and the absence of implementation strategies and conceptual frameworks. The consistent lack of features identified as important aspects of implementation science reinforced the need for criteria that specifically addressed implementation science approaches to guide both proposal preparation and evaluation.

Operationalizing scoring criteria

We identified Proctor et al.’s “ten key ingredients” for writing implementation research proposals [ 6 ] as an appropriate framework to guide and evaluate proposals. We operationalized the “ingredients” into a scoring system. To construct the scoring system, a four-point scale (0–3) was created for each element. In general, a score of 3 was given for an element if all of the criteria requirements for the element were fully met; a score of 2 was given if the criteria were somewhat, but not fully addressed; a score of 1 was given if the ingredient was mentioned but not operationalized in the proposal or linked to the rest of the study; and a score of 0 was given if the element was not addressed at all in the proposal. Table  1 illustrates the INSPECT scoring system for the 10 items, in which proposals receive one score for each of the 10 ingredients, for a cumulative score between 0 and 30.

Implementation and Improvement Science Proposal Evaluation Criteria

Testing INSPECT

We used the pilot study proposals submitted to CIIS to develop and evaluate the utility and reliability of the INSPECT scoring system. Initially, two research team members (ELC, DB) independently applied the 10-element criteria to 7 of the 30 pilot grant proposals. Four team members (MLD, AJW, ELC, DB) then met to discuss these initial results and achieve consensus on the scoring criteria. Two team members (ELC, DB) then independently scored the remaining 23 pilot study applications using the revised scoring system. Both reviewers recorded brief justifications for each of the ten scores assigned to individual study proposals. The two coders (ELC, DB) then met to compare scores, share scoring justifications, and determine the final item-specific scores for each proposal using group consensus.

Inter-coder reliability with the scoring protocol was measured using Krippendorff’s alpha to assess observed and expected disagreement between the two coders’ initial individual item scores [ 10 , 11 ]. An alpha coefficient of 0.70 was deemed a priori as the lowest acceptable level of agreement to establish reliability of the new scoring protocol [ 10 , 11 ]. Frequency analyses were conducted to determine the distribution of final element-specific scores (0–3) across all proposals. We calculated a correlation coefficient to assess the association between proposal scores assigned using the NIH framework and scores assigned using INSPECT. All calculations were performed in R version 3.3.2 [ 12 ].

Iterative review of the 30 research proposals using Proctor et al.’s “ten key ingredients” resulted in the development and testing of the INSPECT system for assessing implementation and improvement science proposals.

Figure  1 displays the skewed right distribution of cumulative proposal scores, with most proposals receiving low overall scores. Out of a possible cumulative score of 30, proposals had a median score of 7 (IQR 3.3–11.8).

An external file that holds a picture, illustration, etc.
Object name is 13012_2018_770_Fig1_HTML.jpg

Distribution of cumulative proposal scores assigned using ImplemeNtation and Improvement Science Proposal Evaluation CriTeria (INSPECT)

Table  2 presents the distribution of cumulative and item-specific scores assigned to proposals using the INSPECT criteria. Across individual elements, proposals scored highest for criteria describing care/quality gaps in health services. Thirty-six percent of proposals received the maximum score of 3 for meeting all care or care or quality gap element requirements, including using local setting data to support the existence of a gap, including an explicit description of the potential for improvement, and linking the proposed research to funding priorities (i.e., safety net setting).

Distribution of ImplemeNtation and Improvement Science Proposal Evaluation CriTeria (INSPECT) Scores

Proposals generally scored poorly for other criteria. As shown in Table  2 , most study proposals received scores of 0 in the categories of evidence-based treatment to be implemented (50%), conceptual model and theoretical justification (70%), setting’s readiness to adopt new services/treatment/programs (53%), implementation strategy/process (67%), and measurement and analysis (70%). For example, reviewers gave scores of 0 for the “evidence-based intervention to be implemented” element because the intervention was not evidence-based and the project sought to establish efficacy, rather than to examine uptake of an established evidence-based practice. Similarly, proposals that only sought to study effectiveness and did not assess any implementation outcomes [ 13 ] (e.g., adoption, fidelity) received scores of 0 for “measurement and analysis.” None of the study proposals primarily aiming to assess effectiveness outcomes expressed the dual research intent of a hybrid design. Scores of 0 for other categories were given when applications lacked any description relevant to the category, such as no conceptual model, no implementation strategy, or no research team skills revenant to implementation or improvement science.

Table  2 displays the assessed rates of inter-coder reliability in applying INSPECT to the 30 pilot study proposals. An overall alpha coefficient of 0.88 was observed between the coders. Rates of inter-coder reliability in applying each of the 10 items to the proposals ranged from 0.77 to 0.99, all above the 0.70 reliability threshold.

Additionally, we observed a moderate inverse correlation ( r  = − 0.62, p  < 0.01) between the proposal scores initially assigned using the NIH framework and the scores assigned using INSPECT.

We developed a reliable proposal scoring system that operationalizes Proctor et al.’s “ten key ingredients” for writing an implementation research grant [ 6 ]. Previous research analyzing peer-review grant processes has highlighted a need to improve scoring agreement between peer reviewers [ 14 ]. High levels of disagreement in assessors’ interpretation of grant scoring criteria result in unreliable peer-review processes and funding decisions based more on chance than scientific merit [ 14 ]. Measuring rates of inter-rater reliability are a standard approach for evaluating the utility of existing proposal scoring criteria and assessing efforts to improve the criteria [ 15 , 16 ]. Application of the INSPECT system demonstrated high inter-rater reliability overall and within each of the 10 items. The high degree of reliability measured for INSPECT may be related to the specificity of its design as an implementation and improvement science scoring criteria. A review of scoring rubrics reported in the scientific literature suggests that topic-focused criteria contribute to increased scoring reliability [ 17 ]. Additionally, the moderate correlation between scores assigned using the NIH framework and scores assigned using INSPECT suggests validity of the INSPECT criteria in evaluating proposal quality. Proctor et al.’s “ten key ingredients” for grant writers were developed to map onto the existing NIH criteria. Our operationalized version of the ingredients as scoring criteria demonstrated that proposals that scored poorly under NIH criteria also scored poorly under INSPECT.

Applying the INSPECT system to proposed implementation and improvement science research at an academic medical center improved proposal reviewers’ ability to identify specific strengths and weaknesses in implementation approach. Overall, proposals only received high scores for identifying the care gap or quality gap. Since efficacy and implementation or improvement research may use similar techniques to establish the significance of the study questions [ 18 ], proposals may score well on describing the quality gap, even if they later described efficacy hypotheses that received overall low scores from the INSPECT system. Further studies should explore techniques for describing care and quality gaps that highlight implementation or improvement research questions.

Consistently low scores in four areas—defining the evidence-based treatment to be implemented, conceptual model and theoretical justification, setting’s readiness to adopt new programs, and measurement and analysis—suggest that many investigators seeking to conduct implementation research may have misconceptions about the fundamental goals of this field. One misconception may relate to a sole focus on evaluating an intervention’s effectiveness rather than studying the processes and outcomes of implementation strategies. The majority of study proposals evaluated using INSPECT neither aimed to improve uptake of any evidence-based practice nor included any implementation measures such as acceptability, adoption, feasibility, fidelity, penetration, or sustainability [ 19 ]. Inadequate and inconsistent descriptions of implementation strategies and outcomes represent major challenges to overall implementation study success [ 20 ]. In addition to guidance provided by the INSPECT criteria, recent efforts to develop implementation study reporting standards [ 21 ] may assist proposal writers in describing planned research.

Several proposals addressed treatments or practices with low evidence for the potential to improve healthcare. Although hybrid studies, which study both effectiveness and implementation outcomes, are practical approaches to establishing the effectiveness of evidence-informed practices while measuring implementation efforts [ 18 ], none of the study proposals expressed this dual research intent or were conceived as hybrid designs.

Our findings also suggest low familiarity with and use of resources to evaluate of the strength of evidence (such as the Grading Quality of Evidence and Strength of Recommendations system [ 22 ] and the Strength of Recommendation Taxonomy grading scale [ 23 ]) for implementation science research. A more systematic evaluation of the strength of evidence [ 24 – 27 ] necessary to warrant implementation efforts may help to differentiate implementation science from efficacy or effectiveness research and improve understanding of the utility hybrid studies offer [ 28 ].

Expanding access to implementation science training in universities as part of the core health services research curriculum and enhancing access to professional development opportunities that focus on conceptual and methodological implementation skills in a content agnostic way would aid in building capacity for the next generation of implementation science researchers. Additionally, training programs provide an opportunity to provide guidance on both writing and evaluating the quality of implementation science grant applications.

Strengths of our results include that application of INSPECT to study proposals submitted by investigators with a wide range of implementation and improvement science-specific experience, and covering a variety of content areas. However, our results are limited in that they characterize one academic institution’s familiarity with implementation and improvement science research and the INSPECT system requires validation in other settings and over a broader range of proposal ratings. Additionally, we measured a high degree of inter-rater reliability for INSPECT when it was applied to a sample of low-scoring proposals. INSPECT’s inter-rater reliability may decrease when applied to a sample of higher quality proposals, and reviewers are required to discriminate between gradations of quality (i.e., scores of 1–3) rather than mostly scoring the absence of key items (i.e., scores of 0). Future research should test the validity of INSPECT by comparing INSPECT-assigned scores to ratings assigned to approved proposals by the NIH Dissemination and Implementation Research in Health study section. Future research should also assess the relationship between INSPECT score assignments and successful study completion to determine the utility of INSPECT as a mechanism for ensuring the quality and impact of funded research. To aid in these prospective research efforts, forthcoming proposal calls from CIIS will specifically use INSPECT as the proposal evaluation criteria.

Although multiple tools exist to aid researchers in writing implementation science proposals [ 6 , 29 , 30 ], few resources exist to support grant reviewers. This study identified additional functionality of Proctor et al.’s “ten key ingredients” as a guide for writers by developing it into a detailed checklist for proposal reviewers. The current research makes a substantive contribution to implementation and improvement sciences by demonstrating the utility and reliability of a new tool designed to aid grant reviewers in identifying high-quality research.

In conclusion, we operationalized an implementation and improvement research-specific scoring system to provide guidance for proposal writers and grant reviewers. We demonstrated the utility and reliability of the new INSPECT scoring systems in evaluating the quality of implementation and improvement sciences research proposed at one academic medical center. The prevalence of low scores across the majority of INSPECT criteria suggests a need to promote education about the goals of implementation and improvement science, including the conceptual and methodological distinctions from efficacy and effectiveness research.

Acknowledgements

We would like to thank the investigators who submitted pilot grant applications to the Center for Implementation and Improvement Sciences Pilot Grant Program in 2016 and 2017. Creating the Implementation and Improvement Science Proposals Evaluation Criteria would not have been possible without their submissions. We also appreciate the feedback form CIIS grant application reviewers which was instrumental in identifying the need for new scoring criteria. The CIIS team appreciates the ongoing guidance, interest, and support from David Coleman. Thanks also to Kevin Griffith for his feedback on measures of reliability.

This research was supported with funding from the Evans Medical Foundation Inc.

Availability of data and materials

Abbreviations, authors’ contributions.

DB, CGA, AJW, and MD conducted the initial thematic analysis. MD and DB created the original scoring criteria. ELC revised the scoring criteria. ELC, DB, AJW, and MD reviewed and finalized the scoring criteria. ELC and DB piloted the use of the scoring criteria and analyzed the score data. ELC drafted and revised the manuscript based on comments from coauthors. AJW, MD, DB, and EKP provided manuscript comments and revisions. All authors read and approved the final manuscript.

Ethics approval and consent to participate

This study was reviewed and determined to not qualify as human subjects research by the Boston University Medical Campus Institutional Review Board (reference number H-37709).

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Erika L. Crable, Phone: +1 617-638-7281, Email: ude.ub@elbarce .

Dea Biancarelli, Email: ude.ub@blaed .

Allan J. Walkey, Email: ude.ub@yeklawla .

Caitlin G. Allen, Email: ude.yrome@72ellac .

Enola K. Proctor, Email: ude.ltsuw@pke .

Mari-Lynn Drainoni, Email: ude.ub@inoniard .

Europe PMC requires Javascript to function effectively.

Either your web browser doesn't support Javascript or it is currently turned off. In the latter case, please turn on Javascript support in your web browser and reload this page.

Search life-sciences literature (44,118,177 articles, preprints and more)

  • Free full text
  • Citations & impact
  • Similar Articles

Writing implementation research grant proposals: ten key ingredients.

Author information, affiliations.

  • Proctor EK 1

ORCIDs linked to this article

  • Powell BJ | 0000-0001-5245-1186
  • Baumann AA | 0000-0002-4523-0147

Implementation Science : IS , 12 Oct 2012 , 7: 96 https://doi.org/10.1186/1748-5908-7-96   PMID: 23062065  PMCID: PMC3541090

Abstract 

Free full text .

Logo of implemsci

Writing implementation research grant proposals: ten key ingredients

Enola k proctor.

1 Center for Mental Health Services Research, George Warren Brown School of Social Work, Washington University in St. Louis, Campus Box 1196, One Brookings Drive, St. Louis, MO, 63130, USA

Byron J Powell

Ana a baumann, ashley m hamilton, ryan l santens.

All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.

This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project’s feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research.

Investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Researchers need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, most investigators may feel ‘new to the field.’ Furthermore, young investigators may have less preliminary data, and the path to successful proposal writing may seem less clear.

This article identifies ten of the important ingredients in well-crafted implementation proposals; in particular, it addresses how investigators can set the stage for proposed work through pilot data and a well-crafted and rationalized proposed study approach. It addresses questions such as: What preliminary work is important in the grant applications, and how can implementation researchers meet this challenge? How can investigators balance scientific impact with feasibility? Where in an implementation research proposal can investigators demonstrate their capacity to conduct a study as proposed?

The importance of the question

A significant and innovative research question is the first and primary ingredient in a successful proposal. A competitive implementation research application needs to pursue scientific questions that remain unanswered, questions whose answers advance knowledge of implementation with generalizability beyond a given setting. By definition, implementation research in health focuses on a health condition or disease, healthcare settings, and particular evidence-based interventions and programs with promise of reducing a gap in quality of care. It is conducted in usual care settings with practical quality gaps that stakeholders want to reduce. However, to make a compelling argument for scientific innovation and public health significance, a research grant application must have potential beyond reducing a quality gap and implementing a particular evidence-based healthcare practice. The application must have potential to advance the science of implementation by yielding generalizable knowledge. With only one journal devoted solely to implementation science [ 1 ], researchers must be aware of implementation literature that is scattered across a host of discipline-specific journals. Implementation researchers—akin to students with multiple majors—must demonstrate their grounding in implementation science, health diseases, disorders and their treatments, and real-world healthcare delivery.

Although implementation science is often characterized as an emerging field, its bar for scientifically important questions is rising rapidly. Descriptive studies of barriers have dominated implementation science for too long, and the field is urged to ‘move on’ to questions of how and why implementation processes are effective. Accordingly, the Institute of Medicine [ 2 ] has identified studies comparing the effectiveness of alternative dissemination and implementation strategies as a top-quartile priority for comparative effectiveness research. But experimental studies testing implementation strategies need to be informed by systematic background research on the contexts and processes of implementation. While investigators must demonstrate their understanding of these complexities, their grant proposals must balance feasibility with scientific impact. This paper addresses the challenges of preparing grant applications that succeed on these fronts. Though this article focuses on U.S. funding sources and grant mechanisms, the principles that are discussed should be relevant to implementation researchers internationally.

Guidance from grant program announcements

Grant review focuses on the significance of proposed aims, impact and innovation, investigator capacity to conduct the study as proposed, and support for the study hypotheses and research design. The entire application should address these issues. Investigators early in their research careers or new to implementation science often struggle to demonstrate their capacity to conduct the proposed study and the feasibility of the proposed methods. Not all National Institutes of Health (NIH) program announcements require preliminary data. However, those that do are clear that applications must convey investigator training and experience, capacity to conduct the study as proposed, and support for the study hypotheses and research design [ 3 ]. The more complex the project, the more important it is to provide evidence of capacity and feasibility [ 4 ].

The R01grant mechanism is typically large in scope compared to the R03, R21 and R34 a . Program announcements for grant mechanisms that are preliminary to R01 studies give important clues as to how to set the stage for an R01 and demonstrate feasibility. Investigator capacity can be demonstrated by describing prior work, experience, and training relevant to the application’s setting, substantive issues, and methodology—drawing on prior employment and research experience. For example, the NIH R03 small grant mechanism is often used to establish the feasibility of procedures, pilot test instruments, and refine data management procedures to be employed in a subsequent R01. The NIH R21 and the R34 mechanisms support the development of new tools or technologies; proof of concept studies; early phases of research that evaluate the feasibility, tolerability, acceptability and safety of novel treatments; demonstrate the feasibility of recruitment protocols; and support the development of assessment protocols and manuals for programs and treatments to be tested in subsequent R01 studies. These exploratory grants do not require extensive background material or preliminary information, but rather serve as sources for gathering data for subsequent R01 studies. These grant program announcements provide a long list of how pre-R01 mechanisms can be used, and no single application can or should provide all the stage-setting work exemplified in these descriptions.

Review criteria, typically available on funding agency web sites or within program announcements, may vary slightly by funding mechanism. However grants are typically reviewed and scored according to such criteria as: significance, approach (feasibility, appropriateness, robustness), impact, innovation, investigator team, and research environment. Table ​ Table1 1 summarizes the ten ingredients, provides a checklist for reviewing applications prior to submission, and ties each ingredient to one or more of the typical grant review criteria.

Ten key ingredients for implementation research proposals

The literature does not provide a ‘. . . a comprehensive, prescriptive, and robust-yet practical-model to help…researchers understand (the) factors need to be considered and addressed’ in an R01 study [ 5 ]. Therefore we examined a variety of sources to identify recommendations and examples of background work that can strengthen implementation research proposals. This paper reflects our team’s experience with early career implementation researchers, specifically through training programs in implementation science and our work to provide technical assistance in implementation research through our university’s Clinical and Translational Science Award CTSA program. We also studied grant program announcements, notably the R03, R21, R18, and R01 program announcements in implementation science [ 6 - 9 ]. We studied how successful implementation research R01 grant applications ‘set the stage’ for the proposed study in various sections of the proposal. We conducted a literature search using combinations of the following key words: ‘implementation research,’ ‘implementation studies,’ ‘preliminary studies,’ ‘preliminary data,’ ‘pilot studies,’ ‘pilot data,’ ‘pilot,’ ‘implementation stages,’ ‘implementation phases,’ and ‘feasibility.’ We also drew on published studies describing the introduction and testing of implementation strategies and those that characterize key elements and phases of implementation research [ 10 , 11 ].

From these reviews, we identified ten ingredients that are important in all implementation research grants: the gap between usual care and evidence-based care; the background of the evidence-based treatment to be implemented, its empirical base, and requisites; the theoretical framework for implementation and explicit theoretical justification for the choice of implementation strategies; information about stakeholders’ (providers, consumers, policymakers) treatment priorities; the setting’s (and providers’) readiness to adopt new treatments; the implementation strategies planned or considered in order to implement evidence-based care; the study team’s experience with the setting, treatment, or implementation process and the research environment; the feasibility and requisites of the proposed methods; the measurement and analysis of study variables; and the health delivery setting’s policy/funding environment, leverage or support for sustaining change.

Given the sparse literature on the importance of preliminary studies for implementation science grant applications, we ‘vetted’ our list of grant application components with a convenience sample of experts. Ultimately, nine experts responded to our request, including six members of the Implementation Science editorial board. We asked the experts to rate the importance of each of the ten elements, rating them as ‘1: Very important to address this is the application,’ ‘2: Helpful but not necessary to the application,’ or ‘3: Not very important to address’ within the context of demonstrating investigator capacity and study feasibility. Respondents were also asked whether there are any additional factors that were not listed.

While all the ten ingredients below were considered important for a successful application, several experts noted that their importance varies according to the aims of the application. For example, one expert affirmed the importance of the settings’ readiness to change, but noted that it may not be crucial to address in a given proposal: ‘the setting’s readiness may be unimportant to establish or report prior to the study, because the study purpose may be to establish an answer to this question.’ However, another maintained, ‘in a good grant application, you have to dot all the ‘I’s’ and cross all the ‘T’s.’ I consider all these important.’ One expert noted that applications might need to argue the importance of implementation research itself, including the importance of closing or reducing gaps in the quality of care. This was viewed as particularly important when the study section to review the grant may not understand or appreciate implementation research. In these cases, it may be important to define and differentiate implementation research from other types of clinical and health services research. For example, it may be useful to situate one’s proposal within the Institute of Medicine’s ‘prevention research cycle,’ which demonstrates the progression from pre-intervention, efficacy, and effectiveness research to dissemination and implementation studies that focus on the adoption, sustainability, and scale-up of interventions [ 12 ]. It may also be important to convey that implementation research is very complex, necessitating the use of multiple methods, a high degree of stakeholder involvement, and a fair amount of flexibility in order to ensure that implementers will be able to respond appropriately to unforeseen barriers.

Ten key ingredients of a competitive implementation research grant application

As emphasized at the beginning of this article, the essential ingredient in a successful implementation science proposal is a research question that is innovative and, when answered, can advance the field of implementation science. Assuming that an important question has been established to potential reviewers, we propose that the following ten ingredients can help investigators demonstrate their capacity to conduct the study and to demonstrate the feasibility of completing the study as proposed. For each ingredient, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application.

The care gap, or quality gap, addressed in the application

The primary rationale for all implementation efforts, and thus a key driver in implementation science, is discovering how to reduce gaps in healthcare access, quality, or, from a public health perspective, reducing the gap between Healthy People 2020 [ 13 ] goals and current health status. Accordingly, implementation research proposals should provide clear evidence that gaps exists and that there is room for improvement and impact through the proposed implementation effort. This is a primary way of demonstrating the public health significance of the proposed work.

Gaps in the quality of programs, services, and healthcare can be measured and documented at the population-, organization-, and provider-levels [ 14 ]. Several kinds of preliminary data can demonstrate the quality gap to be reduced through the proposed implementation effort. For example, investigators can emphasize the burden of disease through data that reflect its morbidity, mortality, quality of life, and cost [ 14 ]. An implementation research grant should cite service system research that demonstrates unmet need [ 15 ], the wide variation in the use of evidence-based treatments in usual care [ 16 - 19 ], or the association between the burden of disease and variations in the use of guidelines [ 20 ]. Investigators can also document that few providers adopt evidence-based treatments [ 21 , 22 ], that evidence-based treatments or programs have limited reach [ 23 ], or that penetration [ 24 ] into a system of care can be addressed by the implementation study. Regardless of the specific approach to documenting a quality gap, investigators should use rigorous methods and involve all relevant stakeholders [ 14 ]. In fact, stakeholders can demonstrate their involvement and endorse quality gaps through letters of support attesting to the lack of evidence-based services in usual care.

The evidence-based treatment to be implemented

A second key ingredient in implementation research proposals is the evidence-based program, treatment, policies, or set of services whose implementation will be studied in the proposed research [ 25 - 27 ]. The research ‘pipeline’ [ 28 - 30 ] contains many effective programs and treatments in a backlog, waiting to be implemented. Moreover, many health settings experience a huge demand for better care. An appropriate evidence-based treatment contributes to the project’s public health significance and practical impact, presuming of course that it will be studied in a way that contributes to implementation science.

Implementation research proposals must demonstrate that the evidence-based service is ready for implementation. The strength of the empirical evidence for a given guideline or treatment [ 31 , 32 ], a key part of ‘readiness,’ can be demonstrated in a variety of ways; in some fields, specific thresholds must be met before an intervention is deemed ‘evidence-based’ or ‘empirically-supported’ [ 33 - 35 ]. For example, Chambless et al. [ 35 ] suggest that interventions should demonstrate efficacy by being shown to be superior to placebos or to another treatment in at least two between group design experiments; or by showing efficacy in a large series of single case design experiments. Further, Chambless et al. [ 35 ] note that the experiments must have been conducted with treatment manuals, the characteristics of the samples must have been clearly specified, and the effects must have been demonstrated by at least two different investigators or investigative teams.

The strength of evidence for a given treatment can also be classified using the Cochrane EPOC’s criteria for levels of evidence, which considers randomized controlled trials, controlled clinical trials, time series designs, and controlled before-and-after studies as appropriate [ 36 ]. Researchers who come to implementation research as effectiveness researchers or as program or treatment developers are well positioned, because they can point to their prior research as part of their own background work. Other researchers can establish readiness for implementation by reviewing evidence for the treatment or program as part of the background literature review, preferably relying on well-conducted systematic reviews and meta-analyses of randomized-controlled trials (if available). At a minimum, ‘evaluability assessment’ [ 37 ] can help reflect what changes or improvements are needed to optimize effectiveness given the context of the implementation effort.

Conceptual model and theoretical justification

Any research striving for generalizable knowledge should be guided by and propose to test conceptual frameworks, models, and theories [ 38 ]. Yet, theory has been drastically underutilized and underspecified in implementation research [ 38 - 40 ]. For example, in a review of 235 implementation studies, less than 25% of the studies employed theory in any way, and only 6% were explicitly theory-based [ 39 ]. While translating theory into research design is not an easy task [ 36 ], the absence of theory in implementation research has limited our ability to specify key contextual variables and to identify the precise mechanisms by which implementation strategies exert their effects.

McDonald et al. [ 41 ] present a useful hierarchy of theories and models, which serves to organize the different levels of theory and specify the ways in which they can be useful in implementation research. They differentiate between conceptual models, frameworks, and systems, which are used to represent global ideas about a phenomenon and theory, which is an ‘organized, heuristic, coherent, and systematic set of statements related to significant questions that are communicated in a meaningful whole’ [ 41 ]. Within the realm of theory, they differentiate between grand or macro theories ( e.g. , Rogers’ Diffusion of Innovations theory [ 26 ]), mid-range theories ( e.g. , transtheoretical model of change [ 42 ]), and micro-theories ( e.g. , feedback intervention theory [ 43 ]). Though models, frameworks, and systems are generally at a higher level of abstraction than theories, it is important to note that the level of abstraction varies both between and within the categories of the hierarchy. The thoughtful integration of both conceptual models and theories can substantially strengthen an application.

Conceptual models, frameworks, and systems can play a critical role in anchoring a research study theoretically by portraying the key variables and relationships to be tested. Even studies that address only a subset of variables within a conceptual model need to be framed conceptually, so that reviewers perceive the larger context (and body of literature) that a particular study proposes to inform. Given the confusion surrounding definitions and terminology within the still-evolving field of dissemination and implementation [ 44 , 45 ], grant proposals need to employ consistent language, clear definitions for constructs, and the most valid and reliable measures for the constructs that correspond to the guiding conceptual framework or theoretical model. Proposal writers should be cautioned that the theory or conceptual model used to frame the study must be used within the application. A mere mention will not suffice. A conceptual model can help frame study questions and hypotheses, anchor the background literature, clarify the constructs to be measured, and illustrate the relationships to be evaluated or tested. The application must also spell out how potential findings will inform the theory or model.

Numerous models and frameworks can inform implementation research. For example, Glasgow et al. [ 23 ] RE-AIM framework can inform evaluation efforts in the area of implementation science. Similarly, Proctor et al. [ 46 ] have proposed a model that informs evaluation by differentiating implementation, service system, and clinical outcomes, and identifying a range of implementation outcomes that can be assessed [ 24 ]. Damschroder et al. ’s [ 10 ] Consolidated Framework for Implementation Research identifies five domains that are critical to successful implementation: intervention characteristics (evidentiary support, relative advantage, adaptability, trialability, and complexity); the outer setting (patient needs and resources, organizational connectedness, peer pressure, external policy and incentives); the inner setting (structural characteristics, networks and communications, culture, climate, readiness for implementation); the characteristics of the individuals involved (knowledge, self-efficacy, stage of change, identification with organization, etc.); and the process of implementation (planning, engaging, executing, reflecting, evaluating). Others have published stage or phase models of implementation. For example, the Department of Veteran Affairs’ QUERI initiative [ 47 ] specifies a four-phase model spanning pilot projects, small clinical trials, regional implementation, and implementation on the national scale; and Aarons, Hurlburt and Horwitz [ 48 ] developed a four phase model of exploration, adoption/preparation, active implementation, and sustainment. Magnabosco [ 49 ] delineates between pre-implementation, initial implementation, and sustainability planning phases.

McDonald et al. [ 41 ] note that grand theories are similar to conceptual models, and that they generally represent theories of change. They differentiate between classical models of change that emphasize natural or passive change processes, such as Rogers’ diffusion of innovations theory [ 26 ], and planned models of change that specify central elements of active implementation efforts. Investigators may find it more helpful to draw from mid-range theories because they discuss the mechanisms of change at various levels of the implementation context [ 26 ]. For example, social psychological theories, organizational theories, cognitive psychology theories, educational theories, and a host of others may be relevant to the proposed project. While conceptual models are useful in framing a study theoretically and providing a ‘big picture’ of the hypothesized relationships between variables, mid-range theories can be more helpful in justifying the selection of specific implementation strategies specifying the mechanisms by which they may exert their effects. Given the different roles that theory can play in implementation research, investigators would be wise to consider relevant theories at multiple levels of the theoretical hierarchy when preparing their proposals. It is far beyond the scope of this article to review conceptual models and theories in detail; however, several authors have produced invaluable syntheses of conceptual models and theories that investigators may find useful [ 10 , 41 , 50 - 56 ].

Stakeholder priorities and engagement in change

Successful implementation of evidence-based interventions largely depends on their fit with the preferences and priorities of those who shape, deliver, and participate in healthcare. Stakeholders in implementation, and thus in implementation research, include treatment or guideline developers, researchers, administrators, providers, funders, community-based organizations, consumers, families, and perhaps legislators who shape reimbursement policies (see Mendel et al. ’ article [ 57 ] for a framework that outlines different levels of stakeholders). These stakeholders are likely to vary in their knowledge, perceptions, and preferences for healthcare. Their perspectives contribute substantially to the context of implementation and must be understood and addressed if the implementation effort is to succeed. A National Institute of Mental Health Council workgroup report [ 58 ] calls for the engagement of multiple stakeholder perspectives, from concept development to implementation, in order to improve the sustainability of evidence-based services in real-world practice. The engagement of key stakeholders in implementation research affects both the impact of proposed implementation efforts, the sustainability of the proposed change, and the feasibility and ultimate success of the proposed research project. Thus, implementation research grant proposals should convey the extent and manner in which key stakeholders are engaged in the project.

Stakeholders and researchers can forge different types of collaborative relationships. Lindamer et al. [ 59 ] describe three different approaches researchers and stakeholders can take that vary with respect to the level of participation of the stakeholders and community in decisions about the research. In the ‘community-targeted’ approach, stakeholders are involved in recruitment and in the dissemination of the results. In the ‘community-based’ approach, stakeholders participate in the selection of research topics, but the researcher makes the final decision on the study design, methodology, and analysis of data. Finally, the ‘community-driven’ approach or community-based participatory research (CBPR) approach entails participation of the stakeholders in all aspects of the research. Some authors advocate for the CBPR model as a strategy to decrease the gap between research and practice because it addresses some of the barriers to implementation and dissemination [ 60 - 62 ] by enhancing the external validity of the research and promoting the sustainability of the intervention. Kerner et al. [ 62 ] note:

‘When community-based organizations are involved as full partners in study design, implementation, and evaluation of study findings, these organizations may be more amenable to adopting the approaches identified as being effective, as their tacit knowledge about ‘what works’ would have been evaluated explicitly through research.’

Stakeholder analysis can be carried out to evaluate and understand stakeholders’ interests, interrelations, influences, preferences, and priorities. The information gathered from stakeholder analysis can then be used to develop strategies for collaborating with stakeholders, to facilitate the implementation of decisions or organizational objectives, or to understand the future of policy directions [ 63 , 64 ].

Implementation research grant applications are stronger when preliminary data, qualitative or quantitative, reflect stakeholder preferences around the proposed change. Engagement is also reflected in publications that the principal investigator (PI) and key stakeholders have shared in authorship, or methodological details that reflect stakeholder priorities. Letters of support are a minimal reflection of stakeholder investment in the proposed implementation project.

Context: Setting’s readiness to adopt new services/ treatments/ programs

Implementation research proposals are strengthened by information that reflects the setting’s readiness, capacity, or appetite for change, specifically around adoption of the proposed evidence-based treatment. This is not to say that all implementation research should be conducted in settings with high appetite for change. Implementation research is often criticized for disproportionate focus on settings that are eager and ready for change. ‘Cherry picking’ sites, where change is virtually guaranteed, or studying implementation only with eager and early adopters, does not produce knowledge that can generalize to usual care, where change is often challenging. The field of implementation science needs information about the process of change where readiness varies, including settings where change is resisted.

Preliminary data on the organizational and policy context and its readiness for change can strengthen an application. Typically viewed as ‘nuisance’ variance to be controlled in efficacy and effectiveness research, contextual factors are key in implementation research [ 65 - 67 ]. The primacy of context is reflected in the choice of ‘it’s all about context’ as a theme at the 2011 NIH Training Institute in Dissemination and Implementation Research in Health [ 68 ]. Because organization, policy, and funding context may be among the strongest influences on implementation outcomes, context needs to be examined front and center in implementation research [ 69 ]. A number of scales are available to capture one key aspect of context, the setting’s readiness or capacity for change. Weiner et al. [ 70 ] extensive review focusing on the conceptualization and measurement of organizational readiness for change identified 43 different instruments; though, they acknowledged substantial problems with the reliability and validity of many of the measures. Due in part to issues with reliability and validity of the measures used in the field, work in this area is ongoing [ 71 , 72 ].

Other approaches to assessing readiness have focused on organizational culture, climate, and work attitudes [ 73 ], and on providers’ attitudes towards evidence-based practices [ 21 , 22 , 74 ]. Furthermore, a prospective identification of implementation barriers and facilitators can be helpful in demonstrating readiness to change, increasing reviewers’ confidence that the PI has thoroughly assessed the implementation context, and informing the selection of implementation strategies (discussed in the following section) [ 75 - 77 ]. An evaluation of barriers and facilitators can be conducted through qualitative [ 78 - 80 ] or survey [ 81 , 82 ] methodology. In fact, a number of scales for measuring implementation barriers have been developed [ 74 , 83 , 84 ]. Letters from agency partners or policy makers, while weaker than data, can also be used to convey the setting’s readiness and capacity for change. Letters are stronger when they address the alignment of the implementation effort to setting or organizational priorities or to current or emergent policies.

Implementation strategy/process

Though the assessment of implementation barriers can play an important role in implementation research, the ‘rising bar’ in the field demands that investigators move beyond the study of barriers to research that generates knowledge about the implementation processes and strategies that can overcome them. Accordingly, the NIH has prioritized efforts to ‘identify, develop, and refine effective and efficient methods, structures, and strategies to disseminate and implement’ innovations in healthcare [ 7 ].

A number of implementation strategies have been identified and discussed in the literature [ 36 , 85 - 87 ]. However, as the Improved Clinical Effectiveness through Behavioural Research Group notes [ 38 ], the most consistent finding from systematic reviews of implementation strategies is that most are effective some, but not all of the time, and produce effect sizes ranging from no effect to a large effect. Our inability to determine how, why, when, and for whom these strategies are effective is hampered in large part by the absence of detailed descriptions of implementation strategies [ 40 ], the use of inconsistent language [ 44 ], and the lack of clear theoretical justification for the selection of specific strategies [ 39 ]. Thus, investigators should take great care in providing detailed descriptions of implementation strategies to be observed or empirically tested. Implementation Science has endorsed [ 40 ] the use of the WIDER Recommendations to Improve Reporting of the Content of Behaviour Change Interventions [ 88 ] as a means of improving the conduct and reporting of implementation research, and these recommendations will undoubtedly be useful to investigators whose proposals employ implementation strategies. Investigators may also find the Standards for Quality Improvement Reporting Excellence (SQUIRE) helpful [ 89 ]. Additional design specific reporting guidelines can be found on the Equator Network website [ 90 ]. The selection of strategies must be justified conceptually by drawing upon models and frameworks that outline critical implementation elements [ 10 ]. Theory should be used to explain the mechanisms through which implementation strategies are proposed to exert their effects [ 39 ], and it may be helpful to clarify the proposed mechanisms of change through the development of a logic model and illustrate the model through a figure [ 91 ].

According to Brian Mittman, in addition to being theory-based, implementation strategies should be: multifaceted or multilevel (if appropriate); robust or readily adaptable; feasible and acceptable to stakeholders; compelling, saleable, trialable, and observable; sustainable; and scalable [ 92 , 93 ]. We therefore emphasize taking stock of the budget impact of implementation strategies [ 94 ] as well as any cost and cost-effectiveness data related to the implementation strategies [ 95 ]. Although budget impact is a key concern to administrators and some funding agencies require budget impact analysis, implementation science to date suffers a dearth of economic evaluations from which to draw [ 96 , 97 ].

The empirical evidence for the effectiveness of multifaceted strategies has been mixed, because early research touted the benefits of multifaceted strategies [ 98 , 99 ], while a systematic review of 235 implementation trials by Grimshaw et al. found no relationship between the number of component interventions and the effects of multifaceted interventions [ 100 ]. However, Wensing et al. [ 101 ] note that while multifaceted interventions were assumed to address multiple barriers to change, many focus on only one barrier. For example, providing training and consultation is a multifaceted implementation strategy; however, it primarily serves to increase provider knowledge, and does not address other implementation barriers. Thus, Wensing et al. [ 101 ] argue that multifaceted interventions could be more effective if they address different types of implementation barriers ( e.g. , provider knowledge and the organizational context). While the methods for tailoring clinical interventions and implementation strategies to local contexts need to be improved [ 102 ], intervention mapping [ 103 ] and a recently developed ‘behaviour change wheel’ [ 104 ] are two promising approaches.

Proposals that employ multifaceted and multilevel strategies that address prospectively identified implementation barriers [ 102 ] may be more compelling to review committees, but mounting complex experiments may be beyond the reach of many early-stage investigators and many grant mechanisms. However, it is within the scope of R03, R21, and R34 supported research to develop implementation strategies and to conduct pilot tests of their feasibility and acceptability—work that can strengthen the case for sustainability and scalability. Proposal writers should provide preliminary work for implementation strategies in much the same way that intervention developers do, such as by providing manuals or protocols to guide their use, and methods to gauge their fidelity. Such work is illustrated in the pilot study conducted by Kauth et al. [ 105 ], which demonstrated that an external facilitation strategy intended to increase the use of cognitive behavioral therapy within Veteran Affairs clinics was a promising and low-cost strategy; such pilot data would likely bolster reviewers’ confidence that the strategy is feasible, scalable, and ultimately, sustainable. Investigators should also make plans to document any modifications to the intervention and, if possible, incorporate adaptation models to the implementation process, because interventions are rarely implemented without being modified [ 67 , 106 ].

While providing detailed specification of theory-based implementation strategies is critical, it is also imperative that investigators acknowledge the complexity of implementation processes. Aarons and Palinkas [ 107 ] comment:

‘It is unrealistic to assume that implementation is a simple process, that one can identify all of the salient concerns, be completely prepared, and then implement effectively without adjustments. It is becoming increasingly clear that being prepared to implement EBP means being prepared to evaluate, adjust, and adapt in a continuing process that includes give and take between intervention developers, service system researchers, organizations, providers, and consumers.’

Ultimately, proposals that reflect the PI’s understanding of the complexity of the process of implementing evidence-based practices and that provide supporting detail about strategies and processes will be perceived as more feasible to complete through the proposed methods.

Team experience with the setting, treatment, implementation process, and research environment

Grant reviewers are asked to specifically assess a PI’s capacity to successfully complete a proposed study. Grant applications that convey the team’s experience with the study setting, the treatment whose implementation is being studied, and implementation processes help convey capacity and feasibility to complete an implementation research project [ 108 ].

The reader should observe that NIH gives different scores for the team experience with the setting and for the research environment ( http://grants.nih.gov/grants/writing_application.htm ) but the purpose of both sections is demonstrating capacity to successfully carry out the study as proposed. Investigators can convey capacity through a variety of ways. Chief among them is building a strong research team, whose members bring depth and experience in areas the PI does not yet have. Implementation research exemplifies multidisciplinary team science, informed by a diverse range of substantive and methodological fields [ 96 , 109 ]. A team that brings the needed disciplines and skill sets directly to the project enhances the project’s likelihood of success. Early-stage implementation researchers who collaborate or partner with senior investigators reassure reviewers that the proposed work will benefit from the senior team member’s experience and expertise. Similarly, collaborators play important roles in complementing, or rounding out, the PI’s disciplinary perspective and methodological skill set. Early career investigators, therefore, should surround themselves with more established colleagues who bring knowledge and experience in areas key to the study aims and methods. The narrative should cite team members’ relevant work, and their prior work can be addressed in a discussion of preliminary studies. Additionally, the new formats for NIH biosketches and budget justifications enable a clear portrayal of what each team member brings to the proposed study.

For the NIH applications, the research environment is detailed in the resources and environment section of a grant application. Here, an investigator can describe the setting’s track record in implementation research; research centers, labs, and offices that the PI can draw on; and structural and historic ties to healthcare settings. For example, a PI can describe how their project will draw upon the University’s CTSA program [ 110 ], statistics or design labs, established pools of research staff, and health services research centers. Preliminary studies and biosketches provide additional ways to convey the strengths of the environment and context within which an investigator will launch a proposed study.

In summary, researchers need to detail the strengths of the research environment, emphasizing in particular the resources, senior investigators, and research infrastructure that can contribute to the success of the proposed study. A strong research environment is especially important for implementation research, which is typically team-based, requires expertise of multiple disciplines, and requires strong relationships between researchers and community based health settings. Investigators who are surrounded by experienced implementation researchers, working in a setting with strong community ties, and drawing on experienced research staff can inspire greater confidence in the proposed study’s likelihood of success.

Feasibility of proposed research design and methods

One of the most important functions of preliminary work is to demonstrate the feasibility of the proposed research design and methods. Landsverk [ 108 ] urges PIs to consider every possible question reviewers might raise, and to explicitly address those issues in the application. Data from small feasibility studies or pilot work around referral flow; participant entry into the study; participant retention; and the extent to which key measures are understood by participants, acceptable for use, and capture variability can demonstrate that the proposed methods are likely to work. The methods section should contain as much detail as possible, as well as lay out possible choice junctures and contingencies, should methods not work as planned. It is not only important to justify methodological choices, but also to discuss why potential alternatives were not selected. For example, if randomization is not feasible or acceptable to stakeholders, investigators should make that clear. Letters from study site collaborators can support, but should not replace, the narrative’s detail on study methods. For example, letters attesting the willingness of study sites to be randomized or to support recruitment for the proposed timeframe can help offset reviewer concerns about some of the real-world challenges of launching implementation studies.

Measurement and analysis

A grant application must specify a measurement plan for each construct in the study’s overarching conceptual model or guiding theory, whether those constructs pertain to implementation strategies, the context of implementation, stakeholder preferences and priorities, and implementation outcomes [ 111 ]. Yet, crafting the study approach section is complicated by the current lack of consensus on methodological approaches to the study of implementation processes, measuring implementation context and outcomes, and testing implementation strategies [ 112 , 113 ]. Measurement is a particularly important aspect of study methods, because it determines the quality of data. Unlike efficacy and effectiveness studies, implementation research often involves some customization of an intervention to fit local context; accordingly, measurement plans need to address the intervention’s degree of customization versus fidelity [ 97 ]. Moreover, implementation science encompasses a broad range of constructs, from a variety of disciplines, with little standardization of measures or agreement on definitions of constructs across different studies, fields, authors, or research groups, further compounding the burden to present a clear and robust measurement plan along with its rationale. Two current initiatives seek to advance the harmonization, standardization, and rigor of measurement in implementation science, the U.S. National Cancer Institute’s (NCI) Grid-Enabled Measures (GEM) portal [ 114 ] and the Comprehensive Review of Dissemination and Implementation Science Instruments efforts supported by the Seattle Implementation Research Conference (SIRC) at the University of Washington [ 115 ]. Both initiatives engage the implementation science research community to enhance the quality and harmonization of measures. Their respective web sites are being populated with measures and ratings, affording grant writers an invaluable resource in addressing a key methodological challenge.

Key challenges in crafting the analysis plan for implementation studies include: determining the unit of analysis, given the ‘action’ at individual, team, organizational, and policy environments; shaping meditational analyses given the role of contextual variables; and developing and using appropriate methods for characterizing the speed, quality, and degree of implementation. The proposed study’s design, assessment tools, analytic strategies, and analytic tools must address these challenges in some manner [ 113 ]. Grant applications that propose the testing of implementation strategies or processes often provide preliminary data from small-scale pilot studies to examine feasibility and assess sources of variation. However, the magnitude of effects in small pilots should be determined by clinical relevance [ 113 ], given the uncertainty of power calculations from small scale studies [ 116 ].

Policy/funding environment; leverage or support for sustaining change

PIs should ensure that grant applications reflect their understanding of the policy and funding context of the implementation effort. Health policies differ in many ways that impact quality [ 117 ], and legal, reimbursement, and regulatory factors affect the adoption and sustainability of evidence-based treatments [ 118 ]. Raghavan et al. [ 119 ] discuss the policy ecology of implementation, and emphasize that greater attention should be paid to marginal costs associated with implementing evidence-based treatments, including expenses for provider training, supervision, and consultation. Glasgow et al. [ 120 ] recently extended their heretofore behaviorally focused RE-AIM framework for public health interventions to health policies, revealing the challenges associated with policy as a practice-change lever.

PIs can address the policy context of the implementation initiative through the narrative, background literature, letters of support, and the resource and environment section. Proposals that address how the implementation initiative aligns with policy trends enhance their likelihood of being viewed as having high public health significance, as well as greater practical impact, feasibility, and sustainability. It is important to note that it may behoove investigators to address the policy context within a proposal even if it is not likely to be facilitative of implementation, because it demonstrates to reviewers that the investigator is not naïve to the challenges and barriers that exist at this level.

We identify and discuss ten key ingredients in implementation research grant proposals. The paper reflects the team’s experience and expertise: writing for federal funding agencies in the United States. We acknowledge that this will be a strength for some readers and a limitation for international readers, whom we encourage to contribute additional perspectives. Setting the stage with careful background detail and preliminary data may be more important for implementation research, which poses a unique set of challenges that investigators should anticipate and demonstrate their capacity to manage. Data to set the stage for implementation research may be collected by the study team through preliminary, feasibility, or pilot studies, or the team may draw on others’ work, citing background literature to establish readiness for the proposed research.

Every PI struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project’s feasibility and likelihood of success. The relative emphasis on, and thus length of text addressing, the various sections of a grant proposal varies with the program mechanism, application ‘call,’ and funding source. For NIH applications, most attention and detail should be allocated to the study method because the ‘approach’ section is typically weighted most heavily in scoring. Moreover, the under-specification or lack of detail in study methodology usually receives the bulk of reviewer criticism. Well-constructed, parsimonious tables, logic models, and figures reflecting key concepts and the analytic plan for testing their relationships all help add clarity, focus reviewers, and prevent misperceptions. All implementation research grants need to propose aims, study questions, or hypotheses whose answers will advance implementation science. Beyond this fundamental grounding, proposed implementation studies should address most, if not all, of the ingredients identified here. While no application can include a high level of detail about every ingredient, addressing these components can help assure reviewers of the significance, feasibility, and impact of the proposed research.

a For more information regarding different grant mechanisms, please see: http://grants.nih.gov/grants/funding/funding_program.htm .

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

EKP conceived the idea for this paper and led the writing. BJP, AAB, AMH, and RLS contributed to the conceptualization, literature review, and the writing of this manuscript. All authors read and approved the final manuscript.

Authors’ information

EKP directs the Center for Mental Health Services Research at Washington University in St. Louis (NIMH P30 MH085979), the Dissemination and Implementation Research Core (DIRC) of the Washington University Institute of Clinical and Translational Sciences (NCRR UL1RR024992), and the Implementation Research Institute (NIMH R25 MH080916).

Acknowledgements

Preparation of this paper was supported in part by National Center for Research Resources through the Dissemination and Implementation Research Core of Washington University in St. Louis’ Institute of Clinical and Translational Sciences (NCRR UL1 RR024992) and the National Institute of Mental Health through the Center for Mental Health Services Research (NIMH P30 MH068579), the Implementation Research Institute (NIMH R25 MH080916), and a Ruth L. Kirschstein National Research Service Award (NIMH T32 RR024992). An earlier version of this paper was an invited presentation at an early investigator workshop, held at the 4 th Annual National Institutes of Health Conference on Advancing the Science of Dissemination and Implementation on March 22, 2011 in Bethesda, Maryland.

  • Implementation Science. http://www.implementationscience.com .
  • Institute of Medicine. Initial national priorities for comparative effectiveness research. Washington, DC: The National Academies Press; 2009. [ Google Scholar ]
  • Agency for Health Care Research and Quality's Essentials of the Research Plan. http://www.ahrq.gov/fund/esstplan.htm#Preliminary .
  • National Institutes of Health Grant Cycle. http://www.niaid.nih.gov/researchfunding/grant/cycle/Pages/part05.aspx .
  • Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Joint Commission on Accreditation of Healthcare Organizations. 2008; 34 :228–243. [ Abstract ] [ Google Scholar ]
  • Researching Implementation and Change while Improving Quality (R18) http://grants.nih.gov/grants/guide/pa-files/PAR-08-136.html .
  • Dissemination and Implementation Research in Health (R01) http://grants.nih.gov/grants/guide/pa-files/PAR-10-038.html .
  • Dissemination and Implementation Research in Health (R03) http://grants.nih.gov/grants/guide/pa-files/PAR-10-039.html .
  • Dissemination and Implementation Research in Health (R21) http://grants.nih.gov/grants/guide/pa-files/PAR-10-040.html .
  • Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009; 4 (50):1–15. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Stetler CB, Mittman BS, Francis J. Overview of the VA quality enhancement research inititative (QUERI) and QUERI theme articles: QUERI series. Implementation Science. 2008; 3 :1–9. 10.1186/1748-5908-3-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Institute of Medicine. Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. Washington, DC: National Academies Press; 2009. [ Abstract ] [ Google Scholar ]
  • Healthy People 2020. http://www.healthypeople.gov/2020/default.aspx .
  • Kitson A, Straus SE. In: Knowledge Translation in Health Care: Moving from evidence to practice. Straus S, Tetroe J, Graham ID, editor. Hoboken, NJ: Wiley-Blackwell; 2009. Identifying the knowledge-to-action gaps; pp. 60–72. [ Google Scholar ]
  • Burns BJ, Phillips SD, Wagner HR, Barth RP, Kolko DJ, Campbell Y, Landsverk J. Mental health need and access to mental health services by youths involved with child welfare: a national survey. J Am Acad Child Adolesc Psychiatry. 2004; 43 :960–970. 10.1097/01.chi.0000127590.95585.65. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA. The quality of health care delivered to adults in the United States. N Engl J Med. 2003; 348 :2635–2645. 10.1056/NEJMsa022615. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Raghavan R, Inoue M, Ettner SL, Hamilton BH. A preliminary analysis of the receipt of mental health services consistent with national standards among children in the child welfare system. Am J Public Health. 2010; 100 :742–749. 10.2105/AJPH.2008.151472. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Wang PS, Berglund P, Kessler RC. Recent care of common mental disorders in the United States. J Gen Intern Med. 2000; 15 :284–292. 10.1046/j.1525-1497.2000.9908044.x. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Zima BT, Hurlburt MS, Knapp P, Ladd H, Tang L, Duan N, Wallace P, Rosenblatt A, Landsverk J, Wells KB. Quality of publicly-funded outpatient specialty mental health care for common childhood psychiatric disorders in California. J Am Acad Child Adolesc Psychiatry. 2005; 44 :130–144. 10.1097/00004583-200502000-00005. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Brook BS, Dominici F, Pronovost PJ, Makary MA, Schneider E, Pawlik TM. Variations in surgical outcomes associated with hospital compliance with safety. Surgery. 2012; 151 :651–659. 10.1016/j.surg.2011.12.001. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS) Ment Health Serv Res. 2004; 6 :61–74. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Aarons GA, Cafri G, Lugo L, Sawitzky A. Expanding the domains of attitudes towards evidence-based practice: The Evidence Based Attitudes Scale-50. Administration and Policy in Mental Health and Mental Health Services Research. 2012; 5 :331–340. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Public Health. 1999; 89 :1322–1327. 10.2105/AJPH.89.9.1322. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2010; 38 :65–76. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Bond GR, Drake R, Becker D. Beyond evidence-based practice: Nine ideal features of a mental health intervention. Research on Social Work Practice. 2010; 20 :493–501. 10.1177/1049731509358085. [ CrossRef ] [ Google Scholar ]
  • Rogers EM. Diffusion of Innovations. 5. New York: Free Press; 2003. [ Google Scholar ]
  • Grol R, Wensing M. In: Improving patient care: The implementation of change in clinical practice. Grol R, Wensing M, Eccles M, editor. Edinburgh: Elsevier; 2005. Characteristics of successful innovations; pp. 60–70. [ Google Scholar ]
  • Diner BM, Carpenter CR, O'Connell T, Pang P, Brown MD, Seupaul RA, Celentano JJ, Mayer D. Graduate medical education and knowledge translation: Role models, information pipelines, and practice change thresholds. Acad Emerg Med. 2007; 14 :1008–1014. [ Abstract ] [ Google Scholar ]
  • Westfall JM, Mold J, Fagnan L. Practice-based research: ‘Blue Highways’ on the NIH roadmap. JAMA. 2007; 297 :403–406. 10.1001/jama.297.4.403. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Kleinman MS, Mold JW. Defining the components of the research pipeline. Clin Transl Sci. 2009; 2 :312–314. 10.1111/j.1752-8062.2009.00119.x. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Oxman AD. Grading quality of evidence and strength of recommendations. BMJ. 2004; 328 :1490–1494. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Ebell MH, Siwek J, Weiss BD, Woolf SH, Susman J, Ewigman B, Bowman M. Strength of recommendation taxonomy (SORT): A patient-centered approach to grading evidence in the medical literature. J Am Board Fam Pract. 2004; 17 :59–67. 10.3122/jabfm.17.1.59. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Roth A, Fonagy P. What works for whom? A critical review of psychotherapy research. New York: Guilford; 2005. [ Google Scholar ]
  • Weissman MM, Verdeli H, Gameroff MJ, Bledsoe SE, Betts K, Mufson L, Fitterling H, Wickramaratne P. National survey of psychotherapy training in psychiatry, psychology, and social work. Arch Gen Psychiatry. 2006; 63 :925–934. 10.1001/archpsyc.63.8.925. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Chambless DL, Baker MJ, Baucom DH, Beutler LE, Calhoun KS, Crits-Christoph P, Daiuto A, DeRubeis R, Detweiler J, Haaga DAF. et al.Update on empirically validated therapies, II. The Clinical Psychologist. 1998; 51 :3–16. [ Google Scholar ]
  • Cochrane Effective Practice and Organisation of Care group. Data collection checklist. EPOC measures for review authors; 2002. [ Google Scholar ]
  • Leviton LC, Khan LK, Rog D, Dawkins N, Cotton D. Evaluability assessment to improve public health policies, programs, and practices. Annu Rev Public Health. 2010; 31 :213–233. 10.1146/annurev.publhealth.012809.103625. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG) Designing theoretically-informed implementation interventions. Implementation Science. 2006; 1 (4):1–8. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implementation Science. 2010; 5 :1–6. 10.1186/1748-5908-5-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implementation Science. 2009; 4 (Article: 40):1–6. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • McDonald KM, Graham ID, Grimshaw J. In: Closing the quality gap: A critical analysis of quality improvement strategies. Shojania KG, McDonald KM, Wachter RM, Owens DK, editor. Rockville, MD: Agency for Healthcare Research and Quality; 2004. Toward a theoretical basis for quality improvement interventions; pp. 27–40. [ Abstract ] [ Google Scholar ]
  • Prochaska JO, Velicer WF. The transtheoretical model of health behavior change. Am J Health Promot. 1997; 12 :38–48. 10.4278/0890-1171-12.1.38. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Kluger AN, DeNisi A. The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol Bull. 1996; 119 :254–284. [ Google Scholar ]
  • McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Haynes RB, Straus SE. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: A Tower of Babel? Implementation Science. 2010; 5 :1–11. 10.1186/1748-5908-5-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Rabin BA, Brownson RC, Joshu-Haire D, Kreuter MW, Weaver NL. A glossary of dissemination and implementation research in health. Journal of Public Health Management. 2008; 14 :117–123. [ Abstract ] [ Google Scholar ]
  • Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009; 36 :24–34. 10.1007/s10488-008-0197-4. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Stetler CB, McQueen L, Demakis J, Mittman BS. An organizational framework and strategic implementation for systems-level change to enhance research-based practice: QUERI series. Implementation Science. 2008; 3 :1–11. 10.1186/1748-5908-3-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011; 38 :4–23. 10.1007/s10488-010-0327-7. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Magnabosco JL. Innovations in mental health services implementation: A report on state-level data from the U.S. evidence-based practices project. Implementation Science. 2006; 1 :1–11. 10.1186/1748-5908-1-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. Making psychological theory useful for implementing evidence based practice: A consensus approach. Qual Saf Health Care. 2005; 14 :26–33. 10.1136/qshc.2004.011155. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Grol R, Wensing M, Hulscher M, Eccles M. In: Improving patient care: The implementation of change in clinical practice. Grol R, Wensing M, Eccles M, editor. Edinburgh: Elsevier; 2005. Theories on implementation of change in healthcare; pp. 15–40. [ Google Scholar ]
  • Grol R, Bosch MC, Hulscher MEJL, Eccles MP, Wensing M. Planning and studying improvement in patient care: The use of theoretical perspectives. Milbank Q. 2007; 85 :93–138. 10.1111/j.1468-0009.2007.00478.x. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Denis J-L, Lehoux P. In: Knowledge translation in health care: Moving from evidence to practice. Straus S, Tetroe J, Graham ID, editor. Hoboken, NJ: Wiley-Blackwell; 2009. Organizational theory; pp. 215–225. [ Google Scholar ]
  • KT Theories Group. Planned action theories. In: Straus S, Tetroe J, Graham ID, editor. Knowledge translation in health care: Moving from evidence to practice. Hoboken, NJ: Wiley-Blackwell; 2009. pp. 185–195. [ Google Scholar ]
  • Hutchinson A, Estabrooks CA. In: Knowledge translation in health care: Moving from evidence to practice. Straus S, Tetroe J, Graham ID, editor. Hoboken, NJ: Wiley-Blackwell; 2009. Cognitive psychology theories of change; pp. 196–205. [ Google Scholar ]
  • Hutchinson A, Estabrooks CA. In: Knowledge translation in health care: Moving from evidence to practice. Straus S, Tetroe J, Graham ID, editor. Hoboken, NJ: Wiley-Blackwell; 2009. Educational theories; pp. 206–214. [ Google Scholar ]
  • Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB. Interventions in organizational and community context: A framework for building evidence on dissemination and implementation research. Adm Policy Ment Health. 2008; 35 :21–37. 10.1007/s10488-007-0144-9. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • National Advisory Mental Health Council's Services Research and Clinical Epidemiology Workgroup. The road ahead: Research partnerships to transform services. Bethesda, Maryland: National Institute of Mental Health; 2006. [ Google Scholar ]
  • Lindamer LA, Lebowitz B, Hough RL, Garcia P, Aguirre A, Halpain MC. et al.Establishing an implementation network: Lessons learned from community-based participatory research. Implementation Science. 2009; 4 (17):1–7. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Chen PG, Diaz N, Lucas G, Rosenthal MS. Dissemination of results in community-based participatory research. Am J Prev Med. 2010; 39 :372–378. 10.1016/j.amepre.2010.05.021. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Wallenstein N, Duran B. Community-based participatory research contributions to intervention research: The intersection of science and practice to improve health equity. Am J Public Health. 2010; 100 :S40–S46. 10.2105/AJPH.2009.184036. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Kerner J, Rimer B, Emmons K. Dissemination research and research dissemination: How can we close the gap? Health Psychol. 2005; 24 :443–446. [ Abstract ] [ Google Scholar ]
  • Brugha R, Varvasovszky Z. Stakeholder analysis: A review. Health Policy Plan. 2000; 15 :239–246. 10.1093/heapol/15.3.239. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Varvasovszky Z, Brugha R. How to do (or not to do) a stakeholder analysis. Health Policy Plan. 2000; 15 :338–345. 10.1093/heapol/15.3.338. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Chambers DA. Advancing the science of implementation: A workshop summary. Administration and Policy in Mental Health and Mental Health Services Research. 2008; 35 :3–10. 10.1007/s10488-007-0146-7. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Glasgow RE, Klesges LM, Dzewaltowski DA, Bull SS, Estabrooks P. The future of health behavior change research: What is needed to improve translation of research into health promotion practice. Ann Behav Med. 2004; 27 :3–12. 10.1207/s15324796abm2701_2. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Schoenwald SK, Hoagwood K. Effectiveness, transportability, and dissemination of interventions: What matters when? Psychiatr Serv. 2001; 52 :1190–1197. 10.1176/appi.ps.52.9.1190. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Training institute for dissemination and implementation research in health. http://conferences.thehillgroup.com/OBSSRinstitutes/TIDIRH2011/index.html . [ Europe PMC free article ] [ Abstract ]
  • Dearing J. Evolution of diffusion and dissemination theory. J Public Health Manag Pract. 2008; 14 :99–108. [ Abstract ] [ Google Scholar ]
  • Weiner BJ, Amick H, Lee S-YD. Conceptualization and measurement of organizational readiness for change: A review of the literature in health services research and other fields. Medical Care Research and Review. 2008; 65 :379–436. 10.1177/1077558708317802. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Stamatakis K. 4th Annual National Institutes of Health Conference on the Science of Dissemination and Implementation. Maryland: Bethesda; 2011. Measurement properties of a novel survey to assess stages of organizational readiness for evidence-based practice in community prevention programs. [ Google Scholar ]
  • Gagnon M-P, Labarthe J, Legare F, Ouimet M, Estabrooks CA, Roch G, Ghandour EK, Grimshaw J. Measuring organizational readiness for knowledge translation in chronic care. Implementation Science. 2011; 6 (72):1–10. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood KE, Mayberg S, Green P. Assessing the organizational social context (OSC) of mental health services: implications for research and practice. Adm Policy Ment Health. 2008; 35 :98–113. 10.1007/s10488-007-0148-5. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Larson E. A tool to assess barriers to adherence to hand hygiene guideline. Am J Infect Control. 2004; 32 :48–51. 10.1016/j.ajic.2003.05.005. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Medical Journal of Australia. 2004; 180 :S57–S60. [ Abstract ] [ Google Scholar ]
  • Légaré F. In: Knowledge translation in health care: Moving from evidence to practice. Straus S, Tetroe J, Graham ID, editor. Hoboken, NJ: Wiley-Blackwell; 2009. Assessing barriers and facilitators to knowledge use; pp. 83–93. [ Google Scholar ]
  • Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud P-AC, Rubin HR. Why don't physicians follow clinical practice guidelines? JAMA. 1999; 282 :1458–1465. 10.1001/jama.282.15.1458. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Forsner T, Hansson J, Brommels M, Wistedt AA, Forsell Y. Implementing clinical guidelines in psychiatry: A qualitative study of perceived facilitators and barriers. BMC Psychiatry. 2010; 10 :1–10. 10.1186/1471-244X-10-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, Callaghan J, Holter M. Barriers to evidence-based practice implementation: Results of a qualitative study. Community Ment Health J. 2010; 46 :112–118. 10.1007/s10597-009-9238-z. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Manuel JI, Mullen EJ, Fang L, Bellamy JL, Bledsoe SE. Preparing social work practitioners to use evidence-based practice: A comparison of experiences from an implementation project. Research on Social Work Practice. 2009; 19 :613–627. 10.1177/1049731509335547. [ CrossRef ] [ Google Scholar ]
  • Chenot J-F, Scherer M, Becker A, Donner-Banzhoff N, Baum E, Leonhardt C, Kellar S, Pfingsten M, Hildebrandt J, Basler H-D, Kochen MM. Acceptance and perceived barriers of implementing a guideline for managing low back in general practice. Implementation Science. 2008; 3 :1–6. 10.1186/1748-5908-3-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based decision making in public health: A national survey of chronic disease practitioners. Public Health Rep. 2010; 125 :736–742. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Wensing M, Grol R. In: Improving Patient Care: The implementation of change in clinical practice. Grol R, Wensing M, Eccles M, editor. Edinburgh: Elsevier; 2005. Methods to identify implementation problems; pp. 109–120. [ Google Scholar ]
  • Funk SG, Champagne MT, Wiese RA, Tornquist EM. BARRIERS: The barriers to research utilization scale. Clinical Methods. 1991; 4 :39–45. [ Abstract ] [ Google Scholar ]
  • Grol R, Wensing M, Eccles M. Improving patient care: The implementation of change in clinical practice. Edinburgh: Elsevier; 2005. [ Google Scholar ]
  • Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, York JL. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2012; 69 :123–157. 10.1177/1077558711430690. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Straus S, Tetroe J, Graham ID. Knowledge translation in health care: Moving from evidence to practice. Hoboken, NJ: Wiley-Blackwell; 2009. [ Google Scholar ]
  • Recommendations to improve reporting of the content of behaviour change interventions. http://interventiondesign.co.uk/
  • Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney S. Publication guidelines for quality improvement in health care: Evolution of the SQUIRE project. Qual Saf Health Care. 2008; 17 :i3–i9. 10.1136/qshc.2008.029066. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Equator Network. http://www.equator-network.org/
  • Goeschel CA, Weiss WM, Pronovost PJ. Using a logic model to design and evaluate quality and patient safety improvement programs. 2012. pp. 330–337. [ Abstract ]
  • Implementation Research Institute. http://cmhsr.wustl.edu/Training/IRI/Pages/ImplementationResearchTraining.aspx .
  • Mittman BS, editor. Implementation Research Institute. St. Louis, Missouri; 2010. Criteria for peer review of D/I funding applications. [ Google Scholar ]
  • Mauskopf JA, Sullivan SD, Annemans L, Caro J, Mullins CD, Nuijten M, Orlewska E, Watkins J, Trueman P. Principles of good practice for budget impact analysis: Report of the ISPOR task force on good research practices: Budget impact analysis. Values in Health. 2007; 10 :336–347. 10.1111/j.1524-4733.2007.00187.x. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Raghavan R. In: Dissemination and implementation research in health: Translating science to practice. Brownson RC, Colditz GA, Proctor EK, editor. New York: Oxford University Press; 2012. The role of economic evaluation in dissemination and implementation research; pp. 94–113. [ Google Scholar ]
  • Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, Gasziou P, Ilott I, Kinmonth A-L, Leng G. et al.An implementation research agenda. Implementation Science. 2009; 4 :1–7. 10.1186/1748-5908-4-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Glasgow RE. Critical measurement issues in translational research. Research on Social Work Practice. 2009; 19 :560–568. 10.1177/1049731509335497. [ CrossRef ] [ Google Scholar ]
  • Wensing M, Weijden TVD, Grol R. Implementing guidelines and innovations in general practice: Which interventions are effective? Br J Gen Pract. 1998; 48 :991–997. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Solberg LI, Brekke ML, Fazio CJ, Fowles J, Jacobsen DN, Kottke TE, Mosser G, O'Connor PJ, Ohnsorg KA, Rolnick SJ. Lessons from experienced guideline implementers: Attend to many factors and use multiple strategies. Journal on Quality Improvement. 2000; 26 :171–188. [ Abstract ] [ Google Scholar ]
  • Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L. et al.Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004; 8 (6):1–72. [ Abstract ] [ Google Scholar ]
  • Wensing M, Bosch M, Grol R. In: Knowledge Translation in health care: Moving from evidence to practice. Straus S, Tetroe J, Graham ID, editor. Oxford, UK: Wiley-Blackwell; 2009. Selecting, tailoring, and implementing knowledge translation interventions; pp. 94–113. [ Google Scholar ]
  • Baker R, Camosso-Stefanovic J, Gilliss CL, Shaw EJ, Cheater F, Flottorp S, Robertson N. Tailored interventions to overcome identified barriers to change: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2010. p. CD005470. [ Europe PMC free article ] [ Abstract ]
  • Bartholomew LK, Parcel GS, Kok G, Gottlieb NH. Planning health promotion programs: An intervention mapping approach. San Francisco: Jossey-Bass; 2011. [ Google Scholar ]
  • Michie S, van Stralen MM, West R. The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science. 2011; 6 (42):1–11. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Kauth MR, Sullivan G, Blevins D, Cully JA, Landes RD, Said Q, Teasdale TA. Employing external facilitation to implement cognitive behavioral therapy in VA clinics: A pilot study. Implementation Science. 2010; 5 (75):1–11. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, Silovsky JF, Hecht DB, Chaffin MJ. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science. 2012; 7 (32):1–9. [ Europe PMC free article ] [ Abstract ] [ Google Scholar ]
  • Aarons GA, Palinkas LA. Implementation of evidence-based practice in child welfare: Service provider perspectives. Administrative Policy in Mental Health & Mental Health Services Research. 2007; 34 :411–419. 10.1007/s10488-007-0121-3. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Landsverk J. In: The field research survivors guide. Stiffman AR, editor. New York: Oxford University Press; 2009. Creating interdisciplinary research teams and using consultants; pp. 127–145. [ Google Scholar ]
  • Institute of Medicine. The state of quality improvement and implementation research: Workshop summary. Washington, DC: The National Academies Press; 2007. [ Google Scholar ]
  • Zerhouni EA, Alving B. Clinical and Translational Science Awards: A framework for a national research agenda. Transl Res. 2006; 148 :4–5. 10.1016/j.lab.2006.05.001. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Proctor EK, Brownson RC. In: Dissemination and implementation research in health: Translating research to practice. Brownson RC, Colditz GA, Proctor EK, editor. New York: Oxford University Press; 2012. Measurement issues in dissemination and implementation research; pp. 261–280. [ Google Scholar ]
  • Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011; 38 :65–76. 10.1007/s10488-010-0319-7. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Landsverk J, Brown CH, Chamberlain P, Palinkas LA, Ogihara M, Czaja S, Goldhaber-Fiebert JD, Rolls-Reutz JA, Horwitz SM. In: Dissemination and implementation research in health: Translating research to practice. Brownson RC, Colditz GA, Proctor EK, editor. New York: Oxford University Press; 2012. Design and analysis in dissemination and implementation research; pp. 225–260. [ Google Scholar ]
  • Grid-enabled measures database. https://www.gem-beta.org/Public/Home.aspx .
  • Instrument review project: A comprehensive review of dissemination and implementation science instruments. http://www.seattleimplementation.org/sirc-projects/sirc-instrument-project/
  • Kraemer HC, Mintz J, Noda A, Tinklenberg J, Yesavage JA. Caution regarding the use of pilot studies to guide power calculations for study proposals. Arch Gen Psychiatry. 2006; 63 :484–489. 10.1001/archpsyc.63.5.484. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Institute of Medicine. Improving the quality of health care for mental and substance-use conditions. Washington, DC: National Academy Press; 2006. [ Google Scholar ]
  • Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in behavioral health: Agency director perspectives. Adm Policy Ment Health. 2007; 34 :479–488. 10.1007/s10488-007-0129-8. [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science. 2008; 3 :1–9. 10.1186/1748-5908-3-1. [ Europe PMC free article ] [ Abstract ] [ CrossRef ] [ Google Scholar ]
  • Jilcott S, Ammerman A, Sommers J, Glasgow RE. Applying the RE-AIM framework to assess the public health impact of policy change. Ann Behav Med. 2007; 34 :105–114. 10.1007/BF02872666. [ Abstract ] [ CrossRef ] [ Google Scholar ]

Full text links 

Read article at publisher's site: https://doi.org/10.1186/1748-5908-7-96

Citations & impact 

Impact metrics, citations of article over time, alternative metrics.

Altmetric item for https://www.altmetric.com/details/999955

Smart citations by scite.ai Smart citations by scite.ai include citation statements extracted from the full text of the citing article. The number of the statements may be higher than the number of citations provided by EuropePMC if one paper cites another multiple times or lower if scite has not yet processed some of the citing articles. Explore citation contexts and check if this article has been supported or disputed. https://scite.ai/reports/10.1186/1748-5908-7-96

Article citations, pre-treatment assessment of chemotherapy for cancer patients: a multi-site evidence implementation project of 74 hospitals in china..

Lai J , Pilla B , Stephenson M , Brettle A , Zhou C , Li W , Li C , Fu J , Deng S , Zhang Y , Guo Z , Wu Y

BMC Nurs , 23(1):320, 11 May 2024

Cited by: 0 articles | PMID: 38734605 | PMCID: PMC11088226

Implementation Feasibility and Hidden Costs of Statewide Scaling of Evidence-Based Therapies for Children and Adolescents.

Hoagwood KE , Richards-Rachlin S , Baier M , Vilgorin B , Horwitz SM , Narcisse I , Diedrich N , Cleek A

Psychiatr Serv , 75(5):461-469, 25 Jan 2024

Cited by: 0 articles | PMID: 38268465

Recruiting Community Health Centers for Implementation Research: Challenges, Implications, and Potential Solutions.

Lee A , Gold R , Caskey R , Haider S , Schmidt T , Ott E , Beidas RS , Bhat A , Pinnock W , Vredevoogd M , Grover T , Wallander Gemkow J , Bennett IM

Health Equity , 8(1):113-116, 13 Feb 2024

Cited by: 0 articles | PMID: 38414491 | PMCID: PMC10898228

The use of implementation science theories, models, and frameworks in implementation research for medicinal products: A scoping review.

Smith MY , Gaglio B , Anatchkova M

Health Res Policy Syst , 22(1):17, 29 Jan 2024

Cited by: 0 articles | PMID: 38287407 | PMCID: PMC10823700

Ten years of implementation outcomes research: a scoping review.

Proctor EK , Bunger AC , Lengnick-Hall R , Gerke DR , Martin JK , Phillips RJ , Swanson JC

Implement Sci , 18(1):31, 25 Jul 2023

Cited by: 19 articles | PMID: 37491242 | PMCID: PMC10367273

Similar Articles 

To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.

The art of obtaining grants.

Am J Health Syst Pharm , 66(6):580-587, 01 Mar 2009

Cited by: 4 articles | PMID: 19265188

Can't We Do Better? A cost-benefit analysis of proposal writing in a competitive funding environment.

Schweiger G

PLoS One , 18(4):e0282320, 19 Apr 2023

Cited by: 1 article | PMID: 37074994 | PMCID: PMC10115252

Recommendations for Writing Successful Grant Proposals: An Information Synthesis.

Wisdom JP , Riley H , Myers N

Acad Med , 90(12):1720-1725, 01 Dec 2015

Cited by: 10 articles | PMID: 26200582

Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science.

Glasgow RE , Chambers D

Clin Transl Sci , 5(1):48-55, 23 Feb 2012

Cited by: 150 articles | PMID: 22376257 | PMCID: PMC5439908

Free full text in Europe PMC

Writing a grant proposal--part 2: research methods--part 2.

Bliss DZ , Savik K

J Wound Ostomy Continence Nurs , 32(4):226-229, 01 Jul 2005

Cited by: 1 article | PMID: 16030461

Funding 

Funders who supported this work.

NCATS NIH HHS (1)

Grant ID: TL1 TR000449

336 publication s

NCRR NIH HHS (2)

Grant ID: UL1 RR024992

1447 publication s

Grant ID: T32 RR024992

1 publication

NIDDK NIH HHS (1)

Grant ID: P30 DK092950

673 publication s

NIMH NIH HHS (2)

Grant ID: P30 MH068579

66 publication s

Grant ID: R25 MH080916

441 publication s

Europe PMC is part of the ELIXIR infrastructure

National Cancer Institute, Division of Cancer Control & Population Sciences

  • Writing Implementation Research Grant Proposals: Ten Key Ingredients
  • Implementation Science

What makes for a strong proposal? This webinar discusses " Writing Implementation Research Grant Proposals: Ten Key Ingredients ," published in the journal Implementation Science.

Dr. Enola Proctor

Washington University in St. Louis

  • Sample Grant Applications
  • Public Goods
  • Eligibility Requirements
  • Application Procedure
  • Frequently Asked Questions
  • Outcomes of the 2023 CCIS Meeting
  • Outcomes of the 2022 CCIS Meeting
  • Outcomes of the 2021 CCIS Meeting
  • Outcomes of the 2020 ISCC Meeting
  • Outcomes of the 2019 ISCC Meeting
  • CCIS Action Groups
  • Health Equity Toolkit
  • ISC3 Centers
  • Research Tools
  • Practice Tools
  • Research-Practice Partnerships
  • Ali Abazeed, MPH, MPP
  • Annabelle Uy, MS
  • Antoinette Percy-Laurry, DrPH, MSPH
  • Cynthia A. Vinson, PhD, MPA
  • David Chambers, DPhil
  • Gila Neta, PhD, MPP
  • Laurie Hursting, MA
  • Mindy Clyne, MHS, CGC
  • Pete DelNero, PhD, MPH
  • Sarah Bruce Bernal, MA
  • Wynne E. Norton, PhD
  • Privacy Policy

Research Method

Home » Grant Proposal – Example, Template and Guide

Grant Proposal – Example, Template and Guide

Table of Contents

Grant Proposal

Grant Proposal

Grant Proposal is a written document that outlines a request for funding from a grant-making organization, such as a government agency, foundation, or private donor. The purpose of a grant proposal is to present a compelling case for why an individual, organization, or project deserves financial support.

Grant Proposal Outline

While the structure and specific sections of a grant proposal can vary depending on the funder’s requirements, here is a common outline that you can use as a starting point for developing your grant proposal:

  • Brief overview of the project and its significance.
  • Summary of the funding request and project goals.
  • Key highlights and anticipated outcomes.
  • Background information on the issue or problem being addressed.
  • Explanation of the project’s relevance and importance.
  • Clear statement of the project’s objectives.
  • Detailed description of the problem or need to be addressed.
  • Supporting evidence and data to demonstrate the extent and impact of the problem.
  • Identification of the target population or beneficiaries.
  • Broad goals that describe the desired outcomes of the project.
  • Specific, measurable, achievable, relevant, and time-bound (SMART) objectives that contribute to the goals.
  • Description of the strategies, activities, and interventions to achieve the objectives.
  • Explanation of the project’s implementation plan, timeline, and key milestones.
  • Roles and responsibilities of project staff and partners.
  • Plan for assessing the project’s effectiveness and measuring its impact.
  • Description of the data collection methods, tools, and indicators used for evaluation.
  • Explanation of how the results will be used to improve the project.
  • Comprehensive breakdown of project expenses, including personnel, supplies, equipment, and other costs.
  • Clear justification for each budget item.
  • Information about any matching funds or in-kind contributions, if applicable.
  • Explanation of how the project will be sustained beyond the grant period.
  • Discussion of long-term funding strategies, partnerships, and community involvement.
  • Description of how the project will continue to address the identified problem in the future.
  • Overview of the organization’s mission, history , and track record.
  • Description of the organization’s experience and qualifications related to the proposed project.
  • Summary of key staff and their roles.
  • Recap of the project’s goals, objectives, and anticipated outcomes.
  • Appreciation for the funder’s consideration.
  • Contact information for further inquiries.

Grant Proposal Template

Here is a template for a grant proposal that you can use as a starting point. Remember to customize and adapt it based on the specific requirements and guidelines provided by the funding organization.

Dear [Grant-making Organization Name],

Executive Summary:

I. Introduction:

II. Needs Assessment:

III. Goals and Objectives:

IV. Project Methods and Approach:

V. Evaluation and Monitoring:

VI. Budget:

VII. Sustainability:

VIII. Organizational Capacity and Expertise:

IX. Conclusion:

Thank you for considering our grant proposal. We believe that this project will make a significant impact and address an important need in our community. We look forward to the opportunity to discuss our proposal further.

Grant Proposal Example

Here is an example of a grant proposal to provide you with a better understanding of how it could be structured and written:

Executive Summary: We are pleased to submit this grant proposal on behalf of [Your Organization’s Name]. Our proposal seeks funding in the amount of [Requested Amount] to support our project titled [Project Title]. This project aims to address [Describe the problem or need being addressed] in [Target Location]. By implementing a comprehensive approach, we aim to achieve [State the project’s goals and anticipated outcomes].

I. Introduction: We express our gratitude for the opportunity to present this proposal to your esteemed organization. At [Your Organization’s Name], our mission is to [Describe your organization’s mission]. Through this project, we aim to make a significant impact on [Describe the issue or problem being addressed] by [Explain the significance and relevance of the project].

II. Needs Assessment: After conducting thorough research and needs assessments in [Target Location], we have identified a pressing need for [Describe the problem or need]. The lack of [Identify key issues or challenges] has resulted in [Explain the consequences and impact of the problem]. The [Describe the target population or beneficiaries] are particularly affected, and our project aims to address their specific needs.

III. Goals and Objectives: The primary goal of our project is to [State the broad goal]. To achieve this, we have outlined the following objectives:

  • [Objective 1]
  • [Objective 2]
  • [Objective 3] [Include additional objectives as necessary]

IV. Project Methods and Approach: To address the identified needs and accomplish our objectives, we propose the following methods and approach:

  • [Describe the activities and strategies to be implemented]
  • [Explain the timeline and key milestones]
  • [Outline the roles and responsibilities of project staff and partners]

V. Evaluation and Monitoring: We recognize the importance of assessing the effectiveness and impact of our project. Therefore, we have developed a comprehensive evaluation plan, which includes the following:

  • [Describe the data collection methods and tools]
  • [Identify the indicators and metrics to measure progress]
  • [Explain how the results will be analyzed and utilized]

VI. Budget: We have prepared a detailed budget for the project, totaling [Total Project Budget]. The budget includes the following key components:

  • Personnel: [Salary and benefits for project staff]
  • Supplies and Materials: [List necessary supplies and materials]
  • Equipment: [Include any required equipment]
  • Training and Capacity Building: [Specify any training or workshops]
  • Other Expenses: [Additional costs, such as travel, marketing, etc.]

VII. Sustainability: Ensuring the sustainability of our project beyond the grant period is of utmost importance to us. We have devised the following strategies to ensure its long-term impact:

  • [Describe plans for securing future funding]
  • [Explain partnerships and collaborations with other organizations]
  • [Outline community engagement and support]

VIII. Organizational Capacity and Expertise: [Your Organization’s Name] has a proven track record in successfully implementing projects of a similar nature. Our experienced team possesses the necessary skills and expertise to carry out this project effectively. Key personnel involved in the project include [List key staff and their qualifications].

IX. Conclusion: Thank you for considering our grant proposal. We firmly believe that [Project Title] will address a critical need in [Target Location] and contribute to the well-being of the [Target Population]. We are available to provide any additional information or clarification as required. We look forward to the

opportunity to discuss our proposal further and demonstrate the potential impact of this project.

Please find attached the required supporting documents, including our detailed budget, organizational information, and any additional materials that may be helpful in evaluating our proposal.

Thank you once again for considering our grant proposal. We appreciate your dedication to supporting projects that create positive change in our community. We eagerly await your response and the possibility of partnering with your esteemed organization to make a meaningful difference.

  • Detailed Budget
  • Organizational Information
  • Additional Supporting Documents]

Grant Proposal Writing Guide

Writing a grant proposal can be a complex process, but with careful planning and attention to detail, you can create a compelling proposal. Here’s a step-by-step guide to help you through the grant proposal writing process:

  • Carefully review the grant guidelines and requirements provided by the funding organization.
  • Take note of the eligibility criteria, funding priorities, submission deadlines, and any specific instructions for the proposal.
  • Familiarize yourself with the funding organization’s mission, goals, and previous projects they have supported.
  • Gather relevant data, statistics, and evidence to support the need for your proposed project.
  • Clearly define the problem or need your project aims to address.
  • Identify the specific goals and objectives of your project.
  • Consider how your project aligns with the mission and priorities of the funding organization.
  • Organize your proposal by creating an outline that includes all the required sections.
  • Arrange the sections logically and ensure a clear flow of ideas.
  • Start with a concise and engaging executive summary to capture the reader’s attention.
  • Provide a brief overview of your organization and the project.
  • Present a clear and compelling case for the problem or need your project addresses.
  • Use relevant data, research findings, and real-life examples to demonstrate the significance of the issue.
  • Clearly articulate the overarching goals of your project.
  • Define specific, measurable, achievable, relevant, and time-bound (SMART) objectives that align with the goals.
  • Explain the strategies and activities you will implement to achieve the project objectives.
  • Describe the timeline, milestones, and resources required for each activity.
  • Highlight the uniqueness and innovation of your approach, if applicable.
  • Outline your plan for evaluating the project’s effectiveness and measuring its impact.
  • Discuss how you will collect and analyze data to assess the outcomes.
  • Explain how the project will be sustained beyond the grant period, including future funding strategies and partnerships.
  • Prepare a comprehensive budget that includes all the anticipated expenses and revenue sources.
  • Clearly justify each budget item and ensure it aligns with the project activities and goals.
  • Include a budget narrative that explains any cost assumptions or calculations.
  • Review your proposal multiple times for clarity, coherence, and grammatical accuracy.
  • Ensure that the proposal follows the formatting and length requirements specified by the funder.
  • Consider seeking feedback from colleagues or experts in the field to improve your proposal.
  • Gather all the necessary supporting documents, such as your organization’s background information, financial statements, resumes of key staff, and letters of support or partnership.
  • Follow the submission instructions provided by the funding organization.
  • Submit the proposal before the specified deadline, keeping in mind any additional submission requirements, such as online forms or hard copies.
  • If possible, send a thank-you note or email to the funding organization for considering your proposal.
  • Keep track of the notification date for the funding decision.
  • In case of rejection, politely ask for feedback to improve future proposals.

Importance of Grant Proposal

Grant proposals play a crucial role in securing funding for organizations and projects. Here are some key reasons why grant proposals are important:

  • Access to Funding: Grant proposals provide organizations with an opportunity to access financial resources that can support the implementation of projects and initiatives. Grants can provide the necessary funds for research, program development, capacity building, infrastructure improvement, and more.
  • Project Development: Writing a grant proposal requires organizations to carefully plan and develop their projects. This process involves setting clear goals and objectives, identifying target populations, designing activities and strategies, and establishing timelines and budgets. Through this comprehensive planning process, organizations can enhance the effectiveness and impact of their projects.
  • Validation and Credibility: Successfully securing a grant can enhance an organization’s credibility and reputation. It demonstrates to funders, partners, and stakeholders that the organization has a well-thought-out plan, sound management practices, and the capacity to execute projects effectively. Grant funding can provide validation for an organization’s work and attract further support.
  • Increased Impact and Sustainability: Grant funding enables organizations to expand their reach and increase their impact. With financial resources, organizations can implement projects on a larger scale, reach more beneficiaries, and make a more significant difference in their communities. Additionally, grants often require organizations to consider long-term sustainability, encouraging them to develop strategies for continued project success beyond the grant period.
  • Collaboration and Partnerships: Grant proposals often require organizations to form partnerships and collaborations with other entities, such as government agencies, nonprofit organizations, or community groups. These collaborations can lead to shared resources, expertise, and knowledge, fostering synergy and innovation in project implementation.
  • Learning and Growth: The grant proposal writing process can be a valuable learning experience for organizations. It encourages them to conduct research, analyze data, and critically evaluate their programs and initiatives. Through this process, organizations can identify areas for improvement, refine their strategies, and strengthen their overall operations.
  • Networking Opportunities: While preparing and submitting grant proposals, organizations have the opportunity to connect with funders, program officers, and other stakeholders. These connections can provide valuable networking opportunities, leading to future funding prospects, partnerships, and collaborations.

Purpose of Grant Proposal

The purpose of a grant proposal is to seek financial support from grant-making organizations or foundations for a specific project or initiative. Grant proposals serve several key purposes:

  • Funding Acquisition: The primary purpose of a grant proposal is to secure funding for a project or program. Organizations rely on grants to obtain the financial resources necessary to implement and sustain their activities. Grant proposals outline the project’s goals, objectives, activities, and budget, making a compelling case for why the funding organization should invest in the proposed initiative.
  • Project Planning and Development: Grant proposals require organizations to thoroughly plan and develop their projects before seeking funding. This includes clearly defining the problem or need the project aims to address, establishing measurable goals and objectives, and outlining the strategies and activities that will be implemented. Writing a grant proposal forces organizations to think critically about the project’s feasibility, anticipated outcomes, and impact.
  • Communication and Persuasion: Grant proposals are persuasive documents designed to convince funding organizations that the proposed project is worthy of their investment. They must effectively communicate the organization’s mission, vision, and track record, as well as the specific problem being addressed and the potential benefits and impact of the project. Grant proposals use evidence, data, and compelling narratives to make a strong case for funding support.
  • Relationship Building: Grant proposals serve as a platform for organizations to establish and strengthen relationships with funding organizations. Through the proposal, organizations introduce themselves, highlight their expertise, and demonstrate their alignment with the funding organization’s mission and priorities. A well-written grant proposal can lay the foundation for future collaborations and partnerships.
  • Accountability and Evaluation: Grant proposals outline the expected outcomes, objectives, and evaluation methods for the proposed project. They establish a framework for accountability, as organizations are expected to report on their progress and outcomes if awarded the grant. Grant proposals often include plans for project evaluation and monitoring to assess the project’s effectiveness and ensure that the funding is being used appropriately.
  • Sustainability and Long-Term Planning : Grant proposals often require organizations to consider the long-term sustainability of their projects beyond the grant period. This includes identifying strategies for continued funding, partnerships, and community involvement. By addressing sustainability in the proposal, organizations demonstrate their commitment to long-term impact and the responsible use of grant funds.

When to Write a Grant Proposal

Knowing when to write a grant proposal is crucial for maximizing your chances of success. Here are a few situations when it is appropriate to write a grant proposal:

  • When There is a Funding Opportunity: Grants become available through various sources, including government agencies, foundations, corporations, and nonprofit organizations. Keep an eye out for grant announcements, requests for proposals (RFPs), or funding cycles that align with your organization’s mission and project goals. Once you identify a relevant funding opportunity, you can begin writing the grant proposal.
  • When You Have a Well-Defined Project or Program: Before writing a grant proposal, it’s important to have a clearly defined project or program in mind. You should be able to articulate the problem or need you are addressing, the goals and objectives of your project, and the strategies and activities you plan to implement. Having a solid project plan in place will help you write a more compelling grant proposal.
  • When You Have Conducted Research and Gathered Data: Grant proposals often require evidence and data to support the need for the project. Before writing the proposal, conduct thorough research to gather relevant statistics, studies, or community assessments that demonstrate the significance and urgency of the problem you aim to address. This data will strengthen your proposal and make it more persuasive.
  • When You Have a Strong Organizational Profile: Funding organizations often consider the credibility and capacity of the applying organization. Before writing a grant proposal, ensure that your organization has a strong profile, including a clear mission statement, track record of accomplishments, capable staff or volunteers, and financial stability. These factors contribute to the overall credibility of your proposal.
  • When You Have the Time and Resources to Dedicate to Proposal Writing: Writing a grant proposal requires time, effort, and resources. It involves conducting research, developing project plans, creating budgets, and crafting compelling narratives. Assess your organization’s capacity to commit to the grant proposal writing process. Consider the timeline, deadline, and any additional requirements specified by the funding organization before deciding to proceed.
  • When You Have Identified Potential Partnerships or Collaborators: Some grant proposals may require or benefit from partnerships or collaborations with other organizations or stakeholders. If your project can be enhanced by partnering with other entities, it’s important to identify and secure these partnerships before writing the grant proposal. This demonstrates a collaborative approach and can strengthen your proposal.
  • When You Are Committed to Project Evaluation and Accountability: Grant proposals often include requirements for project evaluation and reporting. If you are willing and able to commit to evaluating the project’s outcomes, tracking progress, and reporting on the use of funds, it is an appropriate time to write a grant proposal. This shows your dedication to transparency, accountability, and responsible use of grant funds.

Also see Proposal

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

How To Write A Proposal

How To Write A Proposal – Step By Step Guide...

How To Write A Business Proposal

How To Write A Business Proposal – Step-by-Step...

Business Proposal

Business Proposal – Templates, Examples and Guide

How To Write a Research Proposal

How To Write A Research Proposal – Step-by-Step...

Proposal

Proposal – Types, Examples, and Writing Guide

How To Write A Grant Proposal

How To Write A Grant Proposal – Step-by-Step...

writing implementation research grant proposals

Research and Engagement Grants for 2023-2024

2023-2024 IPaT/GVU Research & Engagement Grants

Request for Proposals Application (deadline: Friday, July 14, 5pm) IPaT/GVU, with additional support from GTRI, announce the call for proposals for Research and Engagement Grants for 2023-2024. As in past years, we will support two separate types of grant proposals: Research Grants will provide seed funding for new research collaborations, and Engagement Grants will provide support for new forms of internal and external community engagement and collaboration. RESEARCH GRANTS Research Grants will provide seed funding to conduct interdisciplinary research. The objective of the Research Grant program is to promote research activities involving faculty and students from the many disciplines represented in IPaT/GVU. We seek bold new work that by its preliminary nature would be difficult to fund through ordinary channels. Preference will be given to early-stage research with a high probability of leading to extramural funding, and with a strong interdisciplinary component. All funds must be spent by the end of the spring semester. Research Grant proposals can be either single-semester (fall or spring) or academic year (fall and spring) duration. We expect most research proposals will request funding for a GRA between ⅓ and ½ time for the proposal duration. Proposals can also request research faculty time; in these cases, it is highly encouraged but not required to collaborate with academic faculty as well as GTRI research faculty. Proposals from academic faculty can request other critical resources, such as materials and supplies, but cannot include academic faculty salary support. ENGAGEMENT GRANTS Engagement Grants are designed to foster new sorts of engagements and collaboration, whether internal or external to Georgia Tech. Examples of potential engagement grants could include: • Support for an artist-in-residence (or X-in-residence) program • Support for new sorts of community engagements, such as installation spaces or "pop up" displays of research • Support for new faculty and student workshops, seminars or social events • Support for new undergraduate "hack fests" or laboratories We do not expect most Engagement Grant proposals to include GRA support or other personnel time. In cases where such support is requested, please justify why such support is essential to the activity. Travel, and materials and supplies budgets (as required by the specific plans of the proposal) can be requested, but proposals cannot include academic faculty salary. Budget requests for travel and food should be modest and called out separately.

GRANTEE RESPONSIBILITIES If you receive a Research or Engagement Grant, you must: • Present your planned work at an introductory GVU brown bag panel in the fall, present your final results at a GVU brown bag panel the following spring, present at the fall or spring IPaT Townhall, and produce a brief final report. • Produce an interim and final project video to be used for IPaT/GVU, and GTRI websites. • Acknowledge IPaT/GVU, and potentially GTRI support for the project in any talks, papers, proposals, or other outreach based on the project. • Aim to acquire additional funding for parallel and subsequent research activities and notify us about these efforts. • All funds must be spent by the end of the spring academic semester. PROPOSAL SUBMISSION The proposal should be submitted as a single PDF document no more than three pages in length, and should describe: (1) the problems addressed by the proposed research or engagement, (2) methods or overall approach, (3) benefits anticipated from the research or engagement, (4) a clear description of how the grant will enable subsequent external funding (if appropriate), and (5) an outline of the required budget (please do not include overhead or tuition remission in your budget). Please let us know in your proposal if you require administrative staff time or other resources from IPaT/GVU, or GTRI. If the student who will be involved in the project has already been determined, then the student and his/ her academic unit should be identified in the proposal. Proposals must be submitted to [email protected] by July 14. Awards will be announced in the summer. Late submissions cannot be considered. PROPOSAL REVIEW CRITERIA AND AWARD Submissions will be reviewed on the basis of merit, originality, plans for furthering the collaboration through external funding, synergy with IPaT/GVU, and GTRI themes and activities, and the degree of interaction between different disciplines and between the faculty members from the different academic units. For both Research and Engagement Grants, preference will be given to proposals that span at least two different academic units (e.g., computer science/psychology, or digital media/music) and/or academic and applied units, and which set the stage for new collaborations in the IPaT/GVU community. If you have questions about process, review criteria, or program goals, please address them to IPaT/GVU Interim Director Leigh McCook ([email protected]).

Facebook

You are using an outdated browser. Please upgrade your browser or activate Google Chrome Frame to improve your experience.

UNC School of Social Work

  • CENTERS & INSTITUTES

First-year doctoral student Ally Waters receives research grant for proposal 

Posted on May 23, 2024

by Chris Hilburn-Trenkle   

Ally Waters was not expecting that her graduate student research proposal would be chosen. 

When she received the news in early April, while driving to meet a friend for a weekend hiking trip in Virginia, she was excited as well as surprised, she admitted. 

The first-year doctoral student at the University of North Carolina at Chapel Hill School of Social Work was recently awarded the Graduate Certificate in Participatory Research 2024 Seed Grant Award. Her proposal was titled, “Using Participatory Research Methods to Identify Assets and Opportunities for Diversion and Reentry Services in Rural North Carolina.”  

The award is given annually to proposals that strengthen research activity for building an action-oriented learning community that supplies training and resources for carrying out research in collaboration with communities. 

“I’m extremely appreciative,” Waters said. “I feel a lot of gratitude for someone seeing what I am hoping to do and encouraging it. I have also had such a warm reception from the Graduate Certificate in Participatory Research (community members) … I feel grateful for the funding, but also for the community around it.” 

The idea for Waters’ proposal came to fruition thanks to a conversation she had with Woodrena Baker-Harrell, the public defender for the Orange & Chatham County Public Defender’s Office.  

Waters, ‘23 (MSW), had worked with Baker-Harrell through her internship in the public defender’s office. The two still keep in touch, even though Waters now works as a clinical specialist for the Orange County Criminal Justice Resource Department, discussing ways to collaborate on research and community-engaged projects. 

It was during one of those conversations when Baker-Harrell mentioned the recently established Chatham County Local Reentry Council, a group of individuals who come together to work on issues and processes and advocate for those who recently left incarceration or faced barriers due to a prior incarceration. 

As Baker-Harrell shared more about the council, Waters began thinking about strategies for the organization, including the research that could be helpful to the community, goals of the council and ways that she could help support it, leading to her eventually submitting her proposal.  

Waters will first meet with members of the reentry council to gain their perspectives on needs, learn what questions they want answered and get their recommendations for people to work with. From there her plan is to begin conducting interviews this summer as a data collection tool. 

Although Waters plans to interview administrators of both reentry and diversion programs, she noted that the most important aspect of the project is speaking with formerly incarcerated people who can tell her which services available to them during reentry were beneficial, as well as what they would like to see changed moving forward. Some of the important needs she believes will be highlighted in interviews include housing, transportation and behavioral health treatment. 

In addition to those groups, Waters plans to speak with judges in Chatham County and members of the offices of the district attorney and public defender to help form the fullest picture. 

Once the interviews are complete, Waters aims to meet with reentry council members, interviewees, and other invested parties in the fall to discuss how to implement the information they received, hear the perspectives of members of the community and figure out the best ways to make the findings useful. 

While Waters hopes her project helps improve reentry services for the Chatham County community, she sees her main role as doing whatever she can to help add her research perspective to the efforts underway. 

“I think the big goal is to be helpful to the community in some way,” Waters said. “I have some ideas about what that might look like, but if that thing I have in mind is not actually that useful to them, pivoting so that whatever I can offer is helpful, is valuable. The goal is to support the community’s efforts at putting this local reentry council in place.” 

Related Stories

writing implementation research grant proposals

Tar Heel Talents: Isabelle Borduas

UNC School of Social Work GSDI Program Manager Isabelle Borduas is working to help children across the globe while preparing for her career.

writing implementation research grant proposals

Q&A: Doctoral student Dee Williams focuses on growth of Mental Health First Aid training at UNC

We spoke with Denise “Dee” Yookong Williams, ‘25 (Ph.D.), to learn about their MHFA student doctoral grant, the growth of mental health first aid training at UNC and more.

UNC School of Social Work

COMMENTS

  1. Writing implementation research grant proposals: ten key ingredients

    Background All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because ...

  2. Writing implementation research grant proposals: ten key ...

    We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application. Summary: Every investigator struggles with the challenge of fitting into a page-limited application the research ...

  3. PDF Writing implementation research grant proposals: ten key ingredients

    This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can ...

  4. PDF Writing implementation research grant proposals: 10 key ingredientsand

    Writing implementation research grant proposals: 10 key ingredients....and a few reflections Russell E. Glasgow Materials adapted from ... Proctor et al. Writing implementation research grants. Implementation Sci. 2012 7:96 Brownson et al. Successful grant application writing. Clin Transl Sci. 2015 Dec;8(6):710-6.

  5. PDF Writing implementation research grant proposals: 10 key ingredients

    9. Measurement and analysis detail. Contribution: -Approach -Feasibility of completing study as proposed How? -Detailed measurement plan -Variation data -Unit of analysis specified & consistent -Analysis will exploit data and answer Q's. 10. Policy environment will leverage, support, sustain the change.

  6. (PDF) Writing implementation research grant proposals: ten key

    BackgroundAll investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive

  7. Writing implementation research grant proposals: ten key ingredients

    The challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation are addressed, and ten ingredients that are important in implementation research grants are summarized. BackgroundAll investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to ...

  8. Writing implementation research grant proposals: Ten key ingredients

    Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers.Discussion: This article addresses the challenges of preparing grant applications that succeed in the ...

  9. PRDV009: Writing Grant Proposals (2020.A.01)

    Investigators who are surrounded by experienced implementation researchers, working in a setting with strong community ties, and drawing on experienced research staff can inspire greater confidence in the proposed study's likelihood of success. 8. Feasibility of Proposed Research Design and Methods.

  10. Writing implementation research grant proposals: ten key ingredients

    A second key ingredient in implementation research proposals is the evidence-based program, treatment, policies, or set of services whose implementation will be studied in the proposed research [ 25 - 27 ]. The research 'pipeline' [ 28 - 30] contains many effective programs and treatments in a backlog, waiting to be implemented.

  11. grant-writing resources

    Writing Implementation Research Grant Proposals: Ten Key Ingredients by Enola Proctor, Byron Powell, Ana Bauman, Ashley Hamilton and Ryan Santens. This article summarizes the key ingredients of an implementation research grant application with examples of how preliminary data, background literature and narrative details can strengthen the ...

  12. Writing Implementation Research Grant Proposals: 10 Key Ingredients

    Prevention Research Center (PRC) ... Author

  13. Developing an Implementation Research Proposal

    This module is designed as an aid to the development of a high quality implementation research (IR) proposal by a research team. It draws extensively and builds upon the content of the proposal development module in the first edition of this toolkit. 1 Although there are certain elements that are common to various types of research proposals, some aspects are emphasized in this module to guide ...

  14. Writing implementation research grant proposals: ten key ingredients

    Discussion This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the ...

  15. How to Write a Successful Grant Application and Research Paper

    Writing a grant application is a demanding process, especially in the current environment of historically low funding levels. 1 The current funding rate of the National Heart, Lung, and Blood Institute is 10%, compared with ≈30% funding rate in 2001. When preparing a grant application, the 5 criteria that reviewers will use to score the grant ...

  16. Get Funding

    Writing implementation research grant proposals: Ten key ingredients was published in the journal Implementation Science in 2012, and is a rich source of expert knowledge on writing successful grants in this young field. Below you will find a high level overview of this article, to serve as an introduction to thinking about this type of grant.

  17. Writing implementation research grant proposals: Ten key ingredients

    Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is ...

  18. The Ultimate Grant Proposal Writing Guide (and How to Find and Apply

    Give a clear and concise account of your identity, funding needs, and project roadmap. Write in an instructive manner aiming for an objective and persuasive tone. Be convincing and pragmatic about your research team's ability. Follow the logical flow of main points in your proposal.

  19. How to write a successful research grant proposal: A comprehensive

    1. Abstract. The abstract is a summary of your research proposal. It should be around 150 to 200 words and summarize your aims, the gap in literature, the methods you plan to use, and how long you might take. 2. Literature Review. The literature review is a review of the literature related to your field.

  20. Standardizing an approach to the evaluation of implementation science

    We developed a reliable proposal scoring system that operationalizes Proctor et al.'s "ten key ingredients" for writing an implementation research grant . ... Fiellin DA. An evidence-based guide to writing grant proposals for clinical research. Ann Intern Med. 2005; 142:274-282. doi: 10.7326/0003-4819-142-4-200502150-00009.

  21. Writing implementation research grant proposals: ten key ingredients

    Europe PMC is an archive of life sciences journal literature.

  22. Writing Implementation Research Grant Proposals: Ten Key Ingredients

    This webinar discusses "Writing Implementation Research Grant Proposals: Ten Key Ingredients," published in the journal Implementation Science. ... Writing Implementation Research Grant Proposals: Ten Key Ingredients View this Webinar. Event Type: Meeting. Category: Professional Development. Date: January 24, 2013.

  23. Grant Proposal

    Explanation of the project's implementation plan, timeline, and key milestones. ... When You Have Conducted Research and Gathered Data: Grant proposals often require evidence and data to support the need for the project. Before writing the proposal, conduct thorough research to gather relevant statistics, studies, or community assessments ...

  24. Research and Engagement Grants for 2023-2024

    Application (deadline: Friday, July 14, 5pm) IPaT/GVU, with additional support from GTRI, announce the call for proposals for Research and Engagement Grants for 2023-2024. As in past years, we will support two separate types of grant proposals: Research Grants will provide seed funding for new research collaborations, and Engagement Grants will ...

  25. First-year doctoral student Ally Waters receives research grant for

    The first-year doctoral student at the University of North Carolina at Chapel Hill School of Social Work was recently awarded the Graduate Certificate in Participatory Research 2024 Seed Grant Award. Her proposal was titled, "Using Participatory Research Methods to Identify Assets and Opportunities for Diversion and Reentry Services in Rural ...