This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Evidence-Based Practice (EBP) is an approach that utilizes the best evidence for patient care, and its importance is growing in various fields to improve patient-centered care. However, the Evidence-Practice Gap (EPG) that occurs in the practical application of EBP remains a significant problem. EPG refers to the gap between research results and actual clinical practice, which can hinder the optimization of patient care and lead to inefficiencies in the healthcare system. This review introduces the concepts of EBP and EPG and examines educational approaches such as Sicilian statements and Core Competencies in Evidence-Based Practice. In addition, we discuss translational research, knowledge transfer, multidisciplinary collaboration, and evidence-based policymaking, which are key efforts to resolve EPG. In addition, we emphasize the importance of setting research directions using the Evidence Gap Map (EGM) along with national strategies to promote the spread of EBP. This paper discusses how strategic approaches and policy efforts to resolve the EPG can contribute to the actual clinical application of EBP and suggests future research directions.
Evidence-Based Practice (EBP) has emerged as a fundamental approach in modern healthcare, integrating the best available evidence with clinical expertise and patient preferences to enhance healthcare outcomes [1]. Despite its recognized benefits, a persistent challenge exists in translating research findings into routine clinical practice, a phenomenon known as the Evidence-Practice Gap (EPG) [2]. This gap not only impedes the adoption of scientifically validated interventions but also contributes to variations in patient care and inefficiencies within healthcare systems [3].
Several barriers contribute to the persistence of the EPG, including limited access to evidence-based resources, time constraints, complexity of clinical guidelines, and resistance to change within healthcare institutions [4]. Addressing these challenges requires a multifaceted approach, including educational initiatives such as the Sicilian Statements on EBP, competency-based training, and systematic knowledge translation strategies [5].
This review explores the current status of EBP, the challenges associated with the EPG, and potential solutions for its resolution. By examining educational strategies, translational research, multidisciplinary collaboration, and evidence-based policymaking, this paper aims to provide a comprehensive understanding of how strategic efforts can facilitate the integration of EBP into routine clinical practice and ultimately improve healthcare outcomes.
Evidence-Based Practice
David Sackett, known as the founder of evidence-based medicine (EBM), defined EBM as “the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients” [1]. His definition implies that EBM is making decisions related to patients using appropriate evidence in clinical situations. The MeSH (Medical Subject Headings) definition of EBM is “An approach of practicing medicine with the goal to improve and evaluate patient care. It requires the judicious integration of best research evidence with the patient's values to make decisions about medical care” [6], which is almost similar to the definition by David Sackett.
A term used together with EBM is evidence-based practice (EBP). In the MeSH system, EBP is a broader term than EBM. The MeSH definition for EBP is “A way of providing health care that is guided by a thoughtful integration of the best available scientific knowledge with clinical expertise” [6]. This shows that EBM is a term used primarily for clinical purposes, while EBP is a term applied to the entire health care including clinical care.
In this regard, the Sicilian statements on EBP suggested the use of the term ‘EBP’ rather than ‘EBM’ [7]. The Sicilian statements include a definition of EBP, a description of the skills required to practice in an evidence-based manner and a curriculum that outlines the minimum requirements for training health professionals in EBP.
Sicily statements
The Sicily statements were agreed upon by several EBP-related experts gathered in Sicily in September 2003 at the second international conference of Evidence-Based Health Care Teachers and Developers. Sicily's statements include five recommendations [7]. The five are as follows:
1. The professions and their colleges should incorporate the necessary knowledge, skills and attitudes of EBP into their training and registration requirements.
2. Curricula to deliver these competencies should be grounded in the “five-step model”
3. Further research into the most effective and efficient methods for teaching each step should be fostered, and linked with ongoing systematic reviews on each step.
4. Core assessment tools for each of the steps should be developed, validated, and made freely available internationally.
5. Courses that claim to teach EBP should have effective methods for teaching and evaluating all components.
The recommendations include that EBP requires that medical professionals be trained and practiced in the skills, attitudes, and knowledge of EBP, that this training should be conducted according to the five-step model proposed by David Sackett, and that core assessment tools should be developed for this training.
The five steps proposed by David Sackett are as follows [1]:
1. Translation of uncertainty to an answerable question.
2. Systematic retrieval of best evidence available.
3. Critical appraisal of evidence for validity, clinical relevance, and applicability.
4. Application of results in practice.
5. Evaluation of performance.
These five steps are often referred to as 4A, 1E (Ask, Acquire, Appraise and interpret, Apply, Evaluate).
Core competencies in evidence-based practice
These Sicilian statements led to the development of the Core Competencies in Evidence-Based Practice. EBP is a core component of undergraduate, graduate, and continuing education curricula worldwide, yet a lack of EBP knowledge and skills remains one of the most commonly reported barriers to EBP implementation. Therefore, a standardized set of EBP core competencies could improve EBP teaching and learning programs and EBP knowledge [8]. Core competencies are the minimum set of attributes that an individual must possess, such as applied knowledge, skills, and attitudes, that are measured against appropriate standards [9]. The Core Competencies in Evidence-Based Practice were developed through the following processes: (1) a draft based on a literature review on EBP education, (2) a two-round Delphi survey centered on experts, and (3) a final decision through a consensus meeting. The EBP core competencies are divided into three levels: M (“mentioned”): only mentioned (as a well-known fact of the core competency); E (“explained”): simply explained in the educational program (content is understood without practice); and P (“practiced with exercises”): practice is required (practice is implemented to ensure detailed understanding). Among the 86 core competencies, “P” for “Introductory” includes “EBP 5-step practice; for “Ask” “identification of question categories, PICO (Population, intervention, comparison, outcome) creation, and PICO modification attempt”; for “Acquire stage”, “convert core questions into search terms, find search sources”; for “Appraise and interpret stage”, “interpret uncertainty of measurements, interpret types of measurements, critically evaluate systematic review, identify key elements of clinical trials and interpret measurements, critically evaluate diagnostic studies, and distinguish between evidence-based and opinion-based treatment guidelines.” For the” Apply stage”, “patient participation in medical decision-making, understanding shared decision-making” was given a P grade [8].
Evidence-Practice Gap
One of the most critical challenges in EBP implementation is the Evidence-Practice Gap (EPG), which has garnered significant attention from various healthcare systems worldwide. Despite the availability of high-quality research findings, clinicians frequently struggle to integrate them into their clinical decision-making processes. This gap hinders the optimization of patient care and contributes to inefficiencies within healthcare systems. This gap can impede the optimization of patient care and lead to inefficiencies in the health care system. [10] There are several reasons for the evidence-practice gap. Time constraints make it difficult for clinicians to access new research results [2], clinical practice guidelines may be too complex or conflicting [4], resources to access medical literature may be lacking, and institutional and cultural barriers may prevent adequate reimbursement of evidence-based care [3]. To bridge the evidence-practice gap, continuing medical education (CME) and clinical decision support systems (CDSS) should be introduced [11], establishing an evidence-based treatment culture through activating multidisciplinary conferences, case-based learning, etc. [12]. In addition, patient education and shared decision making can maximize treatment effects by explaining evidence-based treatment options to patients and through collaborative decision making with patients [13].
Among various EPG-related issues, we will explain the Evidence practice time gap (EPTG), EBP-related KAP (Knowledge, Attitude, Behavior), each country's efforts to resolve EPG, and the Evidence Gap Map (EGM), which is one of the methodologies suggested to resolve EPG.
Evidence practice time gap (EPTG)
The evidence practice time gap refers to “the significant delay between when new research evidence is published and when it is actually implemented into routine practice” [14]. Twenty-five years ago, Balas and Boren et al. reported that 17 years were needed for the practical use of pneumococcal vaccination, thrombolytic therapy, diabetic eye exam, beta-blockers after a myocardial infarction, cholesterol screening, fecal occult blood testing, and diabetic foot care after the publication of evidence, and this was called the 17-year time gap in many literatures [15]. Later, Khan et al. published a study to confirm this again, calculating the average period of time during which five cancer prevention methods (mammography screening, smoking cessation, colorectal screening, HPV testing, and HPV vaccination) were implemented in 50% of actual practice, and the average period was 15 years (range of 13 to 21 year) [16]. In fact, even after 20 years, the gap between the evidence and actual clinical practice has not narrowed.
This fact has become an important opportunity for many countries to seriously address the evidence gap issue.
EBP-related KAP (Knowledge, Attitude, Behavior)
"KAP" stands for "Knowledge, Attitude, and Practice," which is a framework commonly used in research, particularly in public health, to assess people's understanding (knowledge), beliefs (attitude), and actual behaviors (practice) related to a specific topic or health issue [17]. In that respect, examining KAP research in EBP helps us understand the current status of the evidence-practice gap. Currently, several systematic reviews have been published on EBP-related KAP, of which three are representative. In a systematic review of 57 studies on knowledge, attitude, and practice of graduate physicians toward evidence-based medicine (EBM), many physicians have poor EBM knowledge and skills, while the majority of them have a positive attitude toward the implication of EBM. The most significant barrier cited by respondents was lack of time [18].
In a systematic review examining KAP of nursing students and nurses toward EBP, nursing students and nurses have positive attitudes toward EBP. However, they lacked the necessary knowledge and skills [19].
When examining the effectiveness of evidence-based healthcare (EBHC) educational interventions on healthcare professionals' knowledge, skills, attitudes, behavior of EBHC, clinical process and care outcomes through 61 RCTs, it showed improvements in knowledge, attitudes and behavior up to 6 months [20]. In a study conducted on Korean nurses, attitudes toward EBP were the highest, knowledge and beliefs were moderate, and implementation was the lowest [21].
In summary, overall beliefs and attitudes toward EBM are generally positive, but the knowledge underlying the attitudes and beliefs is mixed, and EBM implementation is very inadequate.
Efforts by countries to solve EPG
In order to resolve the evidence-to-practice gap, each country has developed various strategies and research fields. Representative examples include the US: Translational Research and Implementation Science, Canada's Knowledge Translation, AH-TRIP (Australian Health Translation Research and Implementation Platform), and the UK's Cooksey Report.
In the US, translational research and implementation science have been developed to reduce the evidence-to-practice gap. Translational research focuses on the process of connecting basic science research results to clinical applications, and is subdivided into T1 (basic science to human research), T2 (clinical research to practice application), T3 (diffusion within the health care system), and T4 (application in the public health context) [22]. Implementation Science is a research field that enables effective evidence-based interventions to be implemented in real-world settings, and plays a role in developing and evaluating strategies for practical application [23].
In Canada, the concept of Knowledge Translation (KT) has been developed to promote evidence-based practice. The Canadian Institutes of Health Research (CIHR) defines KT as “the process of communicating and utilizing research results to knowledge users more effectively” and emphasizes interactive and end-of-grant KT strategies [24].
Australia is attempting to close the evidence-practice gap through the Australian Health Translation Research and Implementation Platform (AH-TRIP). AH-TRIP is a platform that supports the rapid translation of research results into health care practice, promoting multidisciplinary approaches and collaborative research. This strengthens evidence-based policymaking and clinical application, and strengthens links between government and research institutions [25]. In the UK, the Cooksey Report in 2006 analyzed the problems of translational research within the research and development (R&D) system and presented strategies to improve it. The report defined the gap between research and practical application as a "dual gap" and emphasized the need to strengthen translational research by reorganizing the R&D investment structure [26]. Based on this, the UK has established several organizations to promote translational research and has made efforts to improve the way research is funded. Each country has developed distinct strategies to address the evidence-practice gap based on its healthcare system and research environment. The US has enhanced the connection between research and clinical practice through Implementation Science and translational research. Canada has focused on Knowledge Translation to improve the dissemination of research findings. Meanwhile, Australia has fostered multidisciplinary collaboration via AH-TRIP, and the UK has restructured its research and development system based on the recommendations of the Cooksey Report. These various approaches provide important implications for more effectively promoting evidence-based practice in the future.
Evidence gap map (EGM)
Evidence Gap Map (EGM) is a tool that systematically organizes existing research evidence on a specific topic or research field and visually presents areas where research is lacking. It usually evaluates evidence using systematic reviews and meta-analyses and clearly shows areas of research density and gaps [27]. Such EGMs can also be used to explain the status of evidence-based practice gaps.
EGM plays an important role in setting research directions. Researchers can use EGMs to identify areas where research is lacking and to help select future research topics.2 It is also used as an important tool in the policy-making process. Policymakers can use EGMs to develop evidence-based policies and develop strategies to supplement areas where research evidence is insufficient. 3 They also contribute to optimizing resource allocation, and research funding agencies and donors can identify areas where evidence is lacking and allocate resources effectively [5].
Discussion
The persistence of the evidence-to-practice gap (EPG) is a critical challenge to integrating evidence-based practice (EBP) into health systems. Despite numerous advances in research methodology, clinical guidelines, and educational interventions, the process of translating research evidence into routine clinical practice remains slow and inconsistent. These delays, frequently exceeding a decade, have significant implications for patient outcomes, healthcare efficiency, and resource allocation [2].
One of the most pressing barriers to bridging the EPG is limited access to up-to-date, high-quality research. Particularly in resource-constrained settings, many clinicians struggle to retrieve, interpret, and apply the latest evidence due to time constraints, lack of institutional support, and financial limitations [3]. To address these barriers, a robust knowledge translation framework is needed to facilitate effective dissemination and application of research findings [4].
Educational interventions play a critical role in overcoming EPGs. Integrating EBP into undergraduate, graduate, and continuing medical education curricula is essential to fostering a culture of evidence-based decision making [5]. Research has shown that competency-based educational programs that integrate real-world clinical scenarios significantly improve health professionals’ ability to critically evaluate and apply evidence in practice [8].
Efforts at the institutional and policy levels are also important in narrowing the EPG. Governments and health care organizations should prioritize the implementation of evidence-based policies, invest in clinical decision support systems (CDSS), and encourage interdisciplinary collaboration to accelerate knowledge translation [11].
The role of technology in the implementation of EBPs cannot be overstated. Digital health innovations, including artificial intelligence-based decision support tools, electronic health records with embedded evidence-based guidelines, and online knowledge repositories, offer potential solutions to accelerate the integration of research into practice [22].
Effectively addressing EPGs requires a comprehensive and multifaceted approach. This includes strengthening the EBP capacity of health care professionals, improving access to reliable evidence, promoting institutional support, and implementing policies that facilitate the translation of research into practice. Future research should focus on evaluating the effectiveness of these interventions across a variety of health care settings to identify the most impactful strategies for sustaining evidence-based improvements in patient care [12].
Notes
Conflict of Interest
Soo Young Kim has been an editor of the Journal of Evidence-based Practice since 2025. However, he was not involved in the peer reviewer selection, evaluation, or decision process of this article. No other potential conflicts of interest relevant to this article were reported.
Funding
None.
Data Availability Statement
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Ethics Approval and Consent to Participate
Not applicable.
Author Contributions
Conceptualization: Kim SY. Funding acquisition: Kim SY. Methodology: Kim SY. Writing - original draft: Kim SY. Writing - review & editing: Kim SY.
Acknowledgments
None.
References
1. Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence-based medicine: what it is and what it isn't. BMJ 1996; 312: 71-2.
2. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med 2011; 104: 510-20.
3. Gagliardi AR, Berta W, Kothari A, et al. Integrated knowledge translation: a review of concepts and practices. J Health Serv Res Policy 2016; 21: 161-70.
4. Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PA, et al. Why don’t doctors follow clinical practice guidelines? JAMA 1999; 282: 1458-65.
5. Snilstveit B, Vojtkova M, Bhavsar A, Gaarder M. Evidence gap maps—a tool for promoting evidence-informed policy and prioritizing future research. J Clin Epidemiol 2013; 66: 466-76.
6. National Library of Medicine. Medical Subject Headings (MeSH). National Center for Biotechnology Information; [2025 Feb 10]. Available from https://www.ncbi.nlm.nih.gov/mesh/
7. Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Second International Conference of Evidence-Based Health Care Teachers and Developers. Sicily statement on evidence-based practice. BMC Med Educ 2005; 5: 1.
8. Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, et al. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and Delphi survey. JAMA Netw Open 2018; 1: e180281.
9. Moynihan S, Paakkari L, Välimaa R, Jourdan D, Mannix-McNamara P. Teacher competencies in health education: results of a Delphi study. PLoS One 2015; 10: e0143703.
11. Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ 2005; 330: 765.
13. World Health Organization. Bridging the “Know-Do” Gap: Meeting on Knowledge Translation in Global Health. 2006.
14. Rubin R. It takes an average of 17 years for evidence to change practice—the burgeoning field of implementation science seeks to speed things up. JAMA 2023; 329: 1333-6.
15. Melnyk BM. The current research to evidence-based practice time gap is now 15 instead of 17 years: urgent action is needed. Worldviews Evid Based Nurs 2021; 18: 318-9.
16. Khan S, Chambers D, Neta G. Revisiting time to translation: implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes Control 2021; 32: 221-30.
17. Santesso N, Akl E, Bhandari M, Busse JW, Cook DJ, Greenhalgh T, et al. A practical guide for using a survey about attitudes and behaviors to inform health care decisions. J Clin Epidemiol 2020; 128: 93-100.
18. Barzkar F, Baradaran HR, Koohpayehzadeh J. Knowledge, attitudes and practice of physicians toward evidence-based medicine: a systematic review. J Evid Based Med 2018; 11: 246-51.
19. Li H, Xu R, Gao D, Fu H, Yang Q, Chen X, et al. Evidence-based practice attitudes, knowledge and skills of nursing students and nurses, a systematic review and meta-analysis. Nurse Educ Pract 2024; 78: 104024.
20. Hill J, Gratton N, Kulkarni A, Hamer O, Harrison J, Harris C, et al. The effectiveness of evidence-based healthcare educational interventions on healthcare professionals' knowledge, skills, attitudes, professional practice and healthcare outcomes: systematic review and meta-analysis. J Eval Clin Pract 2024; 30: 909-35.
21. Yoo JY, Kim JH, Kim JS, Kim HL, Ki JS. Clinical nurses' beliefs, knowledge, organizational readiness and level of implementation of evidence-based practice: the first step to creating an evidence-based practice culture. PLoS One 2019; 14: e0226742.
22. National Center for Advancing Translational Sciences. Translational Science Spectrum. [2025 Feb 10]. Available from: https://ncats.nih.gov/translation