Join Us | Latest Articles | Contact

Journal Home


Editorial Board


Archive


Submit to this journal


Current issue

Journal of Rheumatic Diseases and Treatment





DOI: 10.23937/2469-5726/1510039



Implementation of Evidence-Based Practice in Rheumatology: What Sociodemographic, Social Cognitive and Contextual Factors Influence Health Professionals' Use of Research in Practice?

Neher M*, Stahl C, Festin K and Nilsen P


Department of Medical and Health Sciences, Linköping University, Sweden


*Corresponding author: Margit Neher, Department of Medical and Health Sciences, Linköping University, Entrance 76, 14th floor, S-581 83 Linköping, Sweden, E-mail: margit.neher@liu.se
J Rheum Dis Treat, JRDT-2-039, (Volume 2, Issue 3), Research Article; ISSN: 2469-5726
Received: May 10, 2016 | Accepted: September 15, 2016 | Published: September 17, 2016
Citation: Neher M, Stahl C, Festin K, Nilsen P (2016) Implementation of Evidence-Based Practice in Rheumatology: What Sociodemographic, Social Cognitive and Contextual Factors Influence Health Professionals' Use of Research in Practice? J Rheum Dis Treat 2:039. 10.23937/2469-5726/1510039
Copyright: © 2016 Neher M, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.



Abstract

Objective: Research on the use of research in rheumatology practice is largely lacking. This study attempts to fill this knowledge gap by exploring the degree to which evidence-based practice (EBP) is implemented in clinical rheumatology practice and identifying individual and organizational factors that may potentially affect research use in the clinical environment.

Methods: A web-based questionnaire was distributed to members of health professional groups in clinical rheumatology by way of publicly available e-mail addresses. Data were collected on sociodemographic, social cognitive, and contextual factors deemed to potentially influence the use of research in practice. The outcome measure was the EBP Implementation Scale.

Results: A complex range of factors was found to influence the outcome. The factors that were most clearly associated with research use were the perception of personal ability to use research knowledge, years of experience in clinical rheumatology, and experience of research activities.

Conclusions: Our study results suggest a large variation in levels of implementation of EBP across work units and individuals, and although a low general standard is indicated (even if a gold standard does not exist), there was also a great interest in working according to EBP principles. Potential for change is apparent, but it seems necessary to examine the use of research evidence in rheumatology practice at the individual and work unit levels to accommodate local and individual needs and resources. Future studies are needed to examine the influence of contextual influences by other methods.


Background

A wealth of research findings in rheumatology promise better methods for diagnosis, yielding many changes in rheumatology practice in recent years [1]. Expectations for a more evidence-based practice (EBP) in rheumatology have led to a growing demand for practitioners to keep abreast of the developments in their field and develop a high level of research literacy [2-4]. EBP involves the integration of three knowledge sources: research evidence, clinical expertise, and the patient's unique values and circumstances [5,6].

National and regional healthcare organizations in Sweden promote EBP through the dissemination of recommendations and guidelines, and clinics stress the importance of working in an evidence-based way. Despite a strong endorsement for EBP, however, there are no official EBP-benchmarking strategies and continuing professional education is optional. Studies have pointed out that the actual use of research for practice tends to be low [7,8]. Furthermore, evidence-based interventions are not used routinely; many practitioners continue using interventions that have little or no evidence and many rely more on their experience than on research [9,10]. The same may apply to rheumatology, but research has been scarce.

Barriers to implementing a more EBP in health care have predominantly been sought at the individual level [11]. However, there is increasing recognition of the relevance of the organizational context, i.e., influences beyond the individual level [12,13]. Leadership and organizational climate are described in many implementation models and frameworks as critical aspects of the context of implementation [14-16].

Research on the use of research in rheumatology practice is largely lacking. This study attempts to fill this knowledge gap by exploring the degree to which EBP is implemented in clinical rheumatology practice and identifying individual and organizational factors that may potentially affect research use in this clinical environment.


Materials and Methods

Study setting and population

Health care in Sweden is publicly funded. All residents are insured by the state, with a politically explicit goal to supply equal health care access for the entire population. Out-of-pocket fees are low and regulated by law. The provision of health care services in Sweden, including rheumatology care, is primarily the responsibility of the 21 county councils throughout Sweden and is carried out by both private and public providers following the same fee-system. Patients requiring specialized care and medication are mainly cared for in hospital clinics.

The study population consisted of practitioners who worked in specialty rheumatology practices. The participants were all professionals with an explicit scientific base for their practice: occupational therapists, physiotherapists, social workers, physicians, and nurses. Although members of these professional groups usually have various organizational affiliations in the hospital, collaboration between team members is taken for granted based on long tradition in Swedish rheumatology units. Local variations determine the composition of teams and their allocated resources and work scope. Assistant nurses and administrative personnel were not included because their training has not traditionally included a scientific curriculum. Participants not currently working in clinical practice were also excluded. The project received approval from an ethics committee in Linköping, Sweden (Dnr 2012/142-31).


Study design and questionnaire

A cross-sectional observational design was used and data were gathered using a questionnaire. Instruments and items were selected to address the research aim. Consideration was also given to ensuring that the questionnaire was suitable for the study population that the instruments and items had acceptable measurement properties, and the burden on respondents was limited.

All instruments were translated into Swedish by means of a forward and backward translation process and tested for acceptability, clarity and cultural applicability in a sample of the target study population. The questionnaire was pilot tested by a sample of participants (n = 12) who were representative of all the professions included in the study. Information was gathered on the perceived time necessary for completion of the questionnaire. Several changes were made to produce a questionnaire that had linguistic and conceptual equivalence to the original instruments as well as acceptability for participants.

Demographic data were collected on the participants' age, sex, work experience and experience in rheumatology, postgraduate education, experience of research activity, their role in the clinic (e.g., clinical leadership), and the number of peers in the work group.

Five items were used to measure factors relating to the participants' perception of the use of research evidence in practice. The items were inspired by an instrument based on socio-cognitive theory intended to assess the impact of continuing professional education (CPE) activities [17]. Four of the items were scored on a scale of 1-7, with the end points defined as low [1] and high [7]. For one of the items, respondents were presented with a choice of five percentage intervals. The items could be used in any order and did not constitute a final score.

The 18-item Implementation Climate Scale is a measure of the climate for EBP implementation [18]. The instrument captures six dimensions of the organizational context that indicate to employees the extent to which their organization prioritizes and values the successful implementation of EBPs: focus on EBP, educational support for EBP, recognition for EBP, rewards for EBP, selection for EBP, and selection for openness. Analyses of the instrument by the authors supported the reliability and construct-based evidence of validity for the Implementation Climate Scale, as well as the aggregation of the measure to the work group level. The instrument is scored on a scale of 0 (not at all) to 4 (to a very great extent). The total score ranges from 0 to 72 (0-12 per subscale).

The 12-item Implementation Leadership Scale measures strategic leadership for implementation [19]. The instrument demonstrated excellent internal consistency and reliability as well as convergent and discriminant validity when tested by the authors [20]. The instrument has four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. The items are scored on a scale of 0 (not at all) to 4 (to a very great extent). Responses to the 12 items are summed, resulting in a total score ranging from 0 to 48.

The 18-item EBP Implementation Scale was used to measure the extent to which EBP was implemented [21]. The instrument originated from a review of the literature on essential components and steps of EBP. The reliability and validity of the instrument was supported in a heterogeneous sample of practicing nurses, and was also translated to Norwegian (Stokke, 2014). Participants were asked to respond to each of the 18 items on a 5-point frequency scale by indicating how often in the past 8-weeks they performed the item. The scale ranges from 0 (0 times) to 4 (> 8 times). Responses to the 18 items are summed, resulting in a total score from 0 to 72.


Data collection

Participants were recruited through the heads of rheumatology units, managers, and other key persons. Participants were also contacted directly using publicly available work addresses, with the help of organizations for the targeted professional groups: rheumatologists, occupational therapists, nurses, physiotherapists, and social workers working in clinical rheumatology.

An invitation to participate in the project was extended to all the known rheumatology clinics in Sweden via an e-mail with information about the study aim and the scope of the questionnaire. Recipients were informed that participation in the study was voluntary and that study data concerning individuals and units would not be identifiable in the results. In order to recruit as broadly as possible, the invitation extended to all employees in the targeted professional groups who worked in specialized rheumatology practice at the time of the survey, and to those with a different organizational affiliation.

The questionnaire was distributed as a web-based survey by e-mail. A first reminder was sent after 2-weeks and a second after another 2-weeks. All data were saved according to university protocol, which is designed to safeguard the personal details of participants. Personal information about the participant and unit was anonymous to the researchers.

The survey was distributed to the work e-mail addresses of 1168 practitioners employed in rheumatology during April and May 2015. Subsequent e-mail communications from work units and participants indicated that many of the addresses were not functioning. A back-check by telephone and e-mail with work units participating in the study showed that some addresses belonged to participants who no longer worked in rheumatology or did not belong to the professions targeted in the study. By comparing the original lists with current personnel information in several work units, we found grounds for estimating that 20% (234) of the addresses were erroneous. This reasoning resulted in an estimated total population of 933 practitioners. Of the remaining 933 persons with a functioning work e-mail address, 306 responded; 294 belonged to the professional groups targeted in this study. One respondent was identified as an outlier (on grounds of response inconsistencies) and excluded from our analyses (Figure 1).


.
Figure 1: Flowchart describing the recruitment of participants to the study. View Figure 1



.




Data analysis

Mann-Whitney and chi-square tests were used to compare groups of responders and non-responders to assess non-response bias. Information was gathered by telephone and e-mail with regard to gender, age, and level of education.

Descriptive statistics were used for demographic factors, and to report the distribution of scores on the measures. Multiple regression analyses determined the contribution of the different variables on the study outcome, i.e., the score on the EBP Implementation Scale. Groups were formed to build a model for predicting the outcome.

Multiple logistic regression analysis was carried out to investigate the variables associated with EBP implementation. The dependent variable, the level of EBP implementation, was dichotomized based on the mean of the total score, resulting in low EBP implementation (score of 0-12 on the EBP Implementation Scale) and high EBP implementation (score of 13 and over). The independent variables were age, profession, postgraduate education, experience in profession, experience in clinical rheumatology, experience of research activities, number of professional peers in work group, assessment of personal ability to use research evidence (in terms of self-efficacy), perceived value of research evidence for clinical practice, intention to use research evidence, estimation of peers' use of research evidence, estimation of respected colleagues' use of research evidence, EBP implementation climate, and EBP implementation leadership.

Dummy coding was used for all categorical independent variables. No problem with multicollinearity between the predictor variables was detected according to the variance inflation factor. Independent variables that showed a p-value ≤ 0.10 were included in a multiple logistic regression model. Possible two-way interaction terms were tested. Results are reported as odds ratios, with 95% confidence intervals and p-values. All results were considered significant at p < 0.05. Data were analyzed using SPSS (version 22).


Results

All instruments showed good internal consistency, with Cronbach's Alpha values of 0.936 (EBP Implementation Scale), 0.912 (EBP Climate Scale) and 0.965 (EBP Leadership Scale). The questionnaire was completed by 294 participants. Ultimately, due to a statistical outlier, our analyses were based on 293 responses. The overall response rate was 32%, although the rate differed considerably among the professions (from 17% of the physicians to 51% of the occupational therapists) which led to some proportional overrepresentation of allied health professional groups compared to physicians in the final results. Analysis of response-bias showed no differences.

The respondents were aged between 25 and 70 years; 89% were women and 11% were men. They worked in clinics of different size throughout Sweden, representing all 21 county councils in Sweden. Employees in large clinics made up nearly half of the study population.

Three-quarters of the physicians were rheumatology specialists, but only 14% of the registered nurses had a formal specialization in rheumatology. Physicians were also most well educated after graduation. In the other professional groups, more than one in five professionals had no CPE after graduation; registered nurses had both the highest percentage of no CPE and the lowest percentage of research education. Around one in six respondents had been active in research-related activities during the past year (Table 1).



Table 1: Sociodemographic characteristics of study group in study C (n = 293). View Table 1


Respondents' perception of the value of research evidence for clinical practice was high (6 on a scale of 1-7), as was their intention to use research evidence in the future. Their personal assessment of their ability to use research in practice and their estimation of the use of research evidence by a respected colleague were somewhat lower. Only half of the respondents estimated that a respected colleague often or always used research, and 15% of the respondents believed that a respected colleague rarely used research. Less than a quarter of the respondents estimated that a high percentage (81-100%) of their peers used research in practice (Table 2).



Table 2: Social-cognitive factors relating to the use of research evidence in clinical practice (n = 293). Means and interquartile range of scores (IQR), and frequencies and percentages of groups. Items adapted from Légaré (2014). View Table 2


The average score on the EBP Implementation Climate Scale was 32.6 (standard deviation [SD], 12.4; range, 3-62). More than half of the respondents agreed to a great extent to the items "One of my workplace's main goals is to use evidence-based practices effectively" (n = 190, 66%), "People in my workplace think that the implementation of evidence-based practices is important" (n = 210, 73%), and "Using evidence-based practices is a top priority in my workplace" (n = 183, 63%) (Figure 2).


.
Figure 2: Responses to items on the EBP Implementation Scale shown as proportions (%) of total number of respondents per item. Responses indicate how many times in the past 8-weeks study-participants performed the item: "0 times", "1-5 times" or "6 times or more". View Figure 2



.




The average score on the Implementation Leadership Scale was 21.8 (SD, 11.4; range, 0-48). The items that most respondents agreed with to a great or very great extent included one item belonging to the subdomain concerning leadership EBP knowledge (my immediate supervisor "knows what he/she is talking about when it comes to EBP"; n = 110, 41%) and all three items belonging to EBP-supporting leadership "supports employee efforts to learn more about EBP" (n = 110, 41%), "supports employee efforts to use EBP" (n = 128, 44%), and "recognizes and appreciates employee efforts" (n = 105, 40%).

The average score on the EBP Implementation Scale was 12.1 (SD, 11.6; range, 0-61). The activities that were reported as being most often performed by most (60%-70%) of the respondents once or more during the previous 8-weeks of practice were "sharing evidence from a research study with patients and the families", "using evidence to change clinical practice" and "informally discussing evidence from a research study".

Univariate logistic regression analyses showed that many of the factors were associated with the outcome measure. Possible two-way interaction terms were not found to be statistically significant, and therefore not included in the model. Respondents' general professional experience, the number of professional peers in the work group, and the implementation leadership score did not have a significant association with the implementation of EBP (Table 3).



Table 3: Odds -ratios (OR) for a high score on the EBP-implementation scale, adjusted for sex and age. View Table 3


Multiple logistic regression analysis was performed to find out which factors were associated with a higher degree of EBP implementation. This resulted in a model including three factors: more years of experience in rheumatology practice, more experience of research activities, and a higher estimation of personal ability to use research evidence in practice was associated with a higher score on the EBP Implementation Scale (Table 4).



Table 4: Sex and age-adjusted multiple regression model explaining the study outcome (score on EBP Implementation Scale). The model demonstrated good model fit according to the Hosmer and Lemeshow goodness-of-fit test (Chi-sq (8) = 4.90, p = 0.77). Nagelkerke R Square = 0.41. View Table 4


Discussion

The results show that many different variables were associated with implementation of EBP, which suggests the complexity of EBP implementation. Three factors were found to be most clearly associated with higher EBP implementation: longer experience in rheumatology practice, experience of research activities, and better perceived personal ability to use research evidence in practice.

Experience of rheumatology was more important than experience in profession for explaining EBP implementation. A longer exposure to the clinical problems of a specific category of patients may foster a deeper motivation to learn more and encourage interest in research. No consistent evidence has been found concerning work experience as a factor for research use [11]. Dalheim, et al. [22] suggest that experienced nurses may be a resource for implementing EBP, while other studies detected tendencies for a decline in research use over time in practice [23].

Experience of research activities was associated with high EBP implementation, despite only one in six respondents having been active in research activities during the past year. Our study outcome was not associated with educational level, even though the odds for higher EBP-implementing scores were significantly higher for participants with CPE, and especially for those with a research education. Academic level has been shown to influence levels of EBP implementation in other studies [24-26].

That perceived personal ability to use evidence in practice was an important factor for EBP implementation accords well with Social Cognitive Theory, which proposes the relevance of self-efficacy for performing a new behavior [27]. Studies concerning EBP in terms of hand hygiene practice and lumbar spine X-ray referrals have shown that beliefs about capabilities are important for developing a new behavior [28,29]. In general, individual motivational and attitudinal factors have been shown to be of importance to EBP implementation [11,30,31].

The mean score for the EBP Implementation Scale was low, which is in line with the results found in several other studies, predominantly within nursing [32-35]. It should be noted that the outcome in our study is a composite score representing several professional groups together. There are no comparable studies, making it difficult to interpret the outcome of our study.

Two approaches to EBP have been described in the literature: EBP as a critical appraisal process leading to clinical decisions informed by evidence and EBP in terms of adopting and using various empirically supported interventions, including those specified in clinical guidelines or other recommendations [6,36]. The EBP Implementation Scale features mixed items and reflects both approaches to EBP. However, neither approach was favored over the other in our study.

A majority of the respondents reported that they had not performed several of the activities traditionally related to the critical appraisal EBP process, such as accessing the Cochrane Database of Systematic Reviews or the National Guidelines Clearinghouse. Our results are similar to those of Stokke, et al. [32]. Other activities, such as collecting data on a patient problem, critically appraising evidence from a (clinical) research study, and "formulating a PICO question about my clinical practice" were performed regularly by 1 in 10 of the respondents, while about half of the respondents did not perform these at all.

Activities related to the second EBP approach, i.e., adopting and using EBPs (interventions, programs, etc.), as exemplified by items like "using an EBP guideline or systematic review to change practice" and "sharing an EBP guideline with a colleague," were not so much more common. However, our results indicate that respondents favored social interactions about evidence ("sharing evidence from a research study with patients and the families" and "informally discussing evidence from a research study") over activities associated with individually acquiring evidence by accessing written sources. This result is also similar to that of Stokke, et al. [32]. Earlier studies also indicate that learning for many health care practitioners more often occurs through workplace interactions with colleagues and patients than through written sources [37,38].

Physicians were more likely than other professional groups to engage in research, being seven times more likely to implement EBP compared with our reference group. Still, professional affiliation had a proportionately low association with our study outcome, compared with other factors. In a study of different professional groups, Weng, et al. [26] observed significant professional differences, with physicians perceiving fewer barriers to implementing EBP than other groups and implementing EBP most. As the availability and relevance of research evidence for practice may vary for different professions, different professional groups may implement EBP differently.

Our study was not able to show clearly that an EBP climate and leadership affected EBP implementation although respondents with a more EBP-supportive work climate were more likely to belong to the high EBP-implementing group. Respondents reported trust in their leadership's knowledge about EBP and evaluated leadership support of EBP implementation in positive terms. In contrast, the respondents expressed doubts concerning how leaders proactively operationalize EBP implementation. In effect, there seems to be a perceived lack of an explicit strategy (clear standards, facilitation plan, removal of obstacles) to effectuate EBP in actual practice. Respondents perceived that they received no financial or other compensation for implementing EBP.

Climate and leadership are perceived to be important to facilitate EBP [11,39-41]. However, the mechanisms by which leadership affects implementation success are as yet unclear, which is reflected in difficulties in operationalizing and measuring leadership influence [42,43]. Further research is needed to identify the specific actions that leaders at different levels could engage in to support EBP implementation.

This study has some limitations that must be considered when interpreting the findings. The wide confidence intervals in the regression analyses were most likely caused by the relatively small sample size. The overall response rate was 32%, which reduces our ability to make generalizations about the entire rheumatology population. Non-respondents in survey research tend to be different from those who participate; the latter are typically more motivated, opinionated, and well organized [44]. As EBP is generally considered important, participants' perceptions of what they are expected to think may have additionally had a positive impact on scores. It is therefore likely that our findings do not exaggerate the extent to which EBP is implemented in rheumatology: the scores may be inflated because of both selection bias and social desirability bias.

The study applied a broad theory-based approach, incorporating several factors that may potentially influence EBP implementation, using valid and reliable instruments. Even so, all potential factors of importance could not be included within the scope of this study. One important example is the allocation of time and resources for EBP implementation, which has been shown to affect knowledge use in many studies [12,27,41,45].

It should also be noted that some of the instruments used in the study, notably the EBP Implementation Scale, the EBP Leadership Scale and the Implementation Climate Scale, are all in relatively early stages of development [46]. No data are as yet available to set appropriate cut-off levels. As subgroups were formed by an even distribution of data rather than following previously determined levels in this study, the resulting subgroups may not have had meaningful boundaries, which may have confused our analyses.

In terms of clinical implications, our study suggests a great variation in levels of EBP implementation across professions, work units, and individuals. Although a low general EBP standard was indicated, great interest in working according to EBP ideals was also seen. Potential for change is apparent, but it seems necessary to examine the use of research evidence in rheumatology practice at individual and work unit levels to accommodate local and individual needs and resources [13].

Education is a strategy that may be used in attempts to facilitate EBP implementation, targeting knowledge and skills at the individual level [47]. Our study indicates that educational efforts should be directed toward tailoring EBP education to work experience in the specialty. Additional implementation strategies may be needed to embed the activities related to the enactment of EBP in the organizational structure at different levels [20].

Organizational interventions to facilitate EBP implementation could include promotion of participation in research activities by employees and capitalizing on opportunities for informal learning in connection with these research activities. Integration of research and practice has been shown to not only change individual behaviors but also results in a culture shift in the clinic [43]. Time for reflection and discussion with peers, and to absorb and adapt to proposed behavioral change, is important for the initiation of new behavior, as behavioral habits may be hard to break [48]. In a larger context, our results indicate that being in a specialized clinic enhanced engagement in EBP activities, which may be motivation to keep specialized rheumatology units intact where possible.


References
  1. Klareskog L, Saxne T, Enman Y Reumatologi (2005) Lund: Studentlitteratur.

  2. Pispati PK (2003) Evidence-based practice in rheumatology. APLAR J Rheumatol 6: 44-49.

  3. Dougados M, Betteridge N, Burmester GR, Euller-Ziegler L, Guillemin F, et al. (2004) EULAR standardised operating procedures for the elaboration, evaluation, dissemination, and implementation of recommendations endorsed by the EULAR standing committees. Ann Rheum Dis 63: 1172-1176.

  4. Lineker S, Husted J (2010) Educational interventions for implementation of arthritis clinical practice guidelines in primary care: effects on health professional behavior. J Rheumatol 37: 1562-1569.

  5. Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS (1996) Evidence based medicine: what it is and what it isn't. BMJ 312: 71-72.

  6. Straus SE (2007) Evidence-based health care: challenges and limitations. Evid-Based Commun Assess Interv 1: 48-51.

  7. Aveyard H, Sharp P (2013) A beginner's guide to evidence-based practice in health and social care. Maidenhead, Open University Press, 143.

  8. Nutley SM, Walter I, Davies HTO (2007) Using evidence: how research can inform public services. Bristol, Policy Press.

  9. Melnyk BM, Gallagher-Ford L, English Long L, Fineout-Overholt E (2014) The establishment of evidence-based practice competencies for practicing registered nurses and advanced practice nurses in real-world clinical settings: proficiencies to improve healthcare quality, reliability, patient outcomes, and costs. Worldviews Evid-Based Nurs 11: 5-15.

  10. Gray M (2009) Evidence-based healthcare and public health. Edinburgh: Churchill Livingstone.

  11. Squires JE, Estabrooks CA, Gustavsson P, Wallin L (2011) Individual determinants of research utilization by nurses: a systematic review update. Implement Sci 6: 1

  12. Squires JE, Moralejo D, Lefort SM (2007) Exploring the role of organizational policies and procedures in promoting research utilization in registered nurses. Implement Sci 2: 17.

  13. Kajermo KN, Boström AM, Thompson DS, Hutchinson AM, Estabrooks CA, et al. (2010) The BARRIERS scale -- the barriers to research utilization scale: A systematic review. Implement Sci 5: 32.

  14. Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, et al. (2009) Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 4: 50.

  15. Wensing M, Bosch M, Grol R (2009) Selecting, tailoring, and implementing knowledge translation interventions. In: Straus S, Tetroe J, Graham I. Knowledge translation in health care: moving from evidence to practice. Oxford: Wiley-Blackwell 94-113.

  16. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O (2004) Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q 82: 581-629.

  17. Légaré F, Freitas A, Thompson-Leduc P, Borduas F, Luconi F, et al. (2015) The majority of accredited continuing professional development activities do not target clinical behavior change. Acad Med 90: 197-202.

  18. Ehrhart MG, Aarons GA, Farahnak LR (2014) Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci 9: 157.

  19. Aarons GA, Ehrhart MG, Farahnak LR (2014) The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci 9: 45

  20. Aarons A, Ehrhart MG, Farahnak LR, Sklar M (2014) Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health 2014: 255-274.

  21. Melnyk BM, Fineout-Overholt E, Mays MZ (2008) The evidence-based practice beliefs and implementation scales: psychometric properties of two new instruments. Worldviews Evid-Based Nurs 5: 208-216.

  22. Dalheim A, Harthug S, Nilsen RM, Nortvedt MW (2012) Factors influencing the development of evidence-based practice among nurses: a self-report survey. BMC Health Serv Res 12: 367.

  23. Forsman H, Rudman A, Gustavsson P, Ehrenberg A, Wallin L (2010) Use of research by nurses during their first two years after graduating. J Adv Nurs 66: 878-890.

  24. Melnyk B, Fineout-Overholt E, Giggleman M, Cruz R (2010) Correlates among cognitive beliefs, EBP implementation, organizational culture, cohesion and job satisfaction in evidence-based practice mentors from a community hospital system. Nurs Outlook 58: 301-308.

  25. Wallen G, Mitchell S, Melnyk B, Fineout-Overholt E, Miller-Davis C, et al. (2010) Implementing evidence-based practice: effectiveness of a structured multifaceted mentorship programme. J Adv Nurs 66: 2761-2771.

  26. Weng Y, Kuo K, Yang CY, Lo HL, Chen C, et al. (2013). Implementation of evidence-based practice across medical, nursing, pharmacological and allied healthcare professionals: a questionnaire survey in nationwide hospital settings. Implement Sci 8: 1-10.

  27. Nouwen A, Urquhart Law G, Hussain S, McGovern S, Napier H (2009) Comparison of the role of self-efficacy and illness representations in relation to dietary self-care and diabetes distress in adolescents with type 1 diabetes. Psychol Health 24: 1071-1084.

  28. Dyson J, Lawton, R, Jackson C, Cheater F (2011) Does the use of a theoretical approach tell us more about hand hygiene behaviour? The barriers and levers to hand hygiene. J Infect Prev 12: 17-24.

  29. Grimshaw JM, Eccles MP, Steen N, Johnston M, Pitts NBLG, et al. (2011) Applying psychological theories to evidence-based clinical practice: identifying factors predictive of lumbar spine x-ray for low back pain in UK primary care practice. Implement Sci 6: 55.

  30. Estabrooks CA, Floyd JA, Scott-Findlay S, O'Leary KA, Gushta M (2003) Individual determinants of research utilization: a systematic review. J Adv Nurs 43: 506-520.

  31. Eccles MP, Grimshaw JM, MacLennan G, Bonetti D, Glidewell L, et al. (2012) Explaining clinical behaviors using multiple theoretical models. Implement Sci 7: 99.

  32. Stokke K, Olsen NR, Espehaug B, Nortvedt MW (2014) Evidence based practice beliefs and implementation among nurses: a cross-sectional study. BMC Nurs 13: 8.

  33. Mariano K, Caley L, Escherberger L, Woloszyn A, Volker P, et al. (2009) Building evidence-based practice with staff nurses through mentoring. J Neonatal Nurs 15: 81-87.

  34. Yousefi-Nooraie R, Shakiba B, Mortaz-Hedjri S, Soroush AR (2007) Sources of knowledge in clinical practice in postgraduate medical students and faculty members: a conceptual map. J Eval Clin Pract 13: 564-568.

  35. Yost J, Dobbins M, Ciliska D (2014) Evaluating the impact of an intensive education workshop on evidence-informed decision making knowledge, skills, and behaviours: a mixed methods study. BMC Med Educ 14: 1-9.

  36. Olsson T (2007) Reconstructing evidence-based practice: an investigation of three conceptualisations of EBP. Evid Policy 3: 271-285

  37. Estabrooks CA, Rutakumwa W, O'Leary KA, Profetto-McGrath J, Milner M, et al. (2005) Sources of practice knowledge among nurses. Qual Health Res 15: 460-476.

  38. Yousefi-Nooraie R, Dobbins M, Marin A (2014) Social and organizational factors affecting implementation of evidence-informed practice in a public health department in Ontario: a network modelling approach. Implement Sci 9: 29

  39. Gifford W, Davies B, Edwards N, Griffin P, Lybanon V (2007) Managerial leadership for nurses' use of research evidence: an integrative review of the literature. Worldviews Evid Based Nurs 4: 126-145.

  40. Taylor SL, Dy S, Foy R, Hempel S, McDonald KM, et al. (2011) What context features might be important determinants of the effectiveness of patient safety practice interventions? BMJ Qual Saf 20: 611-617.

  41. Stetler CB, Ritchie JA, Rycroft-Malone J, Schultz AA, Charns MP (2009) Institutionalizing evidence-based practice: an organizational case study using a model of strategic change. Implement Sci 4: 78.

  42. Aarons G, Ehrhart M, Farahnak L, Hurlburt M (2015) Leadership and organizational change for implementation (LOCI): a mixed-method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implement Sci 10: 11.

  43. Rycroft-Malone J, Burton CR, Harvey G, McCormack B, Graham I, et al. (2011) Implementing health research through academic and clinical partnerships: a realistic evaluation of the Collaborations for Leadership in Applied Health Research and Care (CLAHRC). Implement Sci 6: 74.

  44. Brodie DA, Williams JG, Owens RG (1997) Research methods for the health sciences. Amsterdam: Harwood.

  45. Rycroft-Malone J (2008) Evidence-informed practice: from individual to context. J Nurs Manag 16: 404-408.

  46. Lewis C, Stanick CF, Martinez RG, Weiner BJ, Kim M, et al. (2015) The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci 10: 1-18.

  47. Yost J, Ganann R, Thompson D, Aloweni F, Newman K, et al. (2015) The effectiveness of knowledge translation interventions for promoting evidence-informed decision-making among nurses in tertiary care: a systematic review and meta-analysis. Implement Sci 10: 98.

  48. Presseau J, Johnson M, Heponiemi T, Elovainio M, Francis JJ, et al. (2014) Reflective and automatic processes in health care professional behaviour: a dual process model tested across multiple behaviours. Ann Behav Med 48: 347-358.

International Journal of Anesthetics and Anesthesiology (ISSN: 2377-4630)
International Journal of Blood Research and Disorders   (ISSN: 2469-5696)
International Journal of Brain Disorders and Treatment (ISSN: 2469-5866)
International Journal of Cancer and Clinical Research (ISSN: 2378-3419)
International Journal of Clinical Cardiology (ISSN: 2469-5696)
Journal of Clinical Gastroenterology and Treatment (ISSN: 2469-584X)
Clinical Medical Reviews and Case Reports (ISSN: 2378-3656)
Journal of Dermatology Research and Therapy (ISSN: 2469-5750)
International Journal of Diabetes and Clinical Research (ISSN: 2377-3634)
Journal of Family Medicine and Disease Prevention (ISSN: 2469-5793)
Journal of Genetics and Genome Research (ISSN: 2378-3648)
Journal of Geriatric Medicine and Gerontology (ISSN: 2469-5858)
International Journal of Immunology and Immunotherapy (ISSN: 2378-3672)
International Journal of Medical Nano Research (ISSN: 2378-3664)
International Journal of Neurology and Neurotherapy (ISSN: 2378-3001)
International Archives of Nursing and Health Care (ISSN: 2469-5823)
International Journal of Ophthalmology and Clinical Research (ISSN: 2378-346X)
International Journal of Oral and Dental Health (ISSN: 2469-5734)
International Journal of Pathology and Clinical Research (ISSN: 2469-5807)
International Journal of Pediatric Research (ISSN: 2469-5769)
International Journal of Respiratory and Pulmonary Medicine (ISSN: 2378-3516)
Journal of Rheumatic Diseases and Treatment (ISSN: 2469-5726)
International Journal of Sports and Exercise Medicine (ISSN: 2469-5718)
International Journal of Stem Cell Research & Therapy (ISSN: 2469-570X)
International Journal of Surgery Research and Practice (ISSN: 2378-3397)
Trauma Cases and Reviews (ISSN: 2469-5777)
International Archives of Urology and Complications (ISSN: 2469-5742)
International Journal of Virology and AIDS (ISSN: 2469-567X)
More Journals

Contact Us

ClinMed International Library | Science Resource Online LLC
3511 Silverside Road, Suite 105, Wilmington, DE 19810, USA
Email: contact@clinmedlib.org
 

Feedback

Get Email alerts
 
Creative Commons License
Open Access
by ClinMed International Library is licensed under a Creative Commons Attribution 4.0 International License based on a work at https://clinmedjournals.org/.
Copyright © 2017 ClinMed International Library. All Rights Reserved.