Evaluating a project to improve care of older people in Scotland.
VOL: 100, ISSUE: 38, PAGE NO: 34
Eileen McDonach, MA, MSc, is PhD research student, University of Dundee
Angela Kydd, MSc, PGCE, RGN, RMN, is senior lecturer, School of Health, Nursing, and Midwifery, University of PaisleyThe need for evidence-based practice is well established, particularly in the health care of older people (Mezey and Fulmer, 1999). However, there is often little information about how to actually achieve this. A number of barriers to integrating research and practice have been well documented, including a lack of time, resources and training. Nurses may face additional obstacles in contributing fully to the research and development agenda. Two major barriers have been identified as capacity and capability (Department of Health, 2000). It has been argued that fostering links between higher education and health care providers would create a culture that promotes clinical effectiveness (McClarey and Duff, 1999).
In 2001, a gerontology interest group in Scotland launched the Developing Centres of Excellence pilot (Kydd, 2004). Its purpose was to facilitate the implementation of evidence-based research in gerontological nursing practice to optimise patient care. The implementation of research into practice was promoted through the recruitment of nurse clinicians, supported by academic mentors, to work on specific projects aimed at enhancing the care of older people. The choice of projects was not prescribed at the outset. Instead, clinicians were required to base their choice on a SWOT (strengths, weaknesses, opportunities and threats) analysis of their own service. A number of specific objectives were drawn up before the evaluation was undertaken (Box 1). The evaluation process
One year after the project commenced, it became clear that some sort of evaluation was required. As this was a new scheme, believed to be the first of its kind in Scotland, it was felt important to explore the experiences of the nurse clinicians taking part. In recent years, there has been an increasing recognition of the need to go beyond outcome measures of the 'success' or 'failure' of programmes to develop an understanding of how and why programmes work and, just as importantly, why they don't work. Robson (2002) refers to process evaluation as 'the systematic observation and study of what actually occurs in the programme, intervention, or whatever is being evaluated'. Although the pilot was running for a year before evaluation, the fact that it was a novel project led the team to decide that a process rather than an outcome evaluation would be more useful. Their rationale is encompassed by Patton's (1997) outline of the key features of process evaluation, including: - A focus on the programme's internal dynamics to understand its strengths and weaknesses; - Exploring successes, failures and changes; - Exploring participants' experiences of the programme; - A mechanism to feed back into the programme development procedure. In addition, process evaluations allow access to unintended as well as intended consequences of an intervention. This is a useful feature at the developmental stage of a programme. As limited resources were available for conducting this evaluation, its remit was limited to exploring the experiences of the clinicians who took part. Given that both the pilot study and its evaluation were initiated and funded by the School of Health, Nursing and Midwifery at the University of Paisley, an external researcher was commissioned on a sessional basis to undertake the evaluation. Methods
Since the main focus of the evaluation was the experience of the clinicians involved, a qualitative design was adopted. Each of the clinicians volunteered to take part in an initial, semi-structured interview and a follow-up interview about six months later to explore their experiences. Written consent was obtained from all the participants. Relevant ethical approval was also sought and obtained for the study. Two semi-structured interview schedules were devised to cover several broad areas that were known to be important to the process. Preliminary results were presented to the multidisciplinary group. Feedback from this session was then used to inform the content of the second interview schedule. Participants were invited to identify any other issues they perceived as relevant to their involvement in the pilot scheme. Analysis
A structured approach to data analysis was adopted. A coding framework was devised to aid systematic coding and analyses of the interview data as described by Miles and Huberman (1994). Themes were identified through comparison of interview data. As is customary in qualitative analysis, attention was paid to the identification of exceptional cases that did not conform to general patterns and themes. Alternative explanations for their 'lack of fit' were explored to provide greater insight into the broader data set (Silverman, 2000; Mason, 1996). Findings
A diverse range of projects was selected by the nurse (Box 2). Clinicians generally describe their work as an ongoing, fluid process. Therefore initial projects were subject to considerable revision, with new projects subsequently being added. Despite the wide-ranging choice of projects, a number of common features were evident such as the emphasis on education. The fact that projects were not prescribed was seen by many to be an asset of the scheme. However, this diversity also makes evaluation more complex. Clinicians' perceptions Although diverse in their interpretation of its rationale, a number of common strands were apparent (Box 3). Clinicians' reports of a lack of clarity in the pilot study, particularly in the early stages, in combination with the problem of some clinicians joining at later stages, may have contributed to the varied interpretations. The link between involvement in the pilot and improving practice was often implicit rather than explicit in the accounts of some clinicians. The diversity of clinicians' interpretations may have some bearing on their experience and expectations of involvement. Achieving a shared understanding of the underlying principles of any future initiative may, therefore, be an important first step. Clinicians also highlighted the importance of the award as a recognition of achievement, although there was some uncertainty about criteria in general and eligibility for individual certificates. Clinicians' experience Clinicians generally reported their experience as a positive one, both individually and for their staff teams. The importance of education and the external support obtained were emphasised. However, a number of reported challenges may require careful consideration in any future initiatives (Box 4). The limitations of recruiting participants at different times were also noted because this may promote inequity - in other words, everyone receives the same award regardless of their length of involvement. A number of added benefits of involvement in the pilot study were reported by clinicians, including the opportunity for networking, added focus, motivation and encouragement. Shared experiences Many clinicians reported positively on the two-monthly Development Group meetings as opportunities for networking, sharing ideas, obtaining external support and providing much-needed focus and motivation. However, some variation in attendance at meetings was also noted. A variety of suggestions were made in relation to the timing, duration and frequency of meetings, although no clear consensus emerged. The designers of future schemes may wish to consider the geographical spread when recruiting potential participants in such projects. Outcome measures Future evaluations may wish to consider the inclusion of outcome measures in terms of the extent to which the pilot study actually achieved its aim of 'developing centres of excellence'. This may be important in terms of resource allocation, particularly when clinicians report that they may have undertaken the work without academic input from the university. The issue of outcome measures is also relevant to the award, as clinicians in this study raised the issue of ambiguity in how the criteria are determined and applied. Clarification of this process will be fundamental to future initiatives. Ensuring emphasis on development Clarification of award criteria is also important in terms of how future projects are perceived by both potential participants and the wider public. This pilot was initially entitled 'Centres of Excellence'. Clinicians reported that the word 'developing' was inserted at a later stage to acknowledge the evolving nature of change. The award was intended to reflect the commitment of the wards/units involved in the pilot to developing excellence in their practice as opposed to saying that their practices were in themselves excellent. Those involved need to be aware of this distinction and be clear that attainment of the award does not denote their ward/unit as a 'centre of excellence'. Although this may appear to be an issue of semantics, it is likely to be of prime importance in terms of how the award is perceived by participants and the wider community when they see a certificate displayed in the residential unit. Quality and equity Part of the advisory group's role appears to be to monitor the quality of those nursing practices selected by clinicians. How that quality is defined and assessed are important considerations. As mentioned earlier, the process of determining whether award criteria have been met appears to be unclear. This evaluation did not investigate outcome, therefore it is not possible to comment on the achievements of the clinicians and their teams. It is worth pointing out the variation in length of clinician involvement - some had been involved from the beginning whereas others came on board much later in the process. For some clinicians, this raised questions about the equity of input and achievement and thus the attainment of the award. Future schemes may need to consider the recruitment of participants at the same time or stipulate a minimum duration of involvement to safeguard against such criticisms. Involvement of staff Clinicians suggest that staff understanding of the project may at times have been somewhat limited, for a variety of reasons. This is likely to impact on the nature of their involvement and thus requires further investigation. As a result of the limited resources available for conducting this evaluation it was not possible to interview the academic mentors who took part in the pilot study. The developers of future initiatives may want to consider academic mentors' understanding of the project rationale and their role in it in order to promote consistency. A shared understanding cannot be assumed: this point is highlighted in the diversity of interpretations reported by clinicians about the pilot study. A lack of shared understanding about the purpose of the scheme may have implications for the input that mentors provide. If consistency is to be ensured and quality optimised, inclusion of academic mentors as well as members of the quality advisory group in future evaluations may be important. Future directions
The potential for attitudes and practices to be changed through supportive and mentoring networks is encouraging. The developing nature of the endeavour was summed up by one clinician's interpretation of the study' purpose: 'It's all about fair practice and the centre of excellence in itself. To achieve it is a bonus, but if we never get there I feel if we have this journey and we're working, chipping away at it all the time, that is what it is all about - how we change practice to make places different. Places where patients want to be, relatives want their relatives to be and where nurses want to work.' Perhaps not surprisingly, clinicians report that they may have undertaken this work without being involved in the pilot. However, they also identify many added benefits from their involvement with the university, including increased focus, motivation and encouragement, valuable networking with their peers and a rare opportunity to obtain recognition for the work they do on a day-to-day basis. Another innovative feature of the study was that the link between the university and the care setting was at the ward or unit level. This was most clinicians' first experience of such collaboration at their level and one that they valued. Learning from the past The challenge for future initiatives is to optimise benefits from sharing skills while reducing the problems experienced by clinicians in this study. As an important aspect of this study was learning from experience, clinicians presented their individual experiences of involvement in the pilot study at a conference in June 2003. The results of the process evaluation were also disseminated and discussed at this event. Findings from this evaluation are under review in order to inform future development. TWO-PART SERIES ON DEVELOPING PRACTICE PART 1 (NT; Vol 100: No 37): Setting up a project to improve care of older people in Scotland PART 2 (NT; Vol 100: No 38): Evaluating a project to improve care of older people in Scotland This article has been double-blind peer-reviewed.