Your browser is no longer supported

For the best possible experience using our website we recommend you upgrade to a newer version or another browser.

Your browser appears to have cookies disabled. For the best experience of this website, please enable cookies in your browser

We'll assume we have your consent to use cookies, for example so you won't need to log in each time you visit our site.
Learn more

How do we deliver best practice?

  • 1 Comment

Using evidence to improve productivity and efficiency is not as straightforward as it might appear, as a round table of experts brought together to discuss the issue found.

Everyone knows that the NHS needs to save money while delivering high quality care. Everyone knows that delivering the right care first time can help increase productivity and reduce inefficiency. Everyone knows that care should be evidence based. What we know far less about is what constitutes the best evidence, what best practice looks like and how we persuade clinicians to adopt it.

Richard Vize, former editor of NursingTimes’ sister publication Health Service Journal, kicked off this roundtable discussion about using the best evidence by making just this point: “The challenge for providers and commissioners is how to demonstrate best practice and how to define it. There are challenges for the NHS as a whole in accessing evidence of best practice and high quality information and then, crucially, changing the behaviour of individuals and institutions, getting them to make that cultural leap.”

Step forward NHS Evidence, the one-stop clinical evidence portal envisaged by former health minister Lord Darzi in High Quality Care for All in June 2008. His idea was for a single place where clinicians and managers could access evidence that would inform their clinical practice and commissioning decisions.

The National Institute for Health and Clinical Excellence was tasked with creating the new web portal, building on the wealth of evidence already contained in the National Libraries for Health and other sources. By June 2009, phase one was up and running, allowing users to create simple searches that return a prioritised list of results. By October, phase two was in place, with the ability to personalise and refine searches. Today, developments are under way to make the portal ever more useful and more relevant, setting in train an information revolution that puts health and social care staff in control of searching for quality assured best practice information.

Portal rationale

NHS Evidence chief operating officer Gillian Leng described the thinking behind the portal to roundtable participants. 

“Darzi described a situation where clinicians had to go to lots of places to search for information to help deliver high quality care,” said Dr Leng. “NHS Evidence is now the single place, giving access to a whole range of resources, from guidelines to primary research and policy documents. It was developed with users and what you see is a clean front end – a bit like Google, because that’s what people told us they wanted – where an easy search brings back selected and prioritised results. There are specialist resources there too that allow researchers to search Medline and specialist pages for specialist clinicians, such as cardiologists.”

That’s all very nice, suggested Mr Vize, “but how do you take costs out of services without taking the quality out. What does that mean for the behaviour of managers and clinicians?”

University of York professor of health economics Alan Maynard took the bait. “It will require them to focus on productivity,” he said. “That may mean the relationship between inputs and outputs or between inputs and outcomes. The major deficiency we have at present is in measuring outcomes.”

Professor Maynard argued for health services and clinicians working hard to prevent unnecessary variation then coming down hard on outliers so more clinical cases move towards the average. “This pushes people towards better practice,” he said. 

The lack of outcome measures makes it very difficult to know what best practice looks like, he continued.

“We say ‘this is the way to treat diabetes or heart disease’ but we are not actually sure if it improves outcomes. We say patients should have a CT scan within an hour of a stroke so we can decide about thrombolysis, but the size of the evidence [base] is 800-900 patients. We are designing protocols on the basis of evidence that is not always compelling.”

Care Quality Commission director of methods Gary Needle said this reality was wildly at odds with public expectations. “If you were to go and tell people the NHS has instituted a new approach to capturing the best evidence, I think they would say ‘we thought you were doing that anyway. Surely doctors practise in that way, don’t they?’ The answer is: not quite.”

For example, the CQC recently started an in depth review of stroke services. 

“We wanted to ask: what does ‘good’ look like? It is not as easy to answer as one might imagine.” He could see the value of a single portal with all the relevant information: “It is where you would start the journey.”

The lack of outcome measures and the quality of some of the evidence on which protocols are based were not the only problems identified at the discussion. Another was the question of what is the “best evidence”? This is not always clear, especially where clinicians are faced with multiple guidelines. 

Dr Leng said: “I personally think that there is a culture of evidence based practice in medicine, it’s just that people look at different evidence.” She recalled working as a senior house officer to three consultants: “There were three different treatment regimens, but they all said they were evidence based. I had to remember them all.”

There are guidelines – and then there are guidelines. Some are developed according to rigorous standards, backed up by full literature searches that identify not only the best evidence, but also where there are gaps in the research. Others do not follow such a rigorous approach. The problem for clinicians is how to tell which is which.

One solution now being tried by NHS Evidence is accreditation, not of individual pieces of information or guidelines, but of the organisations producing them.

Organisations producing guidelines can apply for a scheme that will accredit those using the most rigorous standards based on the validated, internationally recognised Agree criteria. Those that are successful will see their material ranked higher in the hierarchy in searches and flagged with a mark of having reached the gold standard.

Dr Leng said: “We have had around 20 guideline producers through the process so far.” The system of accreditation is still being refined.

Consultant pharmacist and NHS Evidence ambassador Mahendra Patel said the accreditation had a “double advantage”. Not only could users see at a glance which guidelines were produced to high standards, it rewards guideline producers too. “It gives them some credit for producing that strength of guidance,” Dr Mahendra Patel said. 

Both Social Care Institute for Excellence deputy chief executive Amanda Edwards, and St George’s Healthcare Trust nurse consultant Jim Blair, whose role involves supporting people with learning disabilities undergoing acute care, highlighted the lack of high quality evidence in their specialist areas and the emphasis on the medical model in the evidence that is available. “There is a dearth of knowledge in learning disabilities,” said Mr Blair. 

Not only is there very little by way of hard research and randomised clinical control trials, but also very little has been done to capture the “soft” qualitative information about experience or what happens to services when they are designed in partnership with users.

Mr Blair added: “This matters. If we can improve care for people with learning disabilities then the outcomes would be better for everyone.”

Ms Edwards added that evidence was not everything – the ability to change practice was important too: “People [working in adult social care] have very different levels of autonomy and control over what they do. We have to pitch a lot of material about evidence and good practice not at individual practitioners but at organisational and management level.”

PCT Network director David Stout agreed: “There is not much evidence about how well evidence is used,” he said. Simply putting information in the public domain did not appear to drive change. “Regardless of how good the evidence is, the NHS does not have a change theory or methodology of how this is supposed to work,” said Mr Stout.

This was an age old conundrum, said Professor Maynard. It took the British navy 60 years from accepting lime juice could prevent scurvy to giving the remedy to all its sailors, but “we are getting better at it”.

Evidence into practice

Dr Leng, formerly NICE director of implementation, acknowledged there are barriers to getting evidence into practice. “The main ones we find are resources or perceived lack of resources, managerial barriers, and [resistance from] clinicians. Conversely, where you have clinicians who are supportive of evidence based practice, they are the best drivers of change too,” she said. Managers and institutions needed to find ways to motivate clinicians to use evidence, for example through continuous professional development or inspection regimes, said Dr Leng.

There was one more barrier, said consultant cardiologist Kiran Patel: clinical autonomy.

“If you present clinicians with standards of care, the argument comes back: do not remove our autonomy,” he said.

Claims for clinical autonomy make a powerful nay-sayer for using best evidence, he and others agreed. It is also complex and wrapped up not just in the quality of evidence – with poor quality evidence or evidence drawn from esoteric circumstances at odds with clinicians’ daily experience – but also in clinicians’ understanding of evidence and how widely it applies.

Dr Leng did not dismiss the case for clinical autonomy – far from it – but she did outline the difference between using it simply as an excuse not to adopt new and proven clinical practices and using autonomy to identify exceptions. 

She said: “The accepted figure is that good guidelines will apply in 80 per cent of patients. You need to use your autonomy to decide which patients it does not apply to and, increasingly, I see medical defence organisations expecting that doctors will document the exceptions.”

Dr Kiran Patel’s solution was to involve clinicians in commissioning. “We need to use evidence based commissioning. It needs to be clinically driven and it’s not at the moment,” he said.

Mr Stout agreed: “Of course commissioning should be clinically led. You cannot imagine anyone calling for nonclinically led commissioning. It makes no sense.” Clinically led commissioning was about having clear standards and assessing practice against those standards to drive out unwarranted variation, he said. “Our problem is that the standards are not necessarily clear at national level and that leaves local organisations scrabbling around asking ‘what are the right standards?’ If we have any sense, we will ask providers and the public, but realistically you are constrained in how much of that you can do.”

The other problem was the lack of evidence about the quality of commissioning – and whether it makes an impact on practice. “There is no evidence and there is a big debate about how you assess the quality of commissioning,” said Mr Stout. “Until recently there was not even a description of the skills needed. World class commissioning is the first attempt we have had even to explain the skills. It is nowhere near as good as it is going to have to be.”

Mr Needle then turned the debate full circle: “One of the biggest barriers to the spread of good practice is a real fear factor about how much it will cost.” He then posed a question: “Is it possible to legislate for getting evidence into practice? I am not advocating it.”

Professor Maynard suggested good practice need not always mean more costly care – indeed this is the thrust of present policy: cut costs while maintaining quality. “There are lots of low hanging fruit that can improve quality at low cost,” he said. York Hospitals Trust, where he recently stepped down after 12 years as chair, had rationalised doctors’ antibiotic prescribing, reducing cost and increasing quality at the same time. “I keep telling the clinicians we are not interested in what works, but what works at a given cost,” he said.

But one anecdote does not make evidence, as Professor Maynard said: “When you start to try to answer the question of not just what works but what works at a given cost, then the evidence base is even poorer.”

This all underlined the importance of involving doctors in the quality versus cost versus evidence debate, argued Professor Maynard and Dr Kiran Patel. “Who manages the NHS? Who controls resource allocation? It’s the doctors,” said Professor Maynard. “If we are going to improve, we have to focus on working with clinicians to improve their practice.” Dr Kiran Patel added: “Seventy to 80 per cent of spending is clinically influenced decision making. There are some smart things we can do to deliver efficiencies.”

He cited the NHS Institute’s “better care, better value” initiative, which sets out 15 high level indicators of where the NHS can improve efficiency. If every primary care trust and acute hospital provider could match the top 25 per cent of performers on these indicators, this could realise the NHS £2.4bn in productivity benefits. This was all about raising the bar and this could be relevant elsewhere, he said.

“Let’s not commission tariffs based on average care,” he said. “Let’s base them on best practice and get better value into the system,” he said.

Ms Edwards was asked for her perspective on this, not least because adult social care has a much longer history of commissioning services than does the NHS. But here too the evidence was thin on the ground about what constitutes cost effective services. “There has been no sustained investment in research to understand cost effectiveness,” she said, but gaining this insight now was more crucial than ever as adult social care undergoes rapid change, driven by the personalisation agenda. 

“This is a real challenge,” she said. “If you look at individual budgets there is some evidence they can reduce costs in low volume high cost services, such as learning disability, but when you get to high volume, low cost services then it is much harder to demonstrate.”

Part of the problem was a lack of tradition for research in this area, but another part was a lack of incentive.

“There is no framework for redirecting cost benefits,” pointed out Ms Edwards. “For example the POP [partnership for older people] pilots show some reduced NHS costs, but there is no way for the local authority to realise these benefits.”

Similarly, when hospitals reduce length of stay and their overall cost by implementing best practice, this can place more strain on local authorities.”

Mr Stout took this one step further. “How do you realise benefits?” he asked. “If the evidence says we can reduce length of stay by 40 per cent, what does that mean? Either we see more patients in the same beds, which leads to spending more money, or we take the capacity out, shut things down and sack the staff.” Not much of an incentive for staff to implement the evidence.

Informed patients

Mr Needle then brought patients into the picture: “Where do patients fit into all this? We are working on a model of gathering evidence about what works so clinicians can best treat patients. But clinicians have all the knowledge and patients have none. What does this mean for the relationship? Do patients have access to NHS Evidence?”

Very much so, said Dr Leng. NHS Evidence is, after all, a publicly funded resource and the organisation was philosophically committed to openness. “Then that could be a very powerful lever for change,” said Mr Needle. “So how do we make it understandable to patients?”

Dr Leng admitted it is not translated into lay language. “NHS Choices is the patientfacing site and we feed across the best sources of information. We have, however, had a lot of demand on NHS Evidence for patient information leaflets and we are looking at leaflets that have been accredited through the Department of Health’s patient information leaflet standards process being made available through the portal.”

Dr Kiran Patel agreed that patient access to the best evidence was a powerful tool. “We all get patients coming to us with clippings about stem cell treatment from the New Scientist or the Daily Mail,” he said. “I think we can use NHS Evidence to manage expectations, using it to put information into context and putting responsibility for health information in their hands.”

Using evidence is not as straightforward as it might seem – as this roundtable highlighted. But, at the very least, NHS Evidence could kickstart the process of making clinical evidence available easily and quickly. In time it may do much more. As Dr Leng said: “My vision is that NHS Evidence will become the routine and regular source of information for clinicians and commissioners, embedded into local IT systems and easy to access.”

  • 1 Comment

Related files

Readers' comments (1)

  • Best practice delivery is as feasible and good as the available research evidence and the organistaional context.
    In many areas, little is known about the best and most efficient way to design and deliver individually tailored nursing care.

    A typical example is the provision of effective care for those suffering from PrUs. In broad terms, the available research evidence, as summarized in instrumental pieces of evidence (eg NICE guidelines) is far from conclusive about the appropriate clinical course of action.

    This is particularly true in the community or district (or home care) setting. The organisational context is another variable. Best pratcice delivery may be encouraged or discouranged implicitly or explicitly in one or another healthcare delivery organisation.

    These comments come from an economist with a PhD in Nursing sciences and publications in nursing journal. With this in mind, I hope this information would be of some use.

    Unsuitable or offensive? Report this comment

Have your say

You must sign in to make a comment

Please remember that the submission of any material is governed by our Terms and Conditions and by submitting material you confirm your agreement to these Terms and Conditions. Links may be included in your comments but HTML is not permitted.