The Central Question: Are they using a sound methodology?

These are feverish times. Probably not the the most propitious time for standing back and having a measured conversation.

My problems (the problem of list size and quality; the problem of patient choice of GP) are not on the agenda. And yet they are important, central to the quality of general practice and patient care.

It seems to me that the root problem here is a very faulty methodology being used to develop and implement this policy. Let me make it clear: I am not saying that all policy planning and implementation is faulty. Some of it is. Consider the following hypothetical rating scale:

Rating Health Care Policies

You will notice that I scored 10 points in the Excellent Policy box, 20 in the Good box, 30 in the Fair, 20 in the Poor. The remaining 20 out of the hundred points I put in the Kafkaesque box. That is policies which by their very design undermine quality and actually drive quality down predictably. The planners and politicians think they are doing something good, but they are actually causing harm. But they don’t know it; they don’t want to know it.

Let me outline what seems to me to be the way a Kafkaesque policy comes to life. A minister has an idea; it sounds like a good idea, something that will improve things, please the public. He goes to the DOH and lets it be known that this policy needs to be implemented, he gives it his full backing, he doesn’t want to hear any negativity, we’ve really go to follow through on this, remember I am the Secretary of State for Health, the PM is behind me on this. The DOH officials package it, dress it up with a nice cover, smiling faces, and words like Choice, Excellence, Making Things Even Better….and so on. The order goes down the chain to PCTs: implement Policy K. They might shake their heads a bit at the PCT; the primary care people (GPs, nurses) shake their heads and wonder how they can minimise the harm this is going to cause, and muddle along. There is no mechanism put in place to check how this is working. There is no way of feeding back to the people at the top, letting them know there’s a problem at the ground level.


What would a sound methodology look like? I would propose the following, just a rough outline. A minister has an idea; he takes it to the DOH; he presents the idea to the DOH people. ‘I really like this idea but I need you to evaluate it: will it work the way we hope it will?’ They gather the appropriate people to think about this, and they do an initial assessment of the idea. They ask questions like: what do we need to know in order to think clearly about this proposal; what is the aim of this proposal, what is the desired outcome; how would this work (do a walk through, modelling it); what are the unintended consequences; how might we get around the unintended consequences; and so on. Then they write up their findings in a preliminary report, and show it to the Minister. It would begin something like this: ‘We are thinking about a policy to do X. We are hopeful that this might bring about significant benefits, but we have also identified some constraints. Here is our outline:  [listing aims, benefits, costs, unintended consequences/risks, thoughts about how to avoid or minimise the risks, how it would work, how it would be implemented, etc]. [The last paragraph would be something like this]: We would be grateful for your thoughts on this. Are there any downsides or costs we have missed, are there any gains we have missed? Don’t pull any punches, we want to get this right…” And then they would send this initial draft proposal out to some sensible, intelligent, practical, straight talking people who will give their honest opinion (in the case of primary care issues, you would have a list of GPs, nurses, managers who would fit this description; not ‘yes men’, of course). These ‘experts’ would then send in their honest assessment and suggestions. The DOH people wouldthen go back to the drawing board, redraft the project proposal. They might decide that, actually, while it is an interesting idea it  will be impractical to implement (too costly, unacceptable unintended costs…), or that there are problems that need to be further hammered out, or that it all looks promising, with just a few tweaks to be made. They then present their findings to the minister, and decide on next steps. If they decide to go ahead with this, they then make the feasibility study document public: with the aims, objectives, benefits, possible risks, implementation plans etc; they would also make clear how they are going to evaluate the results of this policy. The minister would then sign the document, and so would at least one named DOH human being, with an email address for feedback. They would then implement the policy (maybe as a pilot); then they would evaluate the policy using a sound methodology which gave an honest answer and not necessarily ‘the answer they would like’. And they would publish the results of the evaluation, together with their conclusions.

This, I would submit, is a sound methodology. It is very different from the unsound methodology.


With reference to the ‘Choose Your GP’ issue, the evidence that we have available reveals a totally unsound methodology. In September 2009, the then Secretary for Health, said in a speech, ‘In this day and age I can see no reason why patients should not be able to choose the GP practice they want.’ He could see no reason, no problem with this policy. In March 2010 the Labour government, still with Andy Burnham, published an online ‘consultation’ on this issue. There was a 54 page document, a patient booklet, a patient leaflet outlining the proposed policy. Everything was slanted towards the benefits of being able to choose your GP anywhere in England. The 54 page document is full of half truths, distortions, misunderstandings of how the system actually works. It is a soothing bit of propaganda whose main message seems to be: ‘we want you to have choice, your choice, because you’re worth it….aren’t we wonderful, vote for us, your choice’ . The document mentions a few ‘challenges’, ‘but nothing unsuperable’, but leaves out numerous risks and costs. This document is truly a ‘dodgy dossier’. The questionnaire accompanying with the consultation was not designed to identify the problems, just to reinforce the message that ‘your choice is paramount’, ‘tells us what your choice is’.

I strongly recommend that you read the Royal College of General Practitioners’ response to the ‘consultation’.


What about Andrew Lansley’s methodology? I tried to find this out, and am happy to publish the result on this blog. In March 2010 I emailed Andrew Lansley. If you read the email exchanges you will see that Andrew Lansley and his team had no evidence to present that they had done a risk assessment on this policy, that they had thought it through. His chief of staff, in one of the emails, offers to ask Andrew to do a ‘feasibility study’. Well, that pretty much sums it up. A year before he was giving speeches extolling the virtues of this idea and castigating Alan Johnson for not driving it forward, and yet he has not yet done a ‘feasibility study’.

This is what I mean by poor methodology.


 To politicians I would ask: don’t pass this part of the bill until Andrew Lansley can present a properly worked out risk assessment, which shows in concrete terms how all this is going to work, in the real world. And the proposal needs to include the criticisms from those who have to implement the policy. Then if you are persuaded that this will lead to better patient care at an affordable price, then pass the bill.

Don’t be content with just lofty words like ‘Liberate’. Remember a previous occasion when Parliament agreed on a plan to do some Liberating without making sure of the facts and the complexity. There were a lot of unintended consequences.

One Response to The Central Question: Are they using a sound methodology?

  1. Ian Greener says:

    Thanks for this. I agree with much of what you write, but would probably add that it is difficult to assess policies in complete terms – they usually need to be broken down to see if their individual elements make sense either logically or in terms of available evidence, and then assessed all together, especially in terms of unintended consequences, as you point out. Our policymaking is a mess. It doesn’t have to be like this – Sweden has a long history of commissions (something Cameron doesn’t want to appear to learn from them) to exactly the ends you suggest above. We should be making big decisions like this with the best advice – not just that offered by whoever is the PM or SoS’s advisor this year.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: