Duncan Smith, Work Programme Statistics, and Shamelessness.

by richardhutton

According to Iain Duncan Smith in a Parliamentary debate on 10th May 2013:

“The Work programme is a success. In fact, the Office for National Statistics wrote the other day to a number people correcting how they interpreted the figures. It made it very clear that what the right hon. Member for Birmingham, Hodge Hill and others have said about the statistics was completely wrong. The ONS has said that the reality is that the figure of 2% or 3% that he has been using, which is below the minimum performance level, is incorrect. It went on to say that the realistic and more relevant figure is that 8.6% of those referred to the Work programme are in sustained employment in the first six months”.

Is this true? Is it really?

No, of course not. This stat did not originate with the ONS; instead it has been taken out of context from a UK Statistics Authority letter – which expressly warned Smith to stop taking statistics out of context:

“Whilst both the PAC report and the related National Audit Office report focus on the fact that 3.6% of people referred to the Work Programme between June 2011 and July 2012 achieved sustained employment (normally of six months) by July 2012, the Statistics Authority does not regard that as the most relevant measure to use – since many of the individuals would not have been in the scheme long enough to achieve six months sustained employment by July 2012.

Our conclusion is that the more relevant figure is that based on the June 2011 cohort on its own – namely that 8.6% of those referred to the Work Programme in June 2011 were in sustained employment of at least six months (or three months if hard to place) at some point during the 12 months following referral. That figure can of course now be updated for each month from June 2011 to give a monthly series. The existence of such different measures was the root of some concern at the hearing of the Public Accounts Committee on 17 December 2012 but there are good arithmetic reasons why one is a lot higher than the other and it is up to the authors of the Department’s statistical releases to explain these points clearly and fully”.

So, the UK Stats Authority made clear that the figure of 8.6% was applicable to the month of June 2011 only – and that this should be updated monthly. Smith not only attributed this to the wrong source, but took it out of context. Nothing particularly new, in fairness.

However, it is the following which demonstrates how shameless Smith is being:

“The figures might be described as, for example, 3.5% of people referred to the Work Programme between June 2011 and July 2012 had been in sustainable employment by July 2012‟. This cannot be used as a measure of the success of the Work Programme because those referred to it in, say, June 2012, could not have built up three (or six) months‟ sustained employment.

Ultimately, it is too soon to make judgements about the performance of the Work Programme – just two one-month cohorts (June and July 2011, at the time of publication) had had sufficient follow up (and data collection) time to allow the first 12-month performance to be assessed. Since the beginning of a new programme is not necessarily representative of the entire Programme‟s performance, further time is needed to assess the first year more fully.”

It adds:

“Users should be informed about the quality of the statistical outputs, including estimates of the main sources of bias and other errors” and “Ultimately, it is too soon to assess the performance of the Work Programme – just two onemonth cohorts (June and July 2011, at the time of publication) had had sufficient follow up (and data collection) time to allow the first 12-month performance to be assessed”.

Furthermore, UKSA specifically requested that the DWP include an impartial narrative in releases, along with context for the stats, and information about the strengths and limitations of the statistics in relation to their potential use. None of which seems to have had any impact.

It’s also worth noting that even a 91.4% rate of failure would hardly be a success.

 

Advertisements