Sue Robb of 4Children talks to Julie Laughton and Alison Britton from the Department for Education about the role of childminders in delivering the 30 hours free entitlement.
Making savings through smart forecasting
Poor quality forecasting by government departments is an entrenched problem leading to poor value for money and additional costs falling on taxpayers. Over 70 of our reports over the last three years have raised concerns about forecasting, prompting us to produce our first ‘holistic review’ of the topic. This examines how departments produce and use forecasts at both a project and aggregate level, and the role of the centre of government in incentivising good forecasting. The report focuses on Departmental Expenditure Limits (DEL) spending which at around £360 billion makes up about half of government expenditure and is generally considered to be under departments’ control.
We have drawn on our extensive back catalogue of high-profile failures to illustrate our concerns. Examples include the Ministry of Defence’s decision to procure the carrier variant of the Joint Strike Fighter, which had to be reversed, at a cost of £74 million, after it became clear the forecast costs were based on immature information and assumptions.
The problem of poor forecasting is also illustrated by the Department for Communities and Local Government’s New Homes Bonus scheme which was underpinned by a model containing an arithmetical error. This led to the department overestimating by about 32,000 the number of new homes the scheme would produce.
Another prominent example is provided by the collapse of the Intercity West Coast franchise procurement in October 2012 as a result of a technical error in the assessment of bidders’ financial models. This high-profile error resulted in £54 million of unforeseen costs for the Department for Transport.
Sir Nick Macpherson, Permanent Secretary at the Treasury, subsequently commissioned a review of the quality assurance arrangements for business critical models across government. This identified 484 such models and made a series of recommendations for departments to improve quality assurance and governance arrangements and to ensure quality assurance is part of their culture. The Treasury is due to review departments’ progress against these recommendations later this year.
Alongside our review of our back catalogue of reports, we surveyed analysts and finance directors, held focus groups with senior analysts and reviewed the Ministry of Justice’s forecasting of the prison population. The evidence gathered was used to assess government’s capability against its framework for good practice in modelling. What we found was that there are issues with the way forecasts are produced and how they are used by departments in the decision-making process.
Our report highlighted concerns about how government departments produce their forecasts. One area of particular concern was the quality of data – more than half the analysts we spoke to identified a lack of good quality data as a barrier to good forecasting. Our report also questions the way in which the results of forecasts are presented, too often using point estimates rather than providing ranges which would reflect the uncertainty in the outputs. The NAO’s 2013 report on High Speed 2, for example, highlighted the fact that uncertainty at the early stages meant it was unwise of the department to present a point estimate of £16.3 billion, rather than the range of £15.4 billion to £17.3 billion.
How forecasts are used
Forecasts are used by policy, finance and operational staff to inform the running of existing projects and the initiation of new ones. These groups need to work closely with analysts to integrate forecasts in the decision-making process. More than a quarter of analysts, however, told us there was a lack of understanding by senior management of what forecasts mean. In addition, only 39 per cent thought senior managers used forecasts effectively.
While departments almost always balance budgets at a macro level, there is considerable volatility at a programme level, with forecast spending varying considerably throughout the year. In the space of 10 months, for example, the Department for Communities and Local Government moved 40 per cent of its resources between programmes. The majority of finance directors agreed that better forecasting could help reduce volatility and the risks to value.
Senior analysts also raised concerns about how forecasts are used at an aggregate level within departments. They told us about a ‘disconnect’ between the analytical and finance functions and described the finance function as a ‘black box’, with a lack of clarity about how forecasts inform allocation. The Ministry of Justice, however, has nominated analysts who work closely with its finance directorate and key change programmes in order to ‘bridge’ the functions.
We looked at how the Department for Business, Innovation and Skills’ finance directorate has worked with analysts to develop a new approach to forecast aggregate spending. The approach used the uncertainty across approximately 600 budget lines to create a picture of where the department’s finances were, and to illustrate the probability of under or overspending.
While departments need to create an environment which improves the quality of forecasting, analysts told us that time pressure is a barrier to good quality forecasting, whilst senior analysts suggested policy colleagues were ‘blind to uncertainty’.
Another major concern raised by both analysts and finance directors was optimism bias, and the desire of decision-makers for analysis that supported their viewpoint and intentions. Analysts have expressed concern that they are under pressure to provide supportive rather than realistic forecasts.
Centre of government
The centre of government has an important role to play in improving the quality of forecasting. Although we acknowledge the positive steps taken, such as the publication of the review of financial management capability in December 2013, we suggest that the pressure not to exceed annual budgets crowds out forecasting and encourages short-term decision-making and volatile spending profiles and, ultimately, provides poor value for money.
The Treasury’s spending teams also need to do more, as the National Audit Office expects ‘the pattern of broadly accurate aggregate forecasts, but poor quality programme-level forecasts, to continue’ without more ‘informed challenge.’
Forecasting in the private sector
We also worked with Deloitte to produce a summary of leading practice in forecasting, based on their experience of working with high-performing private sector clients. The report, published alongside the NAO review, states that forecasting is ‘considered essential within the corporate performance management cycle of leading practice organisations’. In spite of the finding that 85 per cent of private sector managers considered forecasts to be important, only 52 per cent considered their own organisations’ forecasts to be high quality. Deloitte identified several characteristics common to organisations that lead good practice in forecasting, including collaboration across all business areas; balanced investment in systems and inputs; and the ability to adapt to change.
We concluded that forecasting is ‘not taken sufficiently seriously and is often hampered by poor quality data and unrealistic assumptions driven by policy agendas’.
Amyas Morse, head of the National Audit Office, said: “Departments generally treat forecasting of future spending as little more than a technical activity, of limited relevance to financial management. In fact, high quality forecasting is an indispensable element of project planning and implementation.
“We have seen many examples over recent years of government projects where weaknesses in forecasting have led to poor value for money. A first step towards improving the quality of forecasting would be increased transparency and scrutiny of forecasting and more concerted action at the centre of government.”