30 October 2017
This month saw the launch online by Public Health England of the Public Health Dashboard. It tells us how well each local authority area performs across a range of seven public health functions, and then ranks the local authorities against each other. The Dashboard succeeds as a conversation starter in raising the profile of public health services locally, not only by making data easily accessible but also by highlighting the importance to local communities of the outcomes measured in it.
These seven sets of health interventions are selected for the Dashboard, we are told, because there is compelling evidence that they are important to health outcomes in every part of the country; and because delivery of these services will impact on future demand for other council and NHS services. If we get them right, we save taxpayers’ money elsewhere.
For example, it is worth noting the rationale cited for including drug and alcohol treatment as two of the seven functions selected. These services, it is explained, can “improve the lives of individuals, the life chances of their children and family, and community stability”. Drug treatment interventions, it says, have a significant impact on reducing the spread of blood-borne viruses, reducing drug-related deaths and reducing crime. Alcohol treatment has a significant impact in reducing alcohol-related deaths and in reducing crime and health costs. This leaves no room for doubt as to the important contribution drug and alcohol treatment make to local communities far beyond the wellbeing of those receiving the treatment.
The value of the Dashboard to local decision makers is less clear. The colour-coded rankings of local authorities are at once delightfully simple and seductively simplistic. But local authority commissioners already have much more sophisticated data from which to give elected members and senior council officers a wider and more nuanced briefing. The danger is that the Dashboard draws the eye away from how good or bad things actually are by focusing on rankings which are relative and have no objective meaning. When the system being measured is itself in robust good health, rankings helpfully tell you where to look for inspiration to up your game. But rankings are relative. When investment in the system has peaked, has fallen and is still falling, local decision makers can easily kid themselves that above average performance is to be celebrated, when the average they are measuring themselves against is falling and the services available to their community are in decline.
So what do we know that contextualises the dashboard data to give it some real meaning? Most significant is the 25% reduction in investment in treatment since 2012/13. If the explicit rationale for investment in treatment is to reduce ill health, deaths and crime, we shouldn’t really be surprised if cuts at this level put these outcomes at risk. In July we heard about the biggest rise in crime for a decade with the upward trend accelerating. Since August, media reports abound on the highest level of drug-related deaths since records began in 1993. And in September we heard about a 10% increase in the estimated number of crack cocaine users and a reversal in the long term decline in heroin use. It is too early to call this a trend or to pin this squarely on the reduction in public sector investment in drug and alcohol treatment, so far from over £1 billion annually to around £750 million. The full impacts of the ongoing disinvestment are a long way off emerging, but following the rationale behind including drug and alcohol services in the Dashboard, we should expect further rises in harm and associated crime and health costs.
The Dashboard’s underlying data relate to years 2013/14 to 2015/16. Although it may eventually delineate retrospectively some important trends, the Dashboard currently offers nothing in real time on which to base decision-making. Rather than providing management and policy insights, the upbeat and colourful publication of historic data might in practice camouflage the dismantling of the drug treatment system of which the Advisory Council on the Misuse of Drugs has warned.
The sector is at or nearing the limit of what it can achieve through efficiencies and the reconfiguration of services and contracts, so far conducted in a reasonably orderly fashion. But the Dashboard will never flash an amber warning because of its focus on relative rankings. Moreover, it is quite possible that the Dashboard indicators such as accessing and completing treatment will hold up for two or three years when viewed in isolation. This risks the content of treatment and its infrastructure deteriorating unnoticed into a pale shadow of the comprehensive service envisaged in the Government’s 2017 Drug Strategy.
Our next insight into what is currently happening on the ground will be the publication of the annual State of the Sector report commissioned by the Department of Health. This is likely to yield a timely and more comprehensive picture than that currently available from the Dashboard.