UK arts engagement: hot and cold spots
Arts Council England has recently announced it will once again invest in the collection of data to measure engagement with the arts at local authority level. This is great news for the arts and audiences. Really.
I’m sure I was in a very small minority when I celebrated ACE’s initial investment in collecting local data seven years ago. This was a few years after the first results of the Taking Part survey were published in 2005, which showed that there was not a uniform picture of levels of engagement in every English region. The Taking Part sample size wasn’t big enough to give a statistically significant picture down to a local level, so you couldn’t see if there was a difference between South Gloucestershire and South Tyneside, or Southend and Slough.
Those competitive people at Sport England had realised the potential value of local-level data and had commissioned a far, far bigger survey that would give statistically credible results down to English district level. This substantial ongoing investment allowed the organisation to have meaningful conversations with local authorities about the levels of physical activity taking place in their area. They could then work together to come up with a strategy to increase the sportiness of the local population and measure how well they were doing. Many local authorities liked this. They increased their funding for sports development, sometimes at the expense of their arts budgets.
In what now seems like a very different economic and political age, the Westminster government created a pot of cash to reward local areas who set targets for improvement and reached them. By piggybacking the Sport England survey, the arts council put a national indicator for arts engagement (known as NI11) into the mix of what would be monitored in every area. Somewhat surprisingly, a significant minority of areas chose to include arts engagement as a target for improvement in their local area agreement.
I’m not going to dwell on the less than brilliant results achieved in these areas. Some argued that achieving statistically significant increases in levels of arts engagement given the size of the sample and the short timescale (just two years from when the baseline was established) would be next to impossible. They were proved correct. None of the areas demonstrated a statistically significant increase in arts attendance and participation.
Add that to a load of local authorities failing to hit their arts targets, a financial crash, a change of national government and an abandonment of ‘top-down’ targets to local authorities, and you can see why the arts council stopped collecting the data in 2010.
So why on earth are they planning to do it again now?
For me, the targets were far less important than the knowledge gleaned from having accurate figures for the first time. Knowing what the levels of engagement are in every local authority allows policymakers and funders to target their resources and then be able to measure improvement over a long period of time.
And the results of the initial three years of data collection were certainly eye-opening. The vast range of results in local authorities demonstrated how much encouraging increased engagement in the arts is a local issue, with some local authorities showing twice the levels of engagement compared with others.
Look at the top 10 places in the box-out, opposite, and you’ll see they are monopolised by London and the South East. But before we urge a ‘rebalancing’ of funding, we need to look at the bottom places too, the lowest of which is also a London borough.
Is this a surprise?
Here I must confess to some insider knowledge, as I was working at ACE when the first year of data was released. Some very bright colleagues in the research team used what we knew about local area demographics and propensity to engage with the arts to create a model of 10 bands showing where we’d expect each authority to appear. When the real data was published, it showed that the vast majority of local authorities’ actual figures were very close to what had been modelled. What this proved was that the places of lowest engagement weren’t necessarily ‘cold spots’ for arts, but they were most probably places with low levels of educational attainment and a high proportion of people with lower ‘social status’.
What fascinated me was which local authorities performed in a way that was very different to what was predicted.
The initial modelling (and reporting) was done at ‘top tier’ level (not districts), so I doubt whether we’d expect Chiltern and Waverley to be ranked so high or Ashfield and Easington to be at the bottom of the table. Most of the county and unitary councils that did far better than expected were in London. Part of the reason for this will be down to the super-served cultural ‘world city’ that is central London, but I’d also like to believe that the long-term, significant investment made in local arts development by boroughs such as Greenwich (48%) and Lewisham (50%) will have helped them over-perform.
Of great interest are the regional overachievers. Well done to Southend (46%). It was modelled as being in the same band for propensity to engage as lowly Slough (30%). Liverpool (42%) was modelled to be in the bottom band, but actually did better than more than a third of local authorities. Neighbouring Wirral (46%) was modelled in the same band as overachieving Southend and second-worst-in-the-country Slough. Would it be wrong to assume some correlation between Merseyside’s better-than-expected performance and its placement of culture at the heart of its strategy for regeneration and renewal?
Joining Slough on the list of underachievers is a geographically mixed bag of local authorities. North East Lincolnshire (32%) was modelled to be likely to perform better than Greenwich or Lewisham. Thurrock (34%) was predicted to do as well as Southend but was also very close to being at the bottom of the table. In some of the most affluent areas of the South East, Bracknell Forest (45%) and West Berkshire (46%) were predicted to be close to the top of the ranking but performed no better than average.
ACE’s Creative People and Places programme was devised using the NI11 data. It is a wonderful thing: a long-term commitment to deliver change in areas with the least engagement. But what about the places where engagement is far less than you’d expect, but not in the bottom third? What are the barriers to arts engagement in Bracknell Forest or West Berkshire? And what are the reasons that places such as Southend and Merseyside do better than we would predict? Can whatever they are doing right be replicated?
It’s been long enough now that we may see some statistically significant change in some parts of the country when surveying recommences. I hope some of the places celebrating success will be those that have been delivering Creative People and Places programmes, although it is very early days in these really challenging areas for the arts. I also hope that, this time around, more time will be spent focusing on the under- and overachievers. There may be a lot we could learn.
We need your help…
When you subscribe to The Stage, you’re investing in our journalism. And our journalism is invested in supporting theatre and the performing arts.
The Stage is a family business, operated by the same family since we were founded in 1880. We do not receive government funding. We are not owned by a large corporation. Our editorial is not dictated by ticket sales.
We are fully independent, but this means we rely on revenue from readers to survive.
Help us continue to report on great work across the UK, champion new talent and keep up our investigative journalism that holds the powerful to account. Your subscription helps ensure our journalism can continue.