Wild extrapolation and the value of marine protected areas

Last week, the UK National Ecosystem Assessment published a follow-on report on the value of proposed marine protected areas (rMPAs) to sea anglers and divers in the UK. This report gained a fair bit of coverage, likely because the headline numbers it proclaimed are quite astonishing: “The baseline, one-off non-use value of protecting the sites to divers and anglers alone would be worth £730-1,310 million… this is the minimum amount that designation of 127 sites is worth to divers and anglers”. Furthermore, they claim an annual recreational value for England alone of the rMPAs of £1.87-3.39 billion, just for these two user groups (divers and anglers). These numbers are so astonishing, in fact, that my bullshit klaxon went off loud enough to knock me off my chair. See, I’ve been thinking recently about sea angling as an ecosystem service, and so know that there’s estimated to be somewhere around 1-2 million sea anglers in the UK. The number of divers is, I reckoned, likely to be considerably lower (there’s a higher barrier to entry in terms of equipment, qualifications, etc.). So these headline figures imply an annual spend -  purely on their hobby - somewhere in the order of £1000 for every single self-declared sea angler or diver. Which seems rather on the high side, given that one would expect a very long tail of ‘occasional’ dabblers in each activity (e.g. people who spin for a few mackerel on holiday).

So, I bucked down and read the 125 page report, to find that the authors had done some things really nicely. Their valuations are based on online questionnaires featuring a combination of neat choice experiments, willingness to pay (WTP) exercises, and an valuable attempt to characterise the non-monetary value of the sea-angling or diving experience (things like ‘sense of place’, ‘spiritual wellbeing’, etc.). But the headline numbers are highly dubious (worthless, in fact), because they did a few things very very badly indeed. Unfortunately, they did a different bad thing for each of their two major monetary valuation methods, so the numbers emerging from each are equally dodgy, as a modicum of mental arithmetic, common sense, and ground-truthing will show.

First, the annual recreational value models are nicely done, using a choice experiment based on travel distances to hypothetical sites with different features to assess which of those features are most valuable. Mapping these features onto the rMPAs leads to a ranking of these sites in terms of how attractive they are to anglers and divers. One could quibble with details here - perhaps the major quibble would be that there is no ‘control’, i.e. no assessment of the value of sites which are not proposed for protection. But in general, I think this analysis gives a decent estimate of how the survey respondents value the different sites.

They then attempt to get an overall annual value for each site by multiplying its value to individuals by the number of visits it receives in a year. This is where the problem arises: attempting to generalise from these respondents to the entire population of anglers (estimated at 1.1-2 million) or divers (estimated at 200,000). I’m going to concentrate on the anglers because the issue is most extreme here: their models are based on 273 responses, a self-selected group of anglers acknowledged within the report to be especially committed (averaging 3-4 excursions a month) and interested in marine conservation, and representing between 0.01 and 0.02% of the total population, i.e. 1 or 2 responses per 10,000 anglers (they also a self-selected sample of highly experienced divers, representing around 0.5% of all divers, i.e. 5 per 1000). Extrapolating from this sample to the entire UK angling population produces some interesting results.

For example, using this methodology Chesil Beach & Stennis Ledges rMPA on the Dorset Coast has an estimated 1.4-2.7 million visits by sea anglers annually. That translates to 3800-7400 visits every single day of the year. Compare this to a (highly seasonally variable) average of around 3000 visits per month to Chesil Beach Visitors Centre. Or you could look at the Celtic Deep rMPA, a site located some 70km offshore, where they estimate between 145,000 and 263,000 angling visits per year. That’s 400-720 visits a day, which translates to approx 40-70 typical sea angling boats, each full to the gunwales every single day of the year. Of course, this is simply because the tiny sample is uncritically extrapolated. In the case of the Celtic Deep, it is straightforward to calculate that there were actually 36 observed visits, which (when divided by 273 and multiplied by 1.1 or 2 million) gives you 145-263,000 estimated visits. Using this logic, the minimum number of visits a site could receive is (1/273)*1.1 million, or >4000. Diving numbers are similarly unrealistic, with estimates of 123-205,000 visits a year (340-560 per day) by divers to Whitsand & Looe Bay, or 26-44,000 a year (70-120 per day) at Offshore Brighton, which is around 40km offshore.

This kind of wild, uncritical extrapolation is staggering, akin to using the opinions of a focus group of LibDem party activists to predict a landslide in the next election. It’s a textbook example of the utility of a bit of simple guesstimation (e.g. a million visits a year means 10,000 visits/d for 100 days, or ballpark 2500/d over the whole year), allied to some common sense (have you ever experienced those kinds of numbers when you’ve visited the UK coast?)

So, we can discount the big annual recreational value figures. What about the WTP exercise? WTP has its fans and its critics. My view is that it’s a useful way of ranking scenarios according to preference, but I don’t give a lot of credence to the ££ generated, simply because by increasing the number of scenarios you can quickly get people to commit more cash than they intended. But regardless of that, the authors of this report appear to have made a very strange decision in aggregating the WTP estimates arising from their questionnaire. They worded the questions very carefully, presenting each respondent with a single site, outlining its features, and asking how much they would be WTP as a one-off fee for its protection - being sure to think of this amount as a real sum of money, in the context of their household budget. These numbers are then used to give an average WTP for all the rMPAs, which seems reasonable, and a useful way to rank the sites.

But They then simply multiply these site level averages by the whole UK angling (or diving) population to get a total WTP for the whole set of rMPAs.

Think about what they’ve done there.

They’ve asked people how much they would be willing to pay to protect a single site, and have then assumed that the same person will pay a similar amount for every site in the network. So if you agreed that you’d be prepared to pay a one-off sum of £10 to protect a site, you could find yourself with a bill for over £1000 to protect the whole network. (This is a slight over simplification, as specific values are site-specific, but it is essentially what they’ve done.) You simply cannot aggregate WTP like this. I mean, I’m not an economist, but if economists think you can do this, they are deluded.

Again, a bit of common sense would have helped here. The authors compare this WTP to an insurance premium, which is a useful analogy. But how many anglers or divers are really, when it comes down to it, prepared (or even able) to shell out a £1000+ insurance premium to prevent damage to the marine environment which may or may not occur in the future?

Anyway, that’s what’s been bugging me these last few days. I could go on (for instance, on a more philosophical level, is replacing strictly regulated commercial fishing with unregulated recreational angling necessarily a good thing for the marine environment? Will diving or - especially - angling actually be allowed in these rMPAs?). And there are some useful things in the report. It confirms that people do value the marine environment, really quite highly, and that different features are valued differently by different groups - a useful starting point for some more focused research, and helpful in placing relative values on different rMPAs. But unfortunately - inevitably - media attention has focused on the ludicrous headline numbers, something the authors have actively encouraged in their framing of the report.

A final positive point to end on: my bullshit klaxon seems to be in fine working order.