Overton Blog

Three things we've learned about policy citations

We spend a lot of time at Overton collecting the evidence that governments around the world are using to make decisions, and we’ve been learning a lot about policy engagement as a result.

In this blog, we share three (well, four) things we've learned from the trends that emerge when you study policy citation data. 

The first lesson? That it’s really difficult to generalise. Evidence use - even before you get to citing that evidence - differs between Hungary and Harrisburg, DC and Dublin, legislatures and executive branches.

But - with some caveats - we can still see some pretty clear trends in the data common to many of the national, state and city governments that we track.

First the caveats…

We use a broad definition of 'policy document'

We characterise policy documents as 'documents written by or primarily for policymakers'. This includes things like white papers, draft bills and transcripts from governments and parliaments but also things like policy briefs from think tanks and NGOs, working papers from central banks and even clinical guidelines from government health agencies.

It’s a very broad definition and you could argue that some of these document types aren’t ‘policy’ per se. Think tanks like the Heritage Foundation in the US or the IFS in the UK don’t actually make policy… but they do influence policy, both directly (we can see the citations) and in more nuanced ways.

When we talk about policy citations we mean citations (formal or informal) in these documents to scholarly research - in books and journals, mainly - or to other policy documents in the Overton database.

Policy citations are one narrow measure of evidence use

Having an impact on policy may involve different projects, interlocutors and back and forth interactions over years and this won’t always be clear in citation data. It isn’t unusual for a policy document to cite a blog post or a news story about a piece of research rather than the research itself!

So we know that citations in policy documents are just one narrow - if important - measure in the policy engagement story. Though citations are generally accepted as the currency of scholarly influence (for better or worse), policy influence is undoubtedly more complex. 

That’s the data health warning! Now on to the shareable lessons.

 

1: Governments cite local

It’s perhaps no surprise that the US government leans heavily on research and evidence from US universities: the 25 institutions most cited in government policy there are all based in the US.

But this holds true almost everywhere. In policy from the government of Japan the three most cited universities are:

japan  University of Tokyo
japan Kyoto University
japan Tohoku University

 

In the UK they're:

uk  University of Oxford
uk UCL
uk Imperial College

 

And in Canadian government policy the three most cited universities are:

canada  University of British Columbia
canada University of Toronto
usa University of Washington

 

Washington does buck the trend slightly but it’s a two hour drive to Vancouver so our point perhaps still stands.

Even in Singapore, with a smaller (if still strong) research base, the three most cited universities include the National University of Singapore and Nanyang Technological University.

It makes sense that governments would prefer and have more engagement with local universities. Many of those universities will have an explicit mission to improve the health, economy or quality of life of the community they’re based in, so you’d hope that they’d be engaged with city and state level policymakers and that this would pay off.

Equally it’s hard to see local community participation in policymaking as a bad thing.

But there are less positive spins on local engagement when it’s local for the wrong reasons. We’ve previously blogged about Mark Geddes’ excellent work analysing the home institutions of witnesses who gave evidence to UK parliamentary committees, the TL;DR being that they were disproportionately from prestigious universities close to London, where parliament is based.

As an academic there’s potentially little reward and certainly no expenses covered when taking a five hour train journey to the seat of government to share your expertise, and it’s possibly easier as a civil servant to invite somebody local in for a quick coffee than it is to engage with somebody on the other side of the country.

To its credit the UK parliament has spent a lot of time and effort on looking for evidence from more diverse voices. The Geddes et al paper uses data from 2013 and when we ran a similar analysis in 2020 the bias was much less pronounced.

 

2: Less than 6% of academic research gets referenced in policy

Not everybody gets a medal! Don’t worry if you aren’t being cited in policy - most people aren’t and this doesn’t say anything about the quality of their work.

The percentage of papers cited varies slightly depending on how you do the analysis, but generally it’s under 6%.

The first study looking at this was from Fang et al at the Altmetrics Workshop in 2020. It found that 3.9% of articles in the Web of Science database appeared in policy documents tracked by Overton.

A later study by Pinheiro et al in 2021 used the Scopus database and found a slightly higher number of matches - 5.8% of all the articles published between 2008 and 2016 and indexed in Scopus were cited in policy at least once.

The Pinheiro study nicely demonstrates how this varies across subject areas. A random sampling shows that while 47% of papers in Development Studies are cited in policy only 0.1% of papers in Drama & Theatre were.

It also varies by time: generally speaking policy citations lag behind academic ones and it can take years for articles to make their way into the policy world. Older articles will have had more time to be discovered and for policy citations to accrue.

There are notable exceptions: during the COVID-19 pandemic we saw preprints that were only days old being cited by the WHO, CDC and other agencies.

 

3: Social sciences citations reign supreme

The social sciences are generally poorly served by scholarly citations - it’s the hard sciences that get all the attention.

This is flipped on its head in the policy world. Policymakers look for and cite evidence and research about education, policing and social care, not organic chemistry or quantum physics.

You can see this clearly when you look at individual journals. 7% of all the papers published in PLoS ONE (which covers science & medicine) in 2015 were cited at least once in policy… but so were 30% of the papers in Gambling Studies. 

We analysed this in a bit more detail in our 2022 analysis paper with Martin Szomszor at Electric Data Solutions, which found that social sciences research accrued five or six times more citations than work in the physical sciences.

This makes sense: the social sciences deal with people, communities and how societies interact, exactly the kind of things that lots of policymakers are engaged with.

Seeing the data for yourself

We’re always interested in helping researchers, universities and other organisations make use of policy citation data - if you’ve got an interesting research project or would like to dig into any underlying data yourself please do get in touch.

 

If you'd like to see the full Overton platform, we'd love to show you around – get in touch to set up a demo or request trial access.

What is Overton

We help universities, think tanks and publishers understand the reach and influence of their research.

The Overton platform contains is the world’s largest searchable policy database, with over 12 million documents from 31k organisations.

We track everything from white papers to think tank policy briefs to national clinical guidelines, and automatically find the references to scholarly research, academics and other outputs.