Module 2: What is impact?

Perceived and demonstrable benefits to individuals, groups, organisations and society (including human and non-human entities in the present and future) that are causally linked (necessarily or sufficiently) to research.” 

(Reed et al., in press)

In plain English, this definition is saying that impact is the good that researchers can do in the world. In a word, impact is benefit. 

 

Linked to this then, engagement is whatever you do to generate those good things. In some grant application forms, researchers are asked to describe their engagement as their “pathway to impact”. Using this metaphor, you may have many different paths to choose from, and you could have a team of people all travelling different paths to help you reach your destination. One of those pathways may be media engagement. As you stop to rest and take stock of your progress along the path, you often realise that you have reached important milestones on your pathway to impact. As a result of your media engagement, for example, you might have raised awareness of an important issue. Your task then is to enable the researchers you work with to capture evidence of that change in awareness before continuing their journey (sometimes with you and sometimes with others) to reach the next milestone, for example attitudinal or behaviour change. 

 

Crucially, this approach to impact focuses on the concept of benefit. It is surprising how much clarity it brings, when you simply ask yourself “What was the benefit?”. Keep asking who benefits and how, and you will discover if there has been any impact. If there is no evidence of impact, then you not yet reached your destination, and you need to work out what path will take you to those ultimate benefits. Very often, media can be one of the paths that can take researchers to that destination. 

 

The reason that media engagement is sometimes seen as a distraction to impact is that press offices rarely stay on the journey with researchers all the way to impact, and when they do, they are only able to provide evidence of media reach. If you care about impact as benefit, then it doesn’t matter how many people you reached if none of them actually benefited. For all you know, you  may have just been generating noise or worse, you may have generated misunderstanding, offence and negative unintended consequences. The reach you can bring through media engagement only has value as impact if you can also show that at least a proportion of those you reached, benefited in some way. 

What is impact and where does the media fit in?

The word ‘impact’ is problematic for many, given that it could be either positive or negative (and it has connotations of (possibly painful) collisions!). The complexity of the concept is summed up in this recent academic definition of impact:

Impact - the component parts

Impact-the component parts

Let’s break down the idea of impact as “the good researchers do in the world” into its component parts:

  • There is an implicit value judgment in this definition; you are seeking benefits and working for the good of others beyond the academy. This means you need to reflect on whether there may also be unintended negative consequences, and do everything you can to avoid these. It is our responsibility to anticipate and assess the potential consequences of research and work with stakeholders to design responsible, sustainable and inclusive research (for more information, see resources on ‘responsible research and innovation’ under ‘further reading’). 

  • There is an implicit venue for those benefits in my definition: they lie beyond the academy. There are, of course, many forms of academic impact you may be equally interested in (for example, bibliometric indicators of impact), but this toolkit is concerned with non-academic impacts.

  • Impact may be direct or indirect. If someone else is able to use your non-applied research (say a new mathematical algorithm or theory) to derive significant benefits (say a piece of software that saves lives), and that benefit would not have been possible without your research, then you can share some of the credit for that impact. 

  • Of course, for this to be ‘research impact’, the benefits must be clearly linked to research from your institution. This is covered under Evidencing impacts from media engagement, where your task is to build evidence-based arguments that show causal links between research findings and benefits to those who engaged with your media. However, it is perfectly normal to go beyond your own institution’s research to draw on other evidence to help the people you are working with, or just get involved in some other way that has nothing to do with research but that helps make a difference. If you are drawing on other people’s research, that’s still research impact (but you won’t be able to claim this as impact from your research). If you are doing something else to help that is not related to research, then that’s still impact, but it isn’t research impact (and you won’t be able to claim that as impact from your research either). It is important to be prepared to ‘go the extra mile’ and help those you are working with in ways that go beyond your own institution’s research if you want to maintain trust and avoid the perception that you are only doing this for your own gain. In many cases, the most effective approach is to find other researchers who can help. In this way, you are able to add value to the publics and stakeholders you are working with, whilst providing opportunities for impact to your colleagues. 

  • Finally, impact is often conceptualised as beneficial change, but you may have just as much of an impact if our research prevents a damaging or harmful change from occurring. Impacts can be immediate or long-term, in our back yard or in outer space, transforming one person’s life or benefiting millions, tangible or elusive. Defined broadly, impact is rich and varied, and has value whether or not you are able to ‘prove’ it to others. However, if you want to robustly claim and talk publicly about the impact of your institution’s research, the impacts will need to be demonstrable. There are two ways in which you will need to demonstrate impact: you will need to provide evidence that you achieved impact (and ideally that this was significant and far-reaching); and you will need to provide evidence that your research contributed toward achieving those impacts. The key word here is ‘contribution’. It is rare that a researcher is able to claim all the credit for an impact linked to their work. There are almost always other lines of evidence (or argument) that have contributed toward the eventual impact. The need to demonstrate impact tangibly may skew researchers towards particular types of impact that are easier to attribute to the research and evidence. There are concerns that the perceived challenges of evidencing impact from media engagement may be dissuading some from engaging with their press office, so it is essential that you learn how to provide evidence of significant benefits, in addition to reach – see Evidencing impacts from media engagement.

 

For these reasons, definitions of research impact from institutions charged with assessing impact, tend to include “demonstrability”[1]. It is not enough just to focus on activities and outputs that promote research impact, such as staging a conference, publishing a report or making it onto the front page of newspapers around the world. There must be evidence of research impact, for example, that the media coverage influenced public opinion or policy, or was taken up by a business to build a new product or service. 

How is impact evaluated? What counts?

You will learn more about Evidencing impacts from media engagement later in the toolkit, but it is worth noting here that impact is usually judged against two criteria: significance and reach:

  • First, ask yourself how significant are the benefits of the research you are covering? How meaningful, valuable or beneficial is this research to those who are likely to engage with your media work? 

  • Second, ask yourself how far-reaching these benefits are. You can get metrics for reader, viewers and the like fairly easily, but how many of these do you think actually benefited in any specific way? Also, I might ask whether there are other groups who might benefit in similar ways, or new applications of the research that could bring new benefits to new groups, enabling the researchers to extend their reach in important ways. 

The order in which you ask yourself these two questions is crucial. I would argue that if you do something that is situated in every country of the world across multiple social groups, but no one really cares, or benefits in any tangible or meaningful way, you don’t actually have an impact. On the other hand, if you save one person’s life as a result of your institution’s research, you clearly have a significant impact. Therefore, first ask yourself what you can do that would be significant on whatever scale you feel is achievable to you at this point. It may be one company, your local community or your local hospital, but if you think you could actually achieve something significant on that scale, then focus on that.

 

This may mean using your media skills to enable researchers to target local newspapers that have very limited reach, or to develop online videos about their research to train clinicians in a pilot hospital or company as proof of concept that their research actually has benefits before you ever get to the national or international scales that you tend to focus on first in press offices. If you are able to help researchers collect evidence of the benefits of their engagement, then the evidence that it worked for one company, one community or one hospital makes it much easier for others to follow in their footsteps and may generate organic demand for the same benefits elsewhere. 

What types of impact are there?

There are many different types of impact, with some types leading to others. Institutional definitions of impact often list types of impact, but there have been few attempts to categorise these to date. For example, the Higher Education Funding Council for England defines impact as “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia”. More simply, the Australian Engagement and Impact Assessment defines impact as “the contribution that research makes to the economy, society, environment or culture, beyond the contribution to academic research”.

 

It is possible to distinguish between ten types of impact. Categorising impacts in this way is useful, because it gives you a checklist for considering the full range of possible impacts you could seek to deliver through media engagement. Even if a researcher has a narrow focus on one type of impact (say economic impact of a spin-out company), it is often worth looking through the other types of impact that might arise, to consider whether you might also be able to help them generate these benefits. For example, a company’s new product may replace something that was energy intensive to produce, and so reduces greenhouse gas emissions, giving them an environmental impact as well as the original economic impact. 

 

Table 2 shows you the types of impact you can look for. The following subsections define each of these types of impact and give you examples of the sorts of things you might seek to do to achieve each type of impact. 

Table 2: Research impact typology with a fictional list of examples arising from media coverage of a new annual equality and diversity ranking and toolkit for FTSE100 companies based on research showing that companies that make provisions for hidden disabilities and have diverse boards perform better economically

 
How does impact happen?

There is one universal precursor to impact: learning. Published research you publish is typically in the form of data and information (useful data), but for someone to benefit or use research from your institution, this data and information need to be transformed into knowledge in someone’s head. This happens through learning: someone somewhere needs to learn about your institution’s research. Therefore, if you want research to have an impact, you need to find new ways of making this research both accessible and understandable to the people who can benefit from or use your work most. 

 

Press offices have many crucial skills that can enable target audiences to learn more effectively about research. But good communication starts with empathy – you need to truly understand your audience if you want to develop communications that really resonate with them. To do this, you need to enable researchers to to patiently nurture relationships with those who are interested in and can use their research, learning with them about the needs and interests of these groups. This takes humility and empathy, because you need to listen and learn if you want to understand who these people are, and what motivates them. 

 

Empathy is at the heart of media engagement that generates impact, but more specifically, there are five factors that can explain whether or not a pathway to impact is likely to work:

  1. Context and purpose: the impact generation process always starts in a given context, for example, the culture, educational status and interests of a particular public, or the emergence of a new challenge such as a new disease or opportunity such as a new technology. Within this context, researchers and various social groups may wish to achieve specific benefits (your purpose or impact goal), for example, learning about the work of a nationally significant artist, or finding a cure for a disease. As contexts or purposes change over time, you need to adapt your pathway to impact, considering how you may deal differently with each of the factors below. For any given context and purpose, each of the steps required to generate impact will vary significantly.

  2. Who initiates and leads on the pathway to impact: researchers, publics and/or stakeholders may initiate and lead the impact generation process. Who initiates and leads the process matters: there is evidence that impacts vary systematically based on the group that has ownership of the pathway to impact. For example, your pathway to impact may be self-organised from the bottom-up, initiated and led by those seeking the benefits. Alternatively, impact may be initiated through more top-down approaches, where plans to achieve benefits are initiated and led by researchers or other external agencies, such as the government. 

  3. Representation: your engagement with stakeholders and publics is likely to vary from full to partial representation of different groups and their interests. Partial representation may be deliberate (for example, as part of a phased approach to engaging increasingly influential or hard-to-reach groups), or due to a lack of time or resources. There is evidence that pathways to impact are significantly affected by who is engaged in the pathway, and inadvertently overlooking important groups can undermine your attempts to achieve impact.

  4. Design: the way you engage with publics and stakeholders may be designed as communicative (one-way flows of knowledge from researchers to stakeholders and/or publics), consultative (one-way from stakeholders to researchers), deliberative (two-way knowledge flows) or co-productive (joint production of knowledge). Your choice of approach should be adapted to who you are engaging with (point 3 above), who initiated and is leading the process (point 2), and your context and impact goals (point 1). 

  5. Power: finally, depending on the design of the process and its facilitation, power dynamics between researchers, publics and stakeholders may be more or less effectively managed, strongly influencing the ultimate achievement of benefits or unintended consequences. 

 

Ultimately, the likelihood of your pathway to impact working depends on each of these five factors. Get these right, and you are highly likely to achieve your impact goals. Get them wrong, and you are far more likely to fail, potentially leading to unintended negative consequences.