top of page

Section 6: Topic 2

Overview of evaluation methods

1. Quantitative, experimental and statistical

  • Description: Causal links between research and impact via media engagement are established, based on the difference between two otherwise identical cases (cases include individuals, sites, environments/contexts), one that is manipulated and the other that is controlled, or via a statistical relationship between dependent and independent variables or statistical difference between an effect before/after or with/without an intervention.

  • Methods include: statistical modelling, longitudinal analysis, econometrics, difference-in-difference method, double difference method, propensity score matching, instrumental variable, analysis of distributional effects, experimental economics.

  • Examples: A/B testing of behaviour change messages via social media, tracking and statistically analysing the difference in adoption of new behaviours arising from each message; asking a statistically robust cross-section of the public about their perception or understanding of an issue before and after exposure to planned media coverage, asking if they saw the coverage during the study period, and using this to determine the extent to which the coverage change understanding and perceptions.


2. Theory of change and logic driven:

  • Description: Theory and logic driven approaches to impact evaluation trace causal chains from research to impact, based on an anticipated logic or a theory of likely change. The explicit consideration of risks and assumptions in both approaches make them well suited to evaluating whether the research was a sufficient cause of impact in the context of other contributory or confounding factors. The more closely reality corresponds to what was expected in theory at the outset, the stronger the case for assuming the research contributed to the outcomes.

  • Methods include: Theory of Change, Logical Framework Analysis, Payback framework and other logic models.

  • Example: A theory of change for a charity’s plan to raise awareness of an issue they are campaigning about shows how different types of mass media coverage combined with amplification and targeting via social media will raise awareness in specific groups, which may then lead to specific changes in behaviour. Surveys of the target groups show that understanding is growing and attitudes are changing, despite the fact that there is no evidence of a behaviour change yet. Attention is then paid to the link between awareness/attitudes and behaviour to see if more time or a new strategy is needed to reach the ultimate change that was proposed in the Theory of Change.


3. Qualitative and arts-based:

  • Description: Qualitative and arts-based evaluation methods tend to establish cause and effect between research and impact by checking between (or “triangulating”) multiple sources of evidence to create a credible, evidence-based argument. Both qualitative and arts-based methods can be participatory, engaging beneficiaries and other stakeholders in the evaluation itself, enabling these groups to engage and shape the evaluation, which then has the potential to further enhance impact. 

  • Methods include: Testimonials, ethnography, participant observation, qualitative comparative analysis, linkage and exchange model, interviews and focus groups, opinion polls and surveys, other textual analysis e.g. of focus group and interview data, participatory monitoring and evaluation, empowerment evaluation, action research and associated methods, aesthetics, oral history, story-telling, digital cultural mapping, (social) media analysis, poetry and fiction, music and dance, and theatre.

  • Examples: focus groups with newspaper readers, providing them with the story to read at the start of the session, with activities to explore what they have learned, agree or disagree with and/or are likely to do as a result of what they now know; getting participants to turn a media story into a piece of amateur theatre to provide them and/or their audience with an opportunity for further debate and learning, which may be recorded to evaluate changes in awareness or intentions that are expressed as part of the debate.


4. Systems and pathway analysis:

  • Description: Systems and pathway analysis methods attempt to disentangle the messy complexity of impacts that occur in complex systems (compared to logic and theory-driven approaches that are used in a more linear way, and often also used in impact planning). They tend to draw on a range of qualitative and quantitative research methods to depict more complex cause-and-effect relationships. They are able to capture the complex range of other factors mediating impacts, to enable the generation of arguments that the research made a significant contribution to the impact, even if direct and sole attribution is not possible.

  • Methods include: Contribution analysis, knowledge mapping, Social Network Analysis, Bayesian networks, agent-based models, Dynamic System Models, influence diagrams, Participatory Systems Mapping, Bayesian Updating.

  • Example: Social network analysis with qualitative interviews tracing how knowledge about a research innovation described in the media travelled from person to person through peer-to-peer networks, and either led to impacts in policy or practice or got stuck with certain gatekeepers within the network.


Broadly speaking, there are two factors that can help select from these four types of evaluation design. First, ask yourself to what extent you need to generate summative feedback that will prove your media engagement led to impacts, versus the generation of formative feedback that will enable you to engage more effectively in future. Second, ask yourself to what extent you need to generate evidence that the impact would not have been possible without the research or media engagement, or whether you just need to evidence that it was highly likely, but you acknowledge the range of other contributing factors that may have also played their role. If you need summative feedback with a high degree of confidence in your conclusions, then you may want to invest in quantitative and experimental methods. If on the other hand you are just as interested in formative feedback and want to disentangle the role of the research or media one of many factors contributing to the impact, then theory and logic-driven and systems and pathway analysis methods may be more appropriate. Depending on the nature of the impact being evaluated and the context in which the impact occurs, qualitative and arts-based methods may provide either formative or summative feedback and may provide more or less robust assessments of the likelihood that impacts arose from a specific piece of research or media engagement. 

bottom of page