Prof. Reed said, “Love it or hate it, research impact is now a significant part of UK research culture. We need to think more about how to stimulate healthy research cultures that motivate and inspire people to engage with the outside world for a diversity of reasons, rather than just extrinsically incentivising researchers with funding and promotion. When impact is at the heart of a research culture, there are always fresh challenges and ideas, and teams of people who care deeply about what they do.”
The Royal Society is the oldest scientific institution in the world. With Isaac Newton as their president, they published the first ever scientific journal and they have led changes in research culture globally over the centuries.
Fast Track Impact wins Royal Society prize for changing research
Fast Track Impact has been awarded a prize by the
Royal Society for work on research impact. The prize was given at the Royal Society conference “Research Culture: Changing Expectations” on 29th October 2018. The conference marked the culmination of two years of work by the Society exploring how the UK can promote the cultural conditions that will best enable excellent research and researchers, here and elsewhere, to flourish in the future.
Prof. Reed of Fast Track Impact led one of six teams who pitched to a panel of judges, including Rebecca Endean (Director of Strategy for UK Research and Innovation), Dr Steven Hill (Director of Research at Research England) and Dr Magdalena Skipper (Chief Editor at Nature). Building on his research on impact, he pitched (with Rich Young and Tanya Collavo from Univate) a platform to bring researchers in contact with professionals who have important questions, which integrated evidence, training and guidance from Fast Track Impact. The company has trained almost 5000 researchers from over 200 institutions in 55 countries over the last three years to change how researchers generate and share knowledge so they can change the world.
end of research, societal impacts can be part of the first stages of a study. For example, people living in the region where data is to be collected might have insight into the research questions being investigated; scientists need to build in time and plan ways to ask them. Ecological fieldwork presents many opportunities for knowledge exchange, new ideas and even friendships between different groups. Researchers can take steps to engage more directly with community life, such as by taking a few hours to teach local school kids about their research.”
Celín Quenevo and other leaders of the Takana indigenous nation raised money in the 1990s to translate a 1950s book written about the Takana people by a German anthropologist into Spanish. Anne Toomey, CC BY-ND
Foreign researchers less likely than national researchers to share findings with local stakeholders
A new study appearing in volume 48 of Ambio in January 2019 has shown that national researchers are more likely to share findings with local stakeholders, while foreign researchers are more likely to share via international journals in English.
The study considered how Bolivian researchers shared findings about their ongoing work in the Bolivian Amazon compared with their foreign counterparts. Of the researchers interviewed for the study, 83% believed that their work had had implications for management at community, regional and national levels. However, these beliefs were not reflected in how researchers actually disseminated their work. Rather, the strongest predictor of how and with whom a researcher shared their work was whether they were based at a foreign or national institution. Foreign-based researchers had extremely low levels of local, regional or even national dissemination. However, they were more likely than national researchers to publish their findings in the international literature, in English. Dr Anne Toomey, Assistant Professor of Environmental Studies and Science at Pace University, who led the research, raised concerns about the ethics of this behaviour:
“This disparity raises concerns about whether foreign-led research in tropical nations such as Bolivia is perpetuating colonial-era legacies of scientific extractivism.
It’s not enough for researchers to call it a day, after they publish their results in journal articles read by a handful of colleagues and few, if any, people outside the ivory tower. Rather than impact being addressed at the
Accusations of political influence on the grading of impact in REF2014
New research has provided evidence of political influence on the grading of impacts in some Units of Assessment in the Research Excellence Framework (REF2014). The research, funded by the Economic and Social Research Council, included interviews with REF panellists, before and after the assessment, and participant observation of REF panel deliberations.
Lead author Dr Gemma Derrick, Senior Lecturer in Higher Education at Lancaster University, explained that, “prior to the evaluation process, participants demonstrated a strong preconceived, political belief that the results of the evaluation process must ‘showcase’ the value of British research to the public and policymakers as part of a rationale designed to ensure continued public-based research funding. Post-evaluation interviews revealed how, during the societal impact assessment, evaluators drew on these strong beliefs which informed a group-based strategy of ‘generous marking’ of submissions.”
Read the full article at: https://academic.oup.com/spp/article/45/5/673/4819248
Evidence that pedagogical research was held back from REF2014
Concerns about research quality, expertise of reviewers and university politics limited the inclusion of pedagogical research in REF2014, according to new research funded by the UK Higher Education Academy.
Prof. Debby Cotton, lead author of the article in Studies in Higher Education, wrote, “the status of pedagogic research in higher education – once described as the ‘Cinderella’ of academia, but now an increasing part of university research activity – has prompted some controversy. Both policy-makers and academics have raised questions about whether such research is appropriate for submission [to REF], and confusion exists over the distinction between pedagogic research and ‘scholarship of teaching and learning’.”
Read the full article at: https://srhe.tandfonline.com/doi/abs/10.1080/03075079.2016.1276549#.XBuuOs_7QWo
Research England Director sets out the argument for substantially increasing the weight of impact in REF2028
Steven Hill, Director of Research Policy for Research England, has published an article in the journal Palgrave Communications setting out an argument for substantially increasing the weighting of impact assessment in future iterations of the UK’s Research Excellence Framework.
Arguing that “the assessment of societal impact needs to become a more central aspect of research evaluation”, the article contrasts the evaluation of institutions against “mode 1” criteria (research quality) versus “mode 2” criteria (as judged by the assessment of case studies of societal impact). Dr Hill asked, “is there a case for making assessment of societal impact ‘mainstream’, even to the extent of judging research (and researchers) on the basis of ‘mode 2’ criteria [impact] alone?”. He argued that the justification for public funding of research rested primarily on the delivery of impact, which research assessments should therefore incentivise. In this light, he questioned whether removing research quality criteria would adversely affect the outcomes of research that are most valued by the public.
Dr Hill wrote, “Given the history and importance of ‘mode 1’ [research quality] criteria within the culture of research, it seems likely that there is considerable scope for increasing the emphasis on assessment against ‘mode 2’ [impact] criteria without jeopardizing the delivery of societal impact.”
Read the full article at: https://www.nature.com/articles/palcomms201673
None of the cases from low-performing countries involved private partners as formal project partners. In high-performing country cases, private partners were not only involved as knowledge users but also regularly more formally as funders of research.
Although only mentioned in a minority of case studies, all those who described specific competencies that enabled them to have impact came from high-performing countries, and the majority of those claiming to have had no support for impact came from lower-performing countries. Types of support found in high-performing country cases included finance (Italy), societal impact awards (Iceland, Switzerland), communication support (The Netherlands, Germany, Switzerland), and ICT support (Switzerland).
High-performing countries were more likely to have impact policies or national funders that required them to demonstrate impact.
Read the full article at: https://academic.oup.com/rev/advance-article-abstract/doi/10.1093/reseval/rvy036/5238820
Researchers from countries that capture the most EU research funding report more impact from their work
New research from the Universities of Manchester, Leiden and Tempere has compared research impact practices in the social sciences and humanities in countries that capture the most and least EU Horizon 2020 research funding. “Societal impact” determines one-third of a project’s success in Societal Challenges funding from Horizon 2020. The study analysed 60 case studies from 16 countries, and showed that researchers from the highest-performing countries had a higher capacity for generating impact and reported more impact than those from the lowest-performing countries.
Engagement with the policy community was common across all countries, but impact case studies from low-performing countries were more likely to report policy barriers to impact, compared with high-performing countries where policy makers were more likely to approach researchers, recognised the value of the research and sometimes helped facilitated impact.
Engagement was more likely to be one-way in case studies from low-performing countries, with reports, meetings and presentations dominating. In contrast, case studies from high-performing countries used a far more diverse range of pathways, including training, discussion fora and consultancy roles.
Should we stop funding art for impact?
Dr Andrew Hewitt from the University of Northampton has attacked UK public funding of art commissioned for impact as “debasing” and “instrumentalising” art to “become complicit with and functional for an agenda of privatisation and marketisation”.
In his article for the Kunstlicht visual art journal, he identifies three forms of rhetoric common in public art commissioning for impact: art commissioned to produce social cohesion through cultural participation; art to drive economic development, for example as part of regeneration projects; and art to nudge people towards becoming better citizens, for example through outreach programmes to disadvantaged groups designed to empower them to become less dependent on state provision.
He argues that this shows how cultural policy is “a steering medium that promotes ideas of economics and ways of living that are increasingly informed by neoliberal values and that further undermine public interest, social justice, and democratic debate”. Debate on the relationship between art and politics will continue for many years to come, but engaging in the debate may help prevent art being harnessed unthinkingly to the cause of impact.
Read the full article at https://tijdschriftkunstlicht.nl/wp-content/uploads/online_andrew-hewitt_agendaofimpact.pdf
New Public Engagement Evaluation Toolkit launched
Queen Mary University of London (QMUL) have launched a Public Engagement Evaluation Toolkit. The toolkit, and accompanying research paper in Research For All, was developed for QMUL’s Centre for Public Engagement by Fast Track Impact, the National Co-ordinating Centre for Public Engagement and Dialogue Matters.
As well as giving an overview of how to approach evaluating public engagement projects, the toolkit showcases 21 creative tools which are designed to inspire you to integrate evaluation and monitoring into your activity. This enables you to evaluate your work in a way that fits the setting and tone of your event or initiative, and the needs of your audience/participants. Here are a few examples:
A postcard to your future self: Why not give people postcards from your event to write to themselves saying what they will do differently as a result of their experiences? Make sure they write their address on the card. You can then post the postcards back to participants 1–3 months after the event to remind people of their commitments and ideas. Depending on the consent given, you may be able to follow up with interviews or questionnaires to find out if they have done anything differently as a result of your project.
Video diaries: Participants could record their reflections at intervals throughout a project using their mobile phones, either using a video app on their phone or a video diary app such as VideoPop, SocialCam Video Camera, My Video Diary or LifeCloud.
Event app: Some app developers provide cost-effective off-the-shelf, customisable event apps, which can be used to gather audience information and feedback. These are typically owned by universities that host a range of events. Users are able to choose from upcoming events to get access to the event programme, speakers, maps, directions, social media streams and feedback forms. These apps provide value to attendees whilst providing you with the option to contact people via an app notification during and/or after the event to request feedback.
Reward cards: For public engagement initiatives which include multiple events or activities, to incentivise engagement with as many activities as possible, you can use a reward card system whereby participants collect stickers or stamps at each of the events or activities they take part in. A full set of stickers or stamps makes them eligible for a reward or prize. For example, at QMUL’s Festival of Communities, completing the “sticker challenge” wins you a cuddly bee! If the reward (such as a discount code or voucher) can be sent via email or text message, you could also seek consent to retain contact details for future engagement, providing an opportunity for longitudinal follow-up as part of future evaluation. If each activity has a unique sticker or stamp, you can identify particularly popular events or activities by counting the stickers on submitted reward cards (or the empty sticker sheets), which can help with planning future events.
Download the full toolkit at: https://www.qmul.ac.uk/publicengagement/goodpractice/evaluation-toolkit/
If you’re looking for further support and guidance on evaluating your public engagement activities, NCCPE can help: https://www.publicengagement.ac.uk/
Read the accompanying article: Reed MS, Duncan S, Manners P, Pound D, Armitage L, Frewer L, Thorley C, Frost B (2018). A common standard for the evaluation of public engagement with research. Research For All