“Brexit effect”: UK researchers turn to social media as ambassadors of evidence
Findings from a survey blogged by Heather Crookes, Communications Officer for Sheffield University Management School, shows that anti-expert sentiment in the lead up to the UK’s vote to leave the EU has led some researchers to turn to social media to “use their voices” as “ambassadors for academia”. Interviewees highlighted a renewed sense of responsibility to “get out of our boxes and explode myths”.
The research found that researchers who were using social media (in particular the use of Twitter linked to a blog about their research) reported a range of additional benefits, including the development of international peer networks, knowledge sharing and in some cases improved outputs through crowd-sourced feedback on their work.
Heather Crookes is Communications Officer for Sheffield University Management School, where she manages the School’s website, generates content for blogs and social media and promotes research. Read her work here:
Hong Kong Launches $150M Research Impact Fund
Hong Kong’s University Grants Committee has launched a new competitive Research Impact Fund worth $50M per year for the next three years. Launching the three-year pilot, the Committee’s Chairman, Mr Carlson Tong, said,
“There is an increasing focus on the impact brought about by research projects in our society and economy. We see a need for greater change in the local research culture whereby researchers should be further encouraged to consider actively potential benefits and beneficiaries at the outset and how they will achieve excellence with impact.”
Applications to the fund will be judged on academic merit and potential to generate impact from research. Mr Tong defended the Committee’s commitment to funding the best research, while justifying the move towards funding work that could generate impact:
“Implementing the Fund will not only enable Hong Kong to stay competitive in the globalised higher education sector but will also strengthen our research and educational capacity. While the University Grants Committee will continue to focus on research excellence, increased emphasis will be placed on realising the impact of our research.”
Our greatest impacts were conceptual but we submitted instrumental impacts to REF
Interviews with mathematics heads of department across UK Universities, published in Research Policy by Laura Meagher and Ursula Martin, shows that mathematicians went with their heads, rather than their hearts when choosing what to submit to REF2014.
When asked what they saw as the principal impacts arising from research in their discipline, heads of department cited a wide range of impacts, focusing in particular on conceptual impacts. However, Meagher and Martin’s analysis of the 209 case studies submitted to REF2014 by maths departments showed a strong emphasis on instrumental impacts. Writing in Times Higher Education, Meagher and Martin explained that “department heads confirmed to us that these are what they thought the REF panels would want.”
Based on their research, Meagher and Martin urge university managers preparing for the next REF to “value all impact types and to accept that they develop over time, within complex ecosystems of formal and informal relationships that may well be interdisciplinary. And if they want to do well in future REFs, departments must also take steps to facilitate the creation and maintenance of such relationships – including by the provision of incentives and rewards for those who pursue them.”
Laura Meagher is senior partner at Technology Development Group,
a strategic change consultancy in Fife, Scotland. Ursula Martin is professor of computer science at the University of Oxford. Read the full opinion piece at: The full original article, “Slightly dirty maths: The richly textured mechanisms of impact” is available at
New call to build first ever best practice library of Pathways to Impact
Fast Track Impact have launched a call for Pathways to Impact and Impact Summaries (or their equivalent) from grant applications. The move has prompted controversy, with some fearing that researchers may be tempted to copy good practice examples. However, launching a new guide, “How to write a winning impact summary and pathway to impact”, Professor Mark Reed wrote, “I believe that by sharing good practice, we can spread innovation, drive up standards in grant writing and improve the likelihood that research delivers impact.”
Illustration by Steve Hutchinson from a workshop about research impact held at the University of Westminster (tweeted by @begumru)
Work that is submitted to the site will be evaluated by Professor Reed, with only good practice examples published, detailing both good and poor practice features of each example. Examples can be submitted from grants that were rejected on the basis of the research, if the impact elements of the proposal were strong.
Submit your work for evaluation at:
Read the new guide to writing a winning pathway to impact online at:
New methods for synthesising the best available knowledge to inform decisions
Dr Lynn Dicks from the University of Cambridge and colleagues from the EU funded EKLIPSE project have published a collection of 21 methods for identifying and synthesising the best available knowledge to inform decisions.
For each knowledge synthesis method, the report describes likely cost, time required, repeatability, transparency, risk of bias, scale (or level of detail), capacity for participation, data demand, types of knowledge that can be synthesised, types of output that can be produced and specific expertise required.
Researchers from a range of disciplines will find these methods useful for synthesising complex and sometimes contradictory bodies of knowledge arising from research in their discipline, to inform policy and practice.
The report, “Knowledge synthesis for environmental decisions: an evaluation of existing methods, and guidance for their selection, use and development” is available at:
348 years of cumulative experience distilled into 10 evidence-based principles for getting research into policy
Research published by Nadine Marshall from CSIRO in Australia, published in PLOS One, distils 348 years of cumulative experience shared by 31 environmental experts across three continents into advice for social scientists seeking to increase their influence on policy. These are the 10 principles in the words of the authors:
Nadine Marshall is an Environmental Social Scientist at Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO). Read the full paper, “Empirically derived guidance for social scientists to influence environmental policy” at: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0171950
New guide published on how to record and evidence policy impact
The University of Cambridge’s Public Policy Strategic Research Initiative has published a comprehensive new guide to recording and evidencing policy impacts. In it, they identify eight types of policy impact submitted in REF2014, supported by eleven types of evidence.
Evidence of policy implementation was noticeable by its absence. It is likely that the longer lead-in for impact in REF2021 will result in the inclusion of more policy impact case studies that include evidence of policy implementation, for example via policy reviews.
Read “How to Evidence and Record Policy Impact: A ‘how to’ guide for Researchers” at:
Seeking to achieve more significant policy change could get you a lower REF score
An analysis of high versus low scoring REF2014 case studies from the Social Work and Social Policy Unit of Assessment, alongside interviews with academics submitted to the assessment, suggests that research that was particularly critical of current government policy received lower scores than other types of policy-relevant research.
Dr Katherine Smith and Dr Ellen Stewart from the University of Edinburgh argued that this “supported interviewees’ claims that it is likely to be harder to demonstrate the impact of work aimed at achieving more substantial kinds of social or policy change.” They argued that the more significant a policy research impact that is sought, the harder it is likely to be to demonstrate research impact. Smith and Stewart’s “impact ladder” suggests a trade-off between the significance of a policy or social change that is sought and is demonstrability.
Katherine Smith is a Reader in the Global Public Health Unit at the University of Edinburgh, and Ellen Stewart is Chancellor's Fellow at the Centre for Population Health Sciences at the University of Edinburgh.
Researchers share their embarrassing #impactfail stories
Researchers have been sharing their stories of attempts to generate impact that went wrong. Dr Sally Reynolds, Deputy Head of Institute for Studies of Landscape and Human Evolution at Bournemouth University, despaired on day a Royal Society forensic footprint exhibit, when an elderly gentleman saw picture of criminal footwear she was exhibiting and approached her to ask what the best trainers
were for arthritic ankles. Asher Minns, Executive Director of the Tyndall Centre for Climate Change Research, gave an inspirational, big picture climate talk to Norfolk County Council, only to be asked one question: Will road verges need cutting more often? But our favourite #impactfail came from Alister Scott, Professor in Environmental Geography at Northumbria University, and we’ve illustrated it for you here:
We want to hear your unsung impacts
Over the course of 2018, we will be compiling stories of unsung impacts. These are stories that your press office doesn’t want to hear about and that will never be submitted to any research evaluation. But they are often stories we care deeply about.
Some happened at the wrong time, others happened before the person joined a University, and other impacts clearly happened but you could never prove they wouldn’t have happened without the research. Some of these stories are among the most inspiring, but because we only celebrate “demonstrable” impacts that are significant and far-reaching in exercises like REF2021, we will never hear about the research that saved one person’s life, or transformed an organization that was subsequently closed down.
These are the unsung impacts. By allowing these stories to remain unheard, we allow our institutions to narrow and instrumentalise the public’s view of the value of research. It may not be “strategic” to tell these stories, but it is essential that we tell these them, to celebrate the rich diversity of benefits that can arise from research.
If you have an unsung impact, email it to Madie (pa@fasttrackimpact) and we will include them on the blog and in the next issue of the magazine (we may ask you a few questions to draw out your story a bit more before we publish it).