Capture the evidence of impact

Michael Elements Updates

credit:shutterstock

credit:shutterstock

A study from the University of Oxford’s Department of Zoology, discovering that elephants are frightened of bees, led to massive reductions in elephants raiding farms in various parts of Africa.

Research at the University of Sheffield focusing on the links between memory, trauma and narrative has improved the mental health and well-being of patients in secure hospitals after influencing their therapeutic practices, while increasing understanding between – and changing attitudes to – socially marginalised groups.

For many disciplines, frequency of citation of scholarly articles, alongside measures of the outlet in which they are published remains a well-used (albeit controversial) means of evaluating the academic ‘impact’ of research. However, evidence of the wider effects of impact can’t always be captured via traditional metrics. In recent years, a growing number of funding agencies have started to ask institutions and researchers to provide qualitative evidence of research impact generated as a result of externally funded research.

In 2014 for example, the NIH medical research agency introduced a new version of their Biosketch to “redirect the focus of reviewers and the scientific community more generally from widely-questioned metrics” to a more qualitative outline of their work.
The UK government has also placed increasing emphasis on the need for evidence of economic and social returns from its investment in research. For the 2014 REF (Research Excellence Framework), UK higher education institutions submitted 6975 impact case studies detailing the impact of their research on wider society.

And in the last few days, the Australian Research Council announced that they will soon introduce, for the first time, a national impact and engagement assessment which will run as a companion exercise to the current Excellence in Research for Australia (ERA) assessment.

We expect this trend to continue in the coming years as the assessments of governments and funding bodies around the world evolve to consider the wider effects of the research that they fund.

Announcing the Impact Module

Developing a comprehensive evidence base of the wider impact of research undertaken within an institutional setting is a challenge for many reasons:

  • Timing– research impact often occurs long after the research is completed.
  • Attribution– collaborative research involves multiple stakeholders and has many inputs, often in isolation of each other.
  • Classification– it’s not always obvious who is the beneficiary of research and so reporting on a diverse range of impacts can quickly get very complicated.

Challenges such as these prompted many of our clients to ask us to help them capture the emerging evidence of research impact resulting from their researchers’ work. Twelve months and multiple usability workshops later, we are very pleased to announce the first release of our Impact module!

Institutions across the UK, Australia and other parts of the world are now looking at how research impact can be planned for, recorded and monitored proactively – some even appointing Impact Officers to support and monitor the process.

Add impact1

Adding a narrative in the Impact module

The module provides an easy method for not only capturing evidence of impact, be that a document or web link, but also for writing a narrative that explains the relevance and context of this evidence. These narratives can be brief or extensive, but the key benefit here is that they can be built over time, rather than looking back and writing retrospectively. Evidence of impact is often laborious to find, collect and store, and kept in disparate systems across an institution.

With the Impact module, everything is kept in one place and is easy to access. This mitigates problems in future reporting by allowing data to be extracted from a single, comprehensive system to ensure consistency.

In our new module the recording of impact is based on qualitative, rather than quantitative, methods – it is designed so that researchers can demonstrate evidence not with metrics, but with narratives.

We think this is the best way that we can help institutions to record the impact of their research, while empowering researchers to demonstrate their own pathways to impact. The module was created through close collaboration with our user community, and we look forward to developing it further next year.

The Impact module was released as part of Elements v4.17. Subscribers to our support site can find out more in our release note.