Following on from last week’s blog, a little more about who we interviewed and what we asked them about regarding their motivations for improved research information management.
The interview demographics represent a range of individuals and organisations who are involved in the funding, generation, management and reporting of research information. Thus we have spoken to:
- 10 research funders including HEFCE, Research Councils and charities of different sizes;
- Pro-vice chancellors, researchers, research office staff, IT support and information managers for 10 different HEIs spread across the different mission groups;
- 3 “umbrella” organisations representing different sector perspectives – AMRC, ARMA and UCISA;
- 4 vendors of research information management systems;
- and 2 research information/data providers.
As outlined in the last post, the transcriptions from these interviews have been analysed. We defined a set of drivers – motivations underlying the interview responses – according to the following categories:
- political, competitive, marketing, research drivers for business delivery
- day-to-day practical, workflow, management, implementation, efficiency drivers
- technology, functional, hardware, software, standards drivers
- cost, saving, resourcing drivers
- statutory, legal, ethical, contractual drivers
- reporting, submission and similar transactional drivers
- user, engagement, adoption, transition drivers
An initial analysis of the interview texts identified around 200 statements that relate to one of more of the above categories. Emerging headlines from the driver analysis will come as no surprise to those involved in this domain.
The biggest driver for HEI institutions was better quality research information to improve strategy, planning, investment and thus drive up the quality and impact of their research portfolio. Many of the underpinning drivers for this related to improving operational efficiency, reducing the reporting burden and duplication of effort, as well as more agile, forward planning based on quality-assured research information. Funders’ responses matched well with these sentiments, and included additional drivers such as being able to articulate research impact returns on investment from the public purse over the longer term. Umbrella organisations – not surprisingly – were keen to see greater harmonisation around reporting requirements and systems. This was echoed by researchers, who also wanted to make sure the public has access to this information on research outputs.