Research Assessment Reform, Non-Traditional Research Outputs, and Digital Repositories: An Analysis of the Declaration on Research Assessment (DORA) Signatories in the United Kingdom


  • Christie Hurrell University of Calgary, Calgary, Alberta, Canada



Objective – The goal of this study was to better understand to what extent digital repositories at academic libraries are active in promoting the collection of non-traditional research outputs. To achieve this goal, the researcher examined the digital repositories of universities in the United Kingdom who are signatories of the Declaration on Research Assessment (DORA), which recommends broadening the range of research outputs included in assessment exercises.

Methods – The researcher developed a list of 77 universities in the UK who are signatories to DORA and have institutional repositories. Using this list, the researcher consulted the public websites of these institutions using a structured protocol and collected data to 1) characterize the types of outputs collected by research repositories at DORA-signatory institutions and their ability to provide measures of potential impact, and 2) assess whether university library websites promote repositories as a venue for hosting non-traditional research outputs. Finally, the researcher surveyed repository managers to understand the nature of their involvement with supporting the aims of DORA on their campuses.

Results – The analysis found that almost all (96%) of the 77 repositories reviewed contained a variety of non-traditional research outputs, although the proportion of these outputs was small compared to traditional outputs. Of these 77 repositories, 82% featured usage metrics of some kind. Most (67%) of the same repositories, however, were not minting persistent identifiers for items. Of the universities in this sample, 53% also maintained a standalone data repository. Of these data repositories, 90% featured persistent identifiers, and all of them featured metrics of some kind. In a review of university library websites promoting the use of repositories, 47% of websites mentioned non-traditional research outputs. In response to survey questions, repository managers reported that the library and the unit responsible for the repository were involved in implementing DORA, and managers perceived it to be influential on their campus.

Conclusion – Repositories in this sample are relatively well positioned to support the collection and promotion of non-traditional research outputs. However, despite this positioning, and repository managers’ belief that realizing the goals of DORA is important, most libraries in this sample do not appear to be actively collecting non-traditional outputs, although they are active in other areas to promote research assessment reform.


Download data is not yet available.


Alperin, J. P., Schimanski, L. A., La, M., Niles, M. T., & McKiernan, E. C. (2022). The value of data and other non-traditional scholarly outputs in academic review, promotion, and tenure in Canada and the United States. In A. L. Berez-Kroeker, B. J. McDonnell, E. Koller, & L. B. Collister (Eds.), The open handbook of linguistic data management (pp. 171–182). MIT Press. DOI:

Australian Research Council. (2019). Non-traditional research outputs (NTROs). In State of Australian university research 2018–19: ERA national report.

Bailey, C. W., Jr., Coombs, K., Emery, J., Mitchell, A., Morris, C., Simons, S., & Wright, R. (2006). Institutional repositories (SPEC Kit 292). Association of Research Libraries.

Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903. DOI:

Burpee, K. J., Glushko, B., Goddard, L., Kehoe, I., & Moore, P. (2015). Outside the four corners: Exploring non­traditional scholarly communication. Scholarly and Research Communication, 6(2), Article 0201224. DOI:

Caplar, N., Tacchella, S., & Birrer, S. (2017). Quantitative evaluation of gender bias in astronomical publications from citation counts. Nature Astronomy, 1(6), Article 0141. DOI:

Chawla, D. S. (2016a). Men cite themselves more than women do. Nature. DOI:

Chawla, D. S. (2016b). The unsung heroes of scientific software. Nature, 529(7584),115–116. DOI:

Curry, S., de Rijcke, S., Hatch, A., Pillay, D., van der Weijden, I., & Wilsdon, J. (2020). The changing role of funders in responsible research assessment: Progress, obstacles and the way ahead. Research on Research Institute.

Curry, S., Gadd, E., & Wilsdon, J. (2022). Harnessing the Metric Tide: Indicators, infrastructures & priorities for UK responsible research assessment. Research on Research Institute.

Dempsey, L. (2017). Library collections in the life of the user: Two directions. LIBER Quarterly: The Journal of the Association of European Research Libraries, 26(4), 338–359. DOI:

Fulvio, J. M., Akinnola, I., & Postle, B. R. (2021). Gender (im)balance in citation practices in cognitive neuroscience. Journal of Cognitive Neuroscience, 33(1), 3–7. DOI:

Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA, 295(1), 90–93. DOI:

Haak, L. L., Meadows, A., & Brown, J. (2018). Using ORCID, DOI, and other open identifiers in research evaluation. Frontiers in Research Metrics and Analytics, 3: Article 28. DOI:

Haddaway, N. R., Collins, A. M., Coughlin, D., & Kirk, S. (2015). The role of Google Scholar in evidence reviews and its applicability to grey literature searching. PLoS ONE, 10(9), Article e0138237. DOI:

Hurrell, C. (2022). Role of institutional repositories in supporting DORA. Open Science Framework.

Inefuku, H. W., & Roh, C. (2016). Agents of diversity and social justice: Librarians and scholarly communication. In K. L. Smith & K. A. Dickson (Eds.), Open access and the future of scholarly communication: Policy and infrastructure (pp. 107–127). Rowman & Littlefield.

Kingsley, D. (2020). The ‘impact opportunity’ for academic libraries through grey literature. The Serials Librarian, 79(3–4), 281–289. DOI:

Kroth, P. J., Phillips, H. E., & Hannigan, G. G. (2010). Institutional repository access patterns of nontraditionally published academic content: What types of content are accessed the most? Journal of Electronic Resources in Medical Libraries, 7(3), 189–195. DOI:

Lawrence, A., Houghton, J., Thomas, J., & Weldon, P. (2014). Where is the evidence? Realising the value of grey literature for public policy and practice: A discussion paper. Swinburne Institute for Social Research.

Library Publishing Coalition Research Committee. (2020). Library Publishing Research Agenda. Educopia Institute. DOI:

Lynch, C. A. (2003). Institutional repositories: Essential infrastructure for scholarship in the digital age. portal: Libraries and the Academy, 3(2), 327–336. DOI:

Lynch, C. A., & Lippincott, J. K. (2005). Institutional repository deployment in the United States as of early 2005. D-Lib Magazine, 11(9). DOI:

Macgregor, G., Lancho-Barrantes, B. S., & Pennington, D. R. (2023). Measuring the concept of PID literacy: User perceptions and understanding of PIDs in support of open scholarly infrastructure. Open Information Science, 7(1): Article 20220142. DOI:

Makula, A. (2019). “Institutional” repositories, redefined: Reflecting institutional commitments to community engagement. Against the Grain, 31(5): Article 17. DOI:

Marsolek, W. R., Cooper, K., Farrell, S. L., & Kelly, J. A. (2018). The types, frequencies, and findability of disciplinary grey literature within prominent subject databases and academic institutional repositories. Journal of Librarianship and Scholarly Communication, 6(1), Article eP2200. DOI:

Mason, S., Merga, M. K., González Canché, M. S., & Mat Roni, S. (2021). The internationality of published higher education scholarship: How do the ‘top’ journals compare? Journal of Informetrics, 15(2), Article 101155. DOI:

McDowell, C. S. (2007). Evaluating institutional repository deployment in American academe since early 2005: Repositories by the numbers, Part 2. D-Lib Magazine, 13(9/10). DOI:

Moore, E. A., Collins, V. M., & Johnston, L. R. (2020). Institutional repositories for public engagement: Creating a common good model for an engaged campus. Journal of Library Outreach and Engagement, 1(1), 116–129. DOI:

Nicholson, J., & Howard, K. (2018). A study of core competencies for supporting roles in engagement and impact assessment in Australia. Journal of the Australian Library and Information Association, 67(2), 131–146. DOI:

Open Research Funders Group. (n.d.). Incentivizing the sharing of research outputs through research assessment: A funder implementation blueprint.

Parsons, M. A., Duerr, R. E., & Jones, M. B. (2019). The history and future of data citation in practice. Data Science Journal, 18, Article 52. DOI:

Piwowar, H. A., Day, R. S., & Fridsma, D. B. (2007). Sharing detailed research data is associated with increased citation rate. PLoS ONE, 2(3), Article e308. DOI:

Price, R., & Murtagh, J. (2020). An institutional repository publishing model for Imperial College London grey literature. The Serials Librarian, 79(3–4), 349–358. DOI:

Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto.

Research England, Scottish Funding Council, Higher Education Funding Council for Wales, & Department for the Economy, Northern Ireland. (2023). Research Excellence Framework 2028: Initial decisions and issues for further consultation (REF 2028/23/01).

Rieh, S. Y., Markey, K., St. Jean, B., Yakel, E., & Kim, J. (2007). Census of institutional repositories in the U.S.: A comparison across institutions at different stages of IR development. D-Lib Magazine, 13(11/12). DOI:

Salo, D. (2008). Innkeeper at the roach motel. Library Trends, 57(2), 98–123. DOI:

San Francisco Declaration on Research Assessment. (2012). Declaration on Research Assessment.

Savan, B., Flicker, S., Kolenda, B., & Mildenberger, M. (2009). How to facilitate (or discourage) community-based research: Recommendations based on a Canadian survey. Local Environment, 14(8), 783–796. DOI:

Signers. (n.d.). Declaration on Research Assessment.

Suber, P. (2012). Open access. MIT Press. DOI:

Sud, P., & Thelwall, M. (2014). Evaluating altmetrics. Scientometrics, 98(2), 1131–1143. DOI:

Sugimoto, C. R., & Larivière, V. (2017). Measuring research: What everyone needs to know. Oxford University Press. DOI:

Ten Holter, C. (2020). The repository, the researcher, and the REF: “It’s just compliance, compliance, compliance.” The Journal of Academic Librarianship, 46(1), Article 102079. DOI:

Van Noorden, R. (2013). Data-sharing: Everything on display. Nature, 500(7461), 243–245. DOI:

van Westrienen, G., & Lynch, C. A. (2005). Academic institutional repositories: Deployment status in 13 nations as of mid 2005. D-Lib Magazine, 11(9). DOI:

Vandewalle, P. (2012). Code sharing is associated with research impact in image processing. Computing in Science & Engineering, 14(4), 42–47. DOI:

Wellcome Trust. (n.d.). Guidance for research organisations on how to implement responsible and fair approaches for research assessment.

Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., & Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. DOI:




How to Cite

Hurrell, C. (2023). Research Assessment Reform, Non-Traditional Research Outputs, and Digital Repositories: An Analysis of the Declaration on Research Assessment (DORA) Signatories in the United Kingdom. Evidence Based Library and Information Practice, 18(4), 2–20.



Research Articles