Research Assessment Reform, Non-Traditional Research Outputs, and Digital Repositories: An Analysis of the Declaration on Research Assessment (DORA) Signatories in the United Kingdom

Authors

  • Christie Hurrell University of Calgary, Calgary, Alberta, Canada

DOI:

https://doi.org/10.18438/eblip30407

Abstract

Objective – The goal of this study was to better understand to what extent digital repositories at academic libraries are active in promoting the collection of non-traditional research outputs. To achieve this goal, the researcher examined the digital repositories of universities in the United Kingdom who are signatories of the Declaration on Research Assessment (DORA), which recommends broadening the range of research outputs included in assessment exercises.

Methods – The researcher developed a list of 77 universities in the UK who are signatories to DORA and have institutional repositories. Using this list, the researcher consulted the public websites of these institutions using a structured protocol and collected data to 1) characterize the types of outputs collected by research repositories at DORA-signatory institutions and their ability to provide measures of potential impact, and 2) assess whether university library websites promote repositories as a venue for hosting non-traditional research outputs. Finally, the researcher surveyed repository managers to understand the nature of their involvement with supporting the aims of DORA on their campuses.

Results – The analysis found that almost all (96%) of the 77 repositories reviewed contained a variety of non-traditional research outputs, although the proportion of these outputs was small compared to traditional outputs. Of these 77 repositories, 82% featured usage metrics of some kind. Most (67%) of the same repositories, however, were not minting persistent identifiers for items. Of the universities in this sample, 53% also maintained a standalone data repository. Of these data repositories, 90% featured persistent identifiers, and all of them featured metrics of some kind. In a review of university library websites promoting the use of repositories, 47% of websites mentioned non-traditional research outputs. In response to survey questions, repository managers reported that the library and the unit responsible for the repository were involved in implementing DORA, and managers perceived it to be influential on their campus.

Conclusion – Repositories in this sample are relatively well positioned to support the collection and promotion of non-traditional research outputs. However, despite this positioning, and repository managers’ belief that realizing the goals of DORA is important, most libraries in this sample do not appear to be actively collecting non-traditional outputs, although they are active in other areas to promote research assessment reform.

Downloads

Download data is not yet available.

References

Alperin, J. P., Schimanski, L. A., La, M., Niles, M. T., & McKiernan, E. C. (2022). The value of data and other non-traditional scholarly outputs in academic review, promotion, and tenure in Canada and the United States. In A. L. Berez-Kroeker, B. J. McDonnell, E. Koller, & L. B. Collister (Eds.), The open handbook of linguistic data management (pp. 171–182). MIT Press. https://doi.org/10.7551/mitpress/12200.003.0017 DOI: https://doi.org/10.7551/mitpress/12200.003.0017

Australian Research Council. (2019). Non-traditional research outputs (NTROs). In State of Australian university research 2018–19: ERA national report. https://dataportal.arc.gov.au/era/nationalreport/2018/pages/section1/non-traditional-research-outputs-ntros/

Bailey, C. W., Jr., Coombs, K., Emery, J., Mitchell, A., Morris, C., Simons, S., & Wright, R. (2006). Institutional repositories (SPEC Kit 292). Association of Research Libraries. https://publications.arl.org/Institutional-Repositories-SPEC-Kit-292/1

Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903. https://doi.org/10.1016/j.joi.2014.09.005 DOI: https://doi.org/10.1016/j.joi.2014.09.005

Burpee, K. J., Glushko, B., Goddard, L., Kehoe, I., & Moore, P. (2015). Outside the four corners: Exploring non­traditional scholarly communication. Scholarly and Research Communication, 6(2), Article 0201224. https://doi.org/10.22230/src.2015v6n2a224 DOI: https://doi.org/10.22230/src.2015v6n2a224

Caplar, N., Tacchella, S., & Birrer, S. (2017). Quantitative evaluation of gender bias in astronomical publications from citation counts. Nature Astronomy, 1(6), Article 0141. https://doi.org/10.1038/s41550-017-0141 DOI: https://doi.org/10.1038/s41550-017-0141

Chawla, D. S. (2016a). Men cite themselves more than women do. Nature. https://doi.org/10.1038/nature.2016.20176 DOI: https://doi.org/10.1038/nature.2016.20176

Chawla, D. S. (2016b). The unsung heroes of scientific software. Nature, 529(7584),115–116. https://doi.org/10.1038/529115a DOI: https://doi.org/10.1038/529115a

Curry, S., de Rijcke, S., Hatch, A., Pillay, D., van der Weijden, I., & Wilsdon, J. (2020). The changing role of funders in responsible research assessment: Progress, obstacles and the way ahead. Research on Research Institute. https://doi.org/10.6084/m9.figshare.13227914.v1

Curry, S., Gadd, E., & Wilsdon, J. (2022). Harnessing the Metric Tide: Indicators, infrastructures & priorities for UK responsible research assessment. Research on Research Institute. https://doi.org/10.6084/m9.figshare.21701624.v2

Dempsey, L. (2017). Library collections in the life of the user: Two directions. LIBER Quarterly: The Journal of the Association of European Research Libraries, 26(4), 338–359. https://doi.org/10.18352/lq.10170 DOI: https://doi.org/10.18352/lq.10170

Fulvio, J. M., Akinnola, I., & Postle, B. R. (2021). Gender (im)balance in citation practices in cognitive neuroscience. Journal of Cognitive Neuroscience, 33(1), 3–7. https://doi.org/10.1162/jocn_a_01643 DOI: https://doi.org/10.1162/jocn_a_01643

Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA, 295(1), 90–93. https://doi.org/10.1001/jama.295.1.90 DOI: https://doi.org/10.1001/jama.295.1.90

Haak, L. L., Meadows, A., & Brown, J. (2018). Using ORCID, DOI, and other open identifiers in research evaluation. Frontiers in Research Metrics and Analytics, 3: Article 28. https://doi.org/10.3389/frma.2018.00028 DOI: https://doi.org/10.3389/frma.2018.00028

Haddaway, N. R., Collins, A. M., Coughlin, D., & Kirk, S. (2015). The role of Google Scholar in evidence reviews and its applicability to grey literature searching. PLoS ONE, 10(9), Article e0138237. https://doi.org/10.1371/journal.pone.0138237 DOI: https://doi.org/10.1371/journal.pone.0138237

Hurrell, C. (2022). Role of institutional repositories in supporting DORA. Open Science Framework. https://osf.io/5kjna/

Inefuku, H. W., & Roh, C. (2016). Agents of diversity and social justice: Librarians and scholarly communication. In K. L. Smith & K. A. Dickson (Eds.), Open access and the future of scholarly communication: Policy and infrastructure (pp. 107–127). Rowman & Littlefield.

Kingsley, D. (2020). The ‘impact opportunity’ for academic libraries through grey literature. The Serials Librarian, 79(3–4), 281–289. https://doi.org/10.1080/0361526X.2020.1847744 DOI: https://doi.org/10.1080/0361526X.2020.1847744

Kroth, P. J., Phillips, H. E., & Hannigan, G. G. (2010). Institutional repository access patterns of nontraditionally published academic content: What types of content are accessed the most? Journal of Electronic Resources in Medical Libraries, 7(3), 189–195. https://doi.org/10.1080/15424065.2010.505515 DOI: https://doi.org/10.1080/15424065.2010.505515

Lawrence, A., Houghton, J., Thomas, J., & Weldon, P. (2014). Where is the evidence? Realising the value of grey literature for public policy and practice: A discussion paper. Swinburne Institute for Social Research. http://doi.org/10.4225/50/5580b1e02daf9

Library Publishing Coalition Research Committee. (2020). Library Publishing Research Agenda. Educopia Institute. http://doi.org/10.5703/1288284317124 DOI: https://doi.org/10.5703/1288284317124

Lynch, C. A. (2003). Institutional repositories: Essential infrastructure for scholarship in the digital age. portal: Libraries and the Academy, 3(2), 327–336. https://doi.org/10.1353/pla.2003.0039 DOI: https://doi.org/10.1353/pla.2003.0039

Lynch, C. A., & Lippincott, J. K. (2005). Institutional repository deployment in the United States as of early 2005. D-Lib Magazine, 11(9). https://doi.org/10.1045/september2005-lynch DOI: https://doi.org/10.1045/september2005-lynch

Macgregor, G., Lancho-Barrantes, B. S., & Pennington, D. R. (2023). Measuring the concept of PID literacy: User perceptions and understanding of PIDs in support of open scholarly infrastructure. Open Information Science, 7(1): Article 20220142. https://doi.org/10.1515/opis-2022-0142 DOI: https://doi.org/10.1515/opis-2022-0142

Makula, A. (2019). “Institutional” repositories, redefined: Reflecting institutional commitments to community engagement. Against the Grain, 31(5): Article 17. https://doi.org/10.7771/2380-176X.8431 DOI: https://doi.org/10.7771/2380-176X.8431

Marsolek, W. R., Cooper, K., Farrell, S. L., & Kelly, J. A. (2018). The types, frequencies, and findability of disciplinary grey literature within prominent subject databases and academic institutional repositories. Journal of Librarianship and Scholarly Communication, 6(1), Article eP2200. https://doi.org/10.7710/2162-3309.2200 DOI: https://doi.org/10.7710/2162-3309.2200

Mason, S., Merga, M. K., González Canché, M. S., & Mat Roni, S. (2021). The internationality of published higher education scholarship: How do the ‘top’ journals compare? Journal of Informetrics, 15(2), Article 101155. https://doi.org/10.1016/j.joi.2021.101155 DOI: https://doi.org/10.1016/j.joi.2021.101155

McDowell, C. S. (2007). Evaluating institutional repository deployment in American academe since early 2005: Repositories by the numbers, Part 2. D-Lib Magazine, 13(9/10). https://doi.org/10.1045/september2007-mcdowell DOI: https://doi.org/10.1045/september2007-mcdowell

Moore, E. A., Collins, V. M., & Johnston, L. R. (2020). Institutional repositories for public engagement: Creating a common good model for an engaged campus. Journal of Library Outreach and Engagement, 1(1), 116–129. https://doi.org/10.21900/j.jloe.v1i1.472 DOI: https://doi.org/10.21900/j.jloe.v1i1.472

Nicholson, J., & Howard, K. (2018). A study of core competencies for supporting roles in engagement and impact assessment in Australia. Journal of the Australian Library and Information Association, 67(2), 131–146. https://doi.org/10.1080/24750158.2018.1473907 DOI: https://doi.org/10.1080/24750158.2018.1473907

Open Research Funders Group. (n.d.). Incentivizing the sharing of research outputs through research assessment: A funder implementation blueprint. https://www.orfg.org/s/ORFG_funder-incentives-blueprint-_final_with_templated_language.docx

Parsons, M. A., Duerr, R. E., & Jones, M. B. (2019). The history and future of data citation in practice. Data Science Journal, 18, Article 52. https://doi.org/10.5334/dsj-2019-052 DOI: https://doi.org/10.5334/dsj-2019-052

Piwowar, H. A., Day, R. S., & Fridsma, D. B. (2007). Sharing detailed research data is associated with increased citation rate. PLoS ONE, 2(3), Article e308. https://doi.org/10.1371/journal.pone.0000308 DOI: https://doi.org/10.1371/journal.pone.0000308

Price, R., & Murtagh, J. (2020). An institutional repository publishing model for Imperial College London grey literature. The Serials Librarian, 79(3–4), 349–358. https://doi.org/10.1080/0361526X.2020.1847737 DOI: https://doi.org/10.1080/0361526X.2020.1847737

Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. http://altmetrics.org/manifesto/

Research England, Scottish Funding Council, Higher Education Funding Council for Wales, & Department for the Economy, Northern Ireland. (2023). Research Excellence Framework 2028: Initial decisions and issues for further consultation (REF 2028/23/01). https://repository.jisc.ac.uk/9148/1/research-excellence-framework-2028-initial-decisions-report.pdf

Rieh, S. Y., Markey, K., St. Jean, B., Yakel, E., & Kim, J. (2007). Census of institutional repositories in the U.S.: A comparison across institutions at different stages of IR development. D-Lib Magazine, 13(11/12). https://doi.org/10.1045/november2007-rieh DOI: https://doi.org/10.1045/november2007-rieh

Salo, D. (2008). Innkeeper at the roach motel. Library Trends, 57(2), 98–123. https://doi.org/10.1353/lib.0.0031 DOI: https://doi.org/10.1353/lib.0.0031

San Francisco Declaration on Research Assessment. (2012). Declaration on Research Assessment. https://sfdora.org/read/

Savan, B., Flicker, S., Kolenda, B., & Mildenberger, M. (2009). How to facilitate (or discourage) community-based research: Recommendations based on a Canadian survey. Local Environment, 14(8), 783–796. https://doi.org/10.1080/13549830903102177 DOI: https://doi.org/10.1080/13549830903102177

Signers. (n.d.). Declaration on Research Assessment. https://sfdora.org/signers/

Suber, P. (2012). Open access. MIT Press. https://doi.org/10.7551/mitpress/9286.001.0001 DOI: https://doi.org/10.7551/mitpress/9286.001.0001

Sud, P., & Thelwall, M. (2014). Evaluating altmetrics. Scientometrics, 98(2), 1131–1143. https://doi.org/10.1007/s11192-013-1117-2 DOI: https://doi.org/10.1007/s11192-013-1117-2

Sugimoto, C. R., & Larivière, V. (2017). Measuring research: What everyone needs to know. Oxford University Press. DOI: https://doi.org/10.1093/wentk/9780190640118.001.0001

Ten Holter, C. (2020). The repository, the researcher, and the REF: “It’s just compliance, compliance, compliance.” The Journal of Academic Librarianship, 46(1), Article 102079. https://doi.org/10.1016/j.acalib.2019.102079 DOI: https://doi.org/10.1016/j.acalib.2019.102079

Van Noorden, R. (2013). Data-sharing: Everything on display. Nature, 500(7461), 243–245. https://doi.org/10.1038/nj7461-243a DOI: https://doi.org/10.1038/nj7461-243a

van Westrienen, G., & Lynch, C. A. (2005). Academic institutional repositories: Deployment status in 13 nations as of mid 2005. D-Lib Magazine, 11(9). https://doi.org/10.1045/september2005-westrienen DOI: https://doi.org/10.1045/september2005-westrienen

Vandewalle, P. (2012). Code sharing is associated with research impact in image processing. Computing in Science & Engineering, 14(4), 42–47. https://doi.org/10.1109/MCSE.2012.63 DOI: https://doi.org/10.1109/MCSE.2012.63

Wellcome Trust. (n.d.). Guidance for research organisations on how to implement responsible and fair approaches for research assessment. https://wellcome.org/grant-funding/guidance/open-access-guidance/research-organisations-how-implement-responsible-and-fair-approaches-research

Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., & Johnson, B. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. https://doi.org/10.13140/RG.2.1.4929.1363 DOI: https://doi.org/10.4135/9781473978782

Downloads

Published

2023-12-15

How to Cite

Hurrell, C. (2023). Research Assessment Reform, Non-Traditional Research Outputs, and Digital Repositories: An Analysis of the Declaration on Research Assessment (DORA) Signatories in the United Kingdom. Evidence Based Library and Information Practice, 18(4), 2–20. https://doi.org/10.18438/eblip30407

Issue

Section

Research Articles