| Literature DB >> 35548330 |
Nigussie Gemechu1, Meghan Werbick2, Michelle Yang1, Adnan A Hyder3.
Abstract
Research is a critical component of the public health enterprise, and a key component of universities and schools of public health and medicine. To satisfy varying levels of stakeholders in the field of public health research, accurately measuring the return on investment (ROI) is important; unfortunately, there is no approach or set of defined metrics that are universally accepted for such assessment. We propose a research metrics framework to address this gap in higher education. After a selected review of existing frameworks, we identified seven elements of the generic research lifecycle (five internal to an institution and two external). A systems approach was then used to broadly define four parts of each element: inputs, processes, outputs, and outcomes (or impacts). Inputs include variables necessary to execute research activities such as human capital and finances. Processes are the pathways of measurement to track research performance through all phases of a study. Outputs entail immediate products from research; and outcomes/impacts demonstrate the contribution research makes within and beyond an institution. This framework enables the tracking and measurement of research investments to outcomes. We acknowledge some of the challenges in applying this framework including the lack of standardization in research metrics, disagreement on defining impact among stakeholders, and limitations in resources for implementing the framework and collecting relevant data. However, we suggest that this proposed framework is a systematic way to raise awareness about the role of research and standardize the measurement of ROI across health science schools and universities.Entities:
Keywords: health research; public health; research impact; research measurement; research metrics; research outcomes
Year: 2022 PMID: 35548330 PMCID: PMC9082743 DOI: 10.3389/frma.2022.817821
Source DB: PubMed Journal: Front Res Metr Anal ISSN: 2504-0537
Descriptions of selected research assessment frameworks.
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| Canadian Academy of Health Science (CAHS)—Formerly Payback Framework | A framework developed using a logic model for health research translation, and drawing on a library of indicators. It aims to provide consistency and comparability between institutions in a research system with multiple regional funders. Provides a framework for consistent data gathering and presentation across a series of case studies | Five categories: advancing knowledge; capacity building; informing policies and product development; health and health sector benefits; broader economic benefits; categories cover range of perspectives that important to both researchers and various types of users | Review of documents/archives, surveys, analysis of publications, interviews, Bibliometrics, scoring | Tailored to Canadian context; very comprehensive; flexible—applies to range of types of funding, and different types of research; developed through engagement; formative; looks at process, outputs and impacts; aligned with main funders in Canada | Resource intensive to implement; complicated; developed by committee; ambiguity in definitions between outputs and outcomes; impose burden; no ranking; approach plays down difficulties of attribution to specific studies |
| Excellence in Research for Australia (ERA) | A framework used in Australia to measure the performance and quality of research, currently for accountability and advocacy purposes | Assesses quality, volume, application of research (impact), and measures of esteem for all Australian universities at disciplinary level; does not capture societal or environmental impacts comprehensively | Bibliometrics, peer review | Compliance from the research community; burden on participants is moderate; Data accessible (engagement indicator driven); Produces metrics used for ranking; Recognizes multidisciplinary work | Indicator driven to capture engagement only; Use of peer review limits objectivity; Limited to Australian use; Less availability of indicators; Requires some central expertise (e.g., bibliometric expertise on panel) |
| Faster Cures Biomedical Ecosystem Metrics Project | Promotes a high performing, patient-centered biomedical systems and developing metrics to address efficiency and effectiveness of the processes | Process efficiency and effectiveness, productivity, and transparency | Early stage of development | Developed by diverse stakeholders | Under development; Focuses on biomedical innovations; Patient-centered |
| National Institute of Health Research (NIHR) Dashboard | A framework that consists of a dashboard to monitor the performance of research funded by the National Institute of Health Research in the UK, drawing on a logic model and a balanced scorecard approach. It accumulates data from a series of dashboards at lower levels of aggregations and is intended to be used for strategic decision making and analysis | Data collected quarterly at project level on inputs, processes, outputs and outcomes for financial, internal process and user satisfaction | System level dashboard; Peer review; data mining | Aligned with institutional goals; Can be used for monitoring impact; Comparable within organization; Indicator set is balanced; Strong theoretical basis; Wide applicability across the organization; Focused and selective set of indicators | At early phase of implementation; Limited to few indicators; High central burden; Reliant on information management systems; Not a comprehensive assessment |
| Productive Interactions (Europe) | A framework developed across several countries in Europe and for multiple disciplines. It is a flexible approach which aims to help institutions learn and improve their performance against their own goals. Measures productive interactions with stakeholders that lead to change | Intended to work in a wide range of contexts, best applied at research group or department level where goals are consistent | Interviews, document review, data mining | Tailored to assess performance improvements; Formative; Sensitive to organizational goals; Comprehensive; Flexible; Some tools and “how to” guides; Avoids time lag interactions to impact thus reducing bias against early career researchers; Multi-disciplinary; Broad scope suitable for a wide range of contexts | Does not produce comparison between institutions; High burden on participants; Challenging to implement; Requires assessors to identify productive interactions; Assumes interactions are a good indicator of impact |
| Research Excellence Framework (REF) | A framework developed to assess the performance of universities in the UK and to determine funding allocation, taking into account wider nonacademic impacts of research | Assessment at subject level on three elements: quality of research outputs, impact of research (not academic) and vitality of environment | Bibliometrics, peer review, survey, and case studies | Suitable for similar cross institutional assessment of performance; comprehensive (includes societal impact); multi-method and multidisciplinary; successfully piloted and implemented; produces a single performance indicator which can be used for ranking; acceptable to UK academic community | High burden and expensive; can discriminate against some researchers and institutions; summative; scalability not demonstrated; not transparent; almost solely reliant on peer review—limits objectivity |
| Snowball Metrics | A set of metrics for effective and long-term institutional research information measurement. Research metrics developed by research-intensive universities to set standards | Benchmarking | Peer/Expert review, Balanced Scorecard | Triangulate information from different data sources; Helps to understand institutional strengths and weaknesses; Free of charge | Covers only universities |
| Star Metrics | Its goal is to document the outcomes of science investments to the public by developing an open, automated data infrastructure and tools that will enable the documentation and analysis of a subset of the inputs, outputs, and outcomes resulting from the federal investments in science, largely to assess the performance of research and researchers for accountability purposes | Job creation; range of research funded researcher interactions and wider impacts | Data mining—collects jobs data; university administrative databases | Data mining approach is relatively novel; Minimizes burden and maximizes accountability; low participant burden once set up; not a ranking approach; does not produce a single indicator of comparative performance | Not fully developed; Not comprehensive; Summative (at present); not a ranking approach |
Sources: (1) Measuring research: A guide to research evaluation frameworks and tools—.
Figure 1Proposed research metrics conceptual framework.
Example research metrics for a school of health science (e.g., medicine, public health, and biomedical Science).
|
|
|
|
|
|---|---|---|---|
| Research proposal application success or “hit” rate | Increasing the ratio of awards to proposal submissions | Internal Database | Number of proposal submitted, Number of proposal awarded, Fiscal Year (FI) of Award Date, and Department |
| Research Proposal Development Time (RPDT) from research proposal initiation (expression of “intent to submit”) to proposal submission to the sponsor | Increasing the administrative support to researchers to enable faculty to carry out larger and more complex research efforts, including international research | Internal Database | PI, FY (Submission Date), Project Start Date, Department, Funding Agency, and Title Category (research area) |
| Number and dollar amount of research grants (awards) (received) | Increasing the ratio of awards to proposal submissions; Encouraging average individual faculty and staff research productivity as a whole, as measured by: extramural direct research support, FandA support, and average proportion of faculty time devoted to research | Internal Database | PI, FY, Department, Sponsor, Title Category, Award Date, Total Dollar Amount of Funding, Type of Application (New, Continuation, Renewal, Resubmission), Sponsor Type (Federal, State, etc.), and Foreign National Involved (Yes/No) |
| Number and dollar amount of research projects engaging community partners | Building partnerships as measured by research in collaboration with other disciplines and/or academic institutions; community engagement; and broadening support by government agencies, foundations, industry and other funders; Increasing the capacity to carry out research in partnership with communities both within the DC area and globally | Analysis of Research Administrative Documents, PI Survey | PI, Project Start and End Date |
| Number of outlets for research funding opportunities announcement including availability of automatic notification and their respective views (infrastructure) | Fostering the integration of methodologic expertise, to support university research | Internal Database | Outlet Name (type), Outlet Description, Count of Views, and Automatic Notification Available (Y/N) |
| Number and type of honors/prizes received | Recognition of faculty, staff and students by pre-eminent science and professional societies and other bodies | Internal Database, PI Survey | Faculty/PI Name, Honor/Prize Type, Honor/Prize Date, and Honor/Prize By |
| Number of citations | Measuring the scientific impact of research by numbers of peer reviewed publications; impact factors for peer reviewed journals; contribution to significant scientific, health administration and policy innovations nationally and globally; and citations in scientific journals | Funders/Donors, Bibliometric Analysis, PI Survey, Review of Key Policy Documents | Publication Type (article, book, etc.), Author, Title (research area), Institution, Collaborators (if any), Publication Date, Citation Date, Cited In (Policy Documents, Clinical Guidelines, etc.), etc. |
| Amount of direct employment and local spending | Increasing the return on investment from research support strategies: faculty start-ups, and protected research time; pilot funds; staff training; cost-sharing; and investments | Labor Market Analysis, STAR Metrics Data, Expenditure accounts (local spending data) | Employee ID, Position Title (Role), Local/Outside, $Amount (income, etc.), Item Name, and $Amount of spending |
Research metrics portfolio (sample excerpt).
|
|
|
|
|
|---|---|---|---|
| Research proposal application success or “hit” rate | Output | Proposal Development and Submission | This metric captures the proportion of grant applications submitted to the sponsors that are successfully resulted in granting awards |
| Research Proposal Development Time (RPDT) from research proposal initiation (expression of “intent to submit”) to proposal submission to the sponsor | Process | Proposal Development and Submission | This metric aims to measure the time from Principal Investigators (PIs) intent to submit a research proposal to proposal submission date (define research start date) |
| Number and dollar amount of research grants (awards) (received) | Output | Award Setup and Management | The number and total dollar amount of all research funding awards made to school |
| Number and dollar amount of research projects engaging community partners | Impact | Collaboration and Networking | This metric captures absolute number and/or proportion of research projects that engage community organizations |
| Number of outlets for research funding opportunities announcement including availability of automatic notification and their respective views (infrastructure) | Input | Capacity Building/Strengthening | Number of outlets for research funding opportunities announcement including availability of automatic notification and their respective views |
| Number and type of honors/prizes received | Impact | Prestige/Recognition | The number and type of prizes and professional recognitions received |
| Number of citations | Impact | Knowledge Generation, Innovation and Informing Policy and Decision-Making | Number of citations of publications (on articles, policy documents, public health guidelines, books, conference proceedings, etc.) |
| Amount of direct employment and local spending | Impact | Broader Health, Economic, Social, and Environmental Impacts | By creating employment for both researchers and others, research activities can help reduce unemployment. In turn, newly created and filled jobs stimulate the local economy through the spending of those who fill the jobs. Local spending includes money spent on local services such as technical support, catering, and products |
Examples of research measurement approaches.
|
|
|
|---|---|
| Balanced Scorecard | Mostly used for quantitative performance measurement. Provides the capability to maintain big-picture long-term organizational success by integrating performances across domains and research lifecycles. It also helps to align research metrics with strategic objectives |
| Bibliometric Analysis | A range of techniques for assessing quantity, dissemination and content of publications and patents uses quantitative analysis to measure patterns of publications and citations. Bibliometric analysis is one of the important tools and processes used to measure research outputs such as publications and citations. It uses one or a combination of publication and citation tracking databases such as Scopus, Web of Science, PubMed, and Google Scholar to generate measures. Understanding the various types of bibliometric measures and their limitations helps to identify the appropriate ones. Bibliometrics are most useful when employed in conjunction with other measures to assess the categorical or non-comparative research outputs and impact |
| Case Studies | Can be used in a variety of ways; flexible enough to capture a wide variety of impacts, including the unexpected, and can provide the full context around a piece of research, researcher, or impact |
| Data Mining | Allows access to and understanding of existing data sets; uses algorithms to find correlations and patterns and present them in a meaningful format, reducing complexity without losing information |
| Institutional databases and systems | Standalone or integrated Internal database systems or applications for tracking, collecting, analyzing, and reporting research data |
| Interviews | Used to obtain supplemental information on areas of interest, generally to access personal perspectives on a topic, or more detailed contextual information. The participants may include PIs, staff, students, alumni, etc. |
| Labor Market/Economic Analysis | Provides labor and economic data to measure socio-economic returns of research |
| Peer Review | Review by peers, typically other academics in the same or a similar field, of outputs of research; rationale that subject experts are uniquely qualified to assess the quality of the work of others |
| Review of documents | Review of existing internal/external administrative or technical documents, guidelines, reports, or archives |
| Surveys | Provide a broad overview of the current status of a particular program or body of research; widely used in research evaluation to provide comparable data across a range of researchers and/or grants which are easy to analyze. The participants may include PI, staff, alumni, etc. |