Literature DB >> 36128266

The Effective Trends and Driving Forces in The Future of Research Performance Evaluation: A Qualitative Study.

Nadia Sani'ee1, Leila Nemati-Anaraki2,1, Shahram Sedghi2,1, Abdolreza Noroozi Chakoli3, Salime Goharinezhad4.   

Abstract

Background: Performance is a multidimensional concept and is evaluated by different criteria. Definition and evaluation of research performance are always controversial and may be affected by variable conditions. Therefore, this study aimed to determine the effective trends and driving forces in the future of research performance evaluation.
Methods: In this qualitative study, the trend analysis through scoping review and interview was done to identify the driving forces affecting the future of research performance evaluation. The scoping review was conducted according to PRISMA-ScR guidelines and searching of the international databases. The interviews were done face-to-face, by telephone, and on social media. MAXQDA version 10 and thematic analysis were used to analyze the interviews and documents.
Results: In the scoping review step, a total of 6125 records were found through searching of the international databases and search engines. After removing 869 duplications, the title and abstract of 5256 records were screened. Finally, 42 records (41 English articles and 1 dissertation) were eligible for the study. In the interview step, 248 codes were assigned in nine main categories, 64 subcategories, and 47 dimensions. The trends included social (27 codes), technological (38 codes), economic (30 codes), environmental (5 codes), and political (44 codes) dimensions. Then, acquired information from two steps was synthesized, and the effective social, technological, economic, environmental, and political trends and driving forces were identified.
Conclusion: The results showed that various social, technological, economic, environmental, and political factors and indicators must be included and normalized in the national and international research performance evaluation system.
© 2022 Iran University of Medical Sciences.

Entities:  

Keywords:  Interview; Qualitative Research; Research; Systematic Review; Trends

Year:  2022        PMID: 36128266      PMCID: PMC9448456          DOI: 10.47176/mjiri.36.55

Source DB:  PubMed          Journal:  Med J Islam Repub Iran        ISSN: 1016-1430


Performance is a multidimensional concept and evaluated by different criteria. Research performance evaluation is always controversial. It is necessary to improve old methods of research performance evaluation and use new metrics regarding various social, technological, economic, environmental, and political factors.

Introduction

Performance is a multidimensional concept and is evaluated by different criteria (1). Performance evaluation is always controversial, and defining and assessing research performance is not an exce ption (2,3). The construct of research performance subdivides into two components including, research activity and its outcome. The outcome of the research activity becomes visible and will be passed on to others. On the other hand, the research performance is defined as the anticipated research outcomes of researchers in concrete products (e.g., publications), academic standing, personal understanding, and benefits to the community (4). The research performance evaluation plays a substantial role in scientific development, providing benchmarks for recruitment, promotion, funding, and rewards. Various bibliometric indicators have been successively proposed to make scientific and reasonable research evaluations (5). Many researchers suggested that measures of research performance may include bibliometric measures, awards, academy memberships, research funding, activity measures, royalty income, mid-term impact measures, long-term measures, and other metrics of competitiveness. The research performance evaluation uses bibliometric indicators, including both quantitative and qualitative metrics. It measures the performance of a journal, researcher, or research group (6). Quantity may consist of the number of publications and citations, while quality includes the journal’s impact factor (IF), immediacy index, H-index, etc (7). The bibliometric indicators do not reflect the scientific quality and only provide useful supplementary tools for evaluating academic research (2,8-10). These indicators have many strengths and weaknesses and aren’t complete. Many scholars strongly advocate for non-bibliometric measures (6,11,12). The bibliometric indicators are always being applied because of their easier application and access (13), and defended by numerous scholars (2,7,14-16). In addition to bibliometric indicators, other factors such as the science and technology progress for sustainable social development, allocating the human resources, infrastructure, and budget (17), the sufficient Gross Domestic Product (GDP) for research (18), and the international research collaboration networks (19) are the essential factors that can be effective in the research performance evaluation. Besides, the world is evolving, the information and communication technology, economic resources, environmental elements are constantly changing, and new challenges and trends are emerging. But effective trends and drivers in the future of research performance evaluation have not been studied in detail. On the other hand, analyzing the scientific performance of institutions, universities, and researchers has become an inevitable and essential priority (20). The result of bibliometric and scientometric analyses can be used for policy-making on research funding and promotion. Moreover, these results affect universities’ and institutions’ ranking (7). In recent years, global social, technological, economic, environmental, and political changes have influenced countries in different aspects. These changes can also influence the process of research and research performance evaluation. Also, research performance evaluation isn't assigned to a specific community, and all countries in the world face it. Different countries must identify these changes for effective research management and prevent loss of resources. In other words, the process of research performance evaluation may change in the future under the influence of these trends and driving forces. In this regard, developing countries such as Iran in recent years due to political and economic sanctions of funding research, scientific diplomacy, the presence of its researchers in leading universities and global scientific events, as well as publishing articles in prestigious international journals, have been faced many challenges. Therefore, identifying these global trends not only for third world countries such as Iran but also for developed countries can be effective in providing desirable solutions such as providing international scientific relations beyond economic and political sanctions. These forces have had an indirect effect on the research performance evaluation for a long time. If political and economic sanctions continue, research managers must select suitable methods for evaluating research performance. Therefore, the current study seeks to answer these questions: What are the effective social trends and driving forces in the future of research performance evaluation? What are the effective technological trends and driving forces in the future of research performance evaluation? What are the effective economic trends and driving forces in the future of research performance evaluation? What are the effective environmental trends and driving forces in the future of research performance evaluation? What are the effective political trends and driving forces in the future of research performance evaluation?

Methods

In this qualitative study, the trend analysis through scoping review and interview was done to identify the driving forces affecting the future of research performance evaluation. These trends include social, technological, economic, environmental, and political factors, which were in terms of the STEEP framework.

Data collection through Scoping review

The scoping review was conducted according to the “Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for scoping reviews”(PRISMA-ScR) guideline (21). All related documents in the international databases such as Web of Science, Scopus, Pubmed, Embase, Proquest, Library Information Technology Association (LITA), Library, Information Science & Technology Abstracts (LISTA), Springer, Institute of Electrical and Electronics Engineers (IEEE) were searched along with Google Scholar and Google search engines at January 2020. Gray literature was identified through Proquest, Google Scholar, and Google. Inclusion criteria for the scoping review were: Gray literature, review articles, original articles, reports, and working papers that investigated social, technological, economic, environmental, and political trends and driving forces in research performance evaluation; Published in English languages; Availability of full-text. Exclusion criteria for this review were: Scientometric and bibliometric studies without emphasizing social, technological, economic, environmental, and political trends and driving forces in research performance evaluation; Letter to editors, letters, editorials, commentary, conference papers, and notes. The data collection tool for the scoping review was a data extraction form. The bibliographic details for each document included title, first author, publication year, place of study, research method, and main findings. The search strategies for the scoping review are presented in Appendix 1. The search strategy on the Web of Science database is as follow: (TS=(“research performance”) OR TS=(“research performance assessment*”) OR TS=(“research performance evaluati*”) OR TS=(“research performance measurement*”) OR TS=(“research performance ranking*”) OR TS=(“research evaluati*”) OR TS=(“research assessment*”) OR TS=(“research measure*”) OR TS=(“research evaluation system*”) OR TS=(“research indicator*”) OR TS=(“research metric*”)) AND (TS=(scientometric*) OR TS=(bibliometric*) OR TS=(informetric*)) AND (TS=(“social trend*”) OR TS=(“economical trend*”) OR TS=(“political trend*”) OR TS=(“technological trend*”) OR TS=(“environment* trend*”) OR TS=(trend*) OR TS=(“driving force*”) OR TS=(determinant*) OR TS=(factor*)) The search strategy was confirmed by two members of the research team (N.S, SH.S). Also, references of the related documents and journals such as Scientometrics, Journal of Informetrics, Research Evaluation, and Higher Education were screened. Then, the search results were downloaded to EndNote X8. After deleting the duplicate items, two researchers screened the title and abstract of the documents based on the inclusion and exclusion criteria (N.S, A.N). These researchers resolved the conflict through negotiations. Otherwise, a third researcher (L.N) decided to include an article in the study or not. The quality assessment of studies wasn’t performed due to the type of review that was scoping review. Full-text of included articles was read and the main finding related to the research questions extracted (N.S).

Data collection through an interview

In the interview step based on the purposeful and heterogeneous sampling, 11 experts out of 20 ones entered the study. Inclusion criteria were: At least two years work experience in the library and information science, medical library and information science, scientometrics, and research performance evaluation; Availability and responsiveness; Having the scientific outputs published in the field of scientometrics, research performance evaluation, and the educational experience in this regard. A mobile phone voice recorder (a voice recording program installed on the mobile phone for telephone-based interviews) and the interview guideline were used. The interview guide was designed based on the literature review and the research objectives for the semi-structured interviews. The research team deleted the shortcomings of this guideline. The guideline consisted of 19 questions and four sections of personal and work experience information, existing challenges of the research performance evaluation, trend analysis, and intellectual models. Six experts were interviewed face-to-face, two by telephone and two by WhatsApp (done by N.S from January to March 2020). The time allocated for interviews ranged from 13 to 51 minutes. After recording each interview and listening to them, one of the researchers (S.G) wrote them exactly in the Microsoft Word 2016 edition. The interviewing continued until the data was saturated.

Data analysis

MAXQDA version 10 and thematic analysis were used to analyze the interviews and documents. Identified social, technological, economic, environmental, and political trends related to research performance evaluation through scoping review, and interviews were re-categorized based on semantic similarity and thematic overlap.

Ethical considerations

We received informed consent from our participants in the interview stage. The participants that didn't like to continue the interview at any stage were excluded from the study. The interviews were coded with the letter "M" and the number to maintain the confidentiality of the data. This study was conducted in compliance with the Iran University of Medical Sciences’ Code of Ethics as IR.IUMS.REC.1398.229.

Results

Descriptive results of the scoping review

Figure 1 shows the process of selecting documents for the scoping review. The descriptive specifications of each document are reported in Table 1. Total 6125 records were found through searching of the international databases and search engines. After removing 869 duplications, the title and abstract of 5256 records were screened. 5149 records were removed because of publication type and not related to research performance evaluation. Finally, 42 records (41 English articles and 1 dissertation) were eligible for the study. These records refer to one or more of the social, technological, economic, environmental, and political driving forces and trends which affect the future of research performance evaluation.
Fig. 1
Table 1

Descriptive specification of selected studies for the scoping review

TitleFirst authorCountryYearJournal
The effects of changes in the funding structure of the Flemish universities on their research capacity, productivity, and impact during the 1980's and early 1990'sH. F. Moed,Netherlands1998Scientometrics
Should the research performance of scientists be distinguished by gender?G. AbramoItaly2015Journal of Informetrics
National-scale research performance assessment at the individual levelG. AbramoItaly2011Scientometrics
Accounting for gender research performance differences in ranking universitiesG. AbramoItaly2015Current Science
University-industry collaboration in Italy: A bibliometric examinationG. AbramoItaly2009Technovation
The relationship between scientists' research performance and the degree of internationalization of their researchG. AbramoItaly2011Scientometrics
Gender gaps in international research collaboration: A bibliometric approachD. W. AksnesNorway2019Scientometrics
An investigation of the impact of research collaboration on academic performance in ItalyL. AldieriItaly2019Quality & Quantity
Institutional repositories as complementary tools to evaluate the quantity and quality of research outputsA.. Bonilla-CaleroSpain2014Library Review
The efficacy of different modes of funding research: Perspectives from Australian data on the biological sciencesP. BourkeAustralia1999Research Policy
The Role of gender in the employment, career perception and research performance of recent PhD graduates from Dutch universitiesJ. F. Waaijer CathelijnNetherlands2016PLoS One
The economics of post-doc publishingW. W. L. CheungCanada2008Ethics in Science and Environmental Politics
Investigating the interplay between fundamentals of national research systems: Performance, investments and international collaborationsG. CiminiItaly2016Journal of Informetrics
Scientific systems in Latin America: Performance, networks, and collaborations with industryH. ConfrariaNetherlands2019The Journal of Technology Transfer
How to assess quality of research in Iran, from input toimpact? Introduction of peer-based research evaluationmodel in IranA. EbadifarIran2017Archives of Iranian Medicine
How to interpret the position of private sector institutions in bibliometric rankings of research institutionsFe lix de Moya-AnegonSpain2014Scientometrics
Factors influencing university research performanceF. EdgarNew Zealand2013Studies in Higher Education
Research fund evaluation based on academic publication output analysis: The case of Chinese research fund evaluationG. Ji‑pingChina2019Scientometrics
University research evaluation and funding: An international comparisonA. GeunaNetherlands2003Minerva
Industry funding and university professors’ research performanceM. GulbrandsenNorway2005Research Policy
Measuring changes in publication patterns in a context of performance‑based research funding systems: The case of educational research in the University of Gothenburg (2005–20014)L. SīleSweden2019Scientometrics
The effect of market-based policies on academic research performance: Evidence from Australia 1992-2004M. SooUnited states2008Chapell Hill (dissertation)
Gender differences in publication output: Towards an unbiased metric of research performanceM. R. E. SymondsAustralia2006PLoS ONE
The effect of gender on research staff success in life sciences in the Spanish National Research CouncilE. MauleonSpain2008Research Evaluation
How does research productivity relate to gender? Analyzing gender differences for multiple publication dimensionsS. J. MayerGermany2018Scientometrics
Assessment and support of emerging research groupsH. F. MoedItaly2018FEMS Microbiology Letters
The effects of changes in the funding structure of the Flemish universities on their research capacity, productivity and impact during the 1980’s and early 1990’sH. F. MoedNetherland1998Scientometrics
Effects of seniority, gender and geography on the bibliometric output and collaboration networks of European Research Council (ERC) grant recipientsD. G. PinaSpain2019PLOS ONE
The determinants of research performance: A study of Australian university economistsG. HarrisAustralia1994Higher Education
Assessing public-private research collaboration: Is it possible to compare university performance?G. AbramoaItaly2010Scientometrics
Brain circulation, diaspora and scientific progress: A study of the international migration of Chinese scientists, 1998–2006Tian FangmengChina2016Asian and Pacific Migration Journal
The effects of collaboration on research performance of universities: An analysis by federal district and scientific fields in RussiaLuigi AldieriItaly2019Journal of the Knowledge Economy
International collaboration, mobility and team diversity in the life sciences: Impact on research performanceF. BarjakSwitzerland2008Social Geography
Institutionalizing the triple helix: Research funding and norms in the academic systemM. BennerSweden2000Research Policy
Does the aging of tenured academic staff affect the research performance of universities?S. KyvikNorway2008Scientometrics
Sex differences in research funding, productivity and impact: An analysis of Quebec university professorsV. LariviereCanada2011Scientometrics
The Impact of research collaboration on scientific productivityS. LeeUSA2005Social Studies of Science
Gender inequality and research performance: Moving beyond individual-meritocratic explanations of academic advancementM.W. NielsenDenmark2015Studies in Higher Education
Measuring funded research performance for multidisciplinary research in the Danube BasinM. SidoroffRomania2016Journal of Environmental Protection and Ecology
Is the commercialization of scientific research affecting the production of public knowledge? Global trends in the output of corporate research articlesR.J.W. TijssenNetherlands2004Research Policy
Gender differences in research performance and its impact on careers: A longitudinal case studyP. V.D. EsselaarNetherlands2016Scientometrics
Factors influencing research performance of university academic staff F. WoodAustralia1990Higher Education
PRISMA diagram of search and selection process

Descriptive results of the interview

In the interview analysis, the trends and driving forces through initial coding and merging the similar codes were determined, and the unrelated codes dropped. Finally, 248 codes were assigned in the form of nine main categories, 64 subcategories, and 47 dimensions. The trends included social (27 codes), technological (38 codes), economic (30 codes), environmental (5 codes), and political (44 codes) factors (Tables 2 and 3).
Table 2

Codes of the effective trends and driving forces in the future of research performance evaluation

Main categorySub-categoryDimensionNumber of code
Social trends and driving forcesResearch social impact Lack of social impact culture in organizations 27
Increasing emphasis on the research social impact
Increased society demand-based research
The social development of a community-
Increasing researchers’ awareness of the research evaluation importance-
Lack of research culture in society-
The gender gap in societyGlobal, social, organizational, and personal factors
Human resources of universitiesPersonal factors
Organizational factors
Technological trends and driving forcesInformation and communication technologyinsufficient technology in a university38
Increased use of big data
Increased documentation
Lack of a comprehensive research performance evaluation system
Creating national scientific, social networks
Increased presence on international social networks
Understanding the value of social media metrics for research evaluation
Technological trends and driving forcesInformation and communication technologyIncreased use of data mining
Increased use of artificial intelligence
Development of information technology
Scientometric indicatorsUsing the problem-oriented metrics
Normalization of scientometric indicators
Lack of proper use of evaluation indicators
Increased use of altmetrics in research evaluation
Creating new scientometric indicators
Increasing the use of technology-oriented indicators
Open scienceInformation filtering
Economic trends and driving forcesNo dependence on a natural resource-based economyThe economic development of a society30
Increase collaboration between university and industry
Research grantReduction of non-governmental investment for research
Reduction of international research grant
An economic evaluation of research assessment An economic analysis of research performance evaluation
An economic evaluation of research impact
Research budgetWaste of research funding
Lack of research budget
Investment in all fields of science
Funding based on research performance
Funding based on research priority
Ecological trends and driving forcesIncreased emphasis on green information-5
Using the green environmental components in research institutes-
Political trends and driving forcesScientific diplomacyScientific complexity and competition44
Research networking and variety
Domestic policy of a country-
War and political sanctions of a country-
Research performance evaluation systemBalanced inclusion of different dimensions in research evaluation
Localization of research performance evaluations
Parallel work in research performance evaluation
Increased emphasis of evaluations on the efficiency and effectiveness of research
Importance of the macro research policy in a country
Research prioritization
Research equality
Table 3

Effective trends and driving forces in the future of research performance evaluation

Main categorySub-categoryDimensionEvidence of the interviews and the scoping review
Social trends and driving forcesThe social development of a community-“The progressive society is advanced and has met the basic needs of its people. Humans and their values are important. So, people are looking for research” M11
increasing researchers’ awareness of the research evaluation importance-“Very few persons were familiar with these indicators, but now I see that the level of awareness of research performance and research indicators has grown very well and very significantly” M11
Lack of research culture in society-“The problem is that whatever we produce, whatever our measure is, whatever our research is, if society doesn’t want it, neither proper research nor proper evaluation is produced.” M1
The gender gap in societyGlobal, social, organizational, and personal factorsGender differences in research productivity decrease over time. Controlling personal and organizational factors reduces the impact of gender on research performance (22).
Human resources of universitiesPersonal factorsOlder staff publish fewer articles. The increase in doctoral and postdoctoral students compensates for the aging of staff (23).
Organizational factorsSome factors such as changing the staff employment process, educational task, relationship between education and research, and research management programs affect research performance (24).
Research social impact Increasing emphasis on the research social impact “Our research should be an applied one, and its consequences are seen in the community. Perhaps another effective trend is research application in education, problem-solving...” M 7
Increased society demand-based research“It should be noted that we must see the needs of society because our trends have changed. Today, for example, there is Covid-19, It is not only related to the experts, the community, social networks, are all talking about it” M 8
Lack of social impact culture in organizations“My purpose may not be to present my research everywhere and has a social impact, but necessary context or culture has not yet been created to translate knowledge. Well, because my institution policy is not knowledge translation” M 2
Technological trends and driving forcesInformation and communication technologyDevelopment of information technologyCreating a decision support system based on the research performed within the organization helps in allocating research budget and strategic planning and provides the correct ranking at the individual, research group, and educational groups (25).
insufficient technology in a university“Technology depends on our economy. Sometimes, we have good ideas, but we don't have a suitable technological infrastructure” M 2
Increased documentation“In my opinion, the documentation and control of documents will be done more, and we will see them day by day…” M 6
Lack of a comprehensive research performance evaluation system“We now don't have a system that we can take data, for example, comparing the universities in a specific field” M 6
Increased use of big data“In the future, in my opinion, these tools that are related to data analysis, mega-trends, mega-big data, etc., will be developed. Now, our goal is that the research evaluation should be done based on data mining and big data.” M 4
Creating national scientific social networks“Let's move on to the application of science and use new software in new electronic services. Well, it helps to measure one dimension that we don't just measure the global impact alone. Measure the local impact as well” M 7
Increased presence on international social networks“You should be able to find him on several social networks. Because it is not possible, for example, a person is a reviewer of international articles, but he’s not a member of Publons…” M 6
Technological trends and driving forcesInformation and communication technologyUnderstanding the value of social media metrics for research evaluation“In my opinion, the altmetric indicators which now is extracted somewhat in Scopus! But beyond that, it will be extracted…” M 6
Increased use of data mining“The evaluation systems seem to be becoming more professional in data analyzing, you know the analytical data, in fact, more advanced results, which may have extracted by data mining or machine learning.” M 5
Increased use of artificial intelligence“In the future, I think it will go toward artificial intelligence. For example, statistical analysis can be done using a computer and artificial intelligence. Tools related to science mapping and information illustration are getting better” M 4
Scientometric indicatorsUsing the problem-oriented metrics“It is essential to note that social trends are so important. The indicators that exist in this area should be extracted and used anyway” M 8
Normalization of scientometric indicators“The indicators need to be normalized. That happens, I think it's a good thing” M 9
Lack of proper use of evaluation indicatorsToo much emphasis on quantitative indicators such as the number of scientific productions and citations can affect the strategy of publishing of the younger researchers (26,27).
Creating new scientometric indicatorsThe digitalization of scientific communication has led to the emergence of new research performance indicators as altmetrics, webometrics, scientific mapping, and authors' network analysis (28).
Increasing the use of technology-oriented indicators“In addition to the articles that are currently receiving a lot of attention, we should also evaluate and review other types of research studies and the growth and development of countries. For example, in the field of patents, I can point out that the issue of potentiometric has been discussed for a long time but it has not yet reached a deserved position” M 11
Increased use of altmetrics in research evaluation“Another social factor that we would like to consider is social networks, which has recently been discussed in altmetrics. That is how much personal visibility is rising in society? how much it affects his social impact? how should this impact be evaluated and measured? All altmetric indicators are not the same. They are used differently in societies” M 2
Open scienceInformation filteringThe organizational depositories increase the citation because of free access to the publications of a university (29).
Economic trends and driving forcesNo dependence on a natural resource-based economyThe economic development of a societyThe challenge of extracting natural resources and changing the global demand is leading to the emergence of a knowledge-based economy. The production of national knowledge leads to the development of innovation, knowledge-based companies, and the economic progress of that country (30).
Increased collaboration between university and industryUniversity researchers who collaborate with industry have a better research performance (31).
Research grantReduction of non-governmental investment for research“Now, there are many non-governmental organizations and institutions abroad that are the sponsors of research, but this is not the case in our country. Mostly, the governmental organizations support research projects in Iran” M 10Determining a suitable domestic research policy based on external budgeting patterns can increase research impact and productivity (32).
Reduction of international research grant “In the current situation, foreign organizations do not even give us a research budget” M 1The younger grant recipients in countries with lower research performance have a lower diversity of research outputs and collaboration networks (33).
Economic trends and driving forcesAn economic evaluation of research assessmentAn economic analysis of research performance evaluation“What are the costs, economic estimates, and results of these research evaluations? Is it in our interest at all? Then the economic trend will be defined…” M 9
An economic evaluation of research impact“For example, there is a problem with the research evaluations economically that analyzing research in the long-term as a longitudinal process is many expensive …” M 4
Research budgetWaste of research funding“Now, one of the important issues in the world is the waste of money on research. It seems that many types of research have attracted a lot of budgets in the world but for any reason could not reach the desired result” M 5
Lack of research budgetResearch with financial support has more citations, which vary in terms of field and type of sponsor (33).
Investment in all fields of science“Leading countries have research diversity. They don’t research in one field of technical or medical sciences. They determine their competency and have research diversity for creating their network in all areas” M 2
Funding based on research performance“It is possible that in the future, organizations, corporations, and research funding providers, will move to assign the research resources based on the research performance” M 5A combined research evaluation system can be effective in allocating funds, one based on performance (motivating) and the other on an institutional size to reduce costs (34). The government is allocating the research budgets based on the performance indicators (28).
Funding based on research priority“The budget should be allocated for research that is a priority not just for increasing the number of articles” M 7
Environmental trends and driving forcesIncreased emphasis on green information-“We are the information specialists, green information or environmental information suggests that future research should be environmentally compatible and have less polluting effects” M 4
Using the green environmental components in research institutes-“I have heard that professors in some countries have a break in the summer to rest, think, and get creative in the forest. These environmental factors help a person's mind to relax” M 2
Political trends and forcing driversScientific diplomacyScientific complexity and competition“More scientific complexity helps to advance the country, produce science that few countries or institutions can do it. We name it the scientific complexity” M 8 Scientific diplomacy increases the international collaboration of domestic researchers with compatriot researchers in other countries (35).
Research networking and Variety“The knowledge edge of each field and research evaluation will move towards interdisciplinary and applied research in the future” M 5
Domestic policy of a country-“Political issues affect our research. At least, altmetrics shows that when The USA government wants to interpellate Trump, a lot of research is about this. So, this is very effective” M 1
War and political sanctions-“Political events certainly have a special effect. When a country is at war, from a political point of view, it takes precedence over defensive issues no other fields such as philosophy, social, and humanities sciences” M 1
Political trends and forcing driversResearch Performance Evaluation SystemBalanced inclusion of different dimensions in research evaluation “We should look at all of these factors that you count individually from the social factors to the environmental factors in the form of a system that affects each other” M 2“Research will be evaluated at the international than at the national or local levels. Assessments are now usually local or institutional ones” M 1“We will focus more on the final research products such as commercialized products, patents, or a change in a country’s health system and using their related indicators” M 5
Localization of research performance evaluations“I think a comprehensive and localized evaluation of academic, institution, and faculty performance it's a good option” M 9
Parallel work in research performance evaluation“For years, some persons have been saying that we are working, but it is not clear who is responsible for it. Everyone said that I was not responsible for it. It is unknown at this time who is responsible for it” M 6
Increased emphasis of evaluations on the efficiency and effectiveness of research“The research evaluation based on efficiency, effectiveness, or scientific productivity has not been considered now. In my opinion, more emphasis will be placed on these issues in the future.” M 6
Importance of the macro research policy in a countryThe existence of a national strategic research plan and the scientific national and international collaboration can be effective in the research performance of research centers (36).
Research prioritization“We research without knowing the aim of it and its evaluation and spend money on it. In my opinion, this is the main priority” M 10
Research equality“Policies always affect the research process. For example, our goal is to make a policy to encourage the best researchers that have international collaboration, professors, and innovators, etc. All of them make new indicators.” M 6

Analytical results

The effective social, technological, economic, environmental, and political trends and driving forces in the future of research performance evaluation were obtained from the scoping review and interview, synthesized, and are presented in the form of categories and sub-categories (Table 3).

Discussion

The current study aimed to determine the effective trends and driving forces in the future of research performance evaluation through interview and scoping review. The 42 documents were reviewed, and 11 persons were interviewed. Then, the social, technological, economic, environmental, and political trends and driving forces were extracted and reported. The findings showed that the effective social trends and driving are the social development of a community, increasing researchers’ awareness of the research evaluation importance, the gender gap in society, the research social impact, and human resources of universities. Regarding these results, the authors’ search showed a lack of sufficient attention to the social dimensions of research performance evaluation. Consistent with these results, Rababah et al. said that it is necessary to enhance researchers’ awareness of ethical principles in conducting human research and to implement reviewing committees’ standards (37). This is one of the research evaluation aspects that must be considered along with other principles by researchers. Besides, the research performance evaluation should be normalized based on individual, organizational, cultural, and social factors. For example, lack of gender normalization in the research performance evaluations causes men to be superior because women face many issues such as the roles of spouse and maternity, social and organizational factors in their societies. Women face many issues such as the roles of spouse and maternity, social and organizational factors in their societies. These factors can cause Matilda's effect in research publication. It doesn't have a positive effect on their scientific and organizational positions, research collaboration, obtaining research grants, and so on. For this reason, numerous researchers emphasize considering the gender normalization of research performance evaluation in the future (22,38-46). On the other and, research application in a society and its cultural and social impact has become such an essential trend. This requires strengthening the university's relationship with society and creating a culture of research impact there. So that specialists can research to meet the needs of society and convey it to the people by knowledge translation methods. In this regard, Pulido (47) and Eysenbach (48) referred to consider research social impact assessment through data in social media like Twitter which is following the current study’s results. Banner et al. said that meaningful engagement of patients, in addition to the inclusion of patient-reported outcomes and priorities through Integrated Knowledge Translation (IKT), has been hailed as another mechanism to improve the relevance, impact, and efficiency of research (49). So, it is necessary to consider suitable research impact metrics in social, technological, economic, environmental, and political aspects in research performance evaluation systems. These indicators must measure the long-term research impact. Another driving force in the future of research performance evaluation is employing capable staff in the field of research. For evaluating research performance, their individual (age, marriage, gender, personal research style, etc.) and organizational variables (educational and research infrastructure, university reputation, job position, organization size, etc.) should be considered. In this regard, researchers pointed out that the inclusion of these factors leads to the promotion of research and methods for research performance evaluation (23,24,50,51). Cadez et al. said that research productivity is not related to teaching quality, whereas research quality is positively related to teaching quality (52). At present, these variables are not considered much for evaluating research performance by universities in the world, such as Iran. It requires more attention from research managers of universities in terms of specific characteristics of their country. The current study showed that the main technological trends and driving forces are information and communication technology, scientometric indicators, and open science. Sile et al. revealed that information and communication technology is constantly evolving (28), and it is necessary to use new technologies such as big data, data mining, artificial intelligence, and machine learning in research performance evaluation. These new tools make more accurate evaluations and spend less time. This requires the development of advanced technology infrastructure in universities; and documentation of all scientific, technological, and research products. In a similar study, Zhou et al. found that big scholarly data as a large-scale collection of academic information, technical data, and collaboration relationships can provide researchers with research collaboration navigation for their future works. So, scholarly big data analysis of social networks like Research Gate can be a useful method for research performance evaluation (53). Feng also revealed that the research practice is not merely determined by capital possessed. Besides, international collaboration primarily accounts for the research performance of scholars which can be measured through big data analysis (54). Literature review shows that so far, no study has been conducted on the use of information and communication technologies such as artificial intelligence, data mining, decision support systems to evaluate the research performance. While using these methods can provide more accurate and evidence-based research assessments. However, it needs further investigation in future studies. Another technological trend is the increasing presence of researchers in the international and national scientific, social networks which leads to improving scientific collaboration and their scientometric indicators. To accurately assess the research social impact at the national level, it is necessary to create integrated national social networks and establish a link between scientific, social networks and citation databases. Some databases, such as Scopus, have made the availability of the social media data and websites through PlumX, which is not yet complete and needs further work. Web of Science also provides Publons as a reviewing platform that links authors to the reviewer and improves reviewing process. This citation database provides a comprehensive author profile which includes researcher scientific publications in Web of Science and links to Scopus and ORCID. Ortega revealed that Publons is not very efficient due to lack of full coverage of scientific fields, publishers, indicators and needs to be upgraded. Also, correlations between bibliometric and altmetric counts and the Publons metrics are very weak and not significant (55). Another study found that peer evaluation in Publons is not a measure of a work’s quality and impact (56). However, these social networks have strengths and weaknesses that require further investigation in the future. Therefore, research managers should consider new suitable social media metrics in their research performance evaluation systems. One of the main technological trends in recent years is providing researchers with unrestricted access to social networks and scientific information. Some publishers do not agree with this for commercial reasons. However, today most journals and publishers desire to increase the visibility of their scientific output (57). Every university or research institution must provide a depository with open access to its scientific and research products globally. This prevents duplicated research and improves the scientometric indicators of that organization. This finding didn’t consider by the previous research, and needed to be investigated. Today, one of the economic trends in the world is the lack of dependence on natural energy resources like oil and gas and moving towards developing a knowledge-based economy, improving university-industry relationships, and specializing in some scientific fields. Several researchers similar to the current study pointed out that various factors affect the relationship between university and industry. These include geographical, cultural, and social distance; compliance of university and industry policies; innovative capabilities of universities; market-based policies; and industrial structure of a country. The triple helix of government, industry, and universities has increased the research income of universities (24,30,31,58-61). Thus, new metrics must be introduced for these research products in a research performance evaluation that need to be investigated by researchers. Reducing university research funding in some developing countries, such as Iran, is another important economic trend that has been intensified by economic and political sanctions. In alignment with the current study, Confraria revealed that a country's scientific specialization depends on its historical and cultural factors, the strengths of its scientific institutions, the size of the scientific system, and the government's motivation and budget (30). Then, the scientific impact of a country will be improved by increasing R & D budgets (62). Previous studies showed that the type of sponsor (government, organizational, international) can also affect citation indicators (33,63-65). It is better to use an integrated funding system based on institution size, research performance, R & D products, and research priority (28,34). Therefore, a country’s governors and research policy-makers should provide a sufficient financial infrastructure that improves its research growth. This research promotion leads to improve the scientific status of universities and research institutes in that country and the world. Research managers and scientometrics also must consider the economic situation of a country in selecting their research performance evaluation methods and metrics. Another important economic trend is the reduction of foreign research grants due to economic and political sanctions in countries such as Iran. In this case, domestic private organizations should support researchers in that country. Similar to the current study, Berghe and Ghaseminik pointed out that countries with political and economic sanctions face difficulties in attracting international grants, and as a result, the international collaboration and the diversity of their research outputs are diminishing (32,33). It is also necessary to determine the cost-effectiveness of research performance evaluation before doing it. How much do these evaluations spend? how much are these results effective in research performance evaluation? This requires the close cooperation of scientometics and economists as a team. Besides, one of the things that have been neglected in research performance evaluations is not including the environmental indicators. The library and information science now suggest green information, paperless research compatible with the environment, and reducing its polluting effects on the environment. An organization that has a green and relaxing environment while saving energy resources can have a positive effect on research and researchers. Harris pointed out that universities must value their researchers and provide a relaxing environment where persons can think and research (66). Among university ranking systems, only UI Greenmetric (https://greenmetric.ui.ac.id/) considers environmental factors for the ranking of universities in the world, but it is not a complete ranking system in terms of its measurements for research performance. This factor needs to be investigated deeply in future research and scientometrics must try to introduce new metrics for it. The present study shows domestic policies of a country, wars, and political sanctions affect scientific diplomacy and scientific relationship with other countries. Networking and international research collaboration strengthen scientific competition, interdisciplinary and applied research, and countries' specialization in some scientific fields, which is called the scientific complexity. In this case, researchers will not have a problem publishing their articles in international journals. Domestic policies including interaction with other countries, cause a country’s researchers to travel to developed countries to obtain scientific experience and bring new knowledge to their country. As a result, countries can turn brain drain into brain gain (62,67-71). In this regard, countries such as Iran, which have faced economic and political sanctions in recent years and this must affect its various aspects such as research, should try to find appropriate methods and metrics for evaluating research performance. These factors should be studied in detail by scientometric researchers. Another main political driving force is the parallel work of different governmental organizations in determining the rules for research performance evaluation, especially in Iran. Also, in the world, different universities and research institutes have created various ranking systems in terms of common scientometric indicators regarding their goals, which are often overlapping and slightly different from each other (72). It is necessary for each country to determine its strategic research evaluation policy and develop its national research performance evaluation system that measures researchers at the individual, organizational, national, and also at international levels. The main technological and political driving force is the existence of an integrated research performance evaluation system that will contribute to more accurate research evaluations of universities and individuals. In this regard, Djalalinia et al. suggested developing a national health research network evaluation. This observational system can detect the latest research priority that needs to be more addressed by all of the networks (59) and includes suitable normalized metrics regarding these dimensions, but different forms of publications are not introduced. This system must include quantitative, qualitative, combined, and research impact indicators. This system should be a country-specific measurement that includes suitable metrics regarding social, technological, economic, environmental, and political factors, also considers the efficiency and effectiveness of research and research equality (25,36,73). Similar to the current study, Waltman (60) and Bornman (61) emphasize the application of field-based normalized scientometric indicators that should be used in research performance evaluations. Several researchers showed that too much emphasis on quantitative indicators such as the number of scientific productions and citations could affect the strategy of publishing of the younger researchers (27,28). However, it is necessary to introduce new metrics that will be normalized based on researcher gender and age, the field of study, and other aspects which need to be studied in the future. Finally, we encountered some limitations in the current study. One of the limitations was the lack of access to the full text of several documents, which was requested through correspondence with their authors on social networks and sending E-mails. In the interview step, due to the Coronavirus (COVID-19) pandemic and the lack of face-to-face interviews, WhatsApp and telephone were used.

Conclusion

This study aimed to determine the effective trends and driving forces in the research performance evaluation through scoping reviews and interviews. The results showed that various social, technological, economic, environmental, and political factors and indicators must be included and normalized in the national and international research performance evaluation system. The social trends and factors were research social impact, the social development of society, increasing researchers’ awareness of the research evaluation importance, lack of research culture in society, the gender gap in society, and human resources of universities. The technological trends and driving forces were the development of information and communication technology, scientometrics indicators, and open science. The economic trends and driving forces included no dependence on a natural resource-based economy, a research grant, an economic evaluation of research performance, and a research budget. The environmental trends and driving forces were increased emphasis on green information, using the green environmental components in research institutes. Eventually, the political trends and driving factors included scientific diplomacy, the domestic policy of a country, war and political sanctions, and research performance evaluation system. We suggest more research for creating and normalizing new indicators of social, technological, economic, environmental, and political dimensions in the national and international research performance evaluation systems.

Acknowledgment

This study is the result of the fourth phase of a doctoral thesis entitled "Futures study of the research performance evaluation using the scenario approach" supported by Iran University of Medical Sciences, Tehran, Iran and and Code of Ethics IR.IUMS.REC.1398.229. The authors would like to thank all the colleagues and experts who participated in the interview.

Conflict of Interests

The authors declare that they have no competing interests.
Appendix 1

Search strategies in the included databases

DatabasesSearch strategyResults
Pubmed(“research performance”[tiab] OR “research performance assessment*”[tiab] OR “research performance evaluati*”[tiab] OR “research performance measurement*”[tiab] OR “research performance ranking*”[tiab] OR “research evaluati*”[tiab] OR “research assessment*”[tiab] OR “research measure*”[tiab] OR “research evaluation system*”[tiab] OR “research indicator*”[tiab] OR “research metric*”[tiab]) AND (scientometric*[tiab] OR bibliometric*[tiab] OR informetric*[tiab]) AND (“social trend*”[tiab] OR “economical trend*”[tiab] OR “political trend*”[tiab] OR “technological trend*”[tiab] OR “environment* trend*”[tiab] OR trend*[tiab] OR “driving force*”[tiab] OR determinant*[tiab] OR factor*[tiab])48
Embase(“research performance”:ti,ab OR “research performance assessment*”:ti,ab OR “research performance evaluati*”:ti,ab OR “research performance measurement*”:ti,ab OR “research performance ranking*”:ti,ab OR “research evaluati*”:ti,ab OR “research assessment*”:ti,ab OR “research measure*”:ti,ab OR “research evaluation system*”:ti,ab OR “research indicator*”:ti,ab OR “research metric*”:ti,ab) AND (scientometric*:ti,ab OR bibliometric*:ti,ab OR informetric*:ti,ab) AND (“social trend*”:ti,ab OR “economical trend*”:ti,ab OR “political trend*”:ti,ab OR “technological trend*”:ti,ab OR “environment* trend*”:ti,ab OR trend*:ti,ab OR “driving force*”:ti,ab OR determinant*:ti,ab OR factor*:ti,ab)49
LITA(“research performance” OR “research performance assessment*” OR “research performance evaluati*” OR “research performance measurement*” OR “research performance ranking*” OR “research evaluati*” OR “research assessment*” OR “research measure*” OR “research evaluation system*” OR “research indicator*” OR “research metric*”) AND (scientometric* OR bibliometric* OR informetric*) AND (“social trend*” OR “economical trend*” OR “political trend*” OR “technological trend*” OR “environment* trend*” OR trend* OR “driving force*” OR determinant* OR factor*)0
LISTA(“research performance” OR “research performance assessment*” OR “research performance evaluati*” OR “research performance measurement*” OR “research performance ranking*” OR “research evaluati*” OR “research assessment*” OR “research measure*” OR “research evaluation system*” OR “research indicator*” OR “research metric*”) AND (scientometric* OR bibliometric* OR informetric*) AND (“social trend*” OR “economical trend*” OR “political trend*” OR “technological trend*” OR “environment* trend*” OR trend* OR “driving force*” OR determinant* OR factor*)293
Springer(“research performance” OR “research performance assessment*” OR “research performance evaluati*” OR “research performance measurement*” OR “research performance ranking*” OR “research evaluati*” OR “research assessment*” OR “research measure*” OR “research evaluation system*” OR “research indicator*” OR “research metric*”) AND (scientometric* OR bibliometric* OR informetric*) AND (“social trend*” OR “economical trend*” OR “political trend*” OR “technological trend*” OR “environment* trend*” OR trend* OR “driving force*” OR determinant* OR factor*)972
Proquestti("research performance" OR "research performance assessment*" OR "research performance evaluati*" OR "research performance measurement*" OR "research performance ranking*" OR "research evaluati*" OR "research assessment*" OR "research measure*" OR "research evaluation system*" OR "research indicator*" OR "research metric*") AND ti(scientometric* OR bibliometric* OR informetric*) AND ti("social trend*" OR "economical trend*" OR "political trend*" OR "technological trend*" OR "environment* trend*" OR trend* OR "driving force*" OR determinant* OR factor*)96
IEEE(“research performance” OR “research performance assessment” OR “research performance evaluati*” OR “research performance measurement” OR “research performance ranking” OR “research evaluati*” OR “research assessment” OR “research measure” OR “research evaluation system” OR “research indicator” OR “research metric”)3846
Web of Science(TS=(“research performance”) OR TS=(“research performance assessment*”) OR TS=(“research performance evaluati*”) OR TS=(“research performance measurement*”) OR TS=(“research performance ranking*”) OR TS=(“research evaluati*”) OR TS=(“research assessment*”) OR TS=(“research measure*”) OR TS=(“research evaluation system*”) OR TS=(“research indicator*”) OR TS=(“research metric*”)) AND (TS=(scientometric*) OR TS=(bibliometric*) OR TS=(informetric*)) AND (TS=(“social trend*”) OR TS=(“economical trend*”) OR TS=(“political trend*”) OR TS=(“technological trend*”) OR TS=(“environment* trend*”) OR TS=(trend*) OR TS=(“driving force*”) OR TS=(determinant*) OR TS=(factor*))402
Scopus(TITLE-ABS-KEY("research performance") OR TITLE-ABS-KEY("research performance assessment*") OR TITLE-ABS-KEY( "research performance evaluati*") OR TITLE-ABS-KEY( "research performance measurement*") OR TITLE-ABS-KEY("research performance ranking*") OR TITLE-ABS-KEY("research evaluati*") OR TITLE-ABS-KEY("research assessment*") OR TITLE-ABS-KEY("research measure*") OR TITLE-ABS-KEY("research evaluation system*") OR TITLE-ABS-KEY("research indicator*") OR TITLE-ABS-KEY("research metric*)) AND (TITLE-ABS-KEY( scientometric*) OR TITLE-ABS-KEY( bibliometric*) OR TITLE-ABS-KEY (informetric*)) AND (TITLE-ABS-KEY("social trend*") OR TITLE-ABS-KEY ("economical trend*") OR TITLE-ABS-KEY("political trend*") OR TITLE-ABS-KEY("technological trend*") OR TITLE-ABS-KEY("environment* trend*") OR TITLE-ABS-KEY(trend*) OR TITLE-ABS-KEY("driving force*") OR TITLE-ABS-KEY(determinant*) OR TITLE-ABS-KEY(factor*))404
Google Scholar, Googleresearch performance, social, technological, economic, environmental, political, scientometrics, bibliometrics, research performance evaluation, research performance measurement, research performance assessment
  21 in total

1.  Impact factors, and why they won't go away.

Authors:  E Garfield
Journal:  Nature       Date:  2001-05-31       Impact factor: 49.962

2.  The increasing dominance of teams in production of knowledge.

Authors:  Stefan Wuchty; Benjamin F Jones; Brian Uzzi
Journal:  Science       Date:  2007-04-12       Impact factor: 47.728

3.  Bibliometrics: The Leiden Manifesto for research metrics.

Authors:  Diana Hicks; Paul Wouters; Ludo Waltman; Sarah de Rijcke; Ismael Rafols
Journal:  Nature       Date:  2015-04-23       Impact factor: 49.962

4.  How to Assess Quality of Research in Iran, From Input to Impact? Introduction of Peer-Based Research Evaluation Model in Iran.

Authors:  Asghar Ebadifar; Monir Baradaran Eftekhari; Parviz Owlia; Elham Habibi; Elham Ghalenoee; Mohammad Reza Bagheri; Katayoun Falahat; Masoumeh Eltemasi; Zahra Sobhani; Shahin Akhondzadeh
Journal:  Arch Iran Med       Date:  2017-11-01       Impact factor: 1.354

5.  PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation.

Authors:  Andrea C Tricco; Erin Lillie; Wasifa Zarin; Kelly K O'Brien; Heather Colquhoun; Danielle Levac; David Moher; Micah D J Peters; Tanya Horsley; Laura Weeks; Susanne Hempel; Elie A Akl; Christine Chang; Jessie McGowan; Lesley Stewart; Lisa Hartling; Adrian Aldcroft; Michael G Wilson; Chantelle Garritty; Simon Lewin; Christina M Godfrey; Marilyn T Macdonald; Etienne V Langlois; Karla Soares-Weiser; Jo Moriarty; Tammy Clifford; Özge Tunçalp; Sharon E Straus
Journal:  Ann Intern Med       Date:  2018-09-04       Impact factor: 25.391

6.  Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact.

Authors:  Gunther Eysenbach
Journal:  J Med Internet Res       Date:  2011-12-19       Impact factor: 5.428

7.  Social impact in social media: A new method to evaluate the social impact of research.

Authors:  Cristina M Pulido; Gisela Redondo-Sama; Teresa Sordé-Martí; Ramon Flecha
Journal:  PLoS One       Date:  2018-08-29       Impact factor: 3.240

8.  Gender differences in research performance and its impact on careers: a longitudinal case study.

Authors:  Peter van den Besselaar; Ulf Sandström
Journal:  Scientometrics       Date:  2015-11-12       Impact factor: 3.238

Review 9.  Bibliometrics: tracking research impact by selecting the appropriate metrics.

Authors:  Ashok Agarwal; Damayanthi Durairajanayagam; Sindhuja Tatagari; Sandro C Esteves; Avi Harlev; Ralf Henkel; Shubhadeep Roychoudhury; Sheryl Homa; Nicolás Garrido Puchalt; Ranjith Ramasamy; Ahmad Majzoub; Kim Dao Ly; Eva Tvrda; Mourad Assidi; Kavindra Kesari; Reecha Sharma; Saleem Banihani; Edmund Ko; Muhammad Abu-Elmagd; Jaime Gosalvez; Asher Bashiri
Journal:  Asian J Androl       Date:  2016 Mar-Apr       Impact factor: 3.285

10.  Patient and Public Engagement in Integrated Knowledge Translation Research: Are we there yet?

Authors:  Davina Banner; Marc Bains; Sandra Carroll; Damanpreet K Kandola; Danielle E Rolfe; Caroline Wong; Ian D Graham
Journal:  Res Involv Engagem       Date:  2019-02-12
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.