| Literature DB >> 31316755 |
Adrian G Barnett1, David Moher2.
Abstract
Background: Universities closely watch international league tables because these tables influence governments, donors and students. Achieving a high ranking in a table, or an annual rise in ranking, allows universities to promote their achievements using an externally validated measure. However, league tables predominantly reward measures of research output, such as publications and citations, and may therefore be promoting poor research practices by encouraging the "publish or perish" mentality.Entities:
Keywords: league tables; meta-research; research quality; research reporting
Mesh:
Year: 2019 PMID: 31316755 PMCID: PMC6611132 DOI: 10.12688/f1000research.18453.2
Source DB: PubMed Journal: F1000Res ISSN: 2046-1402
Number of complete and missing affiliation data by country for the top ten countries.
“Missing” is included as a nominal country, that is the affiliation and country data were both missing. Countries ordered by number missing.
| Country | Complete | Missing | % missing |
|---|---|---|---|
|
| 72 | 55 | 43.3 |
| United States | 8,064 | 39 | 0.5 |
| Italy | 2,644 | 22 | 0.8 |
| United Kingdom | 5,223 | 16 | 0.3 |
| Australia | 4,187 | 14 | 0.3 |
| Brazil | 1,609 | 12 | 0.7 |
| Canada | 3,817 | 12 | 0.3 |
| Germany | 1,606 | 12 | 0.7 |
| Spain | 1,306 | 10 | 0.8 |
| China | 4,098 | 8 | 0.2 |
|
| 14,991 | 59 | 0.4 |
| Total | 47,617 | 259 | 0.5 |
Total good research practice scores for the top ten regions and countries in 2016 and 2017.
These results exclude “Missing” as a nominal country or region.
| Rank | Region | 2016 | 2017 |
|---|---|---|---|
| 1 | Western Europe | 2,459 | 2,986 |
| 2 | Northern America | 1,521 | 1,807 |
| 3 | Asia (excluding Near East) | 1,279 | 1,658 |
| 4 | Oceania | 593 | 727 |
| 5 | Latin America and Caribbean | 325 | 424 |
| 6 | Near East | 86 | 109 |
| 7 | Sub-Saharan Africa | 61 | 89 |
| 8 | Eastern Europe | 46 | 71 |
| 9 | Northern Africa | 35 | 38 |
| 10 | Baltics | 5 | 7 |
| Rank | Country | 2016 | 2017 |
| 1 | United States | 1,074 | 1,269 |
| 2 | China | 871 | 1,064 |
| 3 | United Kingdom | 719 | 827 |
| 4 | Australia | 553 | 668 |
| 5 | Canada | 440 | 526 |
| 6 | Italy | 319 | 358 |
| 7 | Netherlands | 296 | 349 |
| 8 | Brazil | 266 | 345 |
| 9 | Germany | 220 | 277 |
| 10 | Denmark (2016) / Spain (2017) | 136 | 190 |
Top ten ranking universities in 2016 and 2017 for our good research practice table.
Universities are ordered by their score in each year. The cluster column is the median cluster from the Bayesian model, with ‘5’ the highest cluster. The rank is the median rank and 95% bootstrap confidence interval in brackets. The standard rank is based on counting each university’s annual papers.
| University | Score | Cluster | Good practice
| Standard
|
|---|---|---|---|---|
|
| ||||
| University of Toronto | 82.8 | 5 | 1 (1 to 2) | 2 |
| University of Sydney | 75.8 | 5 | 2 (1 to 2) | 5 |
|
| 47.3 | 4 | 4 (3 to 12) | –
[ |
| King’s College London | 46.5 | 4 | 4 (3 to 10) | 16 |
| Zhejiang University | 42.0 | 4 | 7 (3 to 19) | 176 |
| University College London | 40.7 | 4 | 8 (3 to 17) | 7 |
| Mayo Clinic | 39.7 | 4 | 9 (3 to 20) | 38 |
| West China Hospital of Sichuan University | 39.1 | 4 | 9 (3 to 22) | 239 |
| Erasmus University Rotterdam | 38.1 | 4 | 10 (4 to 21) | 92 |
| University of Melbourne | 37.6 | 4 | 11 (4 to 20) | 13 |
|
| ||||
| University of Toronto | 97.4 | 5 | 1 (1 to 1) | 1 |
| University of Sydney | 67.2 | 5 | 2 (2 to 4) | 5 |
| West China Hospital of Sichuan University | 56.7 | 5 | 4 (2 to 10) | 206 |
|
| 56.6 | 5 | 4 (2 to 10) | –
[ |
| University College London | 53.8 | 4 | 5 (2 to 10) | 8
[ |
| King’s College London | 50.3 | 4 | 7 (3 to 13) | 12 |
| Harvard University | 50.1 | 4 | 7 (3 to 12) | 8
[ |
| University of Ottawa | 47.4 | 4 | 9 (4 to 14) | 95 |
| Monash University | 47.2 | 4 | 9 (4 to 15) | 25 |
| University of Oxford | 46.8 | 4 | 9 (4 to 16) | 64 |
a There was no standard rank for missing affiliations. b Tied.
Cross-tabulation of estimated clusters for universities in 2016 (rows) and 2017 (columns).
The diagonal numbers in bold correspond to no change from 2016 to 2017. ‘5’ is the highest cluster with the best score.
| 2016 | 1 | 2 | 3 | 4 | 5 | Total |
|---|---|---|---|---|---|---|
| 2017 | ||||||
| 1 |
| 80 | 2 | 0 | 0 | 202 |
| 2 | 48 |
| 30 | 0 | 0 | 207 |
| 3 | 0 | 10 |
| 9 | 0 | 61 |
| 4 | 0 | 0 | 2 |
| 2 | 20 |
| 5 | 0 | 0 | 0 | 0 |
| 2 |
| Total | 168 | 219 | 76 | 25 | 4 | 492 |
Figure 1. Bland–Altman plots of the agreement in university league table ranks between 2016 and 2017 for our good research practice league table and the Times Higher Education league table for research.
We only examine universities in the top 200 in both years, which is 161 in our table and 184 in the THE table. The dashed horizontal lines are the Bland–Altman limits of agreement.
Self-assessment of our Good Research Practice league table against the ten principles for the responsible use of university rankings [42].
| # | Principle | Self-assessment |
|---|---|---|
| 1 | A generic concept of university performance should not be used | We did not use a composite measure and detail what our score measures |
| 2 | A clear distinction should be made between size-dependent and size-
| Our score is size-dependent and we acknowledge that universities with larger health and
|
| 3 | Universities should be defined in a consistent way | Some universities had varying affiliation wordings and we tried to appropriately combine
|
| 4 | University rankings should be sufficiently transparent | We have openly shared our R code that produced the tables and described our methods in
|
| 5 | Comparisons between universities should be made keeping in mind the
| This is a matter of how readers interpret differences between universities. To aid comparisons
|
| 6 | Uncertainty in university rankings should be acknowledged | We used a bootstrap procedure to estimate the uncertainty in ranks. |
| 7 | An exclusive focus on the ranks of universities in a university ranking should be
| We used clustering to try to more sensibly group universities by performance compared with
|
| 8 | Dimensions of university performance not covered by university rankings should
| We acknowledge that our table has a specific focus on health and medical research. Within
|
| 9 | Performance criteria relevant at the university level should not automatically be
| Our scores may be the amalgam of multiple schools in the same university, e.g., schools of
|
| 10 | University rankings should be handled cautiously, but they should not be
| We aimed to provide a different ranking system to current league tables, and one that might
|