Universities are currently facing new societal challenges, both in teaching and learning, as well as in research. One of the biggest developments driving these challenges is the rise of Sustainable Development Goals (SDGs). As the quality of education evolves, integrating sustainable development in a holistic and inclusive manner is becoming increasingly important.
However, universities are struggling to both provide education that addresses these new issues through transformative teaching methods, and to evaluate the effectiveness of their efforts in addressing these challenges and their impact on students. The article delves into this issue, exploring the use of two methods for measuring performance in both the European and Asian settings: quality assurance systems (QAS) and rankings.
QAS in the European and the Asian contexts: do they address the ‘new’ policy issues?
The first method for assessing teaching quality is the use of quality assessment within the framework of (inter)national QAS. A European study, assisting higher education institutions (HEIs) to establish effective internal quality assurance systems (IQA), created a uniform set of quality indicators for monitoring and evaluating performance. The study highlighted that universities are seeking indicators to aid them in becoming more sustainable and inclusive in all aspects of their operations, including teaching, research, and general operations.
Despite the efforts to support higher education institutions in implementing effective internal quality assurance systems, existing frameworks have not provided the necessary support for universities. These frameworks include indicators, such as the percentage of international students and the ratio of sustainable actions per student. However, there is a lack of agreement on which indicators to use and a shortage of data on these indicators. Additionally, the information gathered is not typically utilised for self-reflection and institutional improvement, but rather to fulfil the demands of external accreditation agencies for accountability purposes.
A similar situation exists in the Asian region. Unlike in Europe, there is a diverse range of national QAS systems, yet none of them specifically address the emerging societal challenges. Despite this, some quality assurance agencies from countries such as Taiwan, Japan, and Malaysia are exploring the possibility of incorporating standards related to HE diversity, third mission, and SDGs in future institutional accreditation cycles. However, it appears that utilising QAS alone may not be the most efficient method for assessing universities’ progress in addressing these new policy issues.
Can global university rankings help HEIs’ decision-makers regarding these ‘new’ challenges?
Our research explores the use of rankings as a possible solution for the comparison of university performance in addressing new societal challenges relating to SDGs. Traditionally, global university rankings were (and still are) very much research-oriented, producing league tables where the big research universities always end up on the top of the list. In recent years some of these rankings have broadened their scope to more region-specific and specialised lists.
The QS World University Ranking presents special rankings, based on subsamples of the HEIs participating in QS (by subject, graduate employability, MBA, USA HEIs). Recently, it has added a new ‘layer’ to assess the institutions’ research by focusing on two SDGs’ categories: inequality and environment.
The Times Higher Education Impact Ranking focuses on the SDGs and university activities contributing to addressing societal challenges. This shift marks a transition from quantitative assessment and normative assessments to more qualitative evaluations. The problem is that its methodology is flawed, and the assessment of the activities and performance on inclusion and sustainability is presented in a competitive setting.
To appear on the list, universities must provide a significant amount of information, which can be a major obstacle for smaller institutions with limited organisational capacity. This reduces the university’s presence on the list to just one position, primarily used for marketing purposes. This ranking, lacking context on institutional mission or region, offers no opportunities for universities to learn and improve from one another.
An alternative ranking, U-Multirank, offers a more user-driven comparison, going beyond the traditional one-dimensional ranking. It has recently introduced indicators for some of the ‘new’ challenges. While this allows users access to comparable data, U-Multirank coverage is still limited.
What is the coverage of ‘new’ policy issues in QAS and rankings?
Figure 1 presents a tentative assessment of coverage of ‘new’ policy issues, namely study success, social inclusion and sustainability, by the two types of information tools analysed: QAS (subdivided into EQA and IQA) and rankings (THE, ARWU, QS and UMR).

Note: The size of the circle indicates relative coverage.
Credit Source: Frans Kaiser, Ana I. Melo, and Angela Hou (authors)
How to move forward?
In order to effectively measure the performance of universities on societal challenges, such as sustainable development and inclusion, it is crucial to continue the search for appropriate indicators. One potential solution is to involve a diverse group of stakeholders and experts in the development of these indicators.
Involving a diverse range of stakeholders can aid higher education institutions and quality assurance agencies in identifying and utilising relevant information and data to assess their progress on these “new” societal challenges. Additionally, fostering a culture of exchange and collaboration, where ideas, experiences, and knowledge are shared among institutions, can help embed these challenges into quality assessment tools.
🔬🧫🧪🔍🤓👩🔬🦠🔭📚
Journal reference
Kaiser, F., Melo, A. I., & Hou, A. Y. (2022). Are quality assurance and rankings useful tools to measure ‘new’ policy issues in higher education? the practices in Europe and Asia. European Journal of Higher Education, 12(sup1), 391–415. https://doi.org/10.1080/21568235.2022.2094816