Saturday - 18 May 2024 - 5:05 PM

Does University Ranking Number truly reflect the standard of education provided by the University?

Prof. Ashok Kumar

Rankings are products of Capitalist system that encourages unhealthy competition and rat race among institutions. But these are now unfortunately considered to be especially important as markers of quality. The distinction between Athletics and Academics is slowly vanishing. Everyone wants to be at the top, to be a winner. That is how parameters are set by society and institutions are unconsciously adapting those standards under the illusion that ranking reflects quality.

There are several university ranking agencies, assessing and granting rankings of the Universities, are gaining popularity among students, parents, universities, private organizations, and even governments.

Major international rankings : QS World University Rankings ,Times Higher Education World University Rankings, Academic Ranking of World Universities, Other global rankings : Aggregate Ranking of Top Universities, Center for World University Rankings, Leiden Ranking ,Eduniversal, G-factor ,Global University Ranking ,HEEACT—Ranking of Scientific Papers ,Human Resources & Labor Review ,High Impact Universities: Research Performance Index , Nature Index ,Newsweek Professional Ranking of World Universities Reuters World’s Top 100 Innovative Universities Round University Ranking SCImago Institutions Rankings U-Multirank University Ranking by Academic Performance ,U.S. News & World Report’s Best Global Universities Rankings

Citations are considered the most important index of quality. But we never bother to find out why. Who are the people who decided that citations shall be the index of quality? How does citation ensure that a particular paper is worth the quality?

But we are incredibly happy when we have a greater number of citations. It is true with many parameters. Ranking serves the market driven economy and that is why they have become especially important.

The criteria used by these agencies are so different that, being in a top 100 in one ranking might put you in the top 300 in the other. Should these rankings be taken seriously? Should they be used as a measure of the students’ quality for postgraduate study, or job application?

The problem is that ranking agencies are making a business out of it.

The use of rankings to assess universities also needs a rethink. These league tables, produced by the Academic Ranking of World Universities (ARWU) and the Times Higher Education World University Ranking (THE WUR) and others, determine eligibility for scholarships and other income, and sway where scholars decide to work and study.

The three longest established and most influential global rankings are those produced by Quacquarelli Symonds (QS), Times Higher Education (THE) and Shanghai Ranking Consultancy (the Academic Ranking of World Universities; ARWU). All of these, along with other global rankings, primarily measure the research performance of universities rather than their teaching. They have been criticized for being “largely based on what can be measured rather than what is necessarily relevant and important to the university”, and the validity of the data available globally has been questioned.

Citations are considered the most important index of quality. But we never bother to find out why. Who are the people who decided that citations shall be the index of quality? How does citation ensure that a particular paper is worth the quality?

But we are incredibly happy when we have a greater number of citations. It is true with many parameters. Ranking serves the market driven economy and that is why they have become especially important.

These criteria do not actually measure teaching, and none even come close to assessing quality of impact”. Many rankings are also considered to contain biases.

India’s University Grants Commission requires foreign partners of Indian universities to be ranked in the top 500 of the World Ranking  of the THE or ARWU ranking.

The QS rankings have been criticized for their commercial purpose  and can be manipulated by researchers from different universities agreeing to support each other in the surveys.

Critics argue that rankings can divert universities’ attention away from teaching and social responsibility towards the type of scientific research valued by indicators used for ranking exercises. There have also been concerns that by applying a limited set of criteria to world universities and given the strong desire to feature in the top 200 universities, rankings actually encourage the homogenization of higher education institutions, making them less responsive and less relevant to their immediate contexts. The fact that rankings are also said to favour the advantage enjoyed by the 200 best-ranked institutions has important implications for equity.

INDIA : In India the National Institutional Ranking Framework(NIRF) is a methodology adopted by the Ministry of Education , Government of India, to rank institutions of higher education in India. The Framework was approved by the MHRD and launched by Minister of Human Resource Development on 29 September 2015.There are separate rankings for different types of institutions depending on their areas of operation like universities and colleges, engineering institutions, management institutions, pharmacy institutions and architecture institutions. The Framework uses several parameters for ranking purposes like resources, research, and stakeholder perception. These parameters have been grouped into five clusters and these clusters were assigned certain weightages. The weightages depend on the type of institution. About 3500 institutions voluntarily participated in the first round of rankings.

(Author is Vice Chancellor, Nirwan University, Jaipur )

 

 

Powered by themekiller.com anime4online.com animextoon.com apk4phone.com tengag.com moviekillers.com