College and university rankings are rankings of institutions in higher education which have been ranked on the basis of various combinations of various factors. Rankings have most often been conducted by magazines, newspapers, websites, governments, or academics. In addition to ranking entire institutions, organizations perform rankings of specific programs, departments, and schools. Various rankings consider combinations of measures of funding and endowment, research excellence and/or influence, specialization expertise, admissions, student options, award numbers, internationalization, graduate employment, industrial linkage, historical reputation and other criteria. Various rankings mostly evaluating on institutional output by research. Some rankings evaluate institutions within a single country, while others assess institutions worldwide. The subject has produced much debate about rankings' usefulness and accuracy. The expanding diversity in rating methodologies and accompanying criticisms of each indicate the lack of consensus in the field. Further, it seems possible to game the ranking systems through excessive self-citations. The variety of academic rankings provide a comprehensive overview and insightful overlook of different academic institutions on composite capabilities in academia. Whilst United Nations advocates for the beneficial role that higher education could be the common good of social leverage and educating skills to equip everyone participated, yet college ranking is a transparent tool for a fair evaluation for the public.
For rankings of U.S. universities in particular, see
Video College and university rankings
Global rankings
See Regional and national rankings for university rankings within a particular region. Several organizations produce worldwide university rankings, including the following.
The three longest established and most influential global rankings are those produced by ShanghaiRanking Consultancy (the Academic Ranking of World Universities; ARWU), Times Higher Education (THE) and Quacquarelli Symonds (QS). All of these, along with other global rankings, primarily measure the research performance of universities rather than their teaching. They have been criticised for being "largely based on what can be measured rather than what is necessarily relevant and important to the university", and the validity of the data available globally has been questioned.
While some rankings attempt to measure teaching using metrics such as staff to student ratio, the Higher Education Policy Institute has pointed out that the metrics used are more closely related to research than teaching quality, e.g. "Staff to student ratios are an almost direct measure of research activity", and "The proportion of PhD students is also to a large extent an indication of research activity". Inside Higher Ed similarly states "these criteria do not actually measure teaching, and none even come close to assessing quality of impact". Many rankings are also considered to contain biases towards the natural sciences and, due to the bibilometric sources used, towards publication in English-language journals. Some rankings, including ARWU, also fail to make any correction for the sizes of institutions, so a large institution is ranked considerably higher than a small institution with the same quality of research. Other compilers, such as Scimago and U.S. News and World Report, use a mix of size-dependent and size-independent metrics.
Some compilers, notably QS, THE and U.S. News, use reputational surveys. The validity of these has been criticised: "Most experts are highly critical of the reliability of simply asking a rather unrandom group of educators and others involved with the academic enterprise for their opinions"; "methodologically [international surveys of reputation] are flawed, effectively they only measure research performance and they skew the results in favour of a small number of institutions."
However, despite the criticism, much attention is paid to global rankings, particularly ARWU, QS and THE. Some countries, including Denmark and the Netherlands, use university rankings as part of points-based immigration programmes, while others, such as Russia, automatically recognise degrees from higher-ranked universities. India's University Grants Commission requires foreign partners of Indian universities to be ranked in the top 500 of the THE or ARWU ranking, while Brazil's Science Without Borders programme selected international partner institutions using the THE and QS rankings.
Academic Ranking of World Universities
The Academic Ranking of World Universities (ARWU) compiled originally by the Shanghai Jiao Tong University and now maintained by the ShanghaiRanking Consultancy, has provided annual global rankings of universities since 2003, making it the earliest of its kind. ARWU rankings have been cited by The Economist magazine. It has been lauded for being "consistent and transparent" based on an article. The education ministers of France, Norway and Denmark traveled to China to discuss and find ways to improve their rankings. ARWU does not rely on surveys and school submissions. Among other criteria, ARWU includes the number of articles published by Nature or Science and the number of Nobel Prize winners and Fields Medalists (mathematics). Harvard has topped the ranking for years. One of the primary criticisms of ARWU's methodology is that it is biased towards the natural sciences and English language science journals over other subjects. Moreover, the ARWU is known for "relying solely on research indicators", and "the ranking is heavily weighted toward institutions whose faculty or alumni have won Nobel Prizes": it does not measure "the quality of teaching or the quality of humanities."
Center for World University Rankings
This Saudi Arabia-based consulting organization has published yearly rankings of world universities since 2012. Rankings are based on quality of education, alumni employment, quality of faculty, number of publications, number of publications in high-quality journals, citations, scientific impact and number of patents.
Eduniversal
This university ranking is owned by the French consulting company and rating agency SMBG. It ranks masters and MBA in its 9 geographical regions (the 5 continents).
G-factor
G-factor ranks university and college web presence by counting the number of links only from other university websites, using Google search engine data. G-factor is an indicator of the popularity or importance of each university's website from the combined perspectives of other institutions. It claims to be an objective peer review of a university through its website--in social network theory terminology, G-factor measures the centrality of each university's website in the network of university websites.
Global University Ranking
Global University Ranking measures over 400 universities using the RatER, an autonomous, non-commercial, Russian rating agency supported by Russia's academic society. The methodology pools universities from ARWU, HEEACT, Times-QS and Webometrics and a pool of experts formed by project officials and managers to determine the rating scales for indicators in seven areas. It considers academic performance, research performance, faculty expertise, resource availability, socially significant activities of graduates, international activities, and international opinion. Each expert independently evaluates these performance indicators for candidate universities. The rating is the average of the expert evaluations. This ranking raised questions when it placed Russian Moscow State University in fifth place, ahead of Harvard and Cambridge.
HEEACT--Ranking of Scientific Papers
The Performance Ranking of Scientific Papers for World Universities was produced until 2012 by the Higher Education Evaluation and Accreditation Council of Taiwan (HEEACT). The indicators were designed to measure both long-term and short-term research performance of research universities.
This project employed bibliometrics to analyze and rank the performance of the 500 top universities and the top 300 universities in six fields. HEEACT further provides subject rankings in science and technology fields. It also ranked the top 300 universities across ten science and technology fields. The ranking included eight indicators. They were: articles published over prior 11 years; citations of those articles, "current" articles, current citations, average citations, "H-index", number of "highly cited papers" and high impact journal articles. They representedx three criteria of scientific papers performance: research productivity, research impact, and research excellence.
The 2007 ranking methodology was alleged to have favored universities with medical schools, and in response, HEEACT added assessment criteria. The six field-based rankings are based on the subject categorization of WOS, including Agriculture & Environment Sciences (AGE), Clinical Medicine (MED), Engineering, Computing & Technology (ENG), Life Sciences (LIFE), Natural Sciences (SCI) and Social Sciences (SOC). The ten subjects include Physics, Chemistry, Mathematics, Geosciences, Electrical Engineering, Computer Science, Mechanical Engineering, Chemical Engineering (including Energy & Fuels), Materials Sciences, and Civil Engineering (including Environmental Engineering). The ranking was renamed as National Taiwan University Ranking in 2012.
Human Resources & Labor Review
The Human Resources & Labor Review (HRLR) publishes a human competitiveness index & analysis annually by Asia First Media, previously ChaseCareer Network (ChaseCareer.Net). This system is based on Human Resources & Labour Review Indexes (HRI and LRI), which measure the performance of top 300 universities' graduates.
In 2004, a couple of educational institutions voiced concerns at several events in regard to the accuracy and effectiveness of ranking bodies or lists. The HRLR ranking was pioneered in late 2005 within a working group in response to those concerns. The team was founded in January 2007, in London, and started compiling and processing data, resulting in the first lists in 2007-2008. The ranking concept is later being adopted for Alumni score on ARWU and many other rankings.
The new HRLR ranking innovative methods sparked intense interests from many institutions and inspired several other ranking lists and scoring which are based on professional, alumni, executives, competitiveness, human capital-oriented aspects. Nevertheless, HRLR remains to be the leader in university ranking with innovative and comprehensive approaches, and not relying merely on those aforementioned aspects.
High Impact Universities: Research Performance Index
The High Impact Universities Research Performance Index (RPI) is a 2010 Australian initiative that studies university research performance. The pilot project involved a trial of over 1,000 universities or institutions and 5,000 constituent faculties (in various disciplines) worldwide. The top 500 results for universities and faculties were reported at the project website. The project promotes simplicity, transparency and fairness. The assessment analyzes research performance as measured by publications and citations. Publication and citation data is drawn from Scopus. The project uses standard bibliometric indicators, namely the 10-year g-index and h-index. RPI equally weighs contributions from the five faculties. The five faculty scores are normalized to place them onto a common scale. The normalized scores are then averaged to arrive at a final RPI.
Leiden Ranking
The Centre for Science and Technology Studies at Leiden University maintains a European and worldwide ranking of the top 500 universities according including the number and impact of Web of Science-indexed publications per year. The rankings compare research institutions by taking into account differences in language, discipline and institutional size. Multiple ranking lists are released according to various bibliometric normalization and impact indicators, including the number of publications, citations-per-publication, and field-averaged impact per publication.
Nature Index
The Nature Index tracks the affiliations of high quality scientific articles published in 68 science journals independently chosen by the scientific community as the journals scientists would most like to publish their best research in. Updated monthly, the Nature Index presents research reports of approximately 9,000 parent institutions worldwide presenting a page of output statistics for each institution along with information on institutions collaborating with the institution in the publication of Index articles. Each of the approximately 60,000 articles in the Index has a dedicated article page with social and mainstream media coverage tracked by Altmetric. League tables of output of institutions can be generated on the fly on a global, regional or country basis and by broad subject area as well as by article count and fractional article count. Compare with other metrics of science (e.g., Impact Factor, h-index), Nature Index is the prominent scientific journal ranking with global reputation on original natural science and life science research. Tracing the flow of scientific knowledge into economic and social benefit is a growing priority for governments and research funding agencies. Nature Index Innovation examines the connection between high-quality research and the commercialization of new products and services. In particular, it highlights the use of references to academic literature in patents to show concrete links between discovery and its economic potential.
Newsweek
In August 2006, the American magazine Newsweek published a ranking of the Top 100 Global Universities, using selected criteria from ARWU and the Times Higher Education-QS rankings, with the additional criterion of the number of volumes in the library. It formed part of a special issue including an article from Tony Blair, then prime minister of the UK, but has not been repeated. It considered openness and diversity as well as distinction in research. The ranking has been continued since its merger with The Daily Beast, and currently uses data from the Times Higher Education World Rankings, Webometrics world college rankings from public-research outlet Consejo Superior de Investigaciones Científicas in Spain, and the Shanghai Ranking Consultancy in order to compile its results.
Professional Ranking of World Universities
In contrast to academic rankings, the Professional Ranking of World Universities established in 2007 by the École nationale supérieure des mines de Paris measures the efficiency of each university at producing leading business professionals. Its main compilation criterion is the number of Chief Executive Officers (or equivalent) among the Fortune Global 500. This ranking has been criticized for placing five French universities into the top 20.
QS World University Rankings
The QS World University Rankings are a ranking of the world's top universities produced by Quacquarelli Symonds published annually since 2004. Along with Academic Ranking of World Universities and THE World University Rankings, the QS World University Rankings is widely recognized and cited as one of the 3 main world university rankings. According to Alexa data, they are the world's most-viewed global university rankings. In 2016 they ranked 916 universities, with the Massachusetts Institute of Technology, Stanford University, and Harvard University on top. This represented the first time since the inaugural rankings of 2004 that all three top positions were held by US institutions.
The QS rankings should not be confused with the Times Higher Education World University Rankings. From 2004 to 2009 the QS rankings were published in collaboration with Times Higher Education and were known as the Times Higher Education-QS World University Rankings. In 2010 QS assumed sole publication of rankings produced with this methodology when Times Higher Education split from QS in order to create a new rankings methodology in partnership with Thomson Reuters. The QS rankings are published in the United States by U.S. News & World Report as the "World's Best Universities." However, in 2014, the U.S. News & World Report launched their own international university ranking titled "Best Global Universities". The inaugural ranking was published in October 2014.
The QS rankings use peer review data collected (in 2016) from 74,651 scholars and academics and 37,781 recruiters. These two indicators are worth 40 per cent and 10 per cent of a university's possible score respectively. The QS rankings also incorporate citation per faculty member data from Scopus, faculty/student ratios, and international staff and student numbers. The citations and faculty/student measures are worth 20 per cent of an institution's total possible score and the international staff and student data five per cent each. QS has published online material about its methodology.
QS published the 2016 QS World University Rankings online on 5 September 2016. The rankings also appear in book form, and via media partners including The Guardian, US News & World Report and The Chosun Ilbo.
QS has added to its main World University Rankings, starting in 2009 with the Asian University Rankings. The QS Latin American University Rankings and the QS World University Rankings by Subject were published for the first time in 2011, as well as a faculty ranking worldwide, Top 50 under 50 and Next 50 under 50 ranking and graduate employment ranking. QS now also publish regional rankings for the Arab Region, Emerging Europe and Central Asia, and the five BRICS nations.
The subject rankings are intended to address the most frequent criticism of all world university ranking systems, that they contain too little material about specific subjects, something potential applicants are keen to see. These rankings have been drawn up on the basis of citations, academic peer review and recruiter review, with the weightings for each dependent upon the culture and practice of the subject concerned. They are published in five clusters; engineering; biomedicine; the natural sciences; the social sciences; and the arts and humanities, and covered 42 subjects in 2016.
QS Asian University Rankings
In 2009, Quacquarelli Symonds (QS) launched a department of the QS Asian University Rankings in partnership with The Chosun Ilbo newspaper in Korea. They rank the top 350 Asian universities and the ranking has now appeared eight times. They release an independent list of rankings each time, different from that of the QS World University Rankings. For three consecutive years up to the 2016/17 edition, the rankings was topped by the National University of Singapore.
These rankings use some of the same criteria as the World University Rankings but they use other measures, such as incoming and outgoing exchange students as well. As the criteria and their weightings are different, the QS World university rankings and the QS Asian University rankings released in the same academic year are different. QS published global universities ranking by different major in different countries, which has special reference value for international students, like Statistics & Operational Research program in China.
QS Latin American University Rankings
The QS Latin American University Rankings were launched in 2011. They use academic opinion (30 per cent), employer opinion (20 per cent), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures. These criteria were developed in consultation with experts in Latin America, and the web visibility data comes from Webometrics. The 2016/17 edition of the ranking ranks the top 300 universities in the region, and showed that the University of São Paulo in Brazil is the region's top institution.
Reuters World's Top 100 Innovative Universities
The ranking is empirical and compiles a methodology that employs 10 different metrics. The criteria focused on academic papers, which indicate basic research performed at a university, and patent filings, which point to an institution's interest in protecting and commercializing its discoveries. Compiled by the Intellectual Property & Science business of Thomson Reuters, the list uses proprietary data and analysis tools. The process began by identifying the 500 academic and government organizations that published the greatest number of articles in scholarly journals, as indexed in the Thomson Reuters Web of Science Core Collection database. The list was cross referenced against the number of patents filed by each organization during the same time period in the Derwent World Patents Index and the Derwent Innovations Index. Patent equivalents, citing patents and citing articles were included. The timeframe allows for the articles and patent activity to receive citations, thereby contributing to that portion of the methodology. The list was reduced to just those institutions that filed 70 or more patents, the bulk of which were universities. Each candidate university was then evaluated using various indicators including how often a university's patent applications were granted; how many patents were filed with global patent offices and local authorities; and how often the university's patents were cited by others. Universities were also evaluated in terms of how often their research papers were cited by patents; and the percentage of articles that featured a co-author from industry. The ranking has the Asia-Pacific edition featuring top 75 institutions across the region and top 25 most innovative governmental institutions in the world.
Round University Ranking
Round University Ranking, or abbreviated RUR Rankings is a world university ranking, assessing effectiveness of 750 leading universities in the world based on 20 indicators distributed among 4 key dimension areas: teaching, research, international diversity, financial sustainability. The ranking has international coverage and is intended to become a tool of choice of the university for the key stakeholders of higher education: applicants, students, representatives of the academic community, university management. The RUR Rankings publisher is an independent RUR Rankings Agency, geographically located in Moscow, Russia. RUR is aimed to provide transparent, comprehensive analytical system for benchmarking and evaluation universities across the borders to the widest possible audience: students, analysts, decision-makers in the field of higher education development both at individual institutional and at the national level.
SCImago Institutions Rankings
The SCImago Institutions Rankings (SIR) since 2009 has published its international ranking of worldwide research institutions, the SIR World Report. The SIR World Report is the work of the SCImago Research Group, a Spain-based research organization consist of members from the Spanish National Research Council (CSIC), University of Granada, Charles III University of Madrid, University of Alcalá, University of Extremadura and other education institutions in Spain.
The ranking measures areas such as: research output, international collaboration, normalized impact and publication rate.
Times Higher Education World University Rankings
From 2004 to 2009 Times Higher Education (THE), a British publication, published the annual Times Higher Education-QS World University Rankings in association with Quacquarelli Symonds (QS). THE published a table of the top 200 universities and QS ranked approximately 500 online, in book form, and via media partners. On 30 October 2009, THE broke with QS and joined Thomson Reuters to provide a new set of world university rankings, called Times Higher Education World University Rankings. The 2015/16 edition of the Times Higher Education World University Rankings rank the world's 800 best universities, while the 2016/17 instalment will rank the world's top 980.
On 3 June 2010, Times Higher Education revealed the methodology which they proposed to use when compiling the new world university rankings. The new methodology included 13 separate performance indicators, an increase from the six measures employed between 2004 and 2009. After further consultation the criteria were grouped under five broad overall indicators to produce the final ranking. THE published its first rankings using its new methodology on 16 September 2010, a month earlier than previous years. THE also kick-started THE 100 Under 50 ranking and Alma Mater Index.
The Globe and Mail in 2010 described the Times Higher Education World University Rankings as "arguably the most influential." Research published by professors at the University of Michigan in 2011 demonstrated that the early THES rankings were disproportionately influential in establishing the status order of world research universities.
Times Higher Education World Reputation Rankings
This ranking was published for the first time in March 2011. The 2016 rankings are based on a survey of 10,323 academics from 133 countries. They rank Harvard University as possessing the world's most powerful university brand, followed by Massachusetts Institute of Technology and Stanford University. The survey was conducted in eight languages by Ipsos Media CT for Times Higher Education's ranking-data partner Thomson Reuters, and asked experienced academics to highlight what they believed to be the strongest universities for teaching and research in their own fields. The top six universities in the ranking for 2014--Harvard, MIT, Stanford, Cambridge, Oxford, UC Berkeley--were found to be "head and shoulders above the rest", and were touted as a group of globally recognised "super brands."
U-Multirank
U-Multirank, a European Commission supported feasibility study, was undertaken to contribute to the European Commission objective of enhancing transparency about the different missions and the performance of higher education institutions and research institutes. At a press conference in Brussels on 13 May 2011, the U-Multirank was officially launched by Androulla Vassiliou, Commissioner for Higher Education and Culture saying: U-Multirank "will be useful to each participating higher education institution, as a planning and self-mapping exercise. By providing students with clearer information to guide their study choices, this is a fresh tool for more quality, relevance and transparency in European higher education." U-Multirank breaks new ground by producing multi-dimensional listings rating universities on a much wider range of factors than existing international rankings. The idea is to avoid simplistic league tables which can result in misleading comparisons between institutions of very different types or mask significant differences in quality between courses at the same university.
U-Multirank assesses the overall performance of universities but also ranks them in selected academic fields: in 2014 the fields are business studies, electrical engineering, mechanical engineering and physics; in 2015, psychology, computer science and medicine will be added. The universities are tested against up to 30 separate indicators and rated in five performance groups, from 'A' (very good) through 'E' (weak). The results show that while over 95% of institutions achieve an 'A' score on at least one measure, only 12% have more than 10 top scores. Of the 850 universities in the ranking, 62% are from Europe, 17% from North America, 14% from Asia and 7% from Oceania, Latin America and Africa. U-Multirank received EUR2 million in EU funding from the former Lifelong Learning Programme (now Erasmus) for the years 2013-2015, with the possibility of a further two years of funding in 2015-2017. The goal is for an independent organisation to manage the ranking on a sustainable business model thereafter.
UniRanks "The Ranking of Rankings"
The UniRanks World University Ranking aggregates the results of five global ranking, combining them to form a single rank. It uses the following rankings and weights: THE World University Ranking 22.5%, QS World University Ranking 22.5%, US News Best Global University 22.5%, ARWU 22.5%, Reuters World Top 100 Innovative Universities 10%. The first edition of UniRanks was launched in 2017.
University Ranking by Academic Performance
The University Ranking by Academic Performance, abbreviated as URAP, was developed in the Informatics Institute of Middle East Technical University. Since 2010, it has been publishing annual national and global college and university rankings for top 2000 institutions. The scientometrics measurement of URAP is based on data obtained from the Institute for Scientific Information via Web of Science and inCites. For global rankings, URAP employs indicators of research performance including the number of articles, citation, total documents, article impact total, citation impact total, and international collaboration. In addition to global rankings, URAP publishes regional rankings for universities in Turkey using additional indicators such as the number of students and faculty members obtained from Center of Measuring, Selection and Placement ÖSYM.
U.S. News & World Report's Best Global Universities Rankings
The U.S. News & World Report's inaugural Best Global Universities ranking was launched on 28 October 2014, and it was based on data and metrics provided by Thomson Reuters, and are thus methodologically different from the criteria traditionally used by U.S. News to rank American institutions. Universities are judged on factors such as global research reputation, publications and number of highly cited papers. U.S. News also publishes region-specific and subject-specific global rankings based on this methodology.
The annual U.S. News Best Global Universities rankings were produced to provide insight into how universities compare globally. As an increasing number of students are planning to enroll in universities outside of their own country, the Best Global Universities rankings - which focus specifically on schools' academic research and reputation overall and not on their separate undergraduate or graduate programs - can help those students accurately compare institutions around the world.
The Best Global Universities rankings also provide insight into how U.S. universities - which U.S. News has been ranking separately for more than 30 years - stand globally. All universities can now benchmark themselves against schools in their own country and region, become more visible on the world stage and find top schools in other countries to consider collaborating with.
The overall Best Global Universities rankings encompass the top 750 institutions spread out across 57 countries - up from the top 500 universities in 49 countries ranked last year. The first step in producing these rankings, which are powered by Thomson Reuters InCitesTM research analytics solutions, involved creating a pool of 1,000 universities that was used to rank the top 750 schools. In comparison with US News National University Ranking. the Global University Ranking is focused on the research power and faculty resources for students, while the National Ranking is only focused on undergraduate studies. Therefore, for graduate studies and international students, the Best Global Universities Ranking is a much better reference than National University Ranking.
Inside Higher Ed noted that the U.S. News is entering into the international college and university rankings area that is already "dominated by three major global university rankings": the Times Higher Education World University Rankings, the Academic Ranking of World Universities, and the QS World University Rankings. U.S. News's chief data strategist Robert Morse stated "We're well-known in the field for doing academic rankings so we thought it was a natural extension of the other rankings that we're doing."
Morse pointed out that the U.S. News as "the first American publisher to enter the global rankings space", given Times Higher Education and QS are both British, while the Academic Ranking of World universities is Chinese.
Webometrics
The Webometrics Ranking of World Universities is produced by Cybermetrics Lab (CCHS), a unit of the Spanish National Research Council (CSIC), the main public research body in Spain. It offers information about more than 12,000 universities according to their web presence (an assessment of the scholarly contents, visibility and impact of universities on the web). The ranking is updated every January and July.
The Webometrics Ranking or Ranking Web is built from a database of over 20,000 higher education institutions. The top 12,000 universities are shown in the main ranking and more are covered in regional lists.
The ranking started in 2004 and is based on a composite indicator that includes both the volume of the Web contents and the visibility and impact of web publications according to the number of external links they received. A wide range of scientific activities appears exclusively on academic websites and is typically overlooked by bibliometric indicators.
Webometric indicators measure institutional commitment to Web publication. Webometric results show a high correlation with other rankings. However, North American universities are relatively common in the top 200, while small and medium-size biomedical institutions and German, French, Italian and Japanese universities were less common in the top ranks. Possible reasons include publishing via independent research councils (CNRS, Max Planck, CNR) or the large amount of non-English web contents, which are less likely to be linked.
Wuhan University
The Research Center for Chinese Science Evaluation at Wuhan University ranking is based on Essential Science Indicators (ESI), which provides data on journal article publication counts and citation frequencies in over 11,000 journals around the world in 22 research fields.
Maps College and university rankings
Regional and national rankings
Regional and national rankings are carried out in Africa, Asia, Europe, North America, South America and Oceania.
Asia
QS's Asian University Rankings use some of the same data as the QS World University Rankings alongside other material, such as the number of exchange students attending or traveling from each university. The rankings list the top 350 universities in Asia.
China
University rankings in China are ordered by different standards and made by various organizations, including:
- Wu Shulian, published in the name of the Chinese Academy of Management Science
- Netbig, the higher education internet information company
- CUAA, by Airuishen (a company) in the name of Chinese Universities Alumni Association, etc.
India
The National Institutional Ranking Framework is initiated by the Ministry of Human Resource Development of the Government of India, to rank all institutions of higher education in India. Magazines such as Youth Incorporated, India Today, Outlook, Mint, The Week, Dataquest, Careers360 and Electronics For You conduct annual rankings for the major disciplines.
Japan
Most of the ranking systems in Japan rank universities by the difficulty of their entrance exams, called "Hensachi". One example of such a ranking is Going broke universities - Disappearing universities by Kiyoshi Shimano. Organizations who use other methods of ranking universities in Japan include Nikkei Business Publications, which annually releases the Brand rankings of Japanese universities every November. Toyo Keizai, who regularly releases the university rankings "Truly Strong Universities" once a year, is another example. Japanese leading prep school Kawaijuku also released the Japan's Top 30 University Rankings in Natural Sciences and Technology for MEXT's GLOBAL 30 Project in 2001.
Pakistan
Pakistan's Higher Education Commission annually ranks domestic universities.
Philippines
Academic rankings in the Philippines are conducted by the Professional Regulation Commission and the Commission on Higher Education, based on the average passing rates in board tests.,
South Korea
Korean Council for University Education, established in 2009, evaluates universities in South Korea.
Europe
European Union
The European Commission compiled a list of the 22 universities in the EU with the highest scientific impact. This ranking was compiled as part of the Third European Report on Science & Technology Indicators, prepared by the Directorate General for Science and Research of the European Commission in 2003 (updated 2004). It only explicitly considers the European Union's top institutions, but comparisons with the rest of the world are provided in the full report. The report says, "University College London comes out on top in both publications (the number of scientific publications produced by the university) and citations (the number of times those scientific publications are cited by other researchers)" however the table lists the top scoring university as "Univ London" implying that the authors counted the scientific output of the entire University of London, rather than its constituent colleges.
In this ranking, the EU's top two universities are Cambridge and Oxford, as in the Jiao Tong and Times rankings. This ranking stresses the scientific quality of the institution, as opposed to its size or perceived prestige. Thus smaller, technical universities, such as Eindhoven (Netherlands) and the Technical University Munich (Germany) are ranked third and fourth, behind Cambridge, and followed by the University of Edinburgh. The report does not provide a direct comparison between EU and universities in the rest of the world, although it does compute a scientific impact score, which is measured against the world average.
In December 2008, the European Commission published a call for tenders, inviting bidders to design and test a new multi-dimensional university ranking system with global outreach. The first results of the envisaged pilot project were promised for the first half of 2011.
Another approach to classify the European research area is offered by 'European Research Ranking'. This ranking is based on publicly available data from the European Commissions project and funding database CORDIS to estimate the funding and networking performance of European research institutions.
Austria
Some Austrian universities, including all Austrian Universities of Applied Sciences, take part in the CHE University Ranking.
Bulgaria
The Bulgarian University Ranking System, maintained by the Bulgarian Ministry of Education, compares academic programs in accredited domestic higher education institutions. The system ranks programs based on more than 50 indicators, such as teaching and learning conditions, scientific research, career development opportunities, prestige, and material resources.
Denmark
In Denmark, the think-tank CEPOS conduct an annual survey and ranking of higher education at study program level and institution level, based on entry salary, career development, drop-out rates, and program completion rates.
France
Eduniversal provides rankings of undergraduate and graduate degrees of French universities in some areas.
Le Nouvel Observateur occasionally offer rankings of "Grandes écoles" and their preparatory schools, the "Prépas", and of universities' undergraduate degrees in some areas.
Germany
Since 2007, the CHE "ExcellenceRanking" has been published by the Center for Higher Education Development in Germany. The ranking includes the sciences of biology, chemistry, mathematics and physics as well as psychology, political science and economics. The ranking is designed to support the search for masters or doctoral programmes. The CHE also wants to highlight the research strengths of European universities and provide them with ideas for improvement. The ranking is published by the German weekly newspaper Die Zeit in English and German. The CHE Center for Higher Education Development gathers the data for this ranking. An English version is provided by the DAAD.
The CHE also publishes a "ResearchRanking" showing the research strengths of German universities. The CHE ResearchRanking is based on the research-related data of the UniversityRanking.
Ireland
The Sunday Times ranks Irish universities based on a mix of criteria, including secondary school examination scores, graduation rates, staff-student ratio, research efficiency, accommodation, nontraditional students, athletics and sports facilities.
Italy
Every year, the newspaper La Repubblica, in collaboration with CENSIS, compiles a ranking of Italian universities.
Macedonia
The Academic Ranking of World Universities (ARWU) compiled a ranking of Macedonian Higher Education Institutions (HEIs) commissioned by the country's Ministry of Education and Science in February 2011 and released it on 16 February 2012. Nineteen qualified HEIs were included in the ranking. The ranking used 19 indicators of academic performance and competitiveness, covering major mission aspects of HEIs such as teaching, research and social service. It is the first university ranking in Macedonia.
Netherlands
Most Dutch universities take part in the CHE UniversityRanking.
Poland
A popular ranking of Polish higher education institutions is annually published by education magazine Perspektywy.
Romania
The Ad Astra association of Romanian scientists ranked Romanian universities in 2006 and 2007.
Russian Federation
Several bodies rank Russian universities, including RIA Novosti / Forbes, independent rating agency RatER, Interfax (in cooperation with Ekho Moskvy) and the Russian journal Finance.
RIA Novosty / Forbes rankings are conducted under the supervision of Public Chamber of Russia in cooperation with State University - Higher School of Economics. This ranking is considered the most objective system. It covers 476 higher education institutions and is based on the average score of the Unified State Examination that is required to enter a university. The ranking has separate subrankings for different subjects and clusters of universities.
RIA Novosty rankings do not align with other local and international rankings such as Academic Ranking of World Universities and QS World University Rankings which take into account inherited reputation from the Soviet Union.
RatER publishes annual rankings based on representation of university graduates in governmental, education and business elite.
Interfax annually ranks "classical" (or multi-faculty) universities and higher education institutions specialising in law. Interfax' methodology quantifies several qualitative factors such as research, teaching standards, public opinion and social and international activity.
Finance produces an integrated ranking of higher education institutions specialising in economics and finance. The Journal uses the average score of the Unified State Examination, the number of CFO graduates and the consolidated turnover of companies where graduate CFOs are employed.
Sweden
In Sweden, the Confederation of Swedish Enterprise (Svenskt Näringsliv) conduct an annual survey and ranking of higher education at study program level, based on entry salary, career development, internationalization, and degree of academic-business collaboration.
Switzerland
The swissUp Ranking ranked Swiss university and polytechnic students until 2004. The swissUp Ranking is no longer conducted. Some universities from the German-speaking part of Switzerland, such as ISFOA Lugano take part in the CHE UniversityRanking.
Ukraine
Ukraine's Ministry of Education and Science performs official yearly university evaluations. Zerkalo Nedeli newspaper published the top 200-ranked Ukrainian universities in 2007. Kyiv Student Council ranks universities on criteria of student satisfaction.
United Kingdom
There are three major rankings of universities in the United Kingdom published by commercial companies: The Times and Sunday Times Good University Guide, The Complete University Guide and the Guardian University Guide. Since 2008, Times Higher Education has compiled a 'Table of Tables' which combines the results of the 3 national league tables. For 2017, the top 5 universities were Cambridge University, Oxford University, University of St Andrews, and Imperial College London and Durham University in joint fourth.
The Research Excellence Framework was the successor to the Research Assessment Exercise in 2014. It is used by the UK government to evaluate the research quality of British universities and determine the distribution of future research funding. In 2014, the top five universities for research power as compiled by Research Fortnight were University of Oxford, University College London, University of Cambridge, University of Edinburgh and University of Manchester.
The Research Assessment Exercises (RAE) were the UK government's evaluation of research quality in British Universities. Each subject, called a unit of assessment, was ranked by a peer review panel. The rankings were used in the allocation of government funding. The last assessment was made in 2008. The RAE provided quality ratings for research across all disciplines. Panels used a standard scale for each submission. Ratings ranged from 1 to 5, according to the quantity of work that was judged to reach national or international levels of excellence. Participating institutions receive grants from one of the four higher education funding bodies in England, Scotland, Wales and Northern Ireland. The top three universities in the 2008 RAE exercise were London School of Economics, Cambridge University and Oxford University.
The Quality Assurance Agency for Higher Education (QAA) assesses undergraduate teaching. QAA is an independent body established by the UK's higher education institutions in 1997. QAA was under contract to the Higher Education Funding Council for England to assess quality for English universities. This replaced Teaching Quality Assessments (TQAs) which aimed to assess the administrative, policy and procedural framework within which teaching took place and did not directly assess teaching quality. This inspection-based system was replaced by a system of information provision, including a national student survey. QAA publishes scores which have been used by the league table industry. The first Teaching Excellence Framework is to be published in 2017; this is a rating system (giving gold, silver or bronze ratings to higher education providers) rather than a ranking as such.
North America
Canada
Maclean's, a Canadian news magazine, publishes an annual ranking of Canadian Universities, called the Maclean's University Rankings. Ranking criteria include student body characteristics, classes, faculty, finances, library, and reputation. The rankings are split into three categories: schools that focus on undergraduate studies with few to no graduate programs, schools that have both extensive undergraduate studies and an extensive selection of graduate programs and schools that have a professional medical program and a selection of graduate programs.
The University of Calgary produced a formal study examining the ranking methodology, illuminating the factors that determined its rank and criticizing certain aspects of the methodology. The University of Alberta, the University of Toronto and University of Manitoba have expressed displeasure over the ranking system.
A notable difference between rankings in the United States and Maclean's rankings, however, is that Maclean's excludes privately funded universities. However, the majority of Canada's institutions, including the best-known, are publicly funded.
Beginning in September 2006, over 20 Canadian universities, including several of the most prestigious and largest universities such as the University of Toronto, University of British Columbia, University of Alberta, Concordia University, McMaster University and Dalhousie University, jointly refused to participate. University of Alberta president Indira Samarasekera wrote that Maclean's initially filed a "Freedom of Information" request but that it was "too late" for the universities to respond. Samarasekera further stated, "Most of [the universities] had already posted the data online, and we directed Maclean's staff to our Web sites. In instances where the magazine staff couldn't find data on our Web site, they chose to use the previous year's data."
Mexico
Estudio Comparativo de Universidades Mexicanas (ECUM)
Mexican institutions have been compared in the Estudio Comparativo de Universidades Mexicanas (ECUM) produced within the Universidad Nacional Autónoma de México (UNAM). ECUM provides data on institutional participation in articles on ISI Web of Knowledge-indexed journals; faculty participation in each of Mexico's three-level National Researchers System (SNI); graduate degrees within National Council of Science and Technology's (CONACYT) register of quality graduate programs; and number of academic research bodies (cuerpos academicos) according to the Secretariat of Public Education (SEP) program PROMEP.
ECUM provides online access to data for 2007 and 2008 through ExECUM. Institutional data can be visualized through three options:
- A selection of the most prominent 58 universities (43 publics and 13 privates). This selection accounts for more than 60 percent of undergraduate and graduate enrollments. It includes public federal universities (UNAM, Instituto Politécnico Nacional, Universidad Autónoma Metropolitana, Universidad Pedagógica Nacional, Universidad del Ejército y la Fuerza Aérea, Colegio de México, Universidad Autónoma de Chapingo, Universidad Autónoma Agraria Antonio Narro); 35 public state universities (UPES), and a group of private institutions that feature within ECUM's selected classification data.
- Result tables for the top 20 institutions in each of the data labels in this study. These include some of the selected universities in addition to the rest of Mexico's higher education institutions, as well as institutes, centers and other research producing organizations.
- A personalized selection from more than 600 institutions. These are classified by institutional type, institutional gatherings, by activity sector alphabetically.
ExECUM allows users to establish comparison types and levels which they consider relevant. Data is presented in raw form with virtually no derived indicators. Users can relate variables and build indicators according to their own analytical perspectives.
Based on this comparative study project, ECUM's creator, the Dirección General de Evaluación Institucional, published reports providing an analysis of the data for 2007 and 2008.
United States
Council for Aid to Education
The Council for Aid to Education publishes a list of the top universities in terms of annual fundraising. Fundraising ability reflects, among other things, alumni and outside donor's views of the quality of a university, as well as the ability of that university to expend funds on top faculty and facilities. Most recent rankings put Stanford at the top, ahead of Harvard and Columbia.
The Daily Beast's Guide to the Best Colleges
The Daily Beast's college rankings take into account nine factors, with future earnings, affordability, and graduation rate weighted most heavily. The other criteria include academics, diversity, athletics, nightlife, activities, and campus quality. The Daily Beast's college rankings took into account approximately 2000 colleges and reported the top 200 scoring schools. The Daily Beast's college rankings report the top 250 scoring schools, with Stanford University at the top, followed by Harvard University, Yale University, MIT, and Columbia University.
The Economist's "Best Colleges. The Value of University"
The Economist's college rankings The Economist Magazine's List of America's Best Colleges focuses on comparable economical advantages defined as 'the economic value of a university is equal to the gap between how much its students subsequently earn, and how much they might have made had they studied elsewhere'. Based on set of strict criteria sourced from U.S. Department of Education ('College Scorecard") with relevant 'expected earnings' and multiple statistics applied in calculation of 'median earnings' conclusive evaluation method has been applied to run the scorecard's earnings data through a multiple regression analysis, a common method of measuring the relationships between variables.
Forbes College rankings
In 2008, Forbes.com began publishing an annual list, prepared by the Center for College Affordability and Productivity of "America's Best Colleges".
- Student satisfaction (evaluations from RateMyProfessors.com, retention rates and targeted student satisfaction surveys on Facebook) constitutes 25% of the score.
- Post-graduate success (self-reported salaries of alumni from PayScale, alumni appearing on the CCAP's America's Leaders List) constitutes 32.5% of the score.
- Student debt loads constitute 25% of the score.
- Graduation rate (the proportion of students who complete four-year degrees in four years) constitutes 7.5% of the score.
- Academic success (the proportion of students receiving nationally competitive awards) constitutes 10% of the score. Public reputation is not considered, which causes some colleges to score lower than in other lists. A three-year moving average is used to smooth out the scoring.
The 2016 ranking put Stanford at the top, followed by Williams, Princeton, Harvard, and MIT.
The "Objective" College rankings
In 2015, a new website began publishing what it terms The Objective College Ranking. The ranking is based on objectively measurable data about US colleges from The National Center for Education Statistics - the weighting factors for different college metrics are given on the site for transparency. Interestingly, refreshing the webpage changes the ranking, showing how sensitive any college ranking process is to the weighting given different factors. While this site is clearly satirical in nature, it makes a profound point regarding the ultimate subjectivity of all college ranking methods.
Money's Best Colleges
Money magazine's college rankings take into account 21 factors which it categorizes as measures of educational quality, affordability, and alumni earnings. The rankings considered 1500 four-year colleges and reported the top ranking 736. In 2015, according to Money, the top five colleges are Stanford, Babson, MIT, Princeton, and Caltech.
The Princeton Review Dream Colleges
The Princeton Review annually asks students and parents what their dream college is, if cost and ability to get in were not factors. In 2016, for the fourth consecutive year, Stanford was the top "dream school" for both students and parents. Second and third places, in 2016, were taken by Harvard and New York University among students, and Harvard and Princeton among parents.
Revealed preference rankings
Avery et al. pioneered the use of choice modelling to rank colleges. Their methodology used a statistical analysis of the decisions of 3,240 students who applied to college in 1999. MyChances.net, now called Parchment, adopted a similar approach starting in 2009, stating that its method is based on this approach. The study analysed students admitted to multiple colleges. The college they attended became the winner, and the others became the losers. An Elo rating system was used to assign points based on each win or loss, and the colleges were ranked based on their Elo points. A useful consequence of the use of Elo points is that they can be used to estimate the frequency with which students, upon being admitted to two schools, will choose one over the other. Most recent preference ranking placed Stanford at the top, followed by MIT, Harvard, and Princeton.
Social Mobility Index (SMI) rankings
The SMI rankings are a collaborative publication from CollegeNet and PayScale. The rankings aim to provide a measure of the extent to which colleges provide upward economic mobility to those that attend. The rankings were created in response to the finding in Science magazine which showed that among developed nations, the United States now provides the least economic opportunity and mobility for its citizens. The rankings were also created to combat the rising costs of tuition, much of which is attributed to the efforts of some colleges to increase their own fame and wealth in ways that increase their rank in media periodicals that put an emphasis on such measures. According to the SMI, the top five colleges are Montana Tech, Rowan University, Florida A&M, Cal Poly Pomona, and Cal State Northridge.
U.S. News & World Report college and university rankings
The magazine U.S. News & World Report's college and university rankings have been compiled since 1983. The college rankings were published in all years thereafter, except 1984. The ranking order of universities has been shown to have great effect; a one-rank improvement leads to a 0.9% increase in number of applicants.
The US News rankings are based upon data which it collects from each educational institution either from an annual survey or from the school's website. There has been some significant controversy surrounding this annual survey, including a letter from the Annapolis Group requesting that school presidents do not participate in the US News annual survey, which led to "a majority of the approximately 80 presidents at the meeting said that they did not intend to participate in the U.S. News reputational rankings in the future." There have been reports of universities misreporting data on surveys just to gain an upper hand in rankings.
Also considered in the rankings formula are opinion surveys of university faculty and administrators outside the school.
U.S. News & World Report puts the colleges in four separate categories based on whether they offer master's degrees, doctoral degrees, or only bachelor's degrees, and the extent to which these respective degree types are offered. In their Regional Colleges category their top colleges are: US Coast Guard Academy (North), Ashbury University (South), Taylor University (Midwest), and Carroll College (West). In their Regional Universities category their top colleges are: Villanova University (North), Elon (South), Creighton (Midwest), and Trinity University (West). In their Liberal Arts Colleges category their top colleges are: Williams, Amherst, Swarthmore, and Wellesley. Bowdoin and Pomona tie for fifth. In their National Universities category their top colleges are: Princeton, Harvard, the University of Chicago, Yale (tied for third), Columbia and Stanford (tied for fifth).
United States National Research Council Rankings
The National Research Council ranks the doctoral research programmes of US universities, most recently in 1995. Data collection for an updated ranking began in 2006.
Faculty Scholarly Productivity rankings
The Faculty Scholarly Productivity Index by Academic Analytics ranks 354 institutions based on faculty publications, citations, research grants and awards.
The Top American Research Universities
The Center for Measuring University Performance has ranked American research universities in the Top American Research Universities since 2000. The methodology is based on data such as research publications, citations, recognitions and funding, as well as undergraduate quality such as SAT scores. The information used can be found in publicly accessible materials, reducing possibilities for manipulation. The methodology is generally consistent from year to year and changes are explained in the publication along with references from other studies.
Washington Monthly College rankings
The Washington Monthly's "College Rankings", last published in 2011, began as a research report in 2005. Related rankings appeared in the September 2006 issue. It offers American university and college rankings based upon how well each enhances social mobility, fosters scientific and humanistic research and promotes an ethic of service. Washington Monthly puts the colleges in four separate categories based on whether they offer master's degrees, doctoral degrees, or only bachelor's degrees, and the extent to which these respective degree types are offered. In their Baccalaureate College category their top five are: Elizabeth City State University, Tuskegee University, Bethel College-North Newton, Wheeling Jesuit University, and Messiah College. In their Liberal Arts Colleges category their top five are: Bryn Mawr, Carleton College, Berea College, Swarthmore College, and Harvey Mudd. In their Master's Universities category their top five are: Creighton, Truman State, Valparaiso, Trinity University, and SUNY Geneseo. In their National Universities category their top five are: UC San Diego, UC Riverside, UC Berkeley, Texas A&M, and UCLA.
TrendTopper MediaBuzz College Guide
TrendTopper MediaBuzz College Guide is an American-college guide based on what it calls "Internet brand equity" based on data collected from the Internet and global media sources. It ranks the Top 300 United States colleges and universities. The guide includes specialty and for profit schools including Art, Business, Design, Music, and Online Education. The TrendTopper MediaBuzz College Rankings are produced twice a year by the Global Language Monitor of Austin, Texas.
Time Magazine described internet brand equity as "a measure of who's talking about you online, based on Internet data, social media, blogs and the top 75,000 print and electronic media outlets."
GLM ranks the schools "according to their online presence -- or internet brand equity ... By focusing on online presence, the Monitor hopes to avoid the biases that characterize other rankings, which commonly rely on the opinions of university officials and college counselors rather than that of the greater public. " GLM believes the rankings provide an up-to-date perspective on which schools have the most popular brand. The resulting rankings gauge the relative value of the various institutions and how they change over time.
American Council of Trustees and Alumni
In 2009, the American Council of Trustees and Alumni (ACTA) began grading each college or university on the strength of its general education requirements. ACTA's annual report on What Will They Learn? uses how many of seven subjects (composition, mathematics, foreign language, science, economics, literature and American government or history) are required by an institution to assign it a letter grade (A through F). The 2011-2012 edition of What Will They Learn? graded 1,007 institutions, and awarded nineteen schools an "A" for requiring more than five of the subjects. Its 2012-2013 evaluation awarded twenty one "A" grades among 1,070 colleges and universities. ACTA's rating system has been endorsed by Mel Elfin, founding editor of U.S. News & World Report's rankings. New York Times higher education blogger Stanley Fish agreed a university ought to have a strong core curriculum, but disagreed with the inclusion of some ACTA subjects in that core.
Niche College Rankings
Niche College Rankings is an American college ranking site that incorporates analysis of college based statistics and reviews. Niche also features A-F rankings for K-12 schools and neighborhoods or districts. Niche's rankings are updated every year. This is shown as they first developed college rankings by major as well as rankings and graded Report Cards for community colleges and trade schools. Niche provides its own grading system that applies a Bayesian method. In 2017, Niche provides several rankings in each category, "Best Colleges," "Best by Major," "Best by State," "Admissions," "Campus Life," "Student," and "Academics". Niche collects more than 100 million college reviews and survey responses as well as comprehensive data such as U.S Department of Education. Niche also incorporates data from the new College Scorecard Data that was introduced in 2015 by the Obama Administration under the U.S Department of Education.
Other
Other organizations that rank US institutions include the Fiske Guide to Colleges and College Prowler. Fiske Guide to Colleges provides rankings for each criterion, which lets students choose their individual factors and use their rankings accordingly. Many specialized rankings are available in guidebooks, considering individual student interests, fields of study, geographical location, financial aid and affordability.
Among the rankings dealing with individual fields of study is the Philosophical Gourmet Report or "Leiter Report", a ranking of philosophy departments. The PGR was described by David L. Kirp in a 2003 New York Times op-ed as "the bible for prospective [philosophy] graduate students." George Yancy, in Reframing the Practice of Philosophy: Bodies of Color, Bodies of Knowledge (SUNY Press, 2012), opined that Philosophical Gourmet Report ranking: "is, of course, very controversial. However, as is often pointed out, there is no real alternative." Carlin Romano, in America the Philosophical (Knopf Doubleday Publishing Group, 2013), referred to the PGR rankings as "often-criticized" and "biased towards mainstream analytic departments". This report has attracted criticism from different viewpoints. Notably, practitioners of continental philosophy, who perceive the Leiter report as unfair to their field, have compiled alternative rankings.
The Gourman Report, last published in 1996, ranked the quality of undergraduate majors and graduate programs.
Gallup polls ask American adults, "All in all, what would you say is the best college or university in the United States?"
The Princeton Review, annually publishes a book of Best Colleges. In 2011, this was titled The Best 373 Colleges. Phi Beta Kappa has also sought to establish chapters at the best schools, lately numbering 280.
In terms of collegiate sports programs, the annual NACDA Directors' Cup provides a measure of all-around collegiate athletic team achievement. Stanford has won the Division I Directors' cup for nineteen years in a row, and is poised to clinch its twentieth cup when the 2014 season ends.
Oceania
Australia
The Good Universities Guide and Excellence in Research for Australia annually rank domestic universities.
South America
QS University Rankings: Latin America
QS Quacquarelli Symonds, in addition to their QS World University Rankings, publish an annual ranking of the top 300 universities in Latin America. The eighth instalment, released for the 2016/17 academic year, places the Universidade de São Paulo as the region's best university.
Argentina
In Argentina the National Commission for University Evaluation and Accreditation ranks higher education programs by evaluation and accreditation.
Brazil
The latest ranking, the Ranking Universitário Folha (RUF) website (in Portuguese), was created by the newspaper Folha de S.Paulo. This ranking is based on the combination of four indicators: education quality, research quality, market assessment and an innovation indicator.
Chile
In Chile the "Comisión Nacional de Acreditación" (National Commission of Accreditation of the Universities) manages evaluation and accreditation. It also ranks universities according to accreditation levels. Other commercial rankings are made by research magazines, including Qué Pasa and América Economía. Qué Pasa's ranking evaluates perception and quality following surveys of approximately 1,000 employers across the country. América Economía's ranking considers quality of students, quality of teachers, rating of professors by student, research productivity, internationalization, integration with the community, student life quality and inclusion of students from lower social strata.
Criticism
Critics argue that rankings can divert universities' attention away from teaching and social responsibility towards the type of scientific research valued by indicators used for ranking exercises. There have also been concerns that by applying a limited set of criteria to world universities, and given the strong desire to feature in the top 200 universities, rankings actually encourage the homogenization of higher education institutions, making them less responsive and less relevant to their immediate contexts. The fact that rankings are also said to favour the advantage enjoyed by the 200 best-ranked institutions has important implications for equity.
See also
- MBA Programme rankings
- Eduniversal
Sources
Notes and references
External links
- EUA Report on University Rankings 2013
- Interactive maps comparing the ARWU, Times Higher Education and QS World University Rankings
Source of the article : Wikipedia