Save

International Student Revenue and International Rankings Success: A Case Study of Australian University Research Strategies

In: Youth and Globalization
Author:
Salvatore Babones Associate Professor, Department of Sociology, University of Sydney, Sydney, Australia, salvatore.babones@sydney.edu.au

Search for other papers by Salvatore Babones in
Current site
Google Scholar
PubMed
Close
Open Access

Abstract

The publication of the first major international university rankings in 2003 opened up a new era in the internationalization of higher education, with universities around the world reorienting their research efforts toward securing success in international (as opposed to domestic) ranking systems. This has generated a potential principal-agency problem, as nationally-funded universities increasingly seek international validation. This opens up the possibility that a focus on rankings success might actually lead universities to implement policies that undermine their educational missions. This paper examines Australian universities’ pursuit of international rankings over the first two decades of the twenty-first century as a case study in the internationalization of higher education, with a focus on the behavior of Australia’s Group of Eight (Go8) research-intensive universities. Two major findings emerge. First, many Australian universities seem to have excessively expanded their international student enrolments as a way to generate the funds needed to compete in the ‘rankings game’. Second, many Australian universities seem to have prioritized research in fields that yield higher returns in rankings outcomes. These findings provide circumstantial evidence that Go8 universities may have compromised their educational missions in the pursuit of rankings success.

Introduction: The Internationalization of Australian Universities

Globalization is a master trend of our time. It both connects institutional fields that were previously isolated at the country level and generates transnational institutional hierarchies that tend to override previously existing national hierarchies (Babones and Aberg 2019). One institutional field that has become thoroughly globalized and simultaneously steeply stratified is higher education (Erkkila 2016). As it has done to other industries, globalization has shifted the primary locus of competition in academic fields from the national level to the global level, with the result that academic success has come to be defined increasingly in global terms. When combined with the high level of geographical mobility allowed to highly qualified workers, this has resulted in an increasing concentration of the most successful scholars in a relatively small number of ‘global’ universities.

Globalization is reflected in universities in many ways: in the internationalization of their academic faculties, in the internationalization of their student bodies, and in the globalization of the fields in which the internationalized careers of academics and students are pursued. The publication of global university rankings beginning in the first two decades of the twenty-first century has reinforced this latter trend toward the concentration of scholarship (Munch 2014:28–31). Indeed, there seems to be a positive feedback connecting the internationalization of universities’ academic faculties and the global concentration of scholarship success, with both researcher mobility being closely related to rankings success (Borjesson and Lillo Cea 2020; Downing et al 2021: 105–106). The managerial imperative to place high on the international rankings drives universities to seek the most productive researchers wherever they can find them, and this faculty internationalization itself contributes directly to some of the international ranking scores.

The missing element in research on the relationship between faculty internationalization and research globalization is finance: universities may want to recruit international faculty, but the competition for top researchers is intense, and desire without money is bound to be disappointed. Australian universities have arrived at a unique solution to this problem: the recruitment of large numbers of international students. All university systems admit international students, and many derive financial benefits from them, but no country has embraced the financial exploitation of international students to anything like the degree that Australia has. As of 2019, more than one-third of Austria’s higher education students by equivalent full-time student load (eftsl) were citizens of other countries (dese 2021b). This figure is globally unparalleled among major international student host countries (Babones 2019).

Australia is, depending on the specific data source used, roughly the 54th largest country in the world by population, the 14th in the size of its economy, and the 10th by income per capita. It is, however, firmly 3rd in the world in the international rankings league tables. Depending on the ranking used, it is home to six, seven, or even eight of the world’s ‘Top 100’ universities. On all four major international ranking systems, Australia ranks third in the world its total number of Top 100 universities, trailing only the United States and the United Kingdom. Australia also ranks 1st in the world on some of secondary rankings, with 14 universities in the Top 100 on the 2021 Times Higher Education (the) Young University Rankings for universities founded in the last 50 years and 4 of the Top 10 universities in the world on the 2021 the Impact Rankings, which are loosely based on the UN’s Sustainable Development Goals.

This performance is truly extraordinary for a country of Australia’s size. It is also a relatively recent phenomenon. When the first major international university rankings were published in 2003, the Academic Ranking of World Universities (arwu), Australia had only two universities in the global Top 100: the Australian National University (#49) and the University of Melbourne (#92). Australia as a whole ranked a distant ninth in the world, tied with France and trailing the United States (with 58 universities in the global Top 100), the United Kingdom (9), Germany (5), Japan (5), Canada (4), the Netherlands (3), Sweden (3), and Switzerland (3). Australia’s results on the inaugural Times Higher Education – Quacquarelli Symonds (the-qs) rankings in 2004 were much stronger (3rd place with 11 in the Top 100), but these rankings were strongly tilted toward the United Kingdom and excluded many non-English-speaking countries (Holmes 2010). When these shortcomings were addressed in subsequent years, Australia’s representation dropped precipitously.

In any case, the the-qs rankings included a strong reputational component, while the arwu rankings are based almost entirely on research. Thus the arwu better represents the research success of Australian universities. Australia’s climb from tied-ninth to third place in the world in less than two decades in the world’s premier research-based university rankings was truly extraordinary, and calls out for explanation. Prior research has noted that Australian universities’ international rankings “have risen in lockstep with their revenues” – and that those revenue increases were driven largely by their recruitment of dramatically increased numbers of Chinese international students (Babones 2019: 3). This conclusion has been confirmed by other experts on Australian university finance (Birrell and Betts 2018: 1–11; Norton 2020; Howard 2021: 1).

Australian universities’ rankings success, which has apparently been accomplished using funds obtained from international student tuition revenues, comes however at a cost. For example, the most important international rankings being pursued by Australian universities the arwu) reflect the Chinese government’s vision for university success (Liu 2015). Unlike its main global competitors, the arwu rankings give no credit whatsoever for research in the humanities. Nor does it give credit for book publications. It gives great weight to Nobel Prizes, but specifically excludes the prizes for literature and peace. These biases reflect those of the Chinese government, which set up the arwu primarily as a tool for benchmarking the success of China’s own universities. (Liu 2015) To the extent that Australian universities seek to achieve high arwu rankings, they are incentivized to internalize the Chinese government’s priorities for what a university should be and do (Fitzgerald 2020).

This paper traces the circuitous path that connects Chinese government priorities to Australian university practices through the mechanisms of internationalization and rankings, with a particular focus on research in the humanities. It first reviews the four major international university ranking systems, analyzing the biases they incorporate into their measurement practices. It then shows how Australian universities have leveraged the recruitment of international students into funding to fuel their rankings ambitions. The main empirical section shows how the battle for Highly Cited Researchers (hcr s) likely skewed the research priorities of Australian universities. The paper then concludes with some reflections on then long-term consequences of the rationalization of university priorities around the pursuit of global rankings success and some suggestions for reform.

The Four Major University Ranking Systems

Universities are large, complex organisations, and it can be difficult to evaluation them in holistic terms. Certainly it is impossible to reduce everything a university does to a single number. And in light of the institutional differences that distinguish university systems in different countries, it is also difficult to compare university success cross-nationally. Nonetheless, international university rankings are widely followed, and have become the basis for national higher education policy-formation. Some countries have even gone so far as to restructure their university systems, consolidating historically distinct universities into larger ‘national champions’, in efforts to raise their standings on international rankings (Sulkowski et al 2020). However imperfect global university ranking systems may be, university rankings have come to occupy an extraordinarily important place in national higher education policy.

Considering their high consequentiality, it can seem strange to reflect on just how new global university rankings are. At the turn of the millennium, there were no comprehensive international university rankings. The four major international university ranking systems in use today are all less than two decades old. Table 1 summarizes their origins and trajectories over time.

T1

The first organisation to produce a comprehensive global university ranking system was the Center for World-Class Universities of the Institute of Higher Education (later the Graduate School of Education) at Shanghai Jiao Tong University. Their Academic Ranking of World Universities (arwu), colloquially known as the “Shanghai” rankings, were commercialised in 2009 into the ShanghaiRanking Consultancy, which describes itself as a “fully independent organization dedicating to research on higher education intelligence and consultation” (ShanghaiRanking 2020c). The arwu rankings draw their citation data from the Science Citation Index and Social Science Citation Index, which were previously published the science metrics unit of Thompson-Reuters, a unit that in 2016 was spun off under the name Clarivate. The exact continuing relationship between ShanghaiRanking and the sjtu Center for World-Class Universities is unclear. They co-sponsor many events, although ShanghaiRanking is at pains to emphasise that it is “not legally subordinated to any universities or government agencies” (ShanghaiRanking 2020a).

ShanghaiRanking explains on its website that “the initial purpose of arwu was to find the global standing of top Chinese universities.” (ShanghaiRanking 2020c) In a short 2015 article, the long-serving dean of the sjtu Graduate School of Education and director of the Center for World-Class Universities, Liu Niancai, explained that the original purpose of the arwu was to “to benchmark top Chinese universities with world-class universities” in the rest of the world (Liu 2015: 2). This focus on Chinese self-benchmarking is confirmed by ShanghaiRanking’s commercial materials (accessed August 1, 2020). ShanghaiRanking Consultancy seems primarily to offer advisory services to Chinese universities on how to improve their management and rankings performance (ShanghaiRanking 2020b). The company’s China-facing website and public communications are much better developed than its international-facing ones, and although ShanghaiRanking does offer the arwu results in English, the Chinese arwu website is more modern and responsive. The ShanghaiRanking commercial consulting website is even more sophisticated, and is only available in Chinese. ShanghaiRanking does not seem to advertise its consulting services in English.

The Times Higher Education–Quacquarelli Symonds World University Rankings (the-qs) were first published in 2004 by what was then the Times Higher Education Supplement in cooperation with the educational consultancy Quacquarelli Symonds, which provided the underlying data (Baty 2014). The two organisations parted ways after publishing their last joint rankings in 2009, and since 2010 have published separate rankings (Holmes 2010). Nonetheless, the current the ranking system is the lineal descendant of the original the-qs ranking system, and it continues to use citation data produced by Thompson-Reuters / Clarivate. By contrast, the qs rankings use citation data from the Elsevier Scopus database. Both the and qs operate educational consultancy businesses; the’s core audience is academics, while qs’s core audience is students. These differing emphases are reflected in their post-2010 rankings approaches, with the giving greater weight to research and qs giving greater weight to teaching.

The Best Global Universities from US News & World Report (US News) came late to the international rankings game, releasing its first rankings in 2014. Ironically, US News is the publication that founded the entire practice of university rankings in 1983 with a special issue of the then-weekly magazine. The magazine ceased publication in 2010, but US News survives as a lifestyle website that provides highly respected domestic (US) rankings of universities, hospitals, law firms, and metropolitan areas. Although US News publishes the dominant American domestic university rankings (the U.S. News & World Report Best Colleges Ranking), its international Best Global Universities offering is less well-known.

Table 2 breaks down each of the four major international ranking systems by component, aggregating the specific components under four broad rubrics: teaching, research, internationalisation, and size adjustments. The four ranking systems differ dramatically in their composition, yet produce similar results at the very top. World-famous universities like Harvard, Stanford, Oxford, Cambridge, and mit figure in the global Top 10 on all four rankings, although their specific positions differ. Below the Top 10, however, individual universities are separated by relatively small differences in scores in each of the ranking systems, producing wide variability in specific results. To take just one example of a famous university that hovers on the edge of the global Top 10, the placements for the Johns Hopkins University range from #10 on the U.S. News rankings to #25 on the qs rankings. The variability for non-US universities can be even greater. For example, the University of Tokyo pips Johns Hopkins on the qs (#23), but lags far behind on the U.S. News ranking (#73). Nonetheless, all four systems yield broadly similar hierarchies of universities, with globally-famous research-intensive universities ranked highly and more teaching-focused universities ranked lower.

T2

Australia’s leading Group of Eight (Go8) research universities perform remarkably well on all four systems. Their most recent rankings are reported in Table 3. The six strongest Go8 universities place in the global Top 100 in all four systems, and all eight make it in the Top 100 in the U.S. News rankings. The median Go8 positions on the the and arwu rankings are charted in Figure 1. All Go8 universities except the Australian National University (anu) have risen dramatically in the science-heavy arwu rankings since they were first introduced in 2003. They have fared less well in the more-comprehensive the rankings, though mainly due to the methodological shift in 2010 that saw all except the University of Adelaide and the University of Melbourne experience substantial declines. Allowing for changes in methods, most Go8 universities have held just about stable on the British the rankings, while slowly climbing up the Chinese arwu table.

T3
Figure 1
Figure 1

Median rankings of Australia’s Go8 universities

Citation: Youth and Globalization 4, 1 (2022) ; 10.1163/25895745-bja10019

In contrast to the arwu and the/qs rankings, the US News rankings are idiosyncratic, relatively new, and not widely followed outside the United States. Australian universities report their results on these rankings, but they probably do not formulate their research strategies specifically to target them. The the and qs rankings have longer histories, and on account of their UK origins are more closely followed in Australia. They are, however, quite broad-based and thus difficult to ‘game’ by making strategic investments in specific areas. Their heavy reliance on reputational surveys, while potentially raising questions about their reliability, makes them particularly resistant to strategic tampering. They seem reasonably well-designed to do what they are intended to do: to inform academics (the) and students (qs) about the relative quality of universities as seen from the perspectives of these particular constituents.

The arwu rankings similarly ‘do what they are intended to do’. The problem, from the perspective of Australian higher education policy, is that they are designed to guide Chinese universities toward meeting educational goals set by the Chinese government. They were never intended to be used as management tools for driving the strategic planning processes of Western universities. Yet they are widely used in just such a capacity (Osterloh and Frey 2014). The unreflexive use of the arwu as a management tool seriously threatens the integrity of Australian and other non-Chinese universities because the arwu rankings are designed to reflect the priorities of a deeply illiberal and potentially hostile foreign government. The managerial use of the arwu, in effect, changes the priorities of universities in ways that benefit those things that are measured in the rankings (primarily scientific research) while disadvantaging things that are not (including other research fields, service to the community, and anything related to teaching).

How International Students Financed Australia’s Rankings Success

When the first arwu rankings were published in 2003, only two Australian universities figured in the Top 100: the anu (#49) and Melbourne (#92). By 2019, every Go8 university except Adelaide was in the Top 100, with Adelaide placed in the next band of universities with rankings in the range of 101–150. No non-Go8 Australian university was ranked in the top 200 places. The race to secure top arwu rankings has thus been exclusively a Go8 phenomenon. The Go8 has thus been extraordinarily successful in producing scientific research of the kind that forms the core of the arwu rankings. Yet the Go8 in its official communications routinely decries the inadequacy of Australian government funding for research (Go8 2019). In 2017, Peter Hoj, speaking in his capacity at the time as chair of the Go8, told Australia’s National Press Club that:

Until now, universities have been sufficiently adept at adjusting their business models in order to survive the withdrawal of public funding. However we are now staring in the face of the real danger that the government is tilting the funding balance to the extreme.

hoj 2017
Hoj went on to explain, in a somewhat circular fashion, that:

our high rankings … depend on our research performance, which … increasingly is funded by teaching-related fee income, much of which comes from international students.

hoj 2017

Figure 2 charts equivalent full-time student load (eftsl) for international students at Go8 universities for the period 2001–2019 (the most recent year for which data are available) using data from dese (2021b). Five of the eight universities stand out for dramatically rising international student numbers: Monash, Melbourne, Sydney, Queensland, and unsw. The figures for Monash include students at its offshore campus in Malaysia; all others are primarily or exclusively onshore students.

Figure 2
Figure 2

International full-time student load (eftsl) for Australia’s Go8 universities, 2001–2019

Citation: Youth and Globalization 4, 1 (2022) ; 10.1163/25895745-bja10019

Rising international student enrolments seem to have been correlated with arwu rankings success. The arwu ranks of Go8 universities over the period 2003–2019 are charted in Figure 3. Australian universities do not break out flows of international students by source country, but China is by far the largest source country for Australian universities, and much circumstantial and anecdotal data confirms that Chinese students are concentrated at the Go8 research-intensive universities (Babones 2019). After around 2012–2013, the five universities that experienced high and expanding international student enrolments began a rapid ascent up the arwu rankings. Meanwhile, the three Go8 universities that experienced more modest growth in international student enrolments (anu, Adelaide, uwa) saw their arwu rankings fall, or didn’t make it into the global Top 100 at all.

Figure 3
Figure 3

arwu rankings for Go8 universities in the global Top 100 (2003–2021)

Citation: Youth and Globalization 4, 1 (2022) ; 10.1163/25895745-bja10019

The contrast between Melbourne’s long-term rise and the anu’s relative decline (before bouncing back up in 2018) is particularly instructive. Between 2003 and 2017, Melbourne increased its performance on one arwu component in particular: that for ‘highly cited researchers’ (which constitutes 20% of the ranking). Over that period, Melbourne’s score increased from 14.5 to 45.0 on a 100-point scale. The anu went the opposite direction, declining from 44.7 to 15.4 out of 100. By trading places on this one arwu component, they effectively traded places in the overall arwu rankings. In fact, the anu’s rankings revival between 2017 and 2018 was almost entirely due to a sudden jump in its ‘highly cited researchers’ score, which shot back up to 23.5, while its performance on the other components remained generally stable. Queensland’s 2016 jump from #77 to #55 was similarly driven by a jump in its ‘highly cited researchers’ score (from 22.0 to 34.0), as was Sydney’s rise back into the Top 100 in the same year (its component score rose from 9.6 to 25.1).

There are other levers for rapidly improving a university’s arwu rank, but the ‘highly cited researcher’ metric is by far the most susceptible to managerial control. For example, after anu professor (now vice chancellor) Brian Schmidt shared one-quarter of the Nobel Prize for physics in 2011, his university jumped six places in the arwu rankings. But the arwu only gives credit for the institutional affiliations of Nobel Prize winners at the time of award, making it impossible to use the recruitment of Nobel Prize winners as a tool for increasing universities’ rankings. It is also possible to improve a university’s arwu rank by increasing faculty members’ publications in Science, Nature, and other prestigious journals, but this lever is not very susceptible to managerial control.

Since it is the one major arwu lever through which universities can translate financial resources directly into rankings success, the awru’s ‘highly cited researchers’ component seems to have become the most popular target in Go8 universities’ efforts to improve their arwu performance. The component is based on the previous year’s list of Highly Cited Researchers (hcr s) in the sciences and social sciences compiled by Clarivate Analytics. University employment of academics on this finite list of 6,389 researchers accounts for 20% of the arwu. Membership in the Clarivate hcr list is based on the citation counts of journal articles indexed in the Science Citation Index (sci) and Social Science Citation Index (ssci), which also separately contribute to the arwu rankings. Thus the recruitment of a Clarivate hcr simultaneously increases a university’s arwu scores on several subindexes. Recruiting a Clarivate hcr pulls the citation levers on the other three ranking systems as well, since hcr s by definition produce large numbers of citations, and are almost certain to have strong reputations, to generate grant income, and to publish many articles. But the most direct effect of hcr recruitment is on the arwu.

Around 2014, Australian universities began a dramatic increase in their numbers of Clarivate-recognised hcr s. Between 2014 and 2020, the proportion of the world’s hcr s based in Australia more than doubled, from 2.02% to 4.77%. For comparison, Harvard University’s proportion has been stable at around 3% for the last two decades. Table 4 reports the numbers of hrc s in Australia and in the world as a whole over the period 2004–2020, taking into account primary affiliations only. Data from 2001 and from 2014 onward have been downloaded directly from Clarivate. With the exception of the 2001 list, Clarivate no longer makes pre-2014 data available, and the missing years’ lists do not seem to have been archived anywhere online. Nonetheless, it was possible to source the 2004 and 2007 results from published research papers, allowing these two years to be included in the table (Basu 2006; Bauwens et al 2011).

T4

Despite gaps in the available data and changes in methodology, the absolute and relative increase in Australia’s hcr count is clear. The motive forces behind that rise can only be guessed at. Did Australian universities dramatically improve their management of research, promoting research excellence among their existing academics? Or did they simply use their increased revenues to buy in talent from overseas? Absent an unlikely self-confession from the universities involved, it would take an extensive biographical research effort to determine the extent to which Australian institutions successfully ‘managed up’ their existing researchers as opposed to simply ‘buying in’ researchers using revenues generated by international students. Two quotes from the highest and lowest ranked Go8 universities may, however, shed some light on the answer. In its 2019 strategic plan, the University of Adelaide promised a “significant injection of new, world-class academic talent aligned with priorities” (Adelaide 2019: 11). Meanwhile the University of Melbourne, in its 2019 annual report, bragged that “with targeted recruitment of research leaders and fostering of in-house talent, Australia has tripled its Hi-Ci researchers since 2014,” with Melbourne ranked “#1 in Australia” (Melbourne 2020: 37). If universities lower down the arwu rankings are looking for a model to follow for increasing their arwu rankings, Melbourne’s historical record of hcr recruitment clearly offers one.

How hcr Recruitment Skews University Priorities

It is well-known in the academic community that universities prioritize the recruitment of recognized hcr s, but it is very difficult to substantiate this reality. Universities are understandably reluctant to publicize the details of the strategic recruitment efforts, and in any case tacit knowledge like this generally does not become the object of academic research. Nonetheless, some evidence can be gleaned from documented scandals regarding the recruitment of hcr s. In 2011, Science magazine broke the news that two Saudi universities were explicitly using the Clarivate hcr list in an attempt to influence their international rankings (Bhattacharjee 2001). It emerged that King Abdulaziz University (kau) and King Saud University (ksu) were offering hcr s annual honoraria of $72,000 merely for adding their universities as ‘secondary affiliations’ on their hcr research profiles. The hcr s were also invited to come to campus and mentor local researchers, but that was optional. The only contractual stipulation for the $72,000 was to add the affiliation.

The arwu eventually adjusted its methodology to close this ‘secondary affiliation’ loophole, but kau continues to target hcr s, and it now 21st in the world in hcr s by primary affiliation. This has driven its arwu ranking into the #101-#150 range. Google searches of some of the 39 hcr s who claim to work primarily at kau suggests that the university has targeted mainly hcr s who work at relatively low-ranked universities. Since their true primary employers are not active in the global competition for a high arwu ranking, they presumably do not closely monitor their scholars’ affiliations as listed with Clarivate. This has the potential to free their professors to transfer their Clarivate affiliations without fear of negative consequences. Although this cannot be substantiated, it is certainly provocative that international hcr s associated with kau are almost entirely drawn from non-ranked universities.

Unlike the two Saudi universities, or indeed Harvard, Australia’s Go8 universities are all public, publicly-funded institutions that are responsible to democratically- elected governments and have statutory missions that are focused on education, not rankings success. If they organise their research priorities around the recruitment of hcr s in order to pursue higher rankings, they may (perhaps inadvertently or unreflexively) undermine their broad public service missions. The paths through which this may occur are clear, though not necessarily obvious to the government bureaucrats and elected officials who are responsible for regulating universities. After all, a recruitment strategy that focuses on attracting hcr s – the ‘best and brightest’, so to speak – appears at first glance to be meritocratic and value-neutral. Yet Clarivate only indexes hrc s in 21 broad fields of study, nearly all of them scientific. This has the potential to bias recruiting in ways that favor the sciences (and specifically the sciences covered by Clarivate), at the expense of other disciplines. The 21 fields indexed by Clarivate for its hcr list are:

  1. Agricultural Sciences
  2. Biology & Biochemistry
  3. Chemistry
  4. Clinical Medicine
  5. Computer Science
  6. Economics & Business
  7. Engineering
  8. Environment & Ecology
  9. Geosciences
  10. Immunology
  11. Materials Science
  12. Mathematics
  13. Microbiology
  14. Molecular Biology & Genetics
  15. Neuroscience & Behavior
  16. Pharmacology & Toxicology
  17. Physics
  18. Plant & Animal Science
  19. Psychiatry & Psychology
  20. Social Sciences
  21. Space Science

The Clarivate hcr list only covers two social scientific fields (Economics & Business and Social Sciences), and no humanities fields whatsoever. Professional disciplines like law, social work, nursing, and education are also excluded. Globally, only 5.1% of Clarivate-listed hcr s are in the two indexed social scientific fields, but in Australia the concentration of social science hcr s is even lower. A field-by-field analysis of Australia’s hcr s for the year 2019 reveals that the entire Go8 put together employs only four hci s in economics and the social sciences, and not one of them fits the tradition model of a social scientist. They are in fact one mathematical physicist (listed as an hcr in Business & Economics) and three public health scholars (listed as hcr s in the Social Sciences). Table 5 summarises the numbers of hcr s at Australia’s Go8 universities in 2019, including those who were listed in the ‘cross-field’ category (a category that was added in 2018 to capture researchers who met the citation thresholds for hcr status but did so with publications that were spread across more than one field of study).

T5

As the patterns of hcr disciplines demonstrate, any strategic recruitment strategy that focuses on the recruitment of Clarivate hcr s is, in effect, a strategy for recruiting scientists. The more humane disciplines and disciplines devoted to public service simply receive no credit in the hcr system, and thus would receive no attention from strategic recruiting focused on the recruitment of hcr s. Of course, the Chinese government did not set up the Clarivate hcr system, but it nonetheless closely reflects the disciplinary biases of the Chinese government, which is probably why the arwu chose to give such a high weighting to hrc s in its rankings. The (perhaps unintended) result is that when Australian universities set strategic goals that involve the explicit recruitment of hcr s, they are implicitly setting goals that align with those of the Chinese government.

This goal alignment goes beyond the direct recruitment of hrc s to affect the broader strategic research initiatives of Australian universities. Most of the Go8’s high-prestige peer institutions in other countries develop their research agendas organically in response to government funding initiatives, the needs of local business communities, and the interests of private donors. They may be legally autonomous institutions, but they are deeply embedded in the societies that host and fund them. They thus exhibit what sociologists call ‘embedded autonomy’: they are broadly answerable to society for the goals that they set, but sufficiently independent to think outside the box in the ways they choose to pursue those goals.

Internationally unparalleled numbers of international students, however, have given Go8 universities the resources to finance their research ambitions out of unrestricted tuition revenues. It has been estimated that more than two-thirds of Australia’s higher education research spending is now attributable to sources other than Australian government financial support. (Norton 2020) Nearly all of this non-government research support is derived from international student tuition revenue, which (unlike domestic student tuition revenue) comes with no regulatory restrictions on how it must be spent. The expansion in international student tuition revenues has thus given Australia’s Go8 universities the freedom to set their own research priorities, free from government oversight, despite the fact that these institutions are publicly-funded universities charged with operating in the public interest. It has, in effect, transformed universities from being institutions enjoying embedded autonomy to being fully autonomous institutions, at least when it comes to setting their research spending priorities.

This creates a principal-agent problem for the Australian government. In their educational missions, Australian universities clearly act as agents for the Australian government (and ultimately the Australian public). The educational activities of universities are subject to explicit government regulation. But in conducting research, Australian universities are increasingly acting as principals in their own right. The international students whose tuition payments fund Australian universities’ strategic research initiatives cannot be realistically expected to regulate university research, and in any case they presumably pay tuition for the purpose of receiving an education, not for funding research. By redirecting international student tuition revenue toward strategic research initiatives, Australian universities are able to avoid direct accountability and pursue their own institutional priorities.

Reflecting their ambitions to rise up the international rankings, and especially the arwu rankings, Australian universities have thus invested their discretionary research funds almost entirely in the sciences. Table 6 summarizes all of the strategic research initiatives highlighted on the websites and in the annual reports of Australia’s Go8 universities (as of August 1, 2020). A total of 53 initiatives were identified. Universities do not publish any financial statistics for their centrally-funded research initiatives, so it cannot be ascertained for certain to what degree funding for these initiatives was obtained from international student tuition revenue. That said, Australian universities have few (and relatively small) alternative sources of discretionary funds. Each initiative has been categorised as focusing primarily on the sciences, social sciences, or humanities based on where the bulk of its publications would fall in Clarivate and other data provider classifications.

T6

It is clear from the character of the initiatives listed in Table 6 that, when it comes to nominating strategic research initiatives, Australia’s Go8 universities focus overwhelmingly on science. This comes despite the fact that slightly more than half of the Go8’s collective enrolments of 441,000 students are concentrated in the non-science disciplines of education, management & commerce, society & culture, and the creative arts, according to data from the Department of Education, Skills and Employment (dese 2021). For international students, whose excess tuition fees largely fund universities’ discretionary research budgets, the proportion who study non-science disciplines is also over 50%. Yet as Table 6 clearly indicates, the increase in Go8 research funding made possible by the rise in international student enrolments has not been shared out equally among the disciplines that generate tuition revenue.

Conclusions and Recommendations for Reform

Since the turn of the millennium, Australia’s Go8 universities seem to have been using excess revenues derived from international (and in particular, Chinese) student tuition to fund massive expansions in their scientific research outputs. This has been reflected both in their recruitment of hcr s and in their funding of science-focused strategic research initiatives. By 2010, international student numbers at Go8 universities were already among the largest in the world. In the decade since, they have risen so much that Go8 universities now occupy five of the top six places for numbers of Chinese students outside China (Babones 2020). Perhaps not coincidentally, those five universities (in order from highest to lowest: Sydney, Monash, Melbourne, nsw, and Queensland) are precisely the five Go8 universities that have risen dramatically in the arwu rankings over the last two decades. Adelaide and uwa, home to fewer international students, have experienced more modest arwu increases. The other less-internationalized Go8 university, anu, has seen its ranking fall.

Countering the dominant narrative that government funding for research is always insufficient and declining, Norton (2020) argues that:

Profits on international students have been used to help finance a massive increase in university research expenditure this century. Growth on this scale was something universities chose to do, not a change forced on them by government policy.

Over the last two decades, Australian government funding may or may not have been sufficient to meet the research needs of Go8 universities, but it certainly did not satisfy their research ambitions. Government funding was apparently sufficient to place all Go8 universities among the top 250 in the world on the first arwu ranking in 2003, but not sufficient to place them in the global Top 100. Yet no Australian government mandate required Go8 universities to seek Top 100 rankings on the arwu, which is after all a ranking system sponsored by the Chinese government. The former dean of medicine, dentistry and health sciences at the University of Melbourne, Shitij Kapur (now president of King’s College London), wrote in an op-ed under the telling title “Universities Need a Research Funding Model to Match Their Ambition” that:

Australia did not choose to create world-leading universities. It was a fortuitous confluence of the ambition of the individual universities, the excellence of their scholars, a surfeit of international students and our attractiveness as a destination that created this powerhouse.

kapur 2020
He went on to argue that:

Having world-class research universities is not a requirement, it is a choice. It is a choice based on our ambition for our future generations and for our role in the knowledge economy. It is a choice we cannot leave for the universities alone to make. It’s a choice we need to make as a nation.

Kapur’s obvious preference is for Australia to choose the Top 100 path, to remain, as he puts it, “the world’s third biggest ‘university power’“. But this position raises a subtle ontological conundrum. Australia can only know that it ranks 3rd in the world by reference to international ranking systems like the arwu, and this is far from a comprehensive indicator of research success. It is a maxim of managerialism that “what gets measured, gets done,” and by embracing the arwu rankings system, Australia’s Go8 universities have (intentionally or unintentionally) embraced the world-view of the educational ideology that stands behind it: that of the Communist Party of China. Ironically, at a time when Australian universities are in the midst of a national debate on how to prevent undue Chinese government influence over their institutions, the universities themselves are embracing Chinese educational priorities – without any pressure from China to do so.

Australia’s vice chancellors are ultimately responsible for guiding the strategic directions of the universities they manage. One of the biggest challenges in management theory is goal alignment: how to ensure that the goals of managers are aligned with the goals of proprietors. In the private sector, a common solution is to link executive renumeration to a company’s share price. Another is to link executive remuneration to financial targets, like profitability, revenue growth, or cost cutting. In most countries (including Australia), listed companies are required not only to disclose the overall level of executive remuneration, but also to disclose the terms of any such performance-linked incentives that might influence executive decision-making. The idea is to give investors insights into the drivers of executive behaviour, and to assure them that executives are properly incentivised to act in the best interests of shareholders.

Executive compensation terms at Australia’s public universities, by contrast, are secretive and opaque. The total pay packages of vice chancellors and other senior executives are published, along with the total amounts of any performance-linked bonuses, but the triggers for those bonuses are generally kept confidential. As a result, the Australian public knows more about the drivers of executive behaviour at its listed companies than at its public universities. It took a whistleblower leak for even a federal senator to learn the terms of a vice chancellor’s compensation incentives at the University of Queensland – and even he could only disclose those terms under parliamentary privilege (Hunter 2020).

As a first step toward reform, Australian universities should be prohibited from tying senior executive compensation to international rankings success. Variable compensation strategies (i.e., bonuses) should be used sparingly in not-for-profit organisations. They should only be used to motivate managers to meet difficult goals that they can achieve through their own hard (and often unpleasant) work. For example, it is appropriate to give managers bonuses for meeting cost cutting or efficiency targets, since these are very difficult to accomplish, can only be met through difficult negotiations, and are reputationally risky for the vice chancellors themselves. As the Handbook of Human Resources Management emphasises: “any good variable pay system should be self-funding in that it generates more money than it costs” (Haussmann 2015: 1).

In order to ensure that vice chancellors do not misallocate university resources in the pursuit of rankings success, variable compensation for university executives should never be tied to international rankings. More broadly, the variable compensation provisions of the contracts of executives at public, publicly-funded universities should be made public. In addition, the details of unadvertised, non-competitive ‘strategic hires’ of specifically-targeted academics should also be made public. It is difficult to see why full transparency in hiring should apply to junior academic positions, but not to the most senior ones. If transparency is needed anywhere, it is for those who are being hired into the most privileged academic jobs, not the least privileged.

In November 2017, the Go8’s chief executive Vicki Thomson gave a speech at the 7th International Conference on World-Class Universities, organised by the Shanghai Jiao Tong University’s Center for World-Class Universities on behalf of the ShanghaiRanking Consultancy, publishers of the arwu. Speaking more than two years before the beginning of the coronavirus crisis and the contemporaneous (if not concomitant) outburst of China skepticism in Australia and around the world, she contrasted China’s “powerful political settings” for university success with Australia’s “fragile” ones:

It is fair to say that from Australia, we watch with awe, and more than a little envy, at the determined prioritisation of university education and research in China. […] In Australia, with my Board of Presidents, I am managing a leading group of research-intensive universities through what are fragile political settings. We do not allow those settings to affect the quality of what we deliver, in teaching or in research, but, it would be disingenuous to pretend that it has been, or is, simple or easy. As a group of universities we are as pragmatic as we are determined. We know we must survive and thrive despite the fragile settings.

We owe that to our students, and to Australia’s economic future, because there cannot be a knowledge economy without a thriving university sector at its core. As guests joining us today from other nations can attest, we in Australia, are, sadly, not alone. The common question is how do we withstand the fragility we are confronted with? How, in Australia’s case, does the Group of Eight set and pursue strategies to achieve excellence in shifting sands where we have had six Prime Ministers in 10 years and 9 Education Ministers in a decade, each with a different teaching and research policy agenda?

Thomson went on to fret over the irony that “the more available to the community a university education in Australia has become, the less the community has trusted us” (Thomson 2017).

With due allowance for the natural urge to show respect for one’s hosts and with due credit to the Go8 for making its public statements publicly available, it is nonetheless striking to see an Australian higher education executive lauding “the determined prioritisation of university education and research in China,” lamenting the “shifting sands” of democratic governance, and still not understanding how Australians could lose faith in their country’s universities.

If the Go8 and its member universities were serious about regaining the public’s trust, they could start by trusting the public. Instead of envying Chinese universities the lavish support they receive from a repressive communist government, they could look to North American universities for inspiration about how to engage the broader community in their intellectual life – and through community engagement to reap the benefits of private philanthropy. Or they could embrace European models of the university as a public service organisation, with leaders who are unconcerned with international rankings and accept public sector style pay packages that lack performance incentives or generous bonuses. In the post-coronavirus world that is sure to prompt many changes in university funding and operations, the Go8 and its member universities could proactively seek to develop more democratic approaches to university governance. The principal-agent problem that currently afflicts Australia’s leading universities is too serious for Australian governments to ignore forever. The universities would be well-advised to improve their behavior before their principal (the government) steps in to impose improvements on them through increased regulation and oversight.

References

  • Adelaide. 2019. Future Making. Adelaide: University of Adelaide.

  • Babones, Salvatore. 2019. “The China Student Boom and the Risks It Poses to Australian Universities.Centre for Independent Studies, Analysis Paper 5.

    • Search Google Scholar
    • Export Citation
  • Babones, Salvatore. 2020. “Is There a Future for Chinese Students in Australia?.” Pp. 3742 in UK Universities and China, edited by Michael Natzler. Oxford: Higher Education Policy Institute.

    • Search Google Scholar
    • Export Citation
  • Babones, Salvatore, and John H.S. Aberg. 2019. “Globalization and the Rise of Integrated World Society: Deterritorialization, Structural Power, and the Endogenization of International Society.International Theory 11: 293317.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Basu, Aparna. 2006. “Using isi’s ‘Highly Cited Researchers’ to Obtain a Country Level Indicator of Citation Excellence.Scientometrics 68: 361375.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Baty, Phil. 2014. “The Times Higher Education World University Rankings, 2004–2012.Ethics in Science and Environmental Politics 13: 125130.

  • Bauwens, Luc, Giordano Mion, and Jacques-Francois Thisse. 2011. “The Resistible Decline of European Science.Recherches Economiques de Louvain 77(4): 531.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bhattacharjee, Yudhijit. 2001. “Saudi Universities Offer Cash in Exchange for Academic Prestige.Science 334(6061): 13441345.

  • Birrell, Bob, and Katharine Betts. 2018. “Australia’s Higher Education Overseas Student Industry: In a Precarious State.” Australian Population Research Institute.

    • Search Google Scholar
    • Export Citation
  • Borjesson, Mikael, and Pablo Lillo Cea. 2020. “World Class Universities, Rankings and the Global Space of International Students.” Pp. 141169 in World Class Universities: A Contested Concept, edited by Sharon Rider, Michael A. Peters, Mats Hyvonen, and Tina Besley. Singapore: Springer Nature.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • dese. 2021. “Equivalent FT Load by Citizenship Category by State - Institution.” Australian Department of Education, Skills and Employment uCube database.

    • Search Google Scholar
    • Export Citation
  • Downing, Kevin, Petrus Johannes Loock, and Sarah Gravett. 2021. The Impact of Higher Education Ranking Systems on Universities. London: Routledge.

  • Erkkila, Tero. 2016. “Global University Rankings and Transnational Politics of Higher Education.” Pp. 178195 in The Transnational Politics of Higher Education: Contesting the Global/Transforming the Local, edited by Meng-Hsuan Chou, Isaac Kamola, and Tamson Pietsch. New York: Routledge.

    • Search Google Scholar
    • Export Citation
  • Fitzgerald, John. 2020. “Chinese Scholars Are Calling For Freedom And Autonomy - How Should Western Universities Respond?Journal of Political Risk 8(1).

    • Search Google Scholar
    • Export Citation
  • Go8. 2019. “Go8 Media Release: Australia’s Research Excellence Confirmed - Against the Funding Odds.” Group of Eight, March 27: https://go8.edu.au/go8-media-release-australias-research-excellence-confirmed-against-the-funding-odds (accessed August 1, 2020).

    • Search Google Scholar
    • Export Citation
  • Haussmann, Thomas. 2015. “Compensation and Benefits: Essentials of Bonus Plans.” Pp. 126 in Handbook of Human Resources Management, edited by Matthias Zeuch. London: Springer Nature.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hoj, Peter. 2017. “National Press Club Address.” University of Queensland, June 28.

  • Holmes, Richard. 2010. “The the-qs World University Rankings, 2004–2009.Asian Journal of University Education 6: 91113.

  • Howard, John. 2021. Rethinking Australian Higher Education: Towards a Diversified System for the 21st Century. Sydney: Howard Partners.

  • Hunter, Fergus. 2020. “Liberal Senator Hits Out at University China Reliance, Reveals Whistleblower Documents.Sydney Morning Herald, May 13.

    • Search Google Scholar
    • Export Citation
  • Kapur, Shitij. 2020. “Universities Need a Research Funding Model to Match Their Ambition.The Australian, July 2.

  • Liu, Nancai. 2015. “The Story of Academic Ranking of World Universities.International Higher Education 54: 23.

  • Melbourne. 2020. 2019 Annual Report. Melbourne: University of Melbourne.

  • Munch, Richard. 2014. Academic Capitalism: Universities in the Global Struggle for Excellence. New York: Routledge.

  • Norton, Andrew. 2020. “Why Did Universities Become Reliant on International Students? Part 5: The Rise of Research Rankings.Andrew Norton, June 15.

    • Search Google Scholar
    • Export Citation
  • Osterloh, Margit, and Bruno S. Frey. 2014. “Academic Rankings between the ‘Republic of Science’ and ‘New Public Management’.” Pp. 104150 in The Economics of Economists: Institutional Setting, Individual Incentives, and Future Prospects, edited by Alessandro Lanteri and Jack Vromen. Cambridge: Cambridge University Press.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • ShanghaiRanking. 2020a. “About Academic Ranking of World Universities.http://www.shanghairanking.com/aboutarwu.html (retrieved August 1, 2020).

    • Search Google Scholar
    • Export Citation
  • ShanghaiRanking. 2020b. “About Ruanke.https://www.shanghairanking.com.cn/gyrk/index.html (retrieved August 1, 2020).

  • ShanghaiRanking. 2020c. “About Us.http://www.shanghairanking.com/aboutus.html (retrieved August 1, 2020).

  • Sulkowski, Lukasz, Andrzej Wozniak, and Robert Selig. 2020. “The Impact of Consolidation on University Positions in Rankings in EU Member States.Proceedings of the 25th International Scientific Conference of PGV Network 2019: 126136.

    • Search Google Scholar
    • Export Citation
  • Thomson, Vicki (2017), ‘Speech to 7th International Conference on World-Class Universities (wcu-7), Shanghai, China: Managing a Group of the World’s Leading Research-Intensive Universities Through Fragile Political Settings’, November 7.

    • Search Google Scholar
    • Export Citation

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 64 0 0
Full Text Views 410 290 26
PDF Views & Downloads 530 340 35