Chapter 6 Saying ‘No’ to Rankings and Metrics

Scholarly Communication and Knowledge Democracy

In: Socially Responsible Higher Education
Authors:
Florence Piron
Search for other papers by Florence Piron in
Current site
Google Scholar
PubMed
Close
,
Tom Olyhoek
Search for other papers by Tom Olyhoek in
Current site
Google Scholar
PubMed
Close
,
Ivonne Lujano Vilchis
Search for other papers by Ivonne Lujano Vilchis in
Current site
Google Scholar
PubMed
Close
,
Ina Smith
Search for other papers by Ina Smith in
Current site
Google Scholar
PubMed
Close
, and
Zakari Liré
Search for other papers by Zakari Liré in
Current site
Google Scholar
PubMed
Close
Open Access

Abstract

This chapter is dedicated to the theoretical deconstruction of the “quality” argument. The authors state that it could only be guaranteed by the current system of metrics and rankings, and propose a radical change from the current system of metrics and rankings for articles and journals from the Global South and from the North that includes social responsibility and social relevance. This chapter reflects on how the knowledge-based economy that dominates the world of scientific publication can be transformed by instilling in it some values of “knowledge democracy,” especially in the practices of researchers.

1 Introduction

Although knowledge democracy is a broad concept that can take on different meanings in different contexts, in this chapter we primarily consider it as a moral and political ideal (Hall & Tandon, 2017). This ideal is based on the effective sharing of knowledge among all, across differences in race, class, gender, geographical location, language and cultural heritage, in the service of a sustainable and welcoming world for all. We see knowledge as a “commons” (Ostrom, 1990), i.e., a collective resource that a community must take care of by establishing rules and practices to ensure that it lasts and is preserved (Bollier & Westron, 2014). Knowledge democracy is therefore a set of means and approaches devised by the scientific community to preserve and share not only scientific-type knowledge, but also the reservoir of knowledge produced over the course of human history and deemed worthy of transmission. Indeed, the current environmental crisis is increasingly bringing us back to indigenous knowledge, in order to rethink our relationship to the land and the environment (Hall et al., 2000).

It is remarkable that all human societies have devised ways to transmit to future generations knowledge deemed fundamental, whether in institutions such as schools and universities, or through practices such as apprenticeship, companionship or imitation of the parent by the child. The knowledge transmitted is valuable because it is adapted to the contexts from which it originates and circulates to support collective or individual action – it is considered capable of maintaining and developing the society in question, of helping it to resist disasters, whether political, natural or health-related, or to think collectively about ways of strengthening its values. In the pluralistic world that has become ours (Escobar, 2011), knowledge circulates from one community or social world to another, so that relevance becomes less related to the local character of knowledge and raises the question of translation, for example. Nevertheless, medical knowledge on malaria or the Ebola virus has a different resonance and level of pertinence in sub-Saharan Africa or in Canada. For researchers, the need to share, and thus to make their new knowledge accessible, can be considered as a form of social responsibility towards their fellow citizens. In this sense, open access to scientific publications could be considered a powerful tool in the pursuit of a “knowledge democracy” since its purpose is to abolish barriers between scientists and their readers, whether they are scientists or not.

Within the theory of the commons, the case of knowledge is original: it is a “commons” that is both immaterial, since it is formed of ideas and cognitions, and material, since these ideas must be embodied in reproducible statements that can be shared and transmitted. While oral tradition had long played a major role in transmission and continues to do so on several continents, it is now mainly the written and published form1 that ensures the preservation and sustainability of knowledge, especially the dominant knowledge – techno-scientific knowledge – in a contemporary, globalised world, or the “one-world world”, according to the brilliant phrase of Law (2015). This knowledge takes the form of texts, including scientific articles published in journals, that are transmitted, taught and read in universities. Alas, these items have acquired a market value in recent decades. The work on cognitive capitalism (Moulier-Boutang, 2011) has clearly showcased a shift between the conception of an “ideal” (idéel) knowledge – which the author never loses the enjoyment of, even if they share it by communicating it – and knowledge in the form of an “editorialised” text, published by a journal after a selection process, which may, like private property, belong to a publisher who sells or rents it.

In this context of the commodification of knowledge, knowledge democracy gives way to another system of values that is called “knowledge-based economy” (Peters, 2007), encouraged by the OECD since the 1990s. This system’s ideal is not the universal sharing of all knowledge for the common good, but the production of scientific publications able to generate wealth through their content (patents, marketable innovations, etc.), or their very existence on for-profit platforms where access to the articles is limited to readers that pay a fee.2

While knowledge democracy is intimately linked to sustainable development (Hall et al., 2016), since it aims to preserve a sustainable world in which human and living communities in general use knowledge to flourish, the knowledge-based economy is, instead, linked to the ideology of growth and neoliberal capitalism (Monbiot, 2016). In the knowledge democracy paradigm, social responsibility of a university is aimed at collective well-being, whereas in the knowledge economy it is primarily aimed at economic prosperity, especially on the terms of industrialists (Piron, 2011). These ideals can also be contrasted in terms of academic publications. While knowledge democracy emphasises sharing and collaboration, knowledge-based economy lives only through selection and competition, symbolised by metrics and rankings, applied to articles, journals, researchers and universities, which are presented as quantitative guarantees of quality (Brembs et al., 2013; Ioannidis, 2006; Tourish & Willmott, 2015; Young et al., 2008).

In this chapter and our other contribution in this book (Chapter 22), we reflect on ways to transform the knowledge-based economy that dominates the world of scientific publication, by instilling in it some values of “knowledge democracy”, especially in the practices of researchers. Specifically, we want to deconstruct the “quality” argument, which states that it can only be guaranteed by the current system of metrics and rankings, and instead propose a vision of various contextualised quality assessment systems for articles and journals from the Global South and from the North that include social responsibility and social relevance. This chapter is dedicated to the theoretical argument. In the next one, we will discuss examples from Africa, Latin America and Europe to answer the question of how to encourage academics to conduct research that meets society’s needs and enhances people’s rights, while preserving academic freedom. In particular, we will ask ourselves how we can use the tools and devices devised by knowledge democracy (science shops, participatory research, community-based research) to emancipate Open Access from the enclosures that for-profit publishers are still trying to impose on academia.

2 An Inequitable Global Research System Based on Selection and Competition

There is a distinction between science and research that is important for our argument. Research concerns the process of creating or producing knowledge, the knowledge “being made”, whereas science refers to the outputs of this process (publication or data) that permits it to be transmitted and evaluated – the knowledge that is “made”. Among the global research system, it is mainly scientific articles that serve to fix these outputs in public/published forms, institutionalised by the quasi-sacralised process of peer review. The distinction is very useful in studying the world of knowledge democracy and the world of knowledge-based economy.

Indeed, for the knowledge democracy paradigm, the research process itself must be democratised and opened up, especially to those who are usually excluded from it – non-scientists, non-academics, indigenous peoples, and knowledge holders in the Global South, who thus become “actor-researchers”. Knowledge democracy rhymes seamlessly with participatory processes, with the fight against cognitive inequalities and injustices, with an aspiration to decolonise knowledge and resistance against epistemicides (see, for example, de Sousa Santos, 2014). Among the devices that aim to open research to a plurality of actors are science shops, participatory action research, citizen science, living laboratories, etc.

For the knowledge-based economy paradigm, openness mainly means open access to scientific publications and open data; it is of value in as much as it accelerates innovation by making processes of collaboration faster.

These two forms of openness are marked by an inequity that is often invisible in the eyes of authors/researchers. Academic subscribers to scientific journals through their university library do not see that they have “access” and that the system set up by the knowledge-based economy excludes others from this access. This lack of awareness can contaminate even action research practitioners aspiring to knowledge democracy. For example, during a recent Living Knowledge conference3 gathering action research practitioners and science shop leaders from all over the world, we noticed that some books presented there, including handbooks for practitioners, were published by for-profit publishers (Sage, Springer, etc.) at a very high price. Therefore, these books were not accessible to civil society organisations, non-university workers or activists, and even non-funded academics from the North or the Global South. This is an unfortunate paradox for people hoping to co-construct knowledge or use authentic participatory methods.

To fight this inequity, many engaged scholars endeavor to raise awareness about the benefits of open access books and papers, including the use of Creative Commons licenses. This includes informing colleagues about the for-profit big editors’4 attempts to normalise the huge sums of money they ask librarians or readers of PDFs, and sometimes authors, for research that is often funded using public money.

This inequity is reinforced in parts of the Global South, notably Francophone African universities, where the internet is still a luxury and downloading files can be difficult. Not only do researchers in this part of the world have difficulty in accessing recent printed journals and books, but even when these resources are online and open access, it is difficult for them to benefit from these resources kept behind paywalls (Piron, 2018). Recognition of this inequity by researchers in the North is the first step towards its disappearance, and thus towards greater concern for equality in the scientific world.

In order to face head-on the inequities produced by a knowledge-based economy in the scientific community, we must also challenge the value of metrics and rankings. Fortunately, while still being used as a main marketing argument by the five major publishers, these indicators are increasingly criticised from within. To better understand the impact factor, we have tried below to deconstruct what is often cited as the pillar of measuring scientific quality.

3 The Impact Factor and Its Criticisms

The story is well-known: designed in the 1960s by Eugene Garfield to help academic librarians choose which journals to subscribe to, the (Journal) Impact Factor (JIF or IF) is an index that uses a ratio of the number of citations a journal receives in a given year (the “proof” that the journal is read) to the number of articles published by that journal in the previous two years. These figures are published annually in the Journal Citation Reports, owned by Clarivate Analytics and based on Web of Science journals. This index multiplied into three indices according to disciplines and copied since the 2000s by other databases (Schöpfel & Prost, 2009), notably Elsevier’s Scopus,5 has since been used to rank scientific journals and, consequently, to analyse the publication records of academics and judge their merits for promotion. The transformation of a quantitative index into a “quality” marker of researchers is only one of the aberrations, noted by observers, including Curry (2012) who declared himself “sick of Impact Factor”.

The other aberrations are numerous. When the Impact Factor is calculated, the types of documents counted are sometimes unclear (articles, commentaries, editorials, research notes) and homonyms are rarely distinguished. An article frequently cited for its low quality can increase the impact factor of a journal. Only certain articles in a journal are cited a great deal, but it allows others to benefit from this “aura”. The duration of the “impact” taken into account is too short for certain disciplines, some journal titles are not recognised by the algorithm, and the average says nothing about the impact of individual articles in a journal (Brembs, 2013; Pendlebury, 2009; Pendlebury & Adams, 2012). In fact, Larivière et al. (2016) have demonstrated that “the citation performance of individual papers cannot be inferred from the JIF” (p. 1). In spite of these shortcomings, these data are often used, in some disciplines, to calculate “publication bonuses” that some universities pay to their researchers (Gingras, 2015), although it does not say anything about the quality of the article. One of the biggest problems, according to us, remains that this index and its clones are calculated by the owners of the journals thus assessed – it is clearly a tool of marketing, dragging readers to their own products!

Despite its lack of reliability and relevance, this index – and others like it – is becoming an obsession for researchers in the North, especially in the fields of health and natural sciences, regardless of internal concerns over the negative impacts (Seglen, 1997; Wellcome Trust, 2020).6 Its continued use is explained by the generalised belief that it is a proof of “quality” and that tenure, promotion and access to research grants depends on publications in high-impact journals. Several surveys on open access demonstrate that researchers choose their journals according to impact factors, without much concern for accessibility to the general public (Piron & Lasou, 2014).

Since the Impact Factor and its clones are not calculated by a neutral and impartial body, but in the interest of for-profit owners of the journals that benefit from these indices, they must instead be considered as excellent marketing tools that encourage researchers to publish in them. In consequence, the Impact Factor eventually permits journals of for-profit owners to raise the price of their subscriptions accordingly. This “branding effect” of endorsements is a well-known strategy of lucrative companies such as Nike or Adidas.

The exclusionary effects of this system are numerous, including the exclusion of papers written in languages other than English, due to the fact that very few non-English journals are integrated into the Web of Science or Scopus.7 In the words of the director of Clarivate analytics, owner of the Web of Science, this makes sense: “English is the language of science”, he says without any hesitation. This diglossia of the scientific world has indeed pushed a growing number of non-English-speaking researchers to choose to publish in broken English, even if it is unable to capture many of the nuances of their research, especially in social sciences. Indeed, some researchers express a kind of contempt for articles that are not in English – publishing in another language is considered as “ghettoising” research because those papers would reach a smaller audience of international scientists. This kind of voluntary linguistic alienation is ever-so-common to researchers in French-speaking sub-Saharan Africa, already writing in a colonial language (Piron, 2018) and usually less trained in English than scientists from other countries. The pressure to publish in English journals accentuates a tendency to try and imitate the North, to “extraversion” (Hountondji, 1990). Extraversion means overvaluing everything that comes from the North (theories, authors, models) and devaluing local knowledge and epistemologies from the Global South (Mvé-Ondo, 2005; Sarr, 2016). We are convinced that one solution to that diglossia of the scientific world is to adopt plurilingualism, and not only accept but promote the use of translation in publication to make research results available in all the relevant languages.8

As a result of these exclusionary effects, the use of impact factors tends to deprive journals from the Global South of any symbolic capital. This is largely because they never have any chance of being recognised by American English-language databases that attribute impact factors, while also contributing to maintaining the “colonisation of the mind” (Oelofsen, 2015), very far from the empowerment so necessary to research in the Global South.

It is largely accepted that the global publishing system, based on the knowledge-based economy and its values, maintains a quasi-monopoly, especially in technology, natural or health sciences. This system is seldom contested by researchers out of fear that any effort to get out of it could harm their careers by diminishing the value of their curricula vitae. This fear is accentuated by a global strategy of intense peer competition that maintains fear of loss in status, position or salary, a strategy that is the hallmark of neoliberalism (Monbiot, 2016). The result is the generation of docile science workers who are responsive to publishing companies and their shareholders’ desires. However, this docility has an undesirable side effect – it limits an awareness of inequalities and dissuades researchers who are caught up in the rat race from thinking about their social responsibilities and the impact their work could have on the well-being of society, as defined by the paradigm of knowledge democracy.

It is therefore necessary to recognise that the metrics and rankings system hinders the development of universities’ social responsibility and researchers’ conscience of it. This system reproduces researchers that could easily neglect issues unique to their society and that are not able to question structural inequalities in their knowledge production. In such a situation, open access to research will only act as band-aid to an infectious disease – it is aesthetically appealing, but does not address the deep, problematic orientation of researchers and their institutions.

Fortunately, the Impact Factor and its clones have sparked movements of contestations that include the DORA Declaration of San Francisco.9 In fact, several alternatives have been proposed, notably the proposal of Larivière et al. (2016) “to refocus attention on individual pieces of work and counter the inappropriate usage of JIFs during the process of research assessment” (p. 1). However, while they may resolve some of the problems identified, they largely maintain the system created by the knowledge-based economy. Think, for example, of the altmetrics showing the number of downloads and hits, or the efforts of the Public Library of Science journals (PLOS) group to introduce its own index.10 Obviously, the portrait of a journal that emerges from this index seems closer to reality, as it is based on the digital impact of open access articles. But it does not change the exclusionary effects specific to the system.

In response to these inadequacies, the Directory of Open Access Journals (DOAJ) proposes an alternative quality assessment framework, based not on citations or rankings, but on good practices of journals. This prioritises transparency, especially in the peer review process, as an essential tool for quality assurance. It should be noted that this normative framework,11 based on 54 indicators, only covers full open access journals that are increasing their presence internationally due to the growing number of institutional and governmental open access policies.

If the system is not challenged, structural inequities in scholarly publishing and knowledge production between the Global South and North will continue to increase, and Northern agendas on science and publishing for the knowledge-based economy will continue to control scholarly publishing practices in the Global North, with the Global South closely following suite, even while open access policies are producing new opportunities.

In an attempt to address this challenge, we want to propose a very different solution, namely a polycentric plurilingual system (McGinnis, 1999) of publication. This approach would allow each part of the world to develop its own publishing system based on the priorities and needs of its populations, but to remain connected to others through the interoperable tools of a free and open Web database.

4 Towards a Polycentric System?

The concept of such a polycentric system is not difficult to understand. The scholarly community of the United States of America, if they wish to, can keep the impact factor system for their journals, since it reflects their reality. It should not prevent the French-speaking scholarly community, the Spanish-speaking one or the Chinese-speaking one to create a quality assessment system for their journals that is more appropriate to their needs, their languages, their contexts and their research concerns, one that could cover any discipline, not only social and human sciences. In Africa, Latin America or Asia, the different scholarly communities should be able to create several quality assessment systems that will respond to their needs, concerns and languages. Ever-improving electronic translation tools will allow those who wish to explore the wealth of scientific worlds and epistemologies from the North and the South to do so. It will not only permit their coexistence on the Web, but will produce a true relational ecology of knowledges (de Sousa Santos, 2014). This should be accompanied by contextualised systems of research evaluation and affordable sharing of knowledge through open source online-based journals and platforms not owned or controlled by for-profit publishers.

For optimal performance, a system of open access publishing within national systems of innovation that support open science should be established. To achieve this, scholars from the Global South could start by abandoning the Northern assessment criteria, and instead develop their own contextualised criteria for science and publishing. This would create an environment conducive for open access and open science in general, while incentivising research that supports the Global South’s agenda in the paradigm of knowledge democracy.

5 Conclusion

In this chapter, we have seen that the metrics and rankings system not only hinders the development of universities’ and researchers’ conscience of their social responsibility, but also produces exclusionary effects in the Global South or in non-Anglophone European scholarly communities. Our second chapter in this book (Chapter 22) explores current initiatives from four different regions. In that we ask, through different examples, whether it is possible to transform the knowledge economy that dominates the world of scientific publication towards a knowledge democracy within a polycentric system that takes into account local values and priorities.

Acknowledgements

Florence and Tom would like to thank the members of their Triangle 2018 team (https://trianglesci.org/): Aurélie Fichot, Zoé Aubierge Ouangré and Kamel Belhamel with whom some of these ideas were first developed, as well as Paolo Mangiafico, the organiser. Many thanks to Paul Gregory Murphy for the revision of our English.

Notes

1

Video can be considered as a form of “animated” publication. Even Wikipedia, the last reservoir of universal knowledge, only accepts written sources for its articles.

2

The growth and, in some countries, the obligation of open access has led these lucrative platforms to imagine another source of funding: fees charged to researchers (APC), mostly publicly funded in Northern countries.

3

Living Knowledge is the name of the International Science Shop Network (https://www.livingknowledge.org).

4

These include Elsevier, Springer Nature, John Wiley Sons, Taylor Francis and Sage Publications.

5

For example, the SCImago Journal Rank Indicator from Elsevier, Citescore also from Elsevier (since 2016), Source Normalized Impact per Paper (SNIP).

6

According to this survey, “only 14% of researchers agree that current metrics have had a positive impact on research culture, and 43% believe that their workplace puts more value on metrics than on research quality”.

7

If a French person is asked for the ideal list of journals, there will be titles in French and others in English. The same question to an Italian giving titles in Italian and titles in English. In each case, the list of English-language titles is likely to be the same, so that English-language titles will stand out more in the end, to the detriment of non-English-language journals.

8

This is why we’ll translate this chapter in French and Spanish, at the least. To those who object that it would cost too dearly, let’s answer that the APCs are also very expensive, without fighting any injustice.

11

DOAJ will, in the future, also monitor other data like ORCID, Open Citations, funder information, research institute information, etc.

References

  • Bollier, D., & Westron, B. H. (2014). The commons as a growing global movement. http://www.csrwire.com/blog/posts/1203-the-commons-as-a-growing-global-movement

    • Search Google Scholar
    • Export Citation
  • Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: Unintended consequences of journal rank. Frontiers in Human Neuroscience, 7, 291. https://doi.org/10.3389/fnhum.2013.00291

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Curry, S. (2012). Sick of impact factors. http://occamstypewriter.org

  • Escobar, A. (2011). Sustainability: Design for the pluriverse. Development, 54(2), 137140. https://doi.org/10.1057/dev.2011.28

  • Gingras, Y. (2015). Drifts and pernicious effects of the quantitative evaluation of research: The misuse of bibliometrics. Nursing Research, 121, 7278. https://doi.org/10.3917/rsi.121.0072

    • Search Google Scholar
    • Export Citation
  • Hall, B. L., Dei, G. J. S., & Rosenberg, D. G. (2000). Indigenous knowledges in global contexts: Multiple readings of our world. University of Toronto Press.

    • Search Google Scholar
    • Export Citation
  • Hall, B. L., Jackson, E. T., & Tandon, R. (2016). Knowledge, democracy and action: Community-university research partnerships in global perspectives. Oxford University Press.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hall, B. L., & Tandon, R. (2017). Decolonization of knowledge, epistemicide, participatory research and higher education. Research for All, 1(1), 619. https://doi.org/10.18546/RFA.01.1.02

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hountondji, P. (1990). Scientific dependence in Africa today. Research in African Literatures, 21(3), 515.

  • Ioannidis, J. P. A. (2006). Concentration of the most-cited papers in the scientific literature: Analysis of journal ecosystems. PLoS ONE, 1(1). https://doi.org/10.1371/journal.pone.0000005

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Larivière, V. V., Kiermer, C. J., MacCallum, M., McNutt, M., Patterson, B., Pulverer, S., Swaminathan, S., & Taylor, S. C. (2016). A simple proposal for the publication of journal citation distributions. bioRxiv, 062109. https://doi.org/10.1101/062109

    • Search Google Scholar
    • Export Citation
  • Law, J. (2015). What’s wrong with a one-world world? Distinktion: Scandinavian Journal of Social Theory, 16(1), 126139.

  • McGinnis, M. D. (1999). Polycentricity and local public economies: Readings from the workshop in political theory and policy analysis. University of Michigan Press.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Monbiot, G. (2016). Neoliberalism – The ideology at the root of all our problems. The Guardian. https://www.theguardian.com/books/2016/apr/15/neoliberalism-ideology-problem-george-monbiot

    • Search Google Scholar
    • Export Citation
  • Moulier-Boutang, Y. (2011). Cognitive capitalism. Polite.

  • Mvé-Ondo, B. (2005). Afrique: La fracture scientifique (Futuribles). https://www.futuribles.com/en/base/bibliographie/notice/afrique-la-fracture-scientifique-africa-the-scient/

    • Search Google Scholar
    • Export Citation
  • Oelofsen, R. (2015). Decolonisation of the African mind an intellectual landscape. Phronimon, 16(2), 130146.

  • Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. Cambridge University Press.

  • Pendlebury, D. A. (2009). The use and misuse of journal metrics and other citation indicators. Archivum Immunologiae Et Therapiae Experimentalis, 57(1), 111. https://doi.org/10.1007/s00005-009-0008-y

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Pendlebury, D. A., & Adams, J. (2012). Comments on a critique of the Thomson Reuters journal impact factor. Scientometrics, 92(2), 395401. https://doi.org/10.1007/s11192-012-0689-6

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Peters, M. A. (2007). Knowledge economy, development and the future of higher education. Sense Publishers.

  • Piron, F. (2011). La citoyenneté scientifique contre l’économie marchande du savoir. Un enjeu d’éthique publique. Éthique publique. Revue internationale d’éthique sociétale et gouvernementale, 12(1), 79104. https://doi.org/10.4000/ethiquepublique.240

    • Search Google Scholar
    • Export Citation
  • Piron, F. (2018). Postcolonial open access. In U. Herb & J. Schöpfel (Éds.), Open divide: Critical studies on open access. Litwin Books. https://corpus.ulaval.ca/jspui/handle/20.500.11794/16178

    • Search Google Scholar
    • Export Citation
  • Piron, F., & Lasou, P. (2014). Pratiques de publication, dépôt onstitutionnel et perception du libre accès. Enquête auprès des chercheuses et chercheurs de l’Université Laval (Québec). http://www.bibl.ulaval.ca/fichiers_site/services/libre_acces/pratiques-de-publication-libre-acces.pdf

    • Search Google Scholar
    • Export Citation
  • Santos, B. de S. (2014). Epistemologies of the South: Justice against epistemicide. Paradigm Publishers.

  • Sarr, F. (2016). Afrotopia. Philippe Rey.

  • Schöpfel, J., & Prost H. (2009). Le JCR facteur d’impact (IF) et le SCImago Journal Rank Indicator (SJR) des revues françaises: une étude comparative. Psychologie Française, 54(4), 287305.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314(7079), 498502.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tourish, D., & Willmott, H. (2015). In defiance of folly: Journal rankings, mindless measures and the ABS guide. Critical Perspectives on Accounting, 26, 3746. https://doi.org/10.1016/j.cpa.2014.02.004

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wellcome Trust. (2020). What researchers think about the culture they work in. Shift Learning, p. 51. https://wellcome.ac.uk/sites/default/files/a-post-brexit-agreement-for-science-bruegel-wellcome-january-2020.pdf

    • Search Google Scholar
    • Export Citation
  • Young, N. S., Ioannidis, J. P. A., & Al-Ubaydli, O. (2008). Why current publication practices may distort science. PLoS Med, 5(10), 201. https://doi.org/10.1371/journal.pmed.0050201

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Collapse
  • Expand

Socially Responsible Higher Education

International Perspectives on Knowledge Democracy

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 792 48 2
PDF Views & Downloads 923 58 2