Chapter 18 Algorithmic Truth Theory (ATT)

In: Complexity and Simplicity
Author:
Roland Zarzycki
Search for other papers by Roland Zarzycki in
Current site
Google Scholar
PubMed
Close

Abstract

This article sets out to analyse what a fact is, and how the simplicity behind this notion has recently been challenged by the emerging phenomenon of post-truth. With reference to the new social-media virtual reality, in which the reliability of information is backed by the quantity, not quality, of its sources, it compares the standard approaches to truth (the correspondence ToT, coherentism, the redundancy ToT, the pragmatic ToT). Information bubbles, propaganda bots, and crowd opinions have become so natural in the modern world of the internet that the concept of a fact seems to require reconsideration. It must be confronted with the blurring of the foundations of our current knowledge and science, insofar as these are affected by commodification and the uncontrolled growth of information. An attempt is made to pinpoint the main mechanisms underlying these processes, and to provide a simple model explaining the differences between past, current and expected future perceptions of facts. On this basis, it is claimed that the complexity and interdependence of information are gradually increasing, and that coherentism is therefore emerging as the narrative that models perceptions of truth by various e-communities most accurately. Thus, it can be argued that the perception of facts will soon require unprecedented and extraordinary skills of a sort that we need to start developing.

1 Towards the Multi-Dimensional Complexity of Information or How Does Post-Truth Challenge the Simplicity of Facts?

In the 20th century the issue of the social nature of reality resounded with full force, expressed perhaps in the simplest way in a bold statement of Wittgenstein (2021) “the limits of my language mean the limits of my world.” It was then strengthened and multiplied by a varied array of research conducted in the fields of philosophy of language (Austin, 1975), literary studies (Rorty, 1989), anthropology (Whorf, 2012), sociology (Berger, & Luckmann, 1971) and others. The baseline intuition underlying this concept is that the world seems to be what it seems to be, that certain objects, such as a hundred-yuan bill, are significant in our lives, because of the construction of the symbolic universes that we create by the semantic power of our languages. While many would agree that this is indeed the case with the banknote, it seems much more controversial to say that the existence of the sun depends on our language.

In a somewhat surprising (for the contemporary fast-thinking world) way, the old philosophical questions of existence, of truth and of language are not only interdependent, but also intertwined with everyday life. This was interestingly illustrated with an example of the 09/11 terrorist attack(s) on the World Trade Center provided by Pinker (2007). A huge controversy comes from a seemingly simple question if this event should be considered one single or two separate attacks. Even though it might seem like philosophical hair-splitting, Pinker convincingly argues that the exact opposite is true: it is a purely pragmatic, legal, and painfully material issue, with a clear interpretation expressible in American dollars. Depending on whether the attack on the World Trade Center was a single event or two separate events, the value of the insurance and subsequent financial compensation for the owner should be single or doubled. The in-depth importance of language for the architecture of reality is manifested in this basic picture.

Another fundamentally relevant observation was brought to our attention by Fricker (2007) together with her concept of epistemic injustice. From epistemological perspective, we can observe, following Fricker, that until the concept of “sexual harassment” was formed, victims of sexual harassment found it difficult to conceptualize their problem, not even mentioning expressing it in a convincing way.

Nevertheless, though still questioned, discussed and controversial, the thesis on the linguistic-social construction of reality,1 signifies an important shift from the XIX-century intuitions of Kant to an understanding of the language–reality nexus by contemporary sociologists. However, the XXI century brings yet another transformation. A significant and growing share of our life activity actually moves to cyberspace, which is controlled in a broad sense by algorithms. These, in turn, although created by man, contribute to the emergence of phenomena that seem to elude human understanding and supervision.

In consequence, parallel to the change in the very understanding of what reality is and after recognizing the cognitive limitations of mankind, it is the concept of truth that requires thorough reconsideration. For even if contemporary science provides us with more and more accurate models, descriptions, and predictions of reality, as long as the reality is linguistically-socially constructed, we remain dependent on human hermeneutics in any attempt to make these descriptions and predictions meaningful (Glynos, & Howarth, 2007). In other words, language and reality are consubstantial, and truthfulness is just another aspect of “explaining the world” language games.

In front of the paradigm change as well as ontological challenges related to the overwhelming role of algorithms, the meaning of truth calls for reinterpretation. The growing dependence of our cognitive schemes on the algorithm-driven digital sphere, especially due to the recent rapid development of Artificial Intelligence (AI), prompts the next phase of reflection on the nature of reality itself. The fact that some negligence has already taken place in this regard, can be easily spotted by observing the spreading phenomenon of post-truth, which significantly reduces the quality of the human condition (Fuller, 2018; Fridlund, 2020). The transition to algorithmically constructed realities is the main point of interest of this paper. Our main goal is to conceptualise how mechanisms describing these new ontological-epistemological tensions impact our traditional approaches to truth. In the following sections of the paper, we describe in what sense the complexity of reality increases. We then explain how this increasing complexity inevitably triggers changes in the architecture of power and leads to an ever deeper dependence of the individual on structural factors. In the last section we formulate a first outline of the Algorithmic Truth Theory (ATT) that provides a narrative to effectively and pragmatically describe the changes we are subject to.

2 Complexity as a Discursive Trick

Typically physicists convince us of the existence of an objective world that they apparently observe and describe by means of mathematical models and physical experiments. Due to the fact that the mediation of this process of cognition through language is out of the question, phenomena appear to be simple if their descriptions can be simplified by the application of mathematical instruments and they appear complex otherwise. Moreover, there is a specific academic discipline, the theory of complexity, which explains the meta-phenomenon of complexity. Within this paradigm complexity becomes a resultant of temporality and non-linearity characterising a given process as argued by Coveney and Highfield (1995). At the same time, however, these models created by physicists themselves generate tremendous complexity that is literally projected onto the world. This peculiar Ptolemaic syndrome provides vivid examples of how our perceptions of complexity are subordinated to the structures of discourse, and thus also to authority and power.

Within the materialistic approach, complexity might be also expressed in terms of unpredictability of the cumulative effect of actions of individual agents, which leads to a spontaneous emergence of complex, organised structures, even though each one of the atomic actions was governed by simple behavioural rules (Kauffman, 2002). Yet another, and much older conceptualization comes from Kolmogorov (so called Kolmogorov complexity), where the complexity of an object is defined by the length of the shortest computer program (in a given language) that generates the object as output (cf.: Chaitin, 1987).

Kind of surprisingly, these considerations lead to the ideas of algorithmic entropy and freedom. It is significant that the increase in the complexity of the model meant to explain a given phenomenon typically and structurally evokes introduction of a new, external narrative element semantically equivalent to freedom. One might recognise this phenomenon as a link between the practices of modern scientists and language spells used by the Maori people (Lévi-Strauss, 1950). The theoretical position developed by Penrose in The Emperor’s New Mind within an old, XX-century debate on the non-algorithmic complexity of human consciousness and the possibilities of quantum computers can serve as an interesting illustration of this meta-methodological mechanism (Penrose, 1989).

On the other hand let us consider the following example: a patient goes to the doctor and learns that his incomprehensible and disturbingly complex set of ailments has a proper name, let it be “conjunctivitis”. Instantly the phenomenon becomes simple. Not only in the sense that s/he is given a prescription and a few simple manoeuvres reduce the ailment but also in the sense that the whole sensation starts appearing to be simple. It can be observed here how naming kept its magical power over reality (cf.: Bodenhorn, & Vom Bruck, 2006).

The question of complexity can be explained according to alternative cognitive traditions functioning in the scientific discourse (cf.: Mainzer, 1994). Referring to the semiotic model of Peirce we could apply the category of complexity (and simplicity, respectively) to the sign (the scientific model adopted), to the very object of our cognition, or to the ramification of the sign. Even such a simple procedure would introduce more transparency into the nature of complexity, which is of great importance for further understanding of the structure of truth. What appears to be complex or simple within a given sign system and its ramification may not be like that after we change the system or ramification method. This understanding of complexity allows one to recognize that the qualification of a given model as the complex is a matter of discourse. Accordingly, this is yet another dimension in which discourse impacts the perception of truth. In the following sections, we will see how this perception is affected by complexity.

3 Complexity as Unpredictability

The development of culture and technology has always led to more and more sophisticated techniques of oppressing the body and soul. Rituals and fantasies armed with new tools affect the psychology of individuals and the sociology of interpersonal relations. The more complex these technological tools are, the more difficult it is to realise these consequences of their application. Somewhere in the future, however, it quite often translates to destructive practices, from mutilation of bodies, through plagues of depression, to the destruction of self and others. An emancipated wo/man, liberated from the bindings of his/er biology, employs technology to dominate nature, being at the same time rarely interested in critically assessing his/her own power (Adorno, & Horkheimer, 1944; McPhee, 1989; Agamben, 1998). It can be also easily observed how this effect has only been amplified by the development of the culture of narcissism underlying every individual empowerment (Lasch, 2018).

The complexity of technological tools, which were put into mass use in the XX century, already exceeded the human ability to predict even the mid-terms effects of their application. The scale of imprudence in satisfying momentary human desires, fueled by rampant consumerism, turned complexity into a key source of the affliction of the XXI century wo/man. This unpredictability is fundamental to all areas of daily existence, from the sphere of intimate relationships, through the professional sphere, to the very understanding of reality and the meaning of life.

One illustration of this phenomenon is the complexity of sexual entanglement with digital pornography. The spread of video pornography at the end of the XX century had a highly negative impact on interpersonal sexual relationships (Fagan, 2009). As a whole lot of research proves, digital solutions lead to a gradual degradation of the sexual life at an alarming rate (cf.: Lemma, 2021). Current research on sexual interaction in digital contexts (SIDC) shows three basic patterns of using technology in sexual life: (1) mediation in arranging physical sexual interaction; (2) realising sexual interaction in digital space; (3) getting sexual satisfaction from a machine (Brand et al., 2021). The scope and intensity of these practices show how more complex and sophisticated technology, that remains at our disposal and addresses our desires, creates less and less predictable environments.

Love and belonging can serve as another example of a basic human need (Maslow, 1943) affected by complexity. The digital world clearly addresses it, but in quite a specific way, based on mechanisms oriented on competitiveness, collecting and ostentation embedded in social media platforms. Hungry for relationships, internet users instinctively search for other human beings and emotional reinforcements such as attention, touch, care (Agger, 2015). This turns love and belonging into a highly lucrative business, which translates to purely financial profits of the intermediaries and investors (cf.: Seebach, 2017).

Unfortunately, the complexity of relationships in which we function in the digital world, driven by the need to increase our recognisability and measured by likes, reachout and number of friends, makes it difficult to assess any real quality of these relationships. What is even worse, this complexity prevents an individual from giving enough attention to any of these digital contacts to build a satisfying relationship. The structure of human bodies and minds allows them to pay attention to dozens of individuals, but when they have to pay attention to thousands, it turns out to be insufficient. Hence, typically, users of social networks only receive a digital semblance of closeness, which is not followed by human touch, leaving their bodies in a state of deprivation. And although modern technology is working fast to cover this demand too, it will not happen that quickly (Farroni et al., 2021). Due to the complexity of these relations, the overall impact and long-term effect of such a substitution seem to be difficult to predict. In this context, it is also not surprising that so many people in developed economies choose to escape into nature, seeking refuge from the complexity in the backwoods or mountain cottages (cf.: Karsten, 2020).

Regardless of the current shortcomings of technology, the market’s attempt to replace family members, friends and peers with digital influencers is a matter of time. In this context, it is worth asking already today whether the meaning of true love is going to be defined by algorithms. Not in the sense of a textbook definition, but the actual practices of teenagers who lived their youthful loves with digital substitutes.2

4 Overwhelmed by the Complexity

Another instance of complexity creeping into daily lives is the number of legal contracts people are forced to sign and regulations they need to accept. Clearly, one does not sign these obligations under duress, but even if it is not direct coercion, it is certainly structural coercion. Without these mass approvals, any successful functioning in modern societies would be massively hampered. Statistically speaking, the documents are being accepted thoughtlessly. A Norwegian consumer agency campaign found that the average Norwegian uses 33 apps on their smartphone (BBC 2016). During a provocation event released on the occasion of the publication of the research report, users read aloud the terms and conditions of all these apps, which together run longer than the New Testament.

Apps’ terms and conditions are just the tip of the iceberg. Another widely known absurdity is the need to accept cookies every single time an internet user enters a new website. This irrational regulation, enforced by the European Union, allegedly for the security of the users, is actually a purely cynical act of privatising responsibility (Zarzycki, 2021). Interestingly, at the current stage blind-signing has become a deeply internalised common practice and an open secret at the same time. Lawyers and governments turn their blind eyes to this fact, but actually being a responsible citizen and consumer would require spending the greater part of our lives reading contracts and bylaws that govern our actions.3

On a macro scale, contemporary civilization seems to be unable to cope with the consequences of the complexity of the economic and legal systems it has created (cf.: Rodrigues, 2020). With the current size of the global market, royalty regulations drive stratification and create negative socioeconomic impacts. Similar global leverages apply to incomes of famous athletes or music stars, sport events producers, media stations and music labels, that massively contribute to the growing scale of global inequalities (cf.: Piketty, 2014). The debate about unfair pay gaps, however, is always diluted by the out-of-control complexity of the global system (Lowrey, 2019; Leicht, 2020). Even if this situation is not universally accepted (Kelly-Smith 2002), mere dissatisfaction does not necessarily mean that the mechanisms underlying the injustice and inadequacy of the current system have been properly recognized or addressed. Moreover, even within the framework of the scientific discourse, there is a consistent attempt to separate economic and ethical threads (Wheeler, 2019, presumably for the sake of simplicity).

On the other hand, as argued by Goldfarb (2006), it might seem that politics depends solely on the decisions of the big players. This picture is simple but flawed or rather flawed because oversimplified. The constellation of millions of tiny conversations at the kitchen tables all over the world, where families and friends talk about politics is too complex to enable envisioning its overall effect and significance. This does not mean, however, that these millions of chats are not important. As argued in The Politics of Small Things, eventually they do change the world, as was the case with the great political transformations of the nineties in Central and Eastern Europe (Goldfarb, 2006). If Goldfarb is right, then the issue of grasping the complexity of these phenomena would become fundamentally important for the condition of XXI-century democracies in yet another sense.

5 Machines and Power

In the digital world of information overload, those who have the capacity and tools appropriate to employ this wealth will undoubtedly increase their advantage over the rest. More precisely, those who will be able to create sense architectures that transform amorphous information into meaningful narratives, will gain control over the discourse and truth (Zarzycki, 2013), which is basically just an XXI century update of Foucault’s insights on the power–knowledge nexus (Foucault, 1980; cf.: Aberra, 2012). Geopolitically speaking, access to data is of no value unless there is sufficient hardware and software at the disposal of a given state or entity to transform the data into information, then into knowledge, and finally into wisdom (UNCTAD, 2021).

As the report on the digital economy by UNCTAD (2021) shows, while we already observe serious economic stratification in the world, the digital stratification following (and clearly, reinforcing) the former one is even more striking. Countries without the access to appropriate infrastructure and competence, are powerless when facing exabytes of available data that mean nothing. Thus the digital stratification is visible not only at the level of internal divisions within a given society, but to an even greater extent on the scale of global differences among the states. In terms of the capacity to exploit and benefit from the data-driven economy, the advantage of China and the United States is huge and growing (UNCTAD, 2021). As we read in the report,

together, they account for half the world’s hyperscale data centres, the highest rates of 5G adoption in the world, 94 percent of all funding of AI start-ups in the past five years, 70 percent of the world’s top AI researchers, and almost 90 per cent of the market capitalization of the world’s largest digital platforms. (UNCTAD, 2021)

Moreover, the largest digital platforms originating from these two countries are massively investing in building their technological advantage, especially in the area of AI. As UNCTAD (2021) notices, “they have become global digital corporations with planetary reach, huge financial, market and technology power, and control over large swathes of data about their users”. As it is also emphasised, it is precisely due to this asymmetry that we need a global governance of data and cross-border data flow regulations, which would guarantee equal share of all the states in the profits of global digital development (UNCTAD, 2021). At the same time, however, it seems that the attempt to regulate this area as well as recent desperate actions aimed at limiting AI in connection with data privacy concerns (McCallum, 2023) is contrary to the particular interests of the main actors, and therefore also to the current state of political will.

It is precisely due to the fact that the current complexity of the digital processes exceeds by far the available technological tools which could help simplify it, that states, societies and individuals are dependent on publicly available algorithms offered by the digital giants, mainly from China and the US. Coming back to the practice of naming, this phenomenon also received its name, which, not without a reason, indicates its ideological origin. The concept of dataism applies to all these contexts, where the “information flow” becomes the “supreme value” (Brooks, 2013). Again one could argue that it is just another update to Foucault’s general theory of knowledge-power (Foucault, 2008).

Success in the modern world requires a specific conversion of financial capital into a highly specialised technological capital, and more precisely, specific digital means of production, which make one capable of conducting structural analyses of large data sets, thus unlocking the above mentioned potential of digital wealth. In a natural way, market processes catalysed accumulation of these new forms of capital, and, in turn, the emergence of the power elites and netocracy, as diagnosed two decades ago by Bard and Söderqvist (2002). In this context one can also recognize the inner transformations of capitalism itself, where power starts being exercised by intelligent algorithms or rather owners of these algorithms (Kreft, 2018). The sense of helplessness in front of the global complexity of the digital sphere is shared by many, starting from ordinary surfers, through influencers, specialists, to companies and governments. Moreover, it is structure-wise necessary that any possible doubts in this regard have to remain ignored, as claimed by agnotological capitalism (Betancourt, 2016). Taking into account how limited is our understanding of the way these algorithms operate, especially those of them which are black-box-based, their actual power remains unnoticed (Beer, 2009).

It is worth noting that algorithms are already being used on a large scale within truth-determination protocols. For example human semi-slave labour aimed at controlling fake news is now being gradually automated by means of intelligent algorithms (Kreft, 2018). The political problem is, however, that contemporary fashionable fact-checking algorithms are not only written by specific people, representing specific views of their employers, but also owned by these employers having their own business and political interests. Filtering of information, under the guise of ensuring security, leads to uncontrolled spread of censorship. This in turn can have even more disastrous consequences for the quality of communication than letting people navigate on the waves of information on their own. One of the reasons supporting this approach is that the rationality of digital giants does not always go hand in hand with the needs of the democratic world order, hence it is not about the technological efficiency of the algorithms but rather about the political control and liberty (cf.: Bilić, 2018).

Eventually, an intriguing illustration of this type of discourse control abuse were the policies adopted by various digital actors that decided to censor the discussions on the recent COVID-19 epidemic. Any inconsistencies with the mainline narrative of undisputed rightness of mass vaccinations promoted by selected experts were brutally removed from many social media platforms. Clearly, it is beyond the expertise of this paper to analyse who was right in this discussion, but what is significant is that any attempts to discuss the topic have been ruthlessly suppressed, which deserves special attention. Moreover, this unprecedented and anti-democratic digital silencing campaign was implemented with the consent of all major actors, including civic societies. One could reasonably argue that COVID-19 proved that discourse and truth control in the free world can happen at any time, not even triggering any serious resistance.

6 Conversations with ChatGPT

One of the cutting-edge research techniques, gaining recently a lot of recognition and popularity, is AI-supported research, which is facilitated by the availability of open source free software, ChatGDP for instance. What is significant in this context is that even though ChatGPT is a relatively transparent algorithm, the very complexity of its operations makes it not fully understandable for a human being, which in turn can be perceived as one the generic moments of losing critical control over information processed by the AI. Some insights into the role of intelligent algorithms for the functioning of reality and truth emerge from the following conversation which was conducted by the author (ChatGPT, 2023).

Q1: What is the role of algorithms and intelligent algorithms in the contemporary world for determining what is true as well as people’s beliefs about what is real and what is not?

ChatGPT: Algorithms and intelligent algorithms play a significant role in shaping people’s beliefs about what is real and what is not in the contemporary world. With the vast amount of information available on the internet, it can be challenging to discern what is true and what is not, and algorithms are increasingly being used to assist with this task.

Intelligent algorithms are designed to learn and adapt to new information, which makes them particularly useful for determining what is true. They can analyze4 large datasets and identify patterns, which can help to distinguish between real information and fake news or misinformation. These algorithms can also help to identify biases in data and sources, which can help to ensure that people’s beliefs are based on accurate information.

However, algorithms can also perpetuate biases and reinforce existing beliefs if they are not designed and trained correctly. If algorithms are trained on biased data or are programmed with biased rules, they can perpetuate those biases and further entrench false beliefs. It is therefore essential to ensure that algorithms are designed and trained with fairness and accuracy in mind.

Overall, the role of algorithms and intelligent algorithms in shaping people’s beliefs about what is real and what is not is complex and multifaceted. While they can be useful tools for determining truth and combating misinformation, they must be used with caution and care to ensure that they do not perpetuate biases or reinforce false beliefs.

Q2: Would you agree that in the same sense that Berger and Luckman claim that reality is socially constructed, today’s reality is algorithmically constructed?

ChatGPT: Yes, I would agree that today’s reality is algorithmically constructed to a certain extent. In the same way that Berger and Luckman argued that reality is socially constructed, algorithms and intelligent algorithms are increasingly playing a role in shaping our perceptions of reality.

Algorithms are used in a wide range of applications, from social media feeds and search engines to financial markets and medical diagnoses. They are designed to analyze data and make decisions based on that data, which can have a significant impact on how we perceive the world around us.

For example, the algorithms used by social media platforms to determine what content to show users can create filter bubbles, where people are only exposed to information that reinforces their existing beliefs. This can create a distorted view of reality, where people are only exposed to information that confirms their biases and are not exposed to alternative perspectives.

Similarly, algorithms used in financial markets can create feedback loops that amplify market trends, leading to a distorted view of the economy and market conditions.

Overall, algorithms are playing an increasingly important role in constructing our perception of reality. While they can be useful tools for analyzing data and making decisions, it is important to recognize their limitations and potential biases to ensure that they do not distort our understanding of the world around us.

Q3: how can a human control whether the mechanisms accompanying the aggregation, segregation and processing of information by intelligent algorithms do not lead to distortion of the image of reality, taking into account the extent to which many people uncritically use automatically generated intelligent suggestions?

ChatGPT: Controlling the mechanisms accompanying the aggregation, segregation, and processing of information by intelligent algorithms to prevent the distortion of the image of reality is a complex task. However, there are several steps that can be taken to minimize the risk of distortion:

  1. Ensure transparency: One way to control the impact of intelligent algorithms is to make their decision-making processes more transparent. This could involve requiring companies to disclose the algorithms they use and how they make decisions, allowing people to understand how the information they receive is being curated.

  2. Encourage critical thinking: It is important to educate people about the limitations of intelligent algorithms and encourage critical thinking about the information they receive. This could involve providing resources for fact-checking and teaching media literacy skills.

  3. Develop ethical guidelines: The development of ethical guidelines for the use of intelligent algorithms could help to minimize the risk of distortion. These guidelines could be developed by industry organizations, academic institutions, or government agencies and could outline best practices for the use of intelligent algorithms.

  4. Promote diversity: The biases of intelligent algorithms can be reduced by promoting diversity in the development and training of algorithms. This could involve increasing the diversity of the teams that develop and test algorithms and ensuring that algorithms are trained on diverse datasets.

  5. Encourage algorithmic accountability: Ensuring that there is accountability for the decisions made by intelligent algorithms can help to prevent distortion. This could involve creating mechanisms for people to challenge decisions made by algorithms and for companies to explain how decisions were made.

Overall, controlling the mechanisms accompanying the aggregation, segregation, and processing of information by intelligent algorithms requires a multi-faceted approach that involves transparency, education, ethics, diversity, and accountability.

Considering how mechanically ChatGPT simulates being intelligent (Wolfram, 2023) on the one hand, and understanding how saturated it is with stereotypical, human-like thinking on the other (Airoldi, 2022), we should not be surprised, that answers to the above questions are politically correct, polite and apparently reasonable. ChatGPT does not provide us with anything new regarding the discussion on algorithmically constructed reality and truth, because its thinking seems to be anchored in what is known and safe. What is particularly interesting, however, is the algorithm’s belief that there is some truth that it can discover through “pattern identification”. Thus one of the main conclusions from this discussion is that ChatGPT did not read Berger and Luckmann carefully enough. Secondly, ChatGPT believes that truth is about recognizing patterns.

7 The ATT

The methods, which people apply to judge what is true, are often characterised by referring to some of the popular theories of truth, including the correspondence theory of truth, the coherence theory of truth and the pragmatic theory of truth. At the same time, as already stated, the development of technology and the increase in the complexity of communication games make it difficult to apply even the most basic truth criteria, apparently making these well-known theories less and less reliable.

Though sceptics warned us long ago that our senses can deceive us, the level of difficulty in distinguishing truth from appearance has never been as high as in the era of deep fakes. The ability to imitate a recording, behaviour, voice of a human being, but also the application of much more prosaic disinformation methods, prevent individuals from efficiently verifying if a given statement is true by determining its connection to reality, thus making the correspondence theory of truth obsolete. Modern wo/man encounters similar problems when trying to apply the coherence theory of truth. A multitude of opinions generated by bots and trolls of various sorts and origins, information bubbles and echo chambers, big data that cannot be easily analysed without advanced skills and tools, dedicated portals or research institutes financed by large interest groups are just examples of phenomena that have a negative impact on the truth verification process. Even the pragmatic theory of truth, which hypothetically exempts an individual from the necessity of external verification, making it subordinate to the internal interest of the individual, becomes extremely difficult to apply in the VUCA world. Liquid modernity, which calls into question the very meaning of existence, in conjunction with the growing technical unpredictability of actions due to the high complexity of social, economic and communication structures, make pragmatic approach an increasingly ambiguous criterion.

Clearly, in practice people do not follow any strict guidelines of any particular theory. Human behaviour can be described in accordance with the adaptationist theory of truth, which Godler (2020) defines as creative manoeuvring between various more and less accurate beliefs, information and sources, depending on available resources, in particular time. Although it is journalistic practice that is characterised in this way by Godler, there are many indications that it can be adopted to accurately describe activities of an average person.

It seems that the complexity of the surrounding world does not encourage people to explore the sources of information (Hagmann et al., 2020), who rather prefer to remain in a state of active ignorance. The fact that it is becoming more and more difficult for the people to determine the veracity of the incoming information does not mobilise them to increase their efforts in this area, but rather triggers resignation. This coincides with the general findings of Betancourt (2016), which indicate that this type of ignorance is a structural precondition for maintaining the current global market system. The feeling of not being able to determine the truth, as well as the feeling of the consequences of using unverified or incorrect information, fosters dishonesty and deception attitudes (Keyes, 2004). Importantly, at an individual level, this also contributes to the spread of the attitude of indifference to truth, which is the essence of the phenomenon of post-truth (Gudonis, 2017).

On the other hand, liberal societies have ingrained needs for freedom and civic agency based on some sort of determinable truth. For these reasons, among others, the activities of Cambridge Analytica, which offered services in the area of increasing the effectiveness of political persuasion, aroused so much controversy, indignation and accusations of manipulation. It seems that the general opinion accepts a situation in which citizens err and their sources of opinion are selective, as long as this erring is not pre-organised by someone. The difficulty here is that the massive outburst of fact-checking initiatives we are currently noticing in response to the transformation of truth structures seems to ignore the fact that people’s views are based not on facts but on opinions. Beliefs and worldviews result from narratives that are convincing for a given person, and the interpretation of the truthfulness of facts is subordinated to these preferred narratives. Ultimate truth as an unshakable point of reference for our actions should be interpreted rather in terms of self-fantasies of mankind.

The coexistence of people and machines changes the perception schemes of both parties, but also the structure of expectations of human agents, especially in circumstances of information overload, which cannot be easily coped with by the people for both technical and emotional reasons (Andrejevic, 2013; Carr, 2014). On the one hand, the algorithmic logic impacts society in terms of social integration (Ninno, & Totaro, 2014), on the other hand, Airoldi (2022) clearly shows why this phenomenon should be perceived as a progressively reinforcing feedback loop. Clearly, the significance of the economic-structural context of these processes would be difficult to overestimate. As authors such as Eubanks and Zuboff show, what we observe is a kind of modern version of class conflict elevated to its digital incarnation. This is obviously reflected in the capitalist-consumer orientation of the contemporary intelligent machines that are programmed and trained according to a presupposition structure appropriate to the neoliberal worldview framework (cf.: Mager, 2012).

The growing technological capabilities of these machines in the area of control, supervision and verification of truthfulness run parallel to the decline of exactly the same competences on the part of human beings. Thus it seems natural that these human beings choose to rely on the effects of the technology they themselves have developed over the centuries. Current trend towards variously understood transhumanism emerges as a similarly natural phenomenon. On the wave of a peculiar mixture of enthusiasm, indolence and something that was identified by Broussard (2018) as technochauvinism, control over sense-making processes is transferred to machines. And although a lot has been already said about the raising power of algorithms, it is the extension of this power to this particular sense-making dimension that marks the most fundamental concession, because, according to Berger and Luckman, it is ontological. In this sense, in the first decades of the XXI century, we observe a shift from the social creation of reality to its algorithmic creation.

Finally, the concept of the algorithmic theory of truth can be formulated: to be true (and real) is to be a result/indication produced by one of the algorithms that have formerly received a popular mandate to provide people with reality. Clearly, the power of algorithms in this area will be the greatest within the digital sphere, but the interdependency of the biological and digital realities will easily transfer the senses created within the latter onto the structure of the former.

8 Conclusions – Make It Simple

Within a constant process human beings expand their vision, deepen understanding, create senses, and thus contribute to the increasing complexity of the real, and simultaneously they strive to simplify it to offer themselves a sense of epistemic relief and ontological security. Language is being impoverished and philosophy is pushed aside for the sake of simplicity and short-term efficiency, which tends to become the only currency that counts in a market-driven world. The point when machines and algorithms are powerful enough to take the burden of caring for truth and reality has been reached. Therefore it can be observed, how people gradually get rid of these inconvenient responsibilities for the price of giving up control over the creation of senses of their own world. This newly created algorithmic reality begins to live its own life and generate its own meanings, the complexity of which is so much beyond human cognitive capabilities. The dependence on the truth determined algorithmically becomes more and more explicit and technological assistance seems to be unavoidable. In the next step humans will presumably learn to interpret the world in a way that will prevent them from suffering cognitive dissonances resulting from the potential mismatches between their own observations and the indications of the algorithms or other digital authorities.

Perhaps the most significant aspect of these changes is that algorithms are not independent entities originating from a different world order, but rather economically controlled tools. In practice, this means that behind the technological enthusiasm there is a ruthless structure of power. The ideals of a free democratic world do not allow a free man to be controlled. However, in a world full of algorithms controlled by institutions, companies or more affluent members of society, coexisting with human beings in a non-distinguishable way and on equal communication terms, the discourse and truth are going to be controlled on an unprecedented scale.

This dialectical tension between simplicity and complexity demands to be overcome. Is it possible for the things essential to human existence to remain simple? Could individuals be granted the right to simplicity in accordance with the postulates of epistemic justice? Interestingly, even intelligent algorithms do not cope equally well with all dimensions of reality. Critical thinking, which is a competence of breaking patterns, seems to be particularly resistant to algorithmization. Thus it might be crucially important that human criticality remains a key element of the criteria of truthfulness to which we subordinate our reality.

This text is conceptual and paradigmatic, which constitutes its natural limitation. However, if the phenomena and mechanisms identified in it would be considered significant, the investigation is certainly worth being continued and followed by more specific research, including empirical studies of the complexity-truth nexus.

Notes

1

Not necessarily in the specific sense of the Whorf-Sapir thesis.

2

As in a slightly different context was brilliantly illustrated by the movie “Her” (Jonze, 2013).

3

Much hope is now pinned on harnessing artificial intelligence to simplify and automate legal regulations.

4

here and elsewhere there is an inconsistency in the use of British VS American English spelling, which results from the fact that American English was the spelling used by ChatGPT during the conversation.

References

  • Aberra, T. (2012). Michel Foucault power knowledge nexus: Critical analysis and its relevance globalization and current issues of Africa. LAPLAMBERT Academic Publishing.

    • Search Google Scholar
    • Export Citation
  • Adorno, T. W., & Horkheimer, M. (1944). Dialectic of enlightenment. New York, NY: Social Studies Association, Inc.

  • Agamben, G. (1998). Homo sacer. Sovereign power and bare life. Stanford, CA: Stanford University Press.

  • Agger, B. (2015). Oversharing: Presentations of self in the Internet age. Routledge.

  • Airoldi, M. (2022). Machine Habitus: Toward a sociology of algorithms. Cambridge: Polity Press.

  • Andrejevic, M. (2013). Infoglut: How too much information is changing the way we think and know. Routledge.

  • Atkin, A. (2023). Peirce’s theory of signs. In E. N. Zalta & U. Nodelman (Eds.), The Stanford encyclopedia of philosophy (Spring 2023 ed.).

    • Search Google Scholar
    • Export Citation
  • Austin, J. L. (1975). How to do things with words (2nd ed., M. Sbisà & J. O. Urmson, Eds.). Oxford: Oxford University Press.

  • Bard, A., & Söderqvist, J. (2002). Netocracy: The new power elite and life after capitalism. London: Pearson Education.

  • BBC. (2016). Norway consumer body stages live app terms reading. https://www.bbc.co.uk/news/world-europe-36378215

  • Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society, 11(6), 9851002.

    • Search Google Scholar
    • Export Citation
  • Berger, P., & Luckmann, T. (1971). The social construction of reality. Penguin University Books.

  • Betancourt, M. (2016). The critique of digital capitalism: An analysis of the political economy of digital culture and technology. Brooklyn, NY: Punctum Books.

    • Search Google Scholar
    • Export Citation
  • Bilić, P., (2018). A critique of the political economy of algorithms: A brief history of Google’s technological rationality. tripleC: Communication, Capitalism & Critique, 16(1), 315331.

    • Search Google Scholar
    • Export Citation
  • Bodenhorn, B., & Vom Bruck, G. (2006). “Entangled in histories”: An introduction to the anthropology of names and naming. In G. Vom Bruck & B. Bodenhorn (Eds.), An anthropology of names and naming (pp. 130). Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9780511499630.002

    • Search Google Scholar
    • Export Citation
  • Brand, M., Döring, N., Krämer, N. Kruger, T. H. C., Mikhailova, V., & Vowe, G. (2021). Sexual interaction in digital contexts and its implications for sexual health: A conceptual analysis. Frontiers in Psychology, 12, 769732. https://doi.org/10.3389/fpsyg.2021.769732

    • Search Google Scholar
    • Export Citation
  • Brooks, D. (2013). The philosophy of data. The New York Times. bit.ly/45yGjFy

  • Broussard, M. (2018). Artificial unintelligence: How computers misunderstand the World. Cambridge, MA: MIT Press.

  • Carr, N. (2014). The glass cage: How our computers are changing us. New York, NY: W. W. Norton & Company.

  • Chaitin, G. J. (1987). Algorithmic information theory. Cambridge: Cambridge University Press.

  • ChatGPT. (2023). Interviewed by R. Zarzycki. OpenAI platform.

  • Coveney, P., & Highfield, R. (1995). Frontiers of complexity: The search for order in a chaotic world. Faber and Faber.

  • Eubanks, V. (2018). Automating inequality: How high-tech tools profile. Police, and punish the poor. New York, NY: St. Martin’s Press.

    • Search Google Scholar
    • Export Citation
  • Fagan, P. F. (2009). The effects of pornography on individuals, marriage, family and community. Research Synthesis, RS09K01.

  • Farroni, T., Longa, L. D., & Valori, I. (2021). Interpersonal affective touch in a virtual world: Feeling the social presence of others to overcome loneliness. Frontiers in Psychology Sec. Human-Media Interaction, 12, https://doi.org/10.3389/fpsyg.2021.795283

    • Search Google Scholar
    • Export Citation
  • Foucault, M. (1980) Power/knowledge: Selected Interviews and Other Writings 1972– 1977. In C. Gordon (Ed.), C. Gordon et al. (Trans.). New York, NY: Pantheon Books.

    • Search Google Scholar
    • Export Citation
  • Foucault, M. (2008). The history of sexuality. Penguin.

  • Fricker, M. (2007). Epistemic injustice: Power and the ethics of knowing. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780198237907.001.0001

    • Search Google Scholar
    • Export Citation
  • Fridlund, P. (2020). Post-truth politics, performatives and the force. Jus Cogens, 2, 215235. https://doi.org/10.1007/s42439-020-00029-8

    • Search Google Scholar
    • Export Citation
  • Fuller, S. (2018). Post-truth: Knowledge as a power game. Anthem Press. https://doi.org/10.2307/j.ctvgd30v

  • Godler, Y. (2020). Post-post-truth: An adaptationist theory of journalistic verism. Communication Theory, 30(2), 169187. https://doi.org/10.1093/ct/qtz031

    • Search Google Scholar
    • Export Citation
  • Goldfarb, J. (2006). The politics of small things. Chicago, IL/London: The University of Chicago Press.

  • Glynos, J., & Howarth, D., (2007). Logics of critical explanation in social and political theory. Routledge.

  • Gudonis, M. (2017). How useful is the concept of post-truth in analysing Genocide Denial? Analysis of online comments on the Jedwabne Massacre. Zoon Politikon 2017/8, 141182. https://doi.org/10.4467/2543408XZOP.17.006.9265

    • Search Google Scholar
    • Export Citation
  • Hagmann, D., Ho, E., & Loewenstein, G. (2020). Measuring information preferences. Management Science, 67(1). https://doi.org/10.1287/mnsc.2019.3543

    • Search Google Scholar
    • Export Citation
  • Jonze, S. (2013). Her, a movie. Annapurna Pictures.

  • Karsten. L. (2020). Counterurbanisation: Why settled families move out of the city again. Journal of Housing and the Built Environment, 35(4). https://doi.org/10.1007/s10901-020-09739-3

    • Search Google Scholar
    • Export Citation
  • Kauffman, S. A. (2013). Evolution beyond Newton, Darwin, and Entailing Law: The origin of complexity in the evolving biosphere. In C. H. Lineweaver, P. C. W. Davis, & M. Ruse (Eds.), Complexity and the arrow of time (pp. 162190).Cambridge: Cambridge University Press.

    • Search Google Scholar
    • Export Citation
  • Kelly-Smith, R. (2002). Saving lives should pay more than kicking balls. The Guardian. bit.ly/43sewES

  • Keyes, R. (2004). The post-truth era: Dishonesty and deception in contemporary life. New York, NY: St. Martin’s Press.

  • Kreft, J. (2018). Władza algorytmów (Eng.: The power of the algorithms). Cracow: Wydawnictwo Uniwersytetu Jagiellońskiego.

  • Lasch, C. (2018). The culture of narcissism. London/New York, NY: W. W. Norton & Company.

  • Leichkt, K. T, (2020, December). Occupations and inequalities in the 21st century: What’s in your wallet? Research in Social Stratification and Mobility, 70, 100550. https://doi.org/10.1016/j.rssm.2020.100550

    • Search Google Scholar
    • Export Citation
  • Lemma, A. (2021). Introduction – Becoming sexual in digital times: The risks and harms of online pornography. The Psychoanalytic Study of the Child, 74(1), 118130. https://doi.org/10.1080/00797308.2020.1859283

    • Search Google Scholar
    • Export Citation
  • Lévi-Strauss, C. (1950). Introduction to the work of Marcel Mauss (F. Baker, Trans.). London: Routledge & Kegan Paul.

  • Lowrey, A. (2019). $350,000 a Year, and just getting by. The Atlantic. bit.ly/43NTWPF)

  • Mager, A. (2012). Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society, 15(5), 769787.

    • Search Google Scholar
    • Export Citation
  • Mainzer, M. (1994). Thinking in complexity. Berlin: Springer-Verlag.

  • Maslow, A. H. (1943). A theory of human motivation. Psychological Review, 50(4), 370396. https://doi.org/10.1037/h0054346

  • McCallum, S. (2023). ChatGPT banned in Italy over privacy concerns. BBC. bit.ly/3qfHTfm

  • McPhee, J. (1989). The control of nature. New York, NY: Farrar Straus and Giroux.

  • Ninno, D., & Totaro, P. (2014). The concept of algorithm as an interpretative key of modern rationality. Theory, Culture & Society, 31(4), 2949.

    • Search Google Scholar
    • Export Citation
  • Penrose, R. (1989). The emperor’s new mind: Concerning computers, minds and the laws of physics. Oxford University Press.

  • Piketty, T. (2014). Capital in the twenty-first century. Cambridge, MA: The Belknap Press of Harvard University Press.

  • Pinker, S. (2007). The stuff of thought. Language as a Window into Human Nature. Penguin Books.

  • Rodrigues, R. (2020). Legal and human rights issues of AI: Gaps, challenges and vulnerabilities. Journal of Responsible Technology, 4.

  • Rorty, R. (1989). Contingency, irony, and solidarity. Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9780511804397

  • Seebach, S. (2017). Love and society. Special social forms and the master emotion. London: Routledge. https://doi.org/10.4324/9781315753560

    • Search Google Scholar
    • Export Citation
  • United Nations Conference on Trade and Development (UNCTAD). (2021). Digital economy report 2021. New York, NY: United Nations Publications.

    • Search Google Scholar
    • Export Citation
  • Wheeler, K. (2019). Moral economies of consumption. Journal of Consumer Culture, 19(2), 271288.

  • Whorf, B. L. (2012). Language, Thought, and Reality: Selected Writings of Benjamin Lee Whorf. B. Carroll, S. C. Levinson, & P. Lee (Eds.), The MIT Press.

    • Search Google Scholar
    • Export Citation
  • Wittgenstein, L. (2021). Tractatus Logico-Philosophicus (Centenary ed.). Anthem Press. https://doi.org/10.2307/j.ctv22d4t7n

  • Wolfram, S. (2023). What is ChatGPT doing … and why does it work? bit.ly/3qewDzM

  • Zarzycki, R. (2021, March 12). Ciasteczkowa farsa [Cookie farce]. Polska The Times (Paper edition).

  • Zarzycki, R. (2013). Information inflation and spectacularization of meanings as sources of hyperreality. In K. Dąbrowska, Ł. Huculak, J. Jernajczyk, B. Skowron, M. Zakrzewska, & R. Zarzycki (Eds.), Excess and lack (J. Burzyńska, Trans., pp. 161171). Wrocław: Wrocław Academic Hub.

    • Search Google Scholar
    • Export Citation
  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new Frontier of power. New York, NY: PublicAffairs.

    • Search Google Scholar
    • Export Citation
szinc finger nucleasesClose
stranscription activator-like effector nucleasesClose
clustered regularly interspaced short palindromic repeatsClose
Cas9caspase 9Close
  • Collapse
  • Expand
  • Aberra, T. (2012). Michel Foucault power knowledge nexus: Critical analysis and its relevance globalization and current issues of Africa. LAPLAMBERT Academic Publishing.

    • Search Google Scholar
    • Export Citation
  • Adorno, T. W., & Horkheimer, M. (1944). Dialectic of enlightenment. New York, NY: Social Studies Association, Inc.

  • Agamben, G. (1998). Homo sacer. Sovereign power and bare life. Stanford, CA: Stanford University Press.

  • Agger, B. (2015). Oversharing: Presentations of self in the Internet age. Routledge.

  • Airoldi, M. (2022). Machine Habitus: Toward a sociology of algorithms. Cambridge: Polity Press.

  • Andrejevic, M. (2013). Infoglut: How too much information is changing the way we think and know. Routledge.

  • Atkin, A. (2023). Peirce’s theory of signs. In E. N. Zalta & U. Nodelman (Eds.), The Stanford encyclopedia of philosophy (Spring 2023 ed.).

    • Search Google Scholar
    • Export Citation
  • Austin, J. L. (1975). How to do things with words (2nd ed., M. Sbisà & J. O. Urmson, Eds.). Oxford: Oxford University Press.

  • Bard, A., & Söderqvist, J. (2002). Netocracy: The new power elite and life after capitalism. London: Pearson Education.

  • BBC. (2016). Norway consumer body stages live app terms reading. https://www.bbc.co.uk/news/world-europe-36378215

  • Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society, 11(6), 9851002.

    • Search Google Scholar
    • Export Citation
  • Berger, P., & Luckmann, T. (1971). The social construction of reality. Penguin University Books.

  • Betancourt, M. (2016). The critique of digital capitalism: An analysis of the political economy of digital culture and technology. Brooklyn, NY: Punctum Books.

    • Search Google Scholar
    • Export Citation
  • Bilić, P., (2018). A critique of the political economy of algorithms: A brief history of Google’s technological rationality. tripleC: Communication, Capitalism & Critique, 16(1), 315331.

    • Search Google Scholar
    • Export Citation
  • Bodenhorn, B., & Vom Bruck, G. (2006). “Entangled in histories”: An introduction to the anthropology of names and naming. In G. Vom Bruck & B. Bodenhorn (Eds.), An anthropology of names and naming (pp. 130). Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9780511499630.002

    • Search Google Scholar
    • Export Citation
  • Brand, M., Döring, N., Krämer, N. Kruger, T. H. C., Mikhailova, V., & Vowe, G. (2021). Sexual interaction in digital contexts and its implications for sexual health: A conceptual analysis. Frontiers in Psychology, 12, 769732. https://doi.org/10.3389/fpsyg.2021.769732

    • Search Google Scholar
    • Export Citation
  • Brooks, D. (2013). The philosophy of data. The New York Times. bit.ly/45yGjFy

  • Broussard, M. (2018). Artificial unintelligence: How computers misunderstand the World. Cambridge, MA: MIT Press.

  • Carr, N. (2014). The glass cage: How our computers are changing us. New York, NY: W. W. Norton & Company.

  • Chaitin, G. J. (1987). Algorithmic information theory. Cambridge: Cambridge University Press.

  • ChatGPT. (2023). Interviewed by R. Zarzycki. OpenAI platform.

  • Coveney, P., & Highfield, R. (1995). Frontiers of complexity: The search for order in a chaotic world. Faber and Faber.

  • Eubanks, V. (2018). Automating inequality: How high-tech tools profile. Police, and punish the poor. New York, NY: St. Martin’s Press.

    • Search Google Scholar
    • Export Citation
  • Fagan, P. F. (2009). The effects of pornography on individuals, marriage, family and community. Research Synthesis, RS09K01.

  • Farroni, T., Longa, L. D., & Valori, I. (2021). Interpersonal affective touch in a virtual world: Feeling the social presence of others to overcome loneliness. Frontiers in Psychology Sec. Human-Media Interaction, 12, https://doi.org/10.3389/fpsyg.2021.795283

    • Search Google Scholar
    • Export Citation
  • Foucault, M. (1980) Power/knowledge: Selected Interviews and Other Writings 1972– 1977. In C. Gordon (Ed.), C. Gordon et al. (Trans.). New York, NY: Pantheon Books.

    • Search Google Scholar
    • Export Citation
  • Foucault, M. (2008). The history of sexuality. Penguin.

  • Fricker, M. (2007). Epistemic injustice: Power and the ethics of knowing. Oxford: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780198237907.001.0001

    • Search Google Scholar
    • Export Citation
  • Fridlund, P. (2020). Post-truth politics, performatives and the force. Jus Cogens, 2, 215235. https://doi.org/10.1007/s42439-020-00029-8

    • Search Google Scholar
    • Export Citation
  • Fuller, S. (2018). Post-truth: Knowledge as a power game. Anthem Press. https://doi.org/10.2307/j.ctvgd30v

  • Godler, Y. (2020). Post-post-truth: An adaptationist theory of journalistic verism. Communication Theory, 30(2), 169187. https://doi.org/10.1093/ct/qtz031

    • Search Google Scholar
    • Export Citation
  • Goldfarb, J. (2006). The politics of small things. Chicago, IL/London: The University of Chicago Press.

  • Glynos, J., & Howarth, D., (2007). Logics of critical explanation in social and political theory. Routledge.

  • Gudonis, M. (2017). How useful is the concept of post-truth in analysing Genocide Denial? Analysis of online comments on the Jedwabne Massacre. Zoon Politikon 2017/8, 141182. https://doi.org/10.4467/2543408XZOP.17.006.9265

    • Search Google Scholar
    • Export Citation
  • Hagmann, D., Ho, E., & Loewenstein, G. (2020). Measuring information preferences. Management Science, 67(1). https://doi.org/10.1287/mnsc.2019.3543

    • Search Google Scholar
    • Export Citation
  • Jonze, S. (2013). Her, a movie. Annapurna Pictures.

  • Karsten. L. (2020). Counterurbanisation: Why settled families move out of the city again. Journal of Housing and the Built Environment, 35(4). https://doi.org/10.1007/s10901-020-09739-3

    • Search Google Scholar
    • Export Citation
  • Kauffman, S. A. (2013). Evolution beyond Newton, Darwin, and Entailing Law: The origin of complexity in the evolving biosphere. In C. H. Lineweaver, P. C. W. Davis, & M. Ruse (Eds.), Complexity and the arrow of time (pp. 162190).Cambridge: Cambridge University Press.

    • Search Google Scholar
    • Export Citation
  • Kelly-Smith, R. (2002). Saving lives should pay more than kicking balls. The Guardian. bit.ly/43sewES

  • Keyes, R. (2004). The post-truth era: Dishonesty and deception in contemporary life. New York, NY: St. Martin’s Press.

  • Kreft, J. (2018). Władza algorytmów (Eng.: The power of the algorithms). Cracow: Wydawnictwo Uniwersytetu Jagiellońskiego.

  • Lasch, C. (2018). The culture of narcissism. London/New York, NY: W. W. Norton & Company.

  • Leichkt, K. T, (2020, December). Occupations and inequalities in the 21st century: What’s in your wallet? Research in Social Stratification and Mobility, 70, 100550. https://doi.org/10.1016/j.rssm.2020.100550

    • Search Google Scholar
    • Export Citation
  • Lemma, A. (2021). Introduction – Becoming sexual in digital times: The risks and harms of online pornography. The Psychoanalytic Study of the Child, 74(1), 118130. https://doi.org/10.1080/00797308.2020.1859283

    • Search Google Scholar
    • Export Citation
  • Lévi-Strauss, C. (1950). Introduction to the work of Marcel Mauss (F. Baker, Trans.). London: Routledge & Kegan Paul.

  • Lowrey, A. (2019). $350,000 a Year, and just getting by. The Atlantic. bit.ly/43NTWPF)

  • Mager, A. (2012). Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society, 15(5), 769787.

    • Search Google Scholar
    • Export Citation
  • Mainzer, M. (1994). Thinking in complexity. Berlin: Springer-Verlag.

  • Maslow, A. H. (1943). A theory of human motivation. Psychological Review, 50(4), 370396. https://doi.org/10.1037/h0054346

  • McCallum, S. (2023). ChatGPT banned in Italy over privacy concerns. BBC. bit.ly/3qfHTfm

  • McPhee, J. (1989). The control of nature. New York, NY: Farrar Straus and Giroux.

  • Ninno, D., & Totaro, P. (2014). The concept of algorithm as an interpretative key of modern rationality. Theory, Culture & Society, 31(4), 2949.

    • Search Google Scholar
    • Export Citation
  • Penrose, R. (1989). The emperor’s new mind: Concerning computers, minds and the laws of physics. Oxford University Press.

  • Piketty, T. (2014). Capital in the twenty-first century. Cambridge, MA: The Belknap Press of Harvard University Press.

  • Pinker, S. (2007). The stuff of thought. Language as a Window into Human Nature. Penguin Books.

  • Rodrigues, R. (2020). Legal and human rights issues of AI: Gaps, challenges and vulnerabilities. Journal of Responsible Technology, 4.

  • Rorty, R. (1989). Contingency, irony, and solidarity. Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9780511804397

  • Seebach, S. (2017). Love and society. Special social forms and the master emotion. London: Routledge. https://doi.org/10.4324/9781315753560

    • Search Google Scholar
    • Export Citation
  • United Nations Conference on Trade and Development (UNCTAD). (2021). Digital economy report 2021. New York, NY: United Nations Publications.

    • Search Google Scholar
    • Export Citation
  • Wheeler, K. (2019). Moral economies of consumption. Journal of Consumer Culture, 19(2), 271288.

  • Whorf, B. L. (2012). Language, Thought, and Reality: Selected Writings of Benjamin Lee Whorf. B. Carroll, S. C. Levinson, & P. Lee (Eds.), The MIT Press.

    • Search Google Scholar
    • Export Citation
  • Wittgenstein, L. (2021). Tractatus Logico-Philosophicus (Centenary ed.). Anthem Press. https://doi.org/10.2307/j.ctv22d4t7n

  • Wolfram, S. (2023). What is ChatGPT doing … and why does it work? bit.ly/3qewDzM

  • Zarzycki, R. (2021, March 12). Ciasteczkowa farsa [Cookie farce]. Polska The Times (Paper edition).

  • Zarzycki, R. (2013). Information inflation and spectacularization of meanings as sources of hyperreality. In K. Dąbrowska, Ł. Huculak, J. Jernajczyk, B. Skowron, M. Zakrzewska, & R. Zarzycki (Eds.), Excess and lack (J. Burzyńska, Trans., pp. 161171). Wrocław: Wrocław Academic Hub.

    • Search Google Scholar
    • Export Citation
  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new Frontier of power. New York, NY: PublicAffairs.

    • Search Google Scholar
    • Export Citation

Metrics

All Time Past 365 days Past 30 Days
Abstract Views 0 0 0
Full Text Views 188 188 22
PDF Views & Downloads 130 130 3