by Ico Maly
We are all living algorithmic lives. Our lives are not just media rich, they increasingly take place in and through an algorithmically programmed media landscape. Algorithms, as a result of digitalisation and the de-computerisation of the internet, are ubiquitous today. We use them to navigate, to buy stuff, to work from home, to search for information, to read our newspaper and to chat with friends and even people we never met before. We live our social lives in post-digital societies: societies in which the digital revolution has been realised. As a result, algorithms have penetrated and changed almost every domain in those societies.
Algorithms have become a normal and to a large extent invisible part of our world. Hence they are rarely questioned. Only when big issues erupt—think about Facebook’s role in Trump’s election, the role of conspiracy theories in the raid on Capitol, or content moderation failures— do debates on the role of digital platforms and their algorithms become prominent. Otherwise, they just seem to be “there”, just as the old media is part of our lives. As Barthes eloquently argued, normality is always a field of power. Normality and normativity, he argued, are not neutral or non-ideological. On the contrary, they are hegemonic. Other ideologies and normativities are measured against this ideological point zero. We could expand his logic and argue that digital platforms, their algorithms, and the ideologies that are embedded in them are part of the invisible and self-evident systemic core organising daily life. Just because we fail to recognize those algorithms and platforms as ideologically grounded, it is necessary to examine and study the impact of this algorithmic revolution in general, and its impact on politics—and the production and distribution of ideology—in particular.
Ideology and the algorithmic logic of post-digital societies
Digitalisation and algorithmic culture have rapidly reshuffled the media system and the information flows and interactions within that system. Politicians, activists, journalists, intellectuals and common citizens politically engage in a very different context than in the 1990s, let alone the 1950s. We now live with an algorithmically-powered attention-based hybrid media system. The different types of media—newspapers, television, radio, and social media platforms—do not merely coexist, but form a media system that is constantly changing. That perpetual change is, according to Chadwick, the result of the reciprocal actions and interactions between those different media and their media logics. In that new media system, the distinction between “old” and “new” media or digital and non-digital media has almost become non-existent. Tweets become news and the newspaper tweets. Moreover, the newspaper is also more and more algorithmically produced.
All media in this hybrid media system are increasingly grounded in an algorithmic logic. Our interactions with algorithms determine which information becomes visible to whom and on which scale. Algorithms, datafication, and the affordances of digital media that allowed for the democratisation of transmission and the banalisation of recording disrupt the status quo. We cannot understand the rise of Trump and Trumpism, or the rise of Bernie Sanders, Alexandria Ocasio-Cortez, and the Squad without taking this new media environment into account. While we should carefully avoid the trap of technological determinism, we cannot disregard the importance of including this new socio-technological context in our analysis of ideology and political discourse in contemporary societies. Post-digital societies create new possibilities and constraints for the production and circulation of discourse. New producers and new relationships have been established between the different actors in this media system and they have had fundamental effects on the construction and circulation of (meta-)political messages and meanings.
In the last two decades, the digital infrastructure has become an inherent part of the social fabric of society. It is one of the deep, generic drivers of concrete human behaviour in hypermediated societies. Without attention to this social structure, one risks the fallacy of internalism, as J.B. Thompson called it. With this concept, Thompson pointed to the widespread idea that the meaning of a text is only to be found in the text itself (and thus not in the attribution of meaning through the uptake and reproduction of texts). He stressed that ‘the analysis of ideology in modern societies must give a central role to the nature and impact of mass communication’, and argued that cultural experience is profoundly shaped by the diffusion of symbolic forms distributed through mass media. As a result, the study of ideology should—if one wants to avoid the fallacy of internalism—be focused on all three aspects of mass communication: ‘the production/transmission, construction, and reception/appropriation of media messages’. If we follow Thompson’s argument, we should at least direct some attention to the algorithmic nature of the distribution of discourse and ideology in contemporary societies.
Algorithmic culture and the attention-based media system
It is important to note here that algorithms are much more than mere technological instruments. They are socio-technical assemblages. Algorithms only work if they are fed with data. In other words, algorithms should be understood from a relational perspective. Not only do programmers (their values and their companies’ goals) matter, but also the interfaces, the data structures, and what people do with algorithms deserve our attention. The idea—so prevalent in public debate—that algorithms just do things and that users do not have impact is false. The recommended videos on YouTube are the result of the interactions between the recommendation algorithms of YouTube, viewers, and how producers prepare their content for uptake. Algorithmic culture matters. People will try to optimise their content, link to each other, have a network of fans which they ask to share content, or even have bots to push certain content.
Algorithms and people have agency. It is in the interaction between humans and algorithms that the contemporary production and circulation of ideology should be understood. When we take the assumption on board that algorithms have agency, then it is important to understand the socio-technical but also the economic context in which they are created. The objectives of the platforms are clear. Beneath all the fine talk of big tech boasting about ‘connecting the world’ and ‘doing no evil’ lies the quest for profit. Social platforms make profit by commodifying our digitally networked social relationships: our emotions, photos, posts, shares and likes are repackaged into ‘tradable commodities’. Or more concretely, data is used to predict the likelihood that certain audiences will be receptive or give attention to certain messages from companies or politicians. The more data those companies have about their users, the more accurately they think that sellers can target them and the more profit big tech can extract from that behavioural data. The result is an unbridled surveillance and datafication. Even if users don’t post or like and don’t leave comments, they still produce data that can be processed and traded.
In order to gather more data on their users, digital media platforms nurture a specific culture in which audience labour takes a central place. We have all become prosumers: we do not only consume information, we also produce it. This has crucial consequences: information—including good quality information—is now abundant. It is no longer a scarce commodity. A wealth of information creates a lack of attention. We have ended up in the opposite of an information economy: an attention economy. In order to convert that attention into profit, attention is codified and categorised. The like, the comment, the view, the click, the share function as proxies of attention. The digital infrastructures of the attention economy are not only organised to keep the users hooked, they facilitate audience labour and thus data production.
This commercial algorithmically-programmed attention economy creates a very specific environment in which we develop our social relationships. “Popularity” has become a crucial factor. The more followers you have, the more likes your posts generate, the more you contribute to the goals of the platform, the more valuable you are for the platforms, and the more visible you and your discourse becomes. As a result, people increasingly present themselves as public personas in search for an audience. In order to capture the attention of platform users, we see that branding strategies have been democratised. People create their brand in relation to the so-called vanity metrics: they monitor the likes, followers and uptake and use it to gain insight in what works, when, and why. Or in other words, they try to acquire and apply algorithmic knowledge to produce attention-grabbing content. The influencer or micro-celebrity is a structural ingredient of this new media environment: they help platforms in realising their goals. These new human practices are best seen as result of their interaction with the algorithms and values of that platform.
The management of visibility and ideology research
The algorithmic logic of this attention-based media environment forces us to understand the importance of algorithmic knowledge in the dissemination of ideologies on the rise. Not only the management of visibility, but also avoiding non-visibility as Bucher stresses is a constant worry for all actors in this media system and is thus of crucial importance for all ideological projects. In line with Thompson, Blommaert argued that ideologies need to ‘be understood as processes that require material reality and institutional structures and practices of power and authority’. Ideologies are thus not just a cognitive phenomenon, they have a material reality. Hence, they cannot be understood without looking at how people spread those ideas, who they address, which media they use, and how those media format the discourses. Studying ideologies in the contemporary era means not only looking at the input, but also at the uptake. Uptake here refers to
(1) the fact that within the digital ecology users are not only consumers but also (re)producers of discourse, so-called prosumers; and
(2) that algorithms and the interfaces of digital media play an important role in the dissemination and reproduction of ideas.
Uptake realises visibility. Human and non-human actors (from bots over the algorithms organising the communication on a platform) are a crucial part of any ideological and political battle. Note here that seemingly simple ‘reproduction’ actions like retweeting, reposting, liking, and sharing are not just ‘copies’ of the same discourse, but ‘re-entextualisations’: a share (and sometimes even a like depending on the algorithms of the platform) is the start of a new communication process where the initial message is now part of a new communicative act performed by a new producer who communicates to new addressees in a new type of interaction. It is also a meaningful act seen from the perspective of the algorithm: a share and a comment adds to the ‘popularity’ of the post and thus can also contribute to its visibility far beyond the audiences of the people who have shared it. Digital media are thus not just intermediaries, they affect the input and the uptake.
Messaging in the digital age is thus not a linear process between sender and receiver, but involves a multitude of human and non-human actors that are all potential senders and receivers and even co-constructs the message. This ‘uptake’ is as crucial as the input and this again highlights why it is important that ideological and discourse-analytical research not only focusses on the content, but also on the different actors and the systems of communication.
Myths, ideology, and the far-right
We can illustrate the importance of the new communicative environment when we zoom in on the emergence of the far-right in the last decades. Although it is certainly not the only factor, the algorithmic hybrid media system is unmistakably an important ingredient in this rise. It has reshaped and re-organised the far-right. The far-right has always used digital media to propagate their ideologies, but in the last decades, we see a fundamental change in the form, content and strategies that are being used today. The far right’s adoption of meme-culture, LARPing (the ironic and metapolitical use of Live Action Role Playing in order to do or say things that are too outrageous for “normies”), digital harassment, trolling, conspiracy theories, and the adoption of influencer culture for metapolitical goals are all relatively new practices that have contributed not only to the spread of their ideologies, but also to recreation, re-emergence, and the mobilisation power of the so-called true right on a global scale.
In post-digital societies, the far-right rarely manifests itself as a hierarchical organisation with one stable ideology or a mass party. More commonly it takes the form of a polycentric and layered network of niched ideological groups. Maly and Varis coined the term micro-populations to describe such social groups. They argued that micro-populations are the material expression of temporary and emerging micro-hegemonies. The Capitol riots in the US are a clear example of how all those digital practices have shaped a wide range of such micro-populations that were moulded into a militant offline mass on 6 January 2021. An analysis of the linguistic signs on display during the storming of Capitol shows us how Trump-supporters are a loose, unstable, and temporal coalition of micro-populations. Next to the red and camouflage MAGA caps and Trump hats, one could spot Confederate flags, QAnon t-shirts, Kek and “three-percenter” flags, Neo-Nazi hoodies, ‘stop the steal’ boards, and of course the Proud Boys themselves.
All these signs and emblems refer to different groups who occupy different (4chan, thedonald.win) or sometimes overlapping online spaces (GAB, Parler, MeWe). Trump—with his massive reach in the hybrid media system—was potentially most important, but he was clearly not the only communicator. Key influencers like Nick Fuentes, Dan Bongino, and Gavin McInnes all collaborated in the production and distribution of discourse. In many cases, we see a complex, layered and ‘democratic’ network of influencers that co-constructs a (micro)-ideology. If we zoom in on QAnon, then we see that even that niche is a decentralised and polycentric pyramid-like conspiracy theory that is constantly being produced and reproduced in different niches by different producers. Mom-influencers, yoga communities, 4channers, and MAGA-activists all prosume the theory and make it ready for uptake in their niches using different angles and discourse strategies.
Trying to understand 6 January means understanding how many of those micro-populations merge to become a mass. One key element is understanding that since Election Day, influencers and prosumers in all those different niches started adopting some version of the conspiracy theory that claims that this election was stolen. This particular type of coalition is grounded in a network of social media sites and of course in the digital campaign of Trump itself. These groups were born within mainstream platforms like Twitter, Reddit, YouTube, and Facebook before some had to move to more fringe platforms like Gab, Parler, and thedonald.win after being deplatformed. In the months before the riot, all those niched groups used digital media to construct their own normalities, their partisan views of the world. In that world, the election was stolen by the left, the liberals, or the deep state. The enemy was accused of manipulating the voting machines, stealing or throwing away ballots, or organising fraud with mail-in and absentee ballots. The seeds for this myth were planted by Trump in the even before his election in 2016—but of course they resonated with discourses on the deep state that were already popular in many of those niches—and were carefully constructed by many of his performances during and after the elections. Trump’s electoral loss was read as the deep state taking over control again. It created a sense of urgency and opposition to the democratic institutions of the US.
We can best understand those conspiracy theories as contemporary and vulgar variants of the Sorelian myth. The Frenchman George Sorel was a prominent and influential anti-elitist and anti-democratic philosopher within revolutionary syndicalism that had a prominent impact on fascism. Myth was central to Sorel’s thinking about revolution and the overthrow of the bourgeois order. He saw myths as “groups of ideas” or knowledge-constructs that can direct reality, people, and movements. Those ideas didn’t need to be rational or true. What was important according to Sorel was that they had affective power. For him, myths had a social function. He saw them as means to mobilise people.
If we look at the role of conspiracy theories from the perspective of this Sorelian concept of myth, we see how they function as a site of ideology:
All those political conspiracy theories create a world in which the liberal elites are destroying traditional societies, enable multiculturalism, feminism, and the destruction of Western culture. That is why debunking the myths doesn’t work. It didn’t matter that Pizzagate was debunked; the general idea—that the liberals are morally rotten—was still seen as true. Important to note again is that those myths are not only cognitive-ideational phenomena, they are grounded in a material reality which is as important as the affective qualities of those myths in the mobilisation of people.
Ideology and algorithmic politics
If we understand ideologies as ideas that penetrate the whole fabric of communities and result in normalised, naturalised patterns of thought and behaviour, then we should realise how important the role of algorithms is in the construction of that normality. The reach of these groups cannot be solely explained by the discourse they produce; all of those influencers and groups deploy ‘algorithmic knowledge’ to spread their discourse and to construct a community around their profiles. Even more, their discourse on ‘censorship’ from the mass media and mainstream digital media platforms helps them to spread digital knowledge. Far-right influencers constantly stress the importance of getting the news out by sharing and liking. This produces fertile ground to grow a supportive culture. The other side of the coin is that the interaction with the personalisation algorithms contributes to the construction of the niched groups circling around specific influencers and pages, whereas the recommendation algorithms help to build a network of different micro-populations.
If we want to analyse ‘political ideologies’, we must not only focus on the content or the large ‘isms’, but also on the form, the communication economy, and the uptake. We need to understand how politicians and activists adapt to this new communicative economy and understand how they use it for their political struggle. What is clear by now is that this new communicative economy creates a polycentric world of communication. Such a world is far more complex than a world dominated by the so-called “mass media”. It thus creates an enormous challenge for scholars of ideology, because we will need to update our toolkit. The good news is that it may help us to develop more fine-grained analysis that takes into account the full context, including the socio-technical context.
 Taina Bucher, If… Then: Algorithmic power and politics (Oxford: Oxford University Press, 2018).
 Florian Cramer, ‘What Is “Post-digital”?’, in David M. Berry and Michael Dieter (eds.), Postdigital Aesthetics (Basingstoke: Palgrave Macmillan, 2014).
 Roland Barthes, Mythologies (Hill and Wang, 1957).
 Jan Blommaert, Discourse: A Critical Introduction (Cambridge: Cambridge University Press, 2005), 160.
 Andrew Chadwick, The Hybrid Media System: Politics and Power (Oxford: Oxford University Press, 2017); John B. Thompson, ‘Mediated Interaction in the Digital Age’, Theory, Culture & Society 37(1) (2020), 3–28; Tommaso Venturini, ‘From fake to junk news: The data politics of online virality’, in Didier Bigo, Engin Isin, and Evelyn Ruppert (eds.), Data Politics. Worlds, Subjects, Rights (Abingdon: Routledge, 2019).
 Bucher, If… Then.
 Tarleton Gillespie, ‘The relevance of algorithms’, in Tarleton Gillespie, Pablo Boczkowski, and Kristen Foot (eds.), Media Technologies (Cambidge, MA: MIT Press, 2014).
 Ico Maly, ‘The global New Right and the Flemish identitarian movement Schild & Vrienden: a case study’, Tilburg Papers in Culture Studies no. 220 (2018); Ico Maly, ‘New Right Metapolitics and the Algorithmic Activism of Schild & Vrienden’, Social Media + Society (2019); Ico Maly, ‘Metapolitical New Right Influencers: The Case of Brittany Pettibone’, Social Science (2020), 9(7); Ico Maly, ‘Algorithmic populism and the datafication and gamification of the people by Flemish Interest in Belgium’, Trabalhos em Linguística Aplicada 59(1) (2020) .
 Jan Blommaert, ‘Political discourse in post-digital societies’, Trabalhos em Linguística Aplicada 59(1) (2020).
 John B. Thompson, Ideology and modern culture: Critical social theory in the era of mass communication (Cambridge: Polity, 1990).
 Ibid., 264.
 Ibid., 24.
 Shoshana Zuboff, The age of surveillance capitalism: The fight for a human future at the new frontier of power (New York, NY: Profile Books, 2019).
 Venturini, ‘From fake to junk news’, 130.
 Vincent Miller, Understanding digital culture (London: SAGE, 2011).
 José Van Dijck, The Culture of Connectivity: A critical history of social media (Oxford: Oxford University Press, 2013).
 Nir Eyal, Hooked: How to build habit-forming products (London: Penguin, 2014).
 Van Dijck, Culture of Connectivity.
 Alice Marwick, ‘You May Know Me from YouTube: (Micro)-Celebrity in Social Media’, in P. David Marshall and Sean Redmond (eds.), A Companion to Celebrity (Hoboken, NY: John Wiley & Sons Inc., 2015).
 Richard Rogers, ‘Digital Traces in Context| Otherwise Engaged: Social Media from Vanity Metrics to Critical Analytics’, International Journal of Communication 12 (2018).
 Bucher, If… Then; Maly, ‘The global New Right’; Maly, ‘New Right Metapolitics’.
 Bucher, If… Then.
 Blommaert, Discourse, 163
 Blommaert, ‘Political discourse’; Maly, ‘The global New Right’; Maly, ‘New Right Metapolitics’; Maly, ‘Metapolitical New Right Influencers’; Maly, ‘Algorithmic populism’.
 Blommaert, ‘Political discourse’; Piia Varis and Jan Blommaert, ‘Conviviality and collectives on social media: Virality, memes, and new social structures’, Multilingual Margins 2(1), 31–45.
 Thomas Poell and José Van Dijck, ‘Social Media and Journalistic Independence’, in James Bennett and Niki Strange (eds.), Media independence: working with freedom or working for free? (Abingdon: Routledge, 2014), 182–201.
 Maly, 2018.
 Maly, 2018, Maly, ‘New Right Metapolitics’; Maly, ‘Metapolitical New Right Influencers’; Maly, ‘Algorithmic populism’.
 Ariel Winter, ‘Online hate: From the far right to the ‘Alt-Right’, and from the margins to the mainstream’, in Karen Lumsden and Emily Harmer (eds.), Online Othering: Exploring Violence and Discrimination on the Web (Basingstoke: Palgrave, 2019).
 Lisa Bogaerts and Maik Fielitz, ‘Do you want meme war? Understanding the visual memes of the German Far Right’, in Maik Fielitz and Nick Thurston (eds.), Post-Digital Cultures of the Far Right: Online Actions and Offline Consequences in Europe and the US (Bielefeld: Transcript Verlag, 2019); Daniele Conversi, ‘Irresponsible radicalisation: Diasporas, globalisation, and Long-distance nationalism in the Digital age’, Journal of Ethnic and Migration Studies 38 (2012), 1357–79; Edwin Hodge and Helga Hallgrimsdottir, ‘Networks of Hate: The Alt- right, “Troll Culture”, and the Cultural Geography of Social Movement Spaces Online’, Journal of Borderlands Studies (2019), 1–8; Rebecca Lewis, Alternative Influence: Broadcasting the Reactionary Right on YouTube, Data & Society (2018); Ico Maly, ‘Populism as a mediatized communicative relation: The birth of algorithmic populism’, Tilburg Papers in Culture Studies no. 213 (2018); Maly, ‘New Right Metapolitics’; Maly, ‘Metapolitical New Right Influencers’; Angela Nagle, Kill All Normies: Online Culture Wars from 4chan and Tumblr to Trump and the Alt-Right (Washington, DC: Zero Books, 2017); Marc Tuters, ‘LARPing and Liberal tears: Irony, Belief, and Idiocy in the deep vernacular web’, in Maik Fielitz and Nick Thurston (eds.), Post-Digital Cultures of the Far Right: Online Actions and Offline Consequences in Europe and the US (Bielefeld: Transcript Verlag, 2019).
 Blommaert, ‘Political discourse’; Maly, ‘Populism as a mediatized communicative relation’; Maly, ‘The global New Right’.
 Ico Maly and Piia Varis, ‘The 21st-century hipster: On micro-populations in times of superdiversity’, European Journal of Cultural Studies 19(6) (2015), 637–53.
 Blommaert, ‘Political discourse’.
 Ico Maly, ‘The Army for Trump and Trump’s war against Sleepy Joe’, Diggit Magazine (2020), https://www.diggitmagazine.com/articles/trump-war-sleepy-joe.
 Georges Sorel, Reflections on violence (New York, NY: Dover Publications, 2004).
 Blommaert, Discourse, 159.
 Maly, ‘The global New Right’; Maly, ‘New Right Metapolitics’; Maly, ‘Metapolitical New Right Influencers’; Maly, ‘Algorithmic populism’.