06 Nov

Last week, I cycled past a bus reading ‘5G - don’t get left behind’ on its back. This very bus drives through Cape Town’s more affluent suburbs, but also transports many workers who come in from low-income areas. The message bothered me. It was there for the purpose of selling a product and thus not necessarily meant to convey a meaningful message. Still, it did echo a range of assumptions that I find to be prominent in discussions on digital media and technological developments more broadly.

Who consumes what digital technology and for what reason is often imagined in rather particular ways in public discourse, reproducing unidirectional and hierarchical ways of thinking. It is mostly young-ish, affluent and socially mobile people whom we see as avid users of information and communication technologies (ICTs). Looking at advertisements for all kinds of tech products, one can see where the reputation of digital technologies as Millennial hedonism-fodder comes from. This certainly applies to the widely used dating app Tinder, the use of which I studied for my PhD. 

Although the audience for new tech products seems selective, my research on Tinder shows that there is no such thing as a particular kind of ‘tinderer’ doing one certain kind of ‘tindering’. Instead, the app is used in different ways to establish a variety of intimacies – even if they sometimes turn out to be quite disappointing. Nevertheless, it must be mentioned that access to technologies, while broadening, remains a marker of privilege. And where access increases quickly, there are often hidden commercial interests at play.

A lot of ‘options’ on the GPS location-based app (meaning that Tinder tracks GPS coordinates of one’s phone and shows matches in the area) with its swiping (yes or no) touchscreen interface are meant to equal more freedom of choice. This is in line with broader assumptions linked to tech developments that there will be an improved humanity with an increased access to information. In fact, information flows are almost religiously celebrated as having supreme value in and of themselves (also referred to as dataism), as being inherently progressive and as levelling social playing fields in public discourse.

While access to the internet and mobile technologies has dramatically increased in many places (specifically in African contexts), the ‘terms and conditions’ of online accesses remain opaque, rendering it possible for data capitalism is increasingly running wild. In this way, big-data optimism assuming that ‘more is better’ does not only lead to the overriding of privacies, it also gradually tinkers with peoples’ sense of autonomy. The Tinder users I interviewed, for instance, embraced the app for its promise of agency in making intimate choices. Yet, there was also a disillusionment liked to the notion that the app creates experiences that are less than authentic or real, also due to the formatting of a seemingly endless string of similar-looking ‘options’ to choose from. These already algorithmically pre-selected ‘options’ then also had to be reconciled with much messier offline sphere (even though online and offline can no longer be neatly distinguished).

Whether it is pre-selected Tinder ‘options’ information on one’s Facebook feed, an ad on the back of a bus or ads on a webpage (I am currently told I should buy a new laptop based on a previous search) - what we are told we need or should identify with in order not to get left behind does not imply much of a free choice. Instead, ideas of independence, progress and self-improvement (mantras of our time) are very much interlinked with steadfast consumption with murky repercussions.

Presenting technisation as a lofty ideal or a superior mode of being to achieve rather than something created from a particular vested vantage point effectively veils the authoritative regimes of the technological revolution we currently witness. This includes the cultures and values embedded in tech products. Very few women and people of colour are hired in tech industries, leading to the development of problematic algorithms. Even more problematically, designs and codes are presented as neutral and gender- and colour-blind, much like the employment politics in big tech. What is more, tech products tend to be portrayed as independent actors.

Power relations precipitating unequal access to resources that tie in with social, economic and educational developments are consequently neatly brushed under the discursive carpet. Framing access as a matter of capability and choice (reach it, grab it – or else get left behind) rather than something that forms part of a historical development supports the priorisation of the needs of some while the experiences of others (those who cannot reach) are rendered even less visible and relevant for imagined futures. And disparities extend much further when looking at the global distribution of labour and resources in the ecosystem of the information and communication ecosystem - keyword: cobalt mining for tech products, most of which is retrieved from the DRC.

In Cape Town, where the geographic, economic and social divisions of Apartheid are notoriously persistent, the ‘don’t get left behind’ paradigm seems particularly cynical. It foreshadows an even more unequal future and places the responsibility for it onto individuals. For if one does not manage to pursue the prescribed path of the digitisation of everyday life at all costs (with barely any regulations in place ensuring ethicalness and accountability), one will have to face the consequences. This form of exclusion severs itself from problematic histories of divisions and portrays the ones to come as both evitable (ones can make the “right” choices and catch up with tech) and as an inescapable future of insiders and outsiders – much like the narratives of numerous sci-fi plots.

It was throughout my studying Tinder that I grew increasingly intrigued by what lies behind the shiny, promising exteriors technologies and artificial intelligence (AI), which is why I want to continue studying and their impact on our well-being, social identities, politics, economies and demographic developments. Something I am very curious about is the role algorithms play in how we come to understanding ourselves and the world around us, how we relate to others. After all, selections of which digital technologies we integrate into our daily lives and how significantly shapes how we relate.

The more I read, the more I find myself getting irritated with overly positivistic representations of AI in particular. Especially when people like Amazon CO Jeff Bezos shamelessly flaunt their extraordinary wealth by taking a quick trip to space in a phallic-shaped rocket and make some extra cash by selling spare seats to similarly wealthy people. There are encoded assumptions that cannot simply be transgressed. If Tinder only shows me people their coders think I should be swiping for, I can do very little to extend myself beyond this algorithmic restriction. It is in moments when such questions of agency arise or when products like the new Tesla robot named “Optimus” are developed and Facebook seeks to usher in the dystopian era of the Metaverse that it is useful to think back to Bezos’s phallus-shaped rocket - as a memento of how the products we use on a daily basis are anything but neutral or necessarily designed for our needs.

AI and big tech are not superior approaches to experiencing the world, nor are they “semi-sentient” as Elon Musk promises his new human-replacement robot to be. If left unchecked, the trajectory of dataism, inextricably linked to ideas of agency and self-determination, may very well be to the detriment of humanism. Thankfully, this is not a sci-fi movie or a zero-sum game. We are in a position in which we can still decide just how to handle these seemingly inevitable developments that are sprung on us from silicon-valley and co. We can contextualise these developments and look at them as the political and consequence-heavy projects that they are. “Don’t get left behind” messages in any context should serve as a wake-up call. But instead of letting it induce panic and self-questioning as the advertisers appear to intend, we should treat it as a reminder to consider people at the margins and designing appropriate interventions instead of placing blame in the most inappropriate ways.

* The email will not be published on the website.