lectio praecursoria

Lectio Praecursoria: Encounters with Algorithmic Systems, through the Game Metaphor

The game metaphor could be more widely used to understand our relationships with algorithmic systems beyond the usual notion of ‘gaming the system’, suggests Jesse Haapoja in his doctoral thesis "Encounters with algorithmic systems, through the game metaphor", which was examined at the University of Helsinki on May 5, 2022. This post is the lectio praecursoria presented at the public examination.

An algorithm is a word that you will be hearing multiple times during this event. When someone explains what an algorithm is, they often use the metaphor of a recipe. And the description of an algorithm certainly does resemble a recipe: a list of steps to arrive at the desired outcome from certain inputs. When it comes to algorithms, we usually make computers take these steps on our behalf. 

Why is it then that you are now listening to a social scientist discussing algorithms? Should this topic not be in the domain of computer science? The reason is that this abstract thing that can be made to sound fairly simple has crept its way into our daily lives in ways that do imply that maybe we are not dealing with such simple things, after all. We live in a society of algorithms, as Burrell and Fourcade have aptly stated. The massive amounts of data collected from our everyday activities fuel these systems. Also, there is almost religious fervor with which we as a society try to automate tasks that a fairly short time ago were thought to be exclusively in the domain of human decisions. These developments have led us to live lives where computer algorithms can be claimed to be almost everywhere. Perhaps we could be discussing software, rather than algorithms, as we tend not to engage with algorithms on their own, but as parts of a bigger system. However, the term algorithm has, at least for the moment, stabilized its position both in the academic literature, and in the everyday discussions that individuals have with each other. I believe everyone here has heard someone mention “social media algorithms” or “Facebook algorithms”, although perhaps in a less academic context than the one we find ourselves in today. 

And algorithms, as stated, are everywhere: that is why they come up in discussion so frequently. When you hear someone discussing artificial intelligence, they are talking about algorithms – often rather complex ones. Algorithms are used to suggest routes in navigation applications – in fact, I believe that at least a few of you have typed the address of this building to such an application today. These technologies also try to guess our tastes in different services that offer us content such as movies or music, bearing consequences on how we spend our free time. Algorithms both mediate and moderate our relationships – people even start families with partners that they have met through a dating algorithm. In essence, we could go as far as to argue that algorithms are breeding humans, and we seem to be quite okay with it.

In this situation, which no doubt would have looked quite strange for someone who lived just some decades ago, our daily dealings increasingly include interactions with algorithmic systems. I prefer using the term algorithmic system, rather than an algorithm, as what we are dealing with is much more than lines of code. While code is essential for these systems, data and humans, both as creators of technology and users of it, are also undeniably significant parts of what we encounter. We are not interacting with “raw algorithms” but rather different services and organizations that they are embedded in. In other words, for me, algorithms are interesting as part of human activities, where they become part of our social lives.

Now, while algorithms may seem cold and distant, it is important to remember that these systems do not come from a void. Rather they are made by humans to serve human interests and are often tinkered with by their creators. They are then constantly in flux, being redefined by their connections to humans. You only need to look at a friend’s social media feed to see that while the technology you engage with may be identical, the choices we have made and the relationships that we have, make our experiences of the same service very different. What an algorithm is, or means, for us at any given moment can then be understood as situation-specific, something that could be otherwise in a different context.

While it is clear that we are now living with algorithms, how exactly should this life be understood? Our relationships with these systems certainly do require us to study them. Technologies that have spread so widely have, without a doubt, implications. In my thesis, I offer one way to understand the life we live with these systems. I set out to answer this question: if we treat encounters with algorithmic systems as games, what can we learn of those situations? To answer this, I draw from the work of Erving Goffman on games and frame analysis to study encounters individuals have with different kinds of algorithmic systems. It is not my intention to claim that this perspective is the most important one to approach this topic. Rather, I will, during this defence, show you that it is one fruitful approach to making sense of our lives in a society of algorithms, and that we can indeed learn something about encounters with algorithmic systems with the help of the game metaphor. 

When the game metaphor is applied as a perspective through which everyday life is looked at, one ends up highlighting certain aspects of the everyday. It does, perhaps, give more credit to our capabilities to be calculative in our daily doings than is necessary. However, this calculative side is a part of us humans, too. The game metaphor can also be used beyond just as a tool to focus on this calculation. As my thesis shows, the metaphor is a flexible tool that allows us to not only look at an actor but also at the actor’s relationship to their surroundings, and their capabilities to alter their immediate situations. This perspective sees humans as agents: not as ones that are all-powerful or separate from their surroundings but rather ones that are able to, or at least try to, act in a sensible way in relation to their immediate situation, which may include other actors, too. Additionally, if we consider daily life as a game, we can also start probing for whom or what we are playing for? While intuitively we may think that someone who is playing is playing for egoistic reasons, it is quite clear that in everyday life we also act on behalf of others, and ask others to act on behalf of us. While we may play to serve certain interests, they are not always our own. And thus, what it means to win or lose to players of these games depends on whose interests they are playing for. This perhaps complicates one of the core concepts of human-computer interaction: that of a user. If you are using a service on behalf of another, who is the user in that situation? Also, by taking on different principals who one serves, the use of different algorithmic services may go against what the designers of these services have had in mind.

I also draw from the concept of frames, another key concept in Goffman’s scholarship. I use it to supplement the game metaphor. Frames act as an answer to the question “what is it that is going on in here”. We are all now participating in a frame of a doctoral defence, and have very different roles in that frame. This shared frame ensures that we know what is happening and how we should act. The concept offers a way to go deeper into the idea that we try to alter and define our immediate situations in ways that suit our goals – or in ways that the understanding that others have about the situation serves our aims. This point is especially important in regards to algorithms, as they do not see the world as we do, but through the data trails left by our actions. By manipulating the data available to algorithms, individuals in practice alter what these systems can “see” and thus what kind of frame is visible to them. Attempts to turn algorithms to serve us in ways unintended by their creators – acts which are often called “gaming the system”,  – hinge on manipulating the data that the algorithms make their decisions on. Gaming these systems can then be seen as actions where one frames a situation in a way that is interpreted by an algorithm so that it furthers one’s goals. Again, we can see that here the users, those engaging with these systems, may challenge the aims that the creators of the systems have had. Users may even bend these systems so that they are used against other individuals: one example being how some police officers in the United States have tried to play copyrighted music aloud when members of the public have been recording them. The music has been played with the hope that content moderation algorithms of social media services would block the dissemination of these videos.

In my use of the game metaphor, I treat encounters with algorithmic systems as relational processes where different actors, using different kinds of moves, construct the games they play. The actions they take in relation to each other and the situation as a whole, including the technologies present, define what the game is about. Individuals thus, through their moves, uphold and change the frames of the situations they find themselves in. My thesis shows, for example, how algorithms can be used to give new meanings to human activities by reframing them, and how this reframing can be considered as a move in a game by those deploying the algorithm into the world. Frames also act as borders of the games, delineating what belongs to the game at hand, and what is outside of it. Players may bring additional elements to the frame of the game beyond its borders – or beyond its brackets, as Goffman named the borders of frames – and thus altering the game at hand.

My thesis draws together research published in three articles, all of which use qualitative methods. Data for the articles has been collected via different means, including interviews, online discussions, and written accounts. Two of the articles included in the thesis focus on recommender systems, or more broadly, personalisation. This usually refers to services that are built to predict our preferences and recommend us material that we could be interested in, based on our prior behavior. Another system that has been under scrutiny is one that was used for automated hate speech detection. These technologies are often controversial as they are deployed in situations where there are disagreements on what is acceptable behavior online. The three studies, when read in tandem, illustrate how algorithmic systems can be used to create and alter definitions of situations individuals find themselves in, and how we may relate to algorithms as allies or enemies depending on the situation and how we may act towards these systems also on behalf of someone else – and likewise use someone else to engage with them in our stead.

As a recap, the main findings of my dissertation can be summarised as follows: First, metaphorical games can be created and altered via various moves, including those made with or by algorithmic systems; Second, particular frames of games are separated from the surrounding reality by borders, or in other words, brackets that support frame-specific meanings; thirdly, individuals may play for others or enrol others to play for them. This affects what it means to win or lose in the metaphorical game as it is dependent on who or what we are playing for; and finally, controlling what is visible to a specific player is especially important in games that include algorithms since algorithms cannot understand most changes in the frame of activity.

My findings illustrate how encounters with algorithmic systems can be understood as relational processes in which different actors, by the moves they take, both working together and against each other, define things such as the meanings, rules and the roles different aspects of the situation have. My thesis shows that humans are active in relation to algorithmic systems, rather than passive cogs in some sort of a machine. These systems are human creations, and furthermore, they are redefined by humans in different situations in ways that may go against the ideas that have originally been the intention behind their deployment – or may be used exactly as intended if the user sees it as useful for themselves.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: