Algorithms are the most powerful computational tools in use today. But it’s essential to know how they work, and who they work for, to make the most of them.
A bit under than 30 years ago, on August 1991, there was one sole website on the internet: the World Wide Web Project. Today, the size of the online sphere is incomprehensibly vast: as of writing this, the Indexed Web contains at least 5.85 billion pages.
It is no longer possible to browse the internet without powerful mediation — to experience the net directly is an exercise in wasting time through drowning in irrelevance. The giants of the internet have been about creating order in the chaos, establishing platforms that promote and simultaneously control content.
Algorithms have been their most potent tools, the lifeboats we can ride on the windy seas to make sense and experience delight from our time online.
There is something sublime and enticing about the idea of an algorithm. An algorithm is essentially a finite, well-defined set of instructions. It is automated and unambiguous, as far from the meddlesome nature of culture and creativity as you can get.
But through the complexity of the calculations they perform and the vast quantity of data we input into them, they’ve morphed into something like intelligent machines. They are powerful entities that toil at extracting tailor-made intellectual pleasure for us. They’re our friends, our confidants, our librarians and our authority figures, all packed into a benevolent presence whirring in the background to our lives.
The Shape of Choices
A good platform, with the use of its algorithms, creates an entire ecosystem of content that the user perceives as a fully formed reality. This is not an accident: the more seamless and effective the design, the better the customer retention is. It makes for smooth user experience, but not necessarily for benefit.
Through specificity and optimisation, what algorithms and filters are creating are content bubbles. We may know there is a world outside of the information we are presented. Still, as long as the bubble keeps us feeling informed and entertained, we have no interest in looking outside of it. At the extreme end, we might begin to forget there is anything else.
Then there is the question of how well people understand the technology. A 2015 study suggested that more than 60% of Facebook users were unaware of their content being curated. Though the number of people aware of data mining and algorithms has risen, these technologies are spoken of by people, but not necessarily understood.
The first problem is that of interest: the casual consumer only cares when the algorithm fails. You find it funny when you’re targeted with a miscalculated ad; you’re annoyed when your feed is full of stale and pointless content. The more data companies mine and analyse to hone the machines, the fewer aha moments users are going to have.
The people outside of the tech industry actually interested in the nitty-gritty of the algorithms are those that stand to make a profit from them: writers, advertisers, small business owners etc. Practically all resources speculating on the workings of algorithms are centred on how to maximise profits using them. Whilst it’s useful to have those resources and that discussion, it is a decidedly cold and non-humanistic way to examine the impact of our content bubbles.
An Ocean of Relevance
With traditional media, we at least had the benefit of scanning headlines and knowing what it was we were choosing not to consume. Now there is but a vague idea of the world outside our feeds. That is if we even care to think of what is outside our belief system. If we see stories perfectly tailored to our world view, they are less likely to come across as bait and more as proof that “I was right all along”. In other words, fuel for your innate confirmation bias.
One could say, and Facebook has said in the past, that the biases and limitations of the content they create are similar to the human tendency to discard challenging beliefs. But that is not entirely true: if we choose to judge a piece of information as irrelevant or lacking, we’ve still engaged with and thought over the contradictory information.
If we have never seen the contradiction, then we never had a chance to engage with and elaborate on our own beliefs. Thus we are robbed of the mechanisms of adding nuances and depth to our thoughts.
The Ghost in the Machine
One must remember algorithms are human creations. Yes, they are sophisticated instructions for the handling of large data sets given to powerful computers. Still, they are instructions written by flesh and blood human beings.
Calling a set of carefully considered directions and specifications an ‘algorithm’ is a way to deify it and absolve the designers of responsibility for the consequences of their creations. It is pretending that the machine is setting the music because it is the one doing all the dancing.
One proof of the ‘human’ nature of the algorithm is how perturbed the companies behind them are by users trying to ‘game the system.’ They promote an image of the algorithm as an inscrutable black box, then punish those that figure out too well what’s in the box.
It is as if, after setting up the marionettes, platforms want to pretend the little strings do not exist. And there are repercussions when users decide to pull on them too generously. One example was YouTube creators calling directly for viewers to like and comment on videos to boost their visibility. The platform responded by tweaking the algorithm to disregard likes in its relevancy matrics.
While banning outright manipulation, the platforms also create guides for helping creators succeed. But these instructions can be vague or at odds with what the algorithm favours. It’s nice to say that posting regular content is excellent. Still, it’s different to disclosing the optimal rate of posting, or the time of day at which you’ll get the most traction.
Algorithmic optimisation also makes unnecessary demands on creators. It influences the volume of uploads, how eye-catching the visuals are, the use of hopeful and inspiring language. All of these must be present, even if it isn’t the best choice editorially. The preferences can go so far as to warp the end product, such as YouTube creators purposefully drawing out their videos to increase platform rank at the cost of quality and pacing.
The Anchors of the Ocean
As long as the algorithm serves us diligently, we hardly even notice it. If you get an ad for a genuinely exciting and engaging product, the chances of you contemplating the invasive nature of the recommendation are slim to none. The chances you think highly of the convenience are very high.
The problems begin when you are targeted with an advertisement for things you only have a passing interest in. The almost sensual nature of a finely tuned advertising machine can be cruel in how persuasive it is. Just like people can grow emotionally attached to complicated devices like cars, the algorithms can lull us into false familiarity. We can end up thinking they’re on our side when they recommend things we don’t actually need.
Algorithms play into a notion of the internet as a tool for rationality. The ads they produce speak of such things as order, efficiency, objectivity etc. They dispose of classical types of corruption, such as nepotism and invasive sales tactics. But just because they’re not tainted with the common forms of bias, doesn’t mean they’re unbiased.
This is where the power of narrative comes into play: the algorithms define the terms on which you think of your choices. Your points of reference when making a decision will be what you’ve seen first. And what you see first will be what the algorithm has chosen out for you.
This is because of a phenomenon known as anchoring in psychology: when presented with a breadth of choices, the human mind will use a focal point to be an anchor for decisions. Usually, the focal point is naturally the thing that came first.
And so algorithms create the framework for any decision that happens after interacting with them, channelling your attention from the word go. Even if you want to venture outside the digital mainstream, your mind will drag you back to what it sees as the familiar.
Pigeonholed to Ennui
Companies like to boast in their promotional material that they are geared towards quality content. That is patently untrue — they strive for profit, their express goal. Thus the calculations that go into their product prioritise time spent on the platform and the profitability of content. There is no discussion of quality beyond audience appeal since, in theory, better quality content means more audiences.
The world is carefully curated to us, yet we can find ourselves bored with looking at different reflections of our past selves. The theory of quality is far from practice. YouTube has allowed controversial content on the platform rather than losing audiences unless faced with massive backlash. When faced with a dilemma such as mediating extremism and fair speech, YouTube has chosen a policy that focuses on minimising audience and revenue loss instead of strict quality control.
Misunderstanding the way algorithms work is a basis for demonising them and jumping to conclusions about what they can and can’t do. In the absence of advanced degrees in AI and machine learning, most users resort to a folk theory of how they work. Platforms and their designers are also to blame: they either don’t talk about the design of their code, talk about it vaguely, or delve into complicated technical explanations.
It’s essential to try and understand how algorithms work, at least conceptually, so you can make better use of them and avoid pitfalls. You do not have to be a computer scientist to know that your view of the internet is being slightly distorted.
Furthermore, platforms can work for you even if you are not a creator or marketer. Though they’re designed with content producers and profits in mind, you, the consumer, are still the one that stands to win or to lose the most.
Of course, in a world of privacy breaches and hidden surveillance, it is utopian to talk of absolute control over anything you do online. What you can do is minimise the amount of power an algorithm has over your virtual reality. You can do this by actively seeking out the unexpected, and not providing personal information you are not comfortable being defined by.
Make a substantial effort to branch out into new interests. Search for content curated by humans from different walks of life, at different times in history. Do not limit yourself to a single platform. Survey the data you’ve provided the platform and how comfortable you are with its use, storage, analysis etc.
Feed the algorithm not with your biases and preconceptions, but with a desire for breadth and quality. Live life outside the bubble.