
Escaping Echo Chambers
Patsy Mboumba
Instagram is one of the world’s leading social media platforms, with over one billion users visiting the platform each month in South Africa. With thousands of eyes on the virtual platform at any given time, Instagram works hard to give us the content each of us want to see, and they use an algorithm to do it.
Algorithms play an intricate role in our everyday social lives. Hard lines of code shape what our Instagram feeds look like today, showing us more of what coders think we like and less of what we don’t. According to Instagram’s guidelines, algorithms decide which posts to recommend to you as you scroll, and which stories will display first on your feed.
These features are designed to provide each user with the perfect media experience, and they rely on highly innovative personalized technology. Despite this, original algorithms are written by fallible humans and tend to be biased in the kinds of content they present.
Body visibility on social media plays a keen role in shaping how certain bodies are perceived, and the Instagram algorithm is a significant driver in determining which bodies are visible – A role which many are beginning to decry as unjust and unfair.

Researchers like Joy Buolamwini at MIT investigate the biases in software used by companies like Amazon, who were called out for inherent biases in their facial recognition technology. The software misidentified or failed to register black faces, while easily recognising white ones.
We use social media to understand what’s happening in the world around us and to connect with others. Media scholars, and members of the public, have raised questions about how and why the algorithm favours certain voices and visuals while suppressing others. Media scholars like David Croteau and William Hoynes argue that the media we consume impacts the way we see the world, and what we view as normal. Algorithms can shift how we see ourselves, others, and the world around us.
Dr Marion Walton, a professor with the Centre for Film and Media Studies at the University of Cape Town, says that algorithms are often linked to the identity of the coder. “Algorithms often embody the biases of whoever has made the decision to set them up,” she says. “There’s a business model behind it,” she adds in reference to the way that platforms are moderated. She believes that social media companies are more focussed on efficiency and profit, and don’t deploy the necessary resources to create algorithms that are both functional and fair.
Algorithms often operate with a semblance of AI, which uses historical user data to make decisions. These algorithms should be sensitive to a diverse range of languages and social behaviours, but the demographic of code-makers behind the algorithms are statistically white and male, accorded to an article published by Harvard Business Review. When they write code to interpret the behaviour and desires of others, it is inherently based on their perspectives and assumptions about the world.
Because of this, Walton believes that algorithms can be incredibly harmful regardless of how they are designed.
“Bias is inevitable, so we need to decide what values we build into these technologies and what data is being used to train the machine,” she explains. Walton is sceptical of the algorithm’s one-size-fits-all approach. “Are we allowed to opt out of these systems when we believe they do not work for us?”
One mechanism of bias that the Instagram algorithm employs is known as ‘shadow banning’. When this occurs, the algorithm fails to publish the content of certain accounts to the feeds of their followers and other Instagram users. Because the algorithm prioritises content it deems desirable, ‘undesirable’ content is subtly censored. Shadow banning is a concern for all those who don’t conform to the type of content favoured by the algorithm. When creators do choose to conform, their content becomes more homogenous and streamlined, by necessity.
