Skip to main content
Internet Talk

Most social media websites operate on a simple principle: recommend the content that you spend the most time engaging with. For instance, on YouTube, if you watch a lot of Minecraft content, it will attempt to recommend you more Minecraft videos.

This feature is turned up to the max on short-form video platforms like TikTok, where the user has no immediate agency over the videos they're being recommended. TikTok will decide what the next video is, rather than YouTube giving you a plethora of choices.

This makes it harder to work around the system: if you watch a particular style of video, you will be recommended more of it. But that doesn't stop you from having agency. If you can see that you're being recommended a particular style of video, and you want less of it, you can aggressively scroll past all instances of that type of video, until the platform recognises you no longer want to see it.

For instance: Instagram Reels will occasionally death spiral into only recipes for me. To escape the death spiral, I have to intentionally skip past all recipe videos, even if I'm interested in the content.

I do this because of the Wall Street Journal's reporting into the algorithm, which showcased that the algorithm is receptive primarily to time spent on content. To that extent, I am not doing this because the algorithm drove me to behave this way: I'm doing so because I understand the underpinnings of the algorithm.

Some people have found their algorithm is entirely separated from what they want to consume. But other users on Instagram Reels frequently talk about building their algorithm "brick by brick" - meaning they've engaged in an act of curation to build up the algorithm that serves videos to them.

How much of that is an actual understanding of the mechanisms of the algorithm, vs an innate understanding of it based on the reaction of the machine? More to the point, is this a desirable outcome, for both the user & the social media companies?

Social media companies don't want you to think about the algorithm: they want you to use it. They want you to stare at your screen, and scroll purely on the videos that make you more likely to stay on the platform: not for you to try and manipulate the algorithm to serve you videos that you'd like.

Watching videos you like seems like the same thing as watching videos that keep you on the platform, but it isn't. Social media companies don't care if you're hurt, scared or angry; they just want to maintain your attention at all times, even if it is to the detriment of you. The alt-right pipeline is the classic example of this: the content does not serve to make you happy, it serves you to make you angry, and to keep you captivated.

When users manipulate the algorithm, they reclaim some level of agency. Many people talk about giving us the ability to control platforms, and platforms continually refuse to give any because they likely fear the outcome is that users end up using the platforms less, even if that use is more conducive to their needs.

One aspect of this that is particularly gnarly is porn addiction. Most platforms have a fair amount of sex workers, and they will attempt to advertise their services, usually in a quasi-SFW, but provocative piece of content. If you are an addict, the platform gives you no "off" switch for this type of video. You have to scroll past it, which is problematic because you're trying to work around your instincts - the very thing the platform is attempting to prey on.

If users don't know about the primitives, they may not be able to do anything about it. It will therefore be an interesting study to see whether they're figuring this out all on their own.

This set of circumstances is unlikely to change: platforms don't appear to be relinquishing control and there is no reliable way to detect algorithm manipulating by users. Your only real control? Scrolling as fast as possible.