The Algorithm and You

The Wall Street Journal did an interesting study, as reported in 9to5Mac, where it attempted to decrypt the TikTok algorithm and came up with some disturbing results.

TikTok looks at the videos you linger on and rewatch in particular, then starts feeding you more similar content. The effect snowballs. (According to TikTok, it also looks at what you share, like, and follow.)

In the case of the Wall Street Journal experiment, they created an account that started leaning into sadness. After 224 videos (36 minutes of total watch time), 93% of the videos fed to the account were about sadness or depression. For someone who is already struggling, this will just pull them further down.

Companies like TikTok and YouTube design these algorithms to get you to watch more. The money comes from the ads. The more you watch, the more ads they feed you. It’s a simple business model, and there is no room in the algorithm to teach you something useful and certainly not to help you.
Why would you turn that agency over to a glorified ad firm?

One of my pet projects heading into next year is pushing all these content consumption algorithms out of my life. No longer will I let companies like TikTok and YouTube decide what gets poured into my brain. There are tools out there to help you curate your own list of content that is not designed to send you down the rabbit hole of emotions and lost time but instead to help you. I’ll share more on this as I figure out the workflows, but if you’re currently letting the algorithms pick your media, stop.