[ad_1]
TikTok users often express respect or dismay for the apparent accuracy of the app’s recommendation algorithm. Wall Street Journal posted a video today we dive into how TikTok customizes your feed.
WSJ the researchers conducted an experiment in which they created bot accounts for which interests had been assigned. The bots “watched” the videos on TikTok by pausing or replaying anyone with images or hashtags related to those interests. WSJ the team reviewed the results Guillaume Chaslot, an algorithm expert who previously worked on YouTube.
The results are with TikTok’s explanations of how its recommendations work. TikTok has previously said The For For feed is tailored to what types of videos you interact with, how you interact with them, video details, and account settings such as language and location.
If you’re hesitant about a weird video that got you in vain, the algorithm can’t in any way differentiate it from content you like and want to see more of. So some people get a set of For You recommendations do not seem to reflect their interests.
Although humans have different tastes than bots, the experiment shows how quickly a user can be exposed to far-reaching potentially harmful content. By WSJ, TikTok recognized the benefits of some robots in just 40 minutes. One of the robots fell into the rabbit hole of depression videos, while the other ended up in videos about election conspiracies. Although Will Oremus points out on Twitter, algorithmic rabbit holes can also bring people to positive content.
The video has a lot of detail and visualizations, so it’s a great way to wrap your head around the magic of TikTok’s action. Watch the video above or at WSJ site – although warned, it contains TikToks clips suggesting depression, suicide and eating disorders.