YouTube looks for an addictive new environment for users
Even the workers of the company have affirmed that the Google affiliate lets out offensive, radical and disinformation content in order to increase their traffic figures.
After not a few indications for the effects that the algorithm has caused, Google has proposed to modify it, but not in a way that avoids the social, medical, cultural and political problems that it usually raises, but in a way that is even more addictive. A new report from the YouTube matrix proposes an update of the algorithm aimed at recommending even more personalized content to users in order to increase visits.
As the MIT Technology Review website explains, echoing the leaking of the Google document, the proposal for the new YouTube algorithm goes through avoiding what the technology calls "implicit bias." In simple words, the company wants to have more tools to know if users watch videos because they are interested in them or because they were recommended, in order to launch better recommendations that translate into more time within the page (and greater exposure to advertising).
This will be the new YouTube algorithmrn
There are two major channels through which YouTube recommends content. The first is that of the main site, which launches recommendations according to the tastes of users during the last weeks, based on everything they have seen and stopped watching, and pulling machine learning to generate links between the videos or channels they usually watch and what other receivers of those videos and channels also usually watch on the platform.
But more important than that channel of recommendations is the bar of related videos that appears in the right column of YouTube every time some content is being played. If the company gets the audience that is already watching videos to get hooked to another video immediately, they will increase their chances of charging for the advertising it shows in the middle of them and of getting better traffic figures.
The YouTube algorithm works day and night to compile lists of hundreds of videos similar to the subject, subject or style that the user is watching at any given time. Then, classify that list according to the preferences of each user, which the algorithm learns after observing each movement - however small - of the users in the portal. Machine learning acts as a sort of blender of these two factors and presents a list of qualified recommendations from which it believes that more interest can be generated than less.
There comes what Google calls "implicit bias", understanding that recommendations can affect user behavior, which means that the company does not know for sure if the video someone decided to watch was because it appeared first in the list of recommendations or because he was really interested. The technology suggests, then, to make an adjustment in the algorithm system to reduce this bias.
The Google document explains that the new model could be like this: every time a user clicks on a video, the place of that video in the recommendation bar should also be considered. If it is at the top of the bar that type of content will have less weight than if it is in the middle or at the bottom of the list of recommendations because to get there the user must try harder, meaning that it is content in which He is really interested.
Google researchers found greater participation of users on YouTube when they tested this new system, so they approve the algorithm setting. Any increase in user interaction translates into higher revenue for YouTube, so everything indicates that this will be the way to go. Despite criticism, YouTube has stated that it has determined that this change would not increase the filter bubbles and, rather, expects them to decrease and the recommendations are more varied.Some experts have criticized YouTube for the change, pointing out that it could boost the creation of more isolated communities than the current ones, as well as increase the reach of the most extreme and radical content of the platform. They also regret that YouTube, instead of looking for a way to improve its footprint in society, only watches over its economic interests, which have one of its greatest allies in the addiction of users.