Two of YouTube's biggest Pokémon Go personalities, Brandon Martyn a.k.a "Mystic7" and Trainer Tips' Nick Oyzon, had their channels removed from the platform after the reporting algorithm wrongly flagged several of their videos for sexual content. The reason? Several of their video titles contained the phrase "CP," which stands for Combat Power, a measure of how strong a particular Pokémon is in battle. This is innocent enough, unless you consider that the phrase, "CP," refers to child pornography.
According to a video posted by Martyn on a second channel, YouTube's algorithm flagged and removed multiple videos for inappropriate content within a short amount of time of being posted. Martyn says he first received two emails notifying him that he had violated YouTube's terms of service, but had not received a strike against his channel. This was quickly followed by an email saying that his channel had received a strike, and was going to be automatically terminated.
[EMBED_YT]https://youtu.be/a9op6nT9Yh0[/EMBED_YT] Specifically, the email referred to a video titled, "Highest CP Yet in Pokémon Go! Wild Dragonite! How Much CP Will It Be?" However, the video contained only the usual, wholesome, Pokémon Go-related content. It was obvious to Martyn that no one had actually watched the videos to make sure the algorithm had correctly flagged them before removing his or Oyzon's channels.
Oyzon reached out to the YouTube team via Twitter. Not only was his channel removed, he also lost access to his Gmail account from the strike, meaning he couldn't even see the messages from YouTube explaining why his channel had been removed. After both channels were back up, and he regained access to his Google account, Oyzon also posted his own video on his personal channel explaining what had happened.
[EMBED_YT]https://youtu.be/AMcQ8NDqMws[/EMBED_YT]
In the video, Oyzon says that he received emails saying a video containing the phrase "CP" in the title had been flagged for inappropriate content by the algorithm. He also calls out YouTube staff for not reviewing the algorithm's flag before removing his channel, as the video obviously did not contain any sexual content. The email also told him that the flag would not result in a strike or account termination, both of which happened.
Unfortunately, this issue is not unique. YouTube has seen more and more content that exploits minors, as well as inappropriate comments directed towards minors in non-sexual videos. In an effort to combat this, YouTube has increased its efforts to find and remove this content using both automated and human flagging. Their goals are admirable, but the automated flagging systems have become somewhat notorious for ignoring context and wrongly removing videos and channels, harming creators who have not violated the terms of service.
As both Martyn and Oyzon point out in their videos, these incidents can be very damaging for people who use YouTube as their main source of income. Even when the issue is resolved and channels are reinstated, creators can lose thousands of followers who think they're gone for good. While trying to remove exploitative content is definitely a good thing, YouTube needs to find ways to do so without harming their user base.