YouTube is changing its community guidelines to give creators more leeway for violating its content rules, the company announced on Tuesday. Content creators will now have the opportunity to take an educational training course that reviews why their video violated the guidelines. The move comes as social media companies appear to be less comfortable with moderating content ahead of the 2024 presidential election.
Under the current guidelines, creators receive a warning if they violate the guidelines which lasts for 90 days, but under the new protocol, if those who receive a violation notice take the class, YouTube will remove the warning from the channel. The caveat is that offending creators must refrain from posting any videos that would violate the same policy for 90 days.
In 2019, YouTube announced that it would issue a one-time warning to creators who “unintentionally” violated its policies, and claimed in Tuesday’s news release that following the change, 80% of those who do receive a warning “never violate our policies again.” This new measure is meant to cut back on the number of channels YouTube terminates from its platform.
YouTube claims creators told the company that they wanted a way to understand why their videos were in violation of the guidelines, prompting the shift in protocol. “We believe educational efforts are successful at reducing the number of creators who unintentionally violate our policies,” the company said in the release.
The release continued: “We also know receiving a strike can be disruptive to a creator’s posting schedule, and for the creators building businesses through our YouTube Partner Program, receiving an unintentional strike is not only frustrating, but can financially impact their bottom line.”
YouTube says it has not eliminated the three-strike rule, meaning it will shut down the channels of any creators who receive three strikes after a warning that sticks, but by allowing violators to take the educational course, fewer creators will be shut down.
As YouTube shifts to a kinder, gentler moderation system, the company announced in a June news release it would no longer monitor misleading claims or false information, such as videos claiming the 2020 election was stolen from former President Donald Trump.
YouTube said in a news release that it was updating its guidelines surrounding its approach to election misinformation and said: “The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society — especially in the midst of election season.”
The company said under the old policy, it removed tens of thousands of videos in two years, but found that by doing so, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”
YouTube’s new policy change could piggyback on its decision to allow misinformation to thrive on the platform as channels that would previously have been shut down, can continue to thrive. Meanwhile, the option to take a course to remove warnings from their channel seems to involve questions primarily targeted toward YouTube’s sexual content policies.
Trending Products