What is the difference between hate speech and harassment?

By | September 8, 2021

Between hate speech and harassment

Our hate speech policy protects specific groups and members of those groups. We remove policy violative content. We consider content hate speech when it incites hatred or violence against groups based on protected attributes such as age, gender, race, caste, religion, sexual orientation, or veteran status.

This policy also includes common forms of online hate such as dehumanizing members of these groups; characterizing them as inherently inferior or ill; promoting hateful ideology like Naziism; promoting conspiracy theories about these groups; or denying that well-documented violent events took place, like a school shooting.

Our harassment policy protects identifiable individuals and we remove policy violative content. We consider content harassment when it targets an individual with prolonged or malicious insults based on intrinsic attributes, including their protected group status or physical traits.

This policy also includes harmful behavior such as deliberately insulting or shaming minors, threats, bullying, doxxing, or encouraging abusive fan behavior.

How does YouTube manage harmful conspiracy theories?

As a part of our hate and harassment policies, we prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence. One example would be content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate.

As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up. Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue to adapt our policies to stay current and remain committed to taking the steps needed to live up to this responsibility.

How does YouTube enforce its hate speech and harassment policies?

Hate speech and harassment are complex policy areas to enforce at scale, as decisions require nuanced understanding of local languages and contexts. To help us consistently enforce our policies, we have review teams with linguistic and subject matter expertise.

See also  Does YouTube contribute to radicalization?

We also deploy machine learning to proactively detect potentially hateful content to send for human review. We remove tens of thousands of videos and channels each quarter that violate them. For channels that repeatedly brush up against our policies, we take severe action including removing from the YouTube Partner Program (which prevents the channel from monetizing), issuing strikes (content removal), or terminating a channel altogether.

Do these policies disproportionately affect political voices YouTube disagrees with?

When developing and refreshing our policies, we make sure we hear from a range of different voices, including Creators, subject-area experts, free speech proponents, and policy organizations from all sides of the political spectrum.

Once a policy has been developed, we invest significant time making sure newly developed policies are consistently enforced by our global team of reviewers, based on objective guidelines, regardless of who is posting the content. We have created a platform for authentic voices which empowers our diverse community of Creators to engage in a vigorous exchange of ideas.

Are there any exceptions to enforcing the hate speech policy?

YouTube is a platform for free expression. While we do not allow hate speech, we make exceptions for videos that have a clear educational, documentary, scientific or artistic purpose. This would include, for example, a documentary about a hate group; while the documentary may contain hate speech, we may allow it if the documentary intent is evident in the content, the content does not promote hate speech, and viewers are provided sufficient context to understand what is being documented and why. This, however, is not a free pass to promote hate speech, and you can flag it to our teams for review if you believe you’ve seen content which violates our hate speech policies.

How does YouTube address repeated harassment?

We remove videos that violate our harassment policy. We also recognize that harassment sometimes occurs through a pattern of repeated behavior across multiple videos or comments, even when individual videos may not cross our policy line. Channels that repeatedly brush up against our harassment policy will be suspended from the YouTube Partner Program (YPP), eliminating their ability to make money on YouTube, to ensure we reward only trusted Creators. These channels may also receive strikes (that could lead to content removal) or have their accounts suspended.

What tools are available for Creators to protect themselves and shape the tone of conversations on their channel?

While the goal of our policies and systems is to minimize the burden placed on Creators to protect themselves from hate and harassment, we have also built tools to help them manage their experience, summarized below.

See also  How to Trim your Youtube videos

We provide Creators with moderation tools for comments so they can shape the tone of the conversation on their channels. We hold potentially inappropriate comments for review, so Creators can best decide what is appropriate for their audience. We also have other tools that empower Creators to block certain words in comments, block certain individuals from commenting, or assign moderation privileges to other people so they can more efficiently monitor comments on their channel.

To encourage respectful conversations on YouTube, we also have a feature that will warn users if their comment might seem offensive to others, giving them the option to reflect and edit before posting.

Finally, we have a list of resources to help Creators feel safe on YouTube. We know there is a lot more work to be done and we are committed to moving this work forward.

What is YouTube doing to support the Black communities?

At YouTube, we believe Black lives matter and we all need to do more to dismantle systemic racism. We join in protest against the murders of George Floyd, Breonna Taylor, Ahmaud Arbery, and so many others before them.

The painful events of 2020 have reminded us of the importance of human connection and the need to continue to strengthen human rights around the world.

Our platform has been a place where people come together since YouTube first launched 15 years ago. And in the midst of uncertainty, Creators continue to share stories that might not otherwise be heard while also building online communities.

We have always been proud that we are a platform that celebrates a broad and diverse set of voices. And we’ve taken many steps over the years to help protect diverse communities from hate and harassment across the platform, including Black creators and artists. And in 2019, we developed more stringent hate speech and harassment policies. Our updated hate speech policy specifically bans videos alleging that a group is superior based on qualities like race, gender, religion, or sexual orientation in order to justify discrimination, segregation or exclusion.

As a result of these changes and our ongoing enforcement, in the last quarter of 2020 alone we removed over 175,000 videos and 182 million comments for hate and harassment.

But we recognize we need to do more, in particular with the Black community, and that is why we have committed to the following actions.

  • Building on our work over the past several years, we’re examining how our policies and products are working for everyone— but specifically for the Black community ⁠— and we will work to close any gaps. And more broadly, we will work to ensure Black users, artists, and creators can share their stories and be protected from hateful, white supremacist, and bullying content.
  • During Black History Month, we celebrated Black stories, voices, and culture that have contributed to creativity and innovation on YouTube and throughout the world.
  • Through the #YouTubeBlack Voices Fund, we’re bringing important stories about Black experiences around the world to YouTube, emphasizing the intellectual power, dignity, and joy of Black voices, and educating viewers on racial justice.
  • In January 2021, we announced the #YouTubeBlack Voices Class of 2021 to amplify their voices and perspectives: 132 creators and artists from the United States, Kenya, the United Kingdom, Brazil, Australia, South Africa, and Nigeria.
  • In June 2020, we announced a multi-year $100 million fund dedicated to amplifying and developing the voices of Black creators and artists and their stories.
  • Through the month of June 2020, our Spotlight channel highlighted racial justice issues, including the latest perspectives from the Black community on YouTube alongside historical content, educational videos, and protest coverage. This content showcased incredibly important stories about the centuries-long fight for equity.
  • On June 13, 2020, Common and Keke Palmer hosted a conversation called “Bear Witness, Take Action” to create a virtual space. It raised awareness on the urgent need for racial justice, and supported the Equal Justice Initiative. It featured conversations with creators including Jouelzy and Ambers Closet; prominent activists including co-founders of Black Lives Matter; Alicia Garza and Patrisse Cullors; Bryan Stevenson, founder of Equal Justice Initiative; bestselling author Roxane Gay; journalist Soledad O’Brien; and powerful musical performances from John LegendTrey SongzBrittany Howard and many more important voices.
See also  Youtube to mp4 Converter

Bear Witness, Take Action raised awareness on the urgent need for racial justice

There is much work to do to advance racial equity in the long-term, and these efforts will continue in the months and years ahead.

Leave a Reply

Your email address will not be published. Required fields are marked *