YouTube Continues to Fight Extremist Videos
Google announced new policies to fight offensive and inappropriate videos on YouTube and other sites. The company is responding most recently to videos promoting terrorism.
When videos violate community guidelines, they will be immediately removed. In addition, in a blog post, the company identified four new strategies:
- Increasing technology to find terrorism-related videos
- Expanding the network of people and organizations to flag videos
- Applying restrictions to "inflammatory religious or supremacist content"
- Expanding its "role in counter-radicalization efforts"
The third point is interesting. Judging a video for removal is difficult, so Google will diminish potentially damaging content by posting a warning and not allowing comments, endorsements, or monetization (they can't accept advertising). General Counsel Kent Walker writes, "That means these videos will have less engagement and be harder to find. We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints."
Discussion:
- Analyze Walker's blog post. Who is the audience, and what are his communication objectives? How would you describe the writing style? What organizational strategy does he use?
- How well do you think Google is balancing freedom of expression with damage and complaints from advertisers?
- What are the potential dangers of Google's new policy? What are the benefits?