testsetset
YouTube today announced four new steps it will be taking to combat terrorist videos on its site. The Google-owned company also took the opportunity to discuss the tech industry’s role in not just identifying and removing content that violates policies, but in proactively eliminating any incentive to upload such content in the first place.
“Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all,” Google’s general counsel, Kent Walker, explained. “Google and YouTube are committed to being part of the solution. We are working with government, law enforcement, and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services.”
Because extremists and terrorists attack our security as well as our values, they are opposed to exactly what makes our societies open and free. But tech companies can help build lasting solutions to this complex challenge.
YouTube is playing its part by taking the following four steps:
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
- Increase the use of technology to help identify extremist and terrorism-related videos. YouTube has used video analysis models to find and assess more than 50 percent of the terrorism-related content it has removed over the past six months, but it will now devote more engineering resources to applying its most advanced machine learning research to train new “content classifiers.”
- Increase the number of independent experts in YouTube’s Trusted Flagger program by adding 50 expert NGOs on top of the existing 63 organisations. While many user flags can be inaccurate, YouTube says Trusted Flagger reports are accurate over 90 percent of the time.
- Take a tougher stance on videos that do not clearly violate YouTube’s policies. For example, videos that contain inflammatory religious or supremacist content will soon appear behind an interstitial warning and will not be monetized, recommended, or eligible for comments or user endorsements.
- Work with Jigsaw to implement the “Redirect Method” more broadly across Europe. This is the use of targeted online advertising to reach potential ISIS recruits and redirect them towards anti-terrorist videos. Potential recruits have so far clicked through on the ads at an unusually high rate and watched over half a million minutes of video content that debunks terrorist recruiting messages.
Google notes it has also recently made a commitment to work with other tech companies, including Facebook, Microsoft, and Twitter. The goal is to establish an international forum to share and develop technology, support smaller companies, and accelerate the joint effort to tackle terrorism online.