A new report by the New Zealand Royal Commission claims that the terrorist who killed 51 people at two mosques in Christchurch was radicalized by YouTube. Officials have described the 2019 tragedy as an act of white nationalism, and the gunman, 28-year-old Brenton Tarrant, as an Islamophobic radical. But the new report, released Tuesday, says that the majority of the media that encouraged Tarrant to execute these acts of horror came from the very mainstream YouTube.
Following the report, New Zealand Prime Minister Jacinda Ardern said that she plans to speak with YouTube leadership about the site’s role in the massacre. “What particularly stood out was the statement that the terrorist made that he was ‘not a frequent commentator on extreme right-wing sites and [that] YouTube was a significant source of information and inspiration,” Ardern said. “This is a point I plan to make directly to the leadership of YouTube.”
In a section of the official report titled “The Terrorist,” the authors right that there is “no doubt” that the gunman “subscribed to right-wing channels on YouTube.” And while it also concedes that he viewed extreme right-wing content on more underground platforms like 4chan and 8chan, it affirms that he “spent much time accessing broadly similar material on YouTube.”
“His exposure to such content may have contributed to his actions on 15 March 2019” the report ultimately asserts. “Indeed, it is plausible to conclude that it did.”
Extremism on Social Media
The findings of the New Zealand report are not the first allegations that YouTube has a fringe content problem. A 2018 article in Fortune Magazine accused the video-sharing platform, owned by Google parent company Alphabet, of using its algorithm to push viewers toward more radical and polarized content with each video.
Following the new report, however, YouTube released a brief statement that seemed to evade blame, while promising to work with Ardern and other world leaders to combat extremist content.
“While we have not yet reviewed the finding of the report,” read the statement, “YouTube remains committed to removing violent extremism and hate speech from our platform. We look forward to reviewing the report in detail and continue our work together with the Prime Minister, as well as governments, industry partners, and communities around the world to combat the spread of violent extremism online.”
The report also showed that the shooter frequented an extreme white supremacist Facebook group called “The Lads Society,” where he griped about Muslim immigration and threatened to harm local immigrants. Meanwhile, an alleged video of the terrorist attack surfaced on YouTube, Twitter, and Instagram in the days following the massacre. These respective sites attempted to remove the video, but in the days following, new versions continued to spring up across social media.
The challenge of removing that video across multiple platforms shows that YouTube is not the only mainstream social media outlet that has struggled to bring radical content to a heal. It argues for an editing process that prevents this content from emerging in the first place.