In the scheme of things, it’s a minor point on an utterly devastating day for our cultural history.
But, as the ashes at Notre Dame start to settle, there should absolutely be a post-mortem into how YouTube can get things spectacularly wrong, yet again:
Several news outlets quickly started livestreaming the fire on YouTube. However, underneath several of them was a small gray panel titled “September 11 attacks,” which contained a snippet from an Encyclopedia Britannica article about 9/11. The feature is part of a larger rollout of tools and disclaimers to prevent users from consuming misinformation on the platform.
“These panels are triggered algorithmically and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire.”
What always stands out to me when something big like this happens, is that the people who spot YouTube acting improperly never seem to be YouTube itself.
The whole world was watching those streams, and yet nobody at YouTube deemed it necessary to see how their site was performing.
At Facebook last week, the company told us about how they are stepping up their detection and removal tools for harmful content, but in response to one question about human intervention, the company confirmed it didn’t a team of humans that would proactively go out and try and find instances of abuses.
Instead, as we’ve seen today, it’s apparently journalists doing that job.
By linking 9/11 to the Notre Dame fire, when there is as yet no suggestion of it being a terrorist attack, is more than misinformation – it’s borderline incitement. Here’s how Christopher Wylie, the Cambridge Analytica whistleblower, put it:
So if you watch a live stream of Notre Dame burning on YouTube, a pop up tells you about the 9/11 terrorist attacks. We are creating an Internet of algorithmic dog whistles. https://t.co/xmaGSK2kgc
— Christopher Wylie 🏳️🌈 (@chrisinsilico) April 15, 2019