On Wednesday, Feb. 14, David Hogg survived the unthinkable, the devastating shooting at his school, Margery Stoneman Douglas High in Parkland, Florida. He and his peers become outspoken gun control advocates overnight. Within a week, however, online searches for the 17-year-old suggested otherwise when a top video in YouTube’s “Trending” tab falsely alleged that Hogg was a paid crisis actor. The video, created by a conspiracy theorist, racked up 200,000 views before it was pulled down.
YouTube’s Trending tab is curated by algorithms, not people. But an algorithm is only as good as its programming lets it be. And programmers are, after all, only human. Every algorithm reveals their specific, human intent. In the case of YouTube Trending, that intent is simply to help you find content, by showing you what’s popular. When algorithms like these fail, they reveal something else besides intent: our own blind spots and biases.
On this week’s episode of IRL, we sit down with Luke Dormehl, author of Thinking Machines and The Formula, to explore the impact of algorithms, on and offline. Staci Burns and James Bridle, author of “Something Is Wrong On The Internet”, investigate the human cost of gaming YouTube recommendations. Anthropologist Nick Seaver talks about the danger of automating the status quo. Researcher Safiya Noble looks at how to prevent racial bias from seeping into code. And Allegheny County’s Department of Children and Family Services shows us how transparent algorithms can help save lives.
Now, more than ever, it’s up to us to push for accountability. Because when bad code spreads disinformation and bias, it’s never something that “the algorithm did.” It’s something people did. That means we can still take charge of our algorithms. After all, they’re here to serve us, and not the other way around. Listen to this week’s episode to lean more.
Original article written by Straith Schreder >