As Twitter and Facebook grapple with hate speech, struggling about whether to ban white supremacists, Nazis, and racists, Apple has taken a clear stand: Hate has no place on our platforms.
Written by Lee Schneider
Tim Cook was the recipient of ADL’s first Courage Against Hate Award and he used the opportunity to speak out about a growing threat: The proliferation of hate speech on digital platforms. Hate speech moves faster now, infects more people, and jumps past geographic boundaries when it is posted digitally. The threat is real and Cook is addressing it.
"At Apple, we believe that technology needs to have a clear point of view on this challenge," Cook said when accepting the award. "There is no time to get tied up in knots. That’s why we only have one message for those who seek to push hate, division and violence: You have no place on our platforms."
Compare those clear words with the equivocating statements and actions taken by Facebook, Twitter, and Reddit. They can’t seem to decide what to do about hate speech. Should they allow it in the name of free speech? Should they temporarily ban offenders in the hope they will clean up their act? Temporary bans against Milo Yiannopoulos or Alex Jones haven’t reformed either of them. They are the worst kind of repeat offenders who continue to pour out conspiracy theories and hate.
Apple was the first platform to ban Alex Jones, an action that Cook addressed: “As we showed this year, we won’t give a platform to violent conspiracy theorists on the App Store. Why … Because it’s the right thing to do. My friends, if we can’t be clear on moral questions like these, then we’ve got big problems. At Apple, we are not afraid to say that our values drive our curation decisions.”
Can there be too much free speech? My answer to that question has changed as the Internet has developed. Once upon a time, when the web was young, we were able to trust that most people would post with compassion. To be sure, there have always been hate mongers, racists, anti-Semitic trolls, and KKK disciples on digital platforms, but they were localized. Their reach was limited. That reach expanded when driven by the algorithms of Facebook, Twitter, YouTube and “voting up” on Reddit. Hate quickly gains velocity online, faster than it can be fact-checked or policed.
Facebook has tried to claim it is merely a distribution platform, not responsible for content. That position has become unsustainable. Users believe what they see online and take popular posts at the top of their feed as truth. Powerful distribution platforms are responsible for the health of the populations they serve. Digital platforms have a responsibility to curate their content. This means they need human editors, not just algorithms, to oversee what we all see.
Tim Cook has it right about values driving curation decisions. ADL has it right to give him its first Courage Against Hate Award.