Skip to main content

ADL CEO Jonathan Greenblatt dives into tech’s reckoning with online hate

On January 6, America watched in horror as groups that recruited and organized on major social media sites violently attacked the seat of American democracy. Within a matter of hours, tech companies took actions they’d said we’re out of the question for years. But real change requires thoughtful policy and a clear-eyed look at the […]

On January 6, America watched in horror as groups that recruited and organized on major social media sites violently attacked the seat of American democracy. Within a matter of hours, tech companies took actions they’d said we’re out of the question for years.

But real change requires thoughtful policy and a clear-eyed look at the choices that allowed dangerous extremism to thrive in the first place. We spoke with Anti-Defamation League CEO Jonathan Greenblatt on proposed policy solutions and tech’s coming era of accountability at TechCrunch Sessions: Justice 2021.


On how the ADL ramped up its efforts in Silicon Valley:

Given the rise of online hate, harassment and dangerous misinformation, tech companies are increasingly on the radar for civil rights organizations. It’s now common to see organizations like the ADL to participate in pressure campaigns aiming to change platforms’ policies and sign onto legislation proposing regulations for the industry.

“So at the ADL, we’re the oldest anti hate organization in the world. But we deeply believe that today, the frontline and fighting hate is really on Facebook. I mean, there’s just no question that social media has become a breeding, breeding ground for kind of bigotry, that is offensive and ugly in all respects. Now, we’ve known this for years. But when I came on board, about five and a half years ago, I really wanted to focus on this, and try to get causal and right to the heart of the problem. So we could finally turn it around. So in 2017, we actually opened an office in Silicon Valley, our Center for Technology and Society, we were the first civil rights group with an actual presence in the valley. And for me, that was sort of second nature, because I had worked in the valley for years before taking this job, you know, both raising money on Sandhill road, managing teams of engineers building products.” (Timestamp: 0:53)


How algorithms make social media uniquely dangerous:

Algorithms are what sets social networks apart from more traditional media sources. Rather than seeking it out, the average internet user has extreme ideas served directly to them through algorithms that decide what they see. This is particularly an issue with Facebook and YouTube’s way of keeping users engaged for as long as possible.

Algorithmic amplification has a lot to do with the dilemma that we found ourselves in, and extremists are, if nothing else innovative, they exploit loopholes. And indeed, they have used the kind of libertarian laissez faire attitude of the companies to their own advantage for a number of years. And so from Facebook groups to YouTube channels to kind of accounts on Twitter, let alone all the other platforms, they’ve used them with tremendous depth, depth. So what’s interesting is, and many people that I know have seen this, I’ve seen this myself, you may have to I’m sure your audience has. It wasn’t too long ago that you might watch a YouTube video and one click or two clicks over, suddenly find yourself down the rabbit hole of some crazy QAnon or anti vaxxer you know, Boogaloo content. Same thing on Facebook.

When you search a piece of content, suddenly, you’re served up Facebook groups that may be from accelerationist, or white supremacists, or other racist and anti Semites. But the reality that we’ve got to confront is that algorithms aren’t our right, if you will, algorithmic amplification isn’t a privilege which should be accorded to everyone. It’s a responsibility that the companies have to make sure that their products give users what they want, but that they’re also not abused. And that the users themselves are not abused, to seeing the kind of things to which they might be very viable. Robots are susceptible. So we deeply believe that algorithmic amplification is very problematic. That’s why we’ve been supporting legislation on Capitol Hill that will finally address this… If you could basically turn off the algorithms for some of these worst elements, you could have curbed these issues a long time ago. (Timestamp: 13:35)


How social media companies failed before the Capitol attack:

In the immediate aftermath of the attack on the Capitol, social media companies suddenly made a number of changed that belied how reluctant they’d been to address the hate and extremism brewing on their platforms all along.

To those of us who’ve been tracking violent extremists for years. This was not a surprise at all, this was the most predictable terror attack in American history. Literally, these groups told us in advance what they were going to do. And the attack itself was sort of the culmination of years and in the last in the months prior intense campaigning by the President himself, to undermine the integrity of the election, to question the democratic process, to call on individuals to interrupt the certification of the election based on this big lie, this totally contrived idea that somehow the election was rigged. I mean, truly, it was bananas.

… The tech companies who for years have told us there was a political exemption, and they wouldn’t necessarily take action when presidents or other politicians said things that were outrageous, and committed slander or incited violence on the platform, suddenly, because of the public pressure from groups like Stop Hate for Profit and the ADL, from internal pressure from their own employees, and I believe, you know, their boards — suddenly they took action instantaneously, overnight. All their other concerns sort of fell by the wayside. I think it was really important that in order Facebook and Twitter and YouTube took down President Trump, that was critical, we called for them to do that. And I’m really pleased that they did. We called for them previously to take down armed militia groups, to take down QAnon content. And I’m really glad that they did and it had a huge impact.

You know, we’ve seen like on Twitter QAnon content drop 97% you know, just days after the attack, because the company actually took action. So I think it really laid bare the myth that somehow, some way the companies couldn’t do anything about this, clearly they could. And they did. And I think their services and society as a whole is better for it. (Timestamp: 7:53)


On Silicon Valley exceptionalism

The tech industry doesn’t think of itself like other traditional sectors of business, instead often casting its own grand pursuits as for the greater good — not just for profits. Those attitudes can contribute to some extraordinary innovations, but they also permeate its products and cultures in ways that create some serious problems.

I think Silicon Valley is almost like, rooted in this American tradition of like, Manifest Destiny, right? conquering the frontier. It’s, it’s ironic, but altogether appropriate, that’s happening in California, right in the land where they have the Gold Rush, right again, where people went to make their fortunes. And now they’re doing it today in Silicon Valley, in tech, and even that’s continued to evolve, right? It was the internet 15 years ago, five years ago, with social media. Today, it’s Clubhouse, and I don’t know what comes next. But I do think that the whole industry does need to undergo a serious self examination.

And I think you’ve seen people like who’ve come out of the industry, I think about Chris Sacca, the former Googler, I think about Alexis Ohanian, the Redditor, and a few others start to grapple with these issues. You know, trust, my friend, Tristan Harris, at the Center for Humane Technology has also done this in his film — The Social Dilemma really plays this out. Whereas Silicon Valley often has a very short memory, the reality is that we there will be a long road ahead of us. And if we don’t wrestle with these demons, and if we don’t sort again, through the wreckage to what they’ve wrought, I think the future is very unclear. (Timestamp: 20:15)


On policy solutions to rein in big tech

There’s a huge swath of policy proposals on the table that could put some real restrictions on how tech companies operate.  From proposed changes to Section 230 of the Communications Decency Act to federal and state antitrust suits, tech companies are on notice in 2021.

So look, the ADL, I mean, we’ve been literally fighting for a more just country, we’ve been fighting for civil rights, we’ve been fighting hate for over 100 years. And we are fiercely, ferociously, defenders of the First Amendment. But freedom of speech isn’t the freedom to slander people, right to freedom of expression isn’t the freedom to incite violence against individuals or groups of people based on their immutable characteristics? And so I think what we’ve seen is the first amendment been warped and weaponized online in ways that are, you know, completely beyond the pale of what the founding fathers ever would have, you know, could have imagined.

Section 230 does need to be addressed. And I think that Warner Hirono bill that you pointed out is a step in the right direction, it is definitely not sufficient… It might actually not be the federal government, but the states that actually pushed the companies to do more, we’ve seen California, do some innovative stuff on privacy that’s pushed the companies and you may see, I think more state action. (Timestamp: 16:28)

You can read the entire transcript here and review the full lineup from Justice 2021 [here].


Early Stage is the premier ‘how-to’ event for startup entrepreneurs and investors. You’ll hear first-hand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, product market fit, PR, marketing and brand building. Each session also has audience participation built-in – there’s ample time included for audience questions and discussion. Use code “TCARTICLE at checkout to get 20 percent off tickets right here.

 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.