Content moderation team cuts at X, formerly known as Twitter : 5 Things podcast

2024-12-24 08:10:15 source: category:My

SPECIAL | X, formerly known as Twitter: Are the guardrails gone for good? 

X, formerly known as Twitter, has lost most of the guardrails it once had. Massive employee cuts, in particular, to content moderation teams, more divisive content, the removal of state-affiliated media labels, and a blind allegiance to free speech by Elon Musk have made the platform much more susceptible to misinformation and disinformation. COVID, Russia’s invasion of Ukraine and the 2024 election are all vulnerable topics. Domestic Security Correspondent Josh Meyer has been covering this story for USA TODAY and joins us to share his insights.

Podcasts:True crime, in-depth interviews and more USA TODAY podcasts right here

Hit play on the player above to hear the podcast and follow along with the transcript below.This transcript was automatically generated, and then edited for clarity in its current form. There may be some differences between the audio and the text.

Dana Taylor:

Hello, and welcome to Five Things. I'm Dana Taylor. Today is Wednesday, October 4th, 2023, and this is a special episode of Five Things. X, formerly known as Twitter, has long since lost most of the guardrails it once had. Massive employee cuts in particular to content moderation teams, more divisive content, the removal of state affiliated media labels and a blind allegiance to free speech by Elon Musk have made the platform much more vulnerable to misinformation and disinformation. COVID, Russia's invasion of Ukraine and the 2024 election are all vulnerable topics. What's the solution? Domestic security correspondent Josh Meyer has been covering this topic for USA today, and he now joins us to share his insights. Josh, thanks for hopping on the podcast.

Josh Meyer:

My pleasure, Dana. Thanks.

Dana Taylor:

Let's start with some recent news. Musk is now saying he's going to start charging all users a small subscription fee to use the site. This he claims is going to restore advertisers trust by rooting out bots. This strategy could also backfire and accelerate users' migration to competitors. But if Musk does go this route, what kind of impact would it have on misinformation, disinformation, and manipulation by foreign adversaries?

Josh Meyer:

Well, I think that's a good question. I think nobody really knows the specific answers to it. We'll have to see how it plays out, but I do think it's going to prompt a mass migration from Twitter or X to other platforms. I think a lot of people are very wary of being charged for this, something that they see as a public square. I do think that it will open up some space for the dedicated state run disinformation networks, whether it's Russia or China or Iran, to really move into that space. It's the same thing with the verification check marks where if you don't need to provide really any information and you can just pay your way into verification, that opens the door for a lot of people with nefarious intentions to try to do that. So we'll have to see, but I think there's a lot of analysts that are very concerned about this.

Dana Taylor:

Pivoting to the 2024 US presidential election, there are quite a few nefarious forces out there including both state and non-state actors who are chipping away at American's confidence in election integrity and would like nothing more than to see the US democracy fail. Elon Musk also recently announced he was cutting X'S global election integrity team in half. Is it looking worse than 2020? And if so, how?

Josh Meyer:

For the story that I wrote, I talked to a lot of experts in, I do think there was a tremendous amount of concern that this could be the worst one ever. Hopefully that won't be the case, but we have a lot of state run actors now. We've got China, Iran, and, of course, Russia looking to meddle in the election. You've got a lot of right-wing extremist groups doing it. Some of the security information specialists that I talked to said you even have kids in their parents' basement who could manipulate things.

Dana Taylor:

Over the summer, X got a new CEO, Linda Yaccarino, who came from NBC news. Her recent appearance at Vox' Code conference caused a media storm. Why was that?

Josh Meyer:

If you look at the video of this, it was a real train wreck for her. And there are some people that say she was sandbagged by the fact that they had brought one of X's biggest critics, the former trust and security officer, Chief Yoel Roth on before her, but she's the CEO of a big company. She should have been ready for this, and she just seemed to be really unprepared. Roth said that the company under Musk was a lot worse than he ever imagined When he left the company. She was unprepared for a lot of the questions. She even tried to find the app on her phone and wasn't able to do this to the point where the audience was laughing at her. They just thought that she was so ill prepared, didn't know what she was doing, and that Musk himself was basically running the show and keeping her out of the loop on some very critical things, including potentially charging people, which will not go over well.

Dana Taylor:

As we mentioned at the top, after Elon Musk bought Twitter now known as X, he quickly slashed more than 80% of its staff, including forcing out top executives. Some of these layoffs included entire teams that were designed to counter election disinformation and screen for hate content. New AI tools have since been introduced to help do that work, but the sheer volume of misinformation is staggering. Is X capable of stopping foreign interference in 2024?

Josh Meyer:

It's a massive undertaking, and I think Elon Musk has said himself publicly that it's pretty impossible to do on a platform like Twitter. I know that he has made some statements that he's trying to restore this capacity, but they did gut, as you noted, Dana, 80% of the staff, including whole teams dedicated to this. So you can't really build that kind of capacity very quickly and very easily especially at a time when given the polarization of the country and all the heated rhetoric that's going on back and forth, there already is a lot of disinformation out there, misinformation and false claims. So it's going to be a very tough task for him to do, and I think that the preponderance of public social media analysts and experts that I've talked to said that he's not up to the task, and that even if he does succeed in doing some of it won't be enough, and it won't be fast enough for the election.

Dana Taylor:

Now let's turn to Russia's manipulation of the X platform with regards to its invasion of Ukraine. The EU passed a very strict law last year to help curb hate speech, propaganda, other harmful content on social media. The law is called the Digital Services Act, but Russian propaganda about Ukraine is now reaching more people than before the war started. That's according to a study released in September by the European Commission, that's the governing body of the European Union. How has Russian disinformation impacted public perceptions both here and abroad, and what steps should X be taking to comply with a new law?

Josh Meyer:

Well, I think Musk is saying that they're going to comply with the law. He hasn't taken specific steps to say what he's going to do or to do them yet as far as I can tell. But Russian disinformation on the platform is a very serious issue, and it's been for a long time. They have dedicated troll farms of people that are paid much more than other people in Russia. So it's a very lucrative job to sit there and create these accounts and to try to spread information, misinformation, online. I think that that's continued, and I think it's something to watch out for. I do think that there are a lot of Americans that are getting some misinformation about the war in Ukraine this way, and the European Commission has been watching this a lot more closely than any similar kind of commission in the United States. So I would tend to follow their lead on this.

I think that there are a lot of people, including in the United States, that say that the United States needs to have some sort of mechanism like this to help watch out for disinformation as aggressively as the Europeans are. But so far we haven't really seen something of that magnitude here, but it's a serious problem.

Dana Taylor:

COVID and election misinformation is rife on the platform. The latest viral conspiracy theory suggests that the Biden administration is going to re-implement some COVID era policies that will allow it to manipulate the 2024 election like they did in 2020. How dangerous are these theories to Americans' trust in democracy and government?

Josh Meyer:

I think they're extremely dangerous. They're not just dangerous to Americans' trust in democracy and government, but they're just dangerous outright. We have people talking about how we have a civil war that's brewing in the United States, and a lot of this very heated rhetoric, especially anti-Biden rhetoric. And I think that this plays right into that, especially if we have a resurgence of COVID like we're starting to see a little bit in some parts of the country. And you have people buying into this, you're going to have another January 6th type atmosphere in the country where there are large numbers of people that are willing to march or even take up arms to fight what they think wrongly is some sort of government plot against them.

And this is a very serious one that plays on people's fears and their insecurities that if Biden does try to promote mail-in ballots and things like that, that that's somehow connected to a plot when actually what they're trying to do is ensure that more people vote. This has been a common theme now for the last two election cycles, and I think that this time it could be much worse than the last time, which is saying a lot.

Dana Taylor:

Elon Musk has suggested that state media entities do not need to be identified as such on X because, "All news is to some degree propaganda." Is this type of justification a dog whistle to propagandists?

Josh Meyer:

Labeling NPR and PBS and things like that as propaganda is quite frankly fairly outrageous. And I think it shows Musk's bias against the media including the many media outlets that are trying to get it right and trying to call out his disinformation on the platform. And as some of the experts, including former Twitter officials, told me one of the biggest disinformation problems on the platform is Musk himself. He is retweeting a lot of this false information. He's allowing back on the platform some of the worst right-wing extremists and demagogues and other people that are pushing disinformation. I think that the problems that we're seeing now are only going to get worse as the election cycle heats up, and nobody's really sure what's going to happen or what they can do about it, but a lot of it relies on Musk himself. He has a tremendous amount of power and authority and influence in just one person, and that worries a lot of people.

Dana Taylor:

You spoke to several former Twitter employees about their concerns. What were their top comments?

Josh Meyer:

These are people, some of them, that have said before that they didn't think that Twitter needed to be regulated or that it could handle it themselves, but I think there's concern, especially with Musk being as unpredictable or volatile or opinionated as he is that there need to be some guardrails in place. The whole controversy over antisemitism on Twitter now where even the Prime Minister of Israel has asked him to try to weigh in and cut down on that. Musk has bent over so far backwards to promote free speech that he's not doing what needs to be done on the platform as something that would be done in any newspaper or other media outlet to reign in this kind of disinformation and hate speech. You can't just put something in a newspaper or on a TV network that's blatantly racist or potentially violent, but Twitter seems to be kind of the Wild West these days where it's happening all the time. So nobody's really sure what to do except to get Musk to do more to stop it.

Dana Taylor:

Josh, thank you so much for joining us.

Josh Meyer:

My pleasure, always.

Dana Taylor:

Thanks to our senior producer Shannon Rae Green for our production assistance. Our executive producer is Laura Beatty. Let us know what you think of this episode by sending a note to [email protected]. Thanks for listening. I'm Dana Taylor. Taylor Wilson will be back tomorrow morning with another episode of Five Things.

More:My

Recommend

Vikings' Camryn Bynum celebrates game-winning interception with Raygun dance

Minnesota Vikings safety Camryn Bynum waited a week to deliver "the most FIRE" interception celebrat

Where things stand on an Israel-Hamas cease-fire deal as Hamas responds to latest proposal

Negotiators from the United States, Qatar and Egypt have been working to mediate an agreement betwee

National Doughnut (or Donut) Day: Which spelling is right? Dictionaries have an answer.

Pastry purveyors around the country are celebrating a favorite American treat on Friday, resurfacing