Categories: Russian

What social media platforms are doing to stop misinformation about Russia’s invasion of Ukraine | #datingscams | #russianliovescams | #lovescams


As Russia’s war in Ukraine continues, the information war is picking up online.

Fake news, photoshopped posts, manipulated media, and all sorts of propaganda and misinformation is being disseminated by both bad actors and those being duped by them. So, what are the big tech companies doing to help stop bad information from spreading?

Mashable reached out to several major social media platforms in order to get a comprehensive look at what exactly is being done to stop misinformation as Russian forces continue to advance in Ukraine.

Meta

Facebook and Instagram are certainly no strangers to disinformation campaigns stemming from Russia. So has Mark Zuckerberg learned from the attempts to sway elections? What is Meta doing this time around?

On Facebook and Instagram, some major steps have been taken to clamp down on falsehoods being spread. Meta has blocked Russian state-run media, such as Russia Today and Sputnik, in the EU and in Ukraine. The company has also cut its revenue share with these outlets so they can’t monetize their content in areas where they haven’t yet been banned. In addition, Meta will continue to label state-run media as it previously did, turning down Russia’s request to stop fact-checking and labeling its content.

The social microblogging platform also amped up its policies on labeling Russian state-run media on its platform. Previously, Twitter labeled accounts belonging to outlets such as Russia Today as “Russian state-run media.” However, as of Monday, the platform began adding warning labels to all tweets linking to Russian state-run media as well.

“This Tweet links to a Russia state-run affiliated media website,” the label reads. Twitter says it will reduce the reach of these tweets, too.

Along with those changes, a number of anchors, columnists and others employed by Russian state-run media outlets have begun reporting that their own personal accounts that promote their work have been affixed with the Twitter warningas well.

On Sunday, Twitter announced that it had suspended more than a dozen accounts for violating its platform manipulation and spam policy. Violating this policy usually entails the use of fake accounts in order to spread content and “artificially inflate” engagement. 

“Our investigation is ongoing; however, our initial findings indicate that the accounts and links originated in Russia and were attempting to disrupt the public conversation around the ongoing conflict in Ukraine,” said Twitter in a public statement.

According to NBC News, these accounts were sharing links from a new propaganda outlet called Ukraine Today.

Before Russian troops even entered the country, Twitter had already suspended advertising in Ukraine and in Russia so adverts didn’t minimize crucial information in users’ feeds. Twitter also paused tweet recommendations from accounts users did not already follow. The company says this action was taken to “reduce the spread of abusive content.”

“Twitter’s top priority is keeping people safe, and we have longstanding efforts to improve the safety of our service,” said a Twitter spokesperson. “We remain vigilant and will continue to closely monitor the situation on the ground.”

The company also seems to have made it clear that at least some of its existing policies won’t be paused due to the conflict. When the Ukrainian National Guard tweeted an Islamophobic video of a Neo-Nazi battallion embedded in the country, Twitter hid the clip behind a warning label as per its hate speech policies.

YouTube

Russian state-run media is a powerhouse on YouTube. Russia Today, specifically, has found success on the platform over the years. RT’s main channel has more than 4.5 million subscribers. RT boasts that it has received more than 10 billion views across all of its YouTube channels.

With numbers like that, YouTube monetization could result in a pretty lucrative revenue stream. That is, until this weekend, when YouTube demonetized RT and all Russian state-run media.

“In light of extraordinary circumstances in Ukraine, we’re taking a number of actions,” read a public statement from YouTube. “We’re pausing a number of channels’ ability to monetize on YouTube, including several Russian channels affiliated with recent sanctions.” 

The statement goes on to say that YouTube will also be “significantly limiting” the platform’s recommendations to content on these channels. In addition to revoking their monetization. YouTube is also “restricting access” to RT and other channels for users in Ukraine.

YouTube also shared that it had removed a number of low-subscriber channels that were part of a “Russian influence operation.”

Snapchat

Due to the nature of how Snapchat works — mainly private feeds and ephemeral content — the social messaging app has rather successfully avoided becoming a hub for misinformation and other problematic content. Even so, Snap states that it will remove any misinformation it comes across on its platform regarding Ukraine.

“The app has actually been designed to make it hard for misinformation to spread,” said a Snap spokesperson in a statement. “We limit the size of group chats and snaps disappear. Unlike traditional social platforms, we don’t feature an open, unvetted newsfeed and the content on the public parts of the app — Discover and Spotlight — only host pre-moderated content. If we find misinformation, we remove it immediately.”

In a post published on Tuesday, Snapchat shared further details regarding actions the company has taken. According to Snapchat, it has “stopped all advertising running in Russia, Belarus, and Ukraine” and has stopped advertising sales to companies in Russia and Belarus. The company also pointed out how it has never allowed Russian state-run media to distribute content on its platform.

TikTok

TikTok has long outgrown online challenges and viral dance crazes. Current events, however, may be the greatest measure of just how much the shortform video app has expanded beyond the teen content it was originally known for.

The war in Ukraine has seen TikTok used as a platform for the latest news as well as updates from people on the ground about what’s happening. Unfortunately, though, the young platform has also seen itself grow as a major outlet for misinformation and propaganda.

Videos purportedly from Ukraine have spread on the platform, often turning out to portray conflicts from years earlier and in completely different parts of the world. Scams have also descended on TikTok livestreams. Scammers are raising money with fake live videos that make it appear as if they are Ukranians sharing their wartime experiences.

TikTok has also just announced a new feature that has some critics scratching their heads at the timing. The shortform video platform has announced it will now support video uploads of up to 10 minutes long. As Media Matters point out, the platform was already struggling to handle misinformation before Russia invaded Ukraine and when it was dealing with 3-minute long videos.

The company, for its part, has said it has taken action against users acting in bad faith and will remove content breaking TikTok’s rules regarding the spread of misinformation.

“We continue to closely monitor the situation, with increased resources to respond to emerging trends and remove violative content, including harmful misinformation and promotion of violence,” said a TikTok spokesperson in a statement provided to Mashable. “We also partner with independent fact-checking organizations to further aid our efforts to help TikTok remain a safe and authentic place.”

On Tuesday, TikTok began restricting access to RT, Sputnik, and other Russian state-run media account in the EU.

TikTok has also partnered with organizations like MediaWise and the National Association of Media Literacy Education in order to help educate its users on digital media literacy.

LinkedIn

While most probably think of LinkedIn as business networking tool, the social network has had its fair share of fake news and misinformation spread throughout the platform.

The Microsoft-owned platform says its “safety teams are closely monitoring conversations on the platform” and its global editing team is making sure news and updates are coming from trusted sources. LinkedIn will take action on any content that does not abide by its Professional Community Policies, which prohibits misinformation, false content, and manipulated media.

Reddit

When it comes to taking action against propaganda being aired on Russian state-run media, no other social media platform has gone further than Reddit.

On March 3, the social sharing website announced a universal ban on all links to RT, Sputnik, and any other Russian state-run media outlets. The restriction affects every subreddit community and is not based on a user’s geographic location.

In addition, Reddit is rejecting advertisements that target Russia or come from any Russian entity.

The company also “quarantined” r/Russia and r/RussiaPolitics in order to curb the spread of misinformation that was running rampant in those subreddits. Both communities no longer appear in search results, recommendations, or feeds. Users must also agree to a content warning prompt before entering either subreddit.

Twitch

Twitch streamers are chatting about the Russian war in Ukraine, some pulling in hundreds of thousands of concurrent viewers. Undoubtably, issues are going to rise. While Twitch did not specifically mention Ukraine or Russia in its new misinformation policy, which rolled out March 3, the timing is conspicuous.

According to Twitch, it will begin taking action against “Harmful Misinformation Actors.”

“Our goal is to prohibit individuals whose online presence is dedicated to spreading harmful, false information from using Twitch,” reads Twitch’s statement about the new policy. “We will not enforce against one-off statements containing misinformation.”

Basically, any Twitch user that dedicates their streams to regularly sharing dangerous misinformation, such as violence-promoting conspiracy theories will run afoul of Twitch’s rules.

Mashable will continue to update this post as policies change.

UPDATE: Mar. 2, 2022, 12:00 p.m. EST Added additional information regarding TikTok blocking Russian state-run media in the EU and Snapchat’s new post on Ukraine.

UPDATE: Mar. 3, 2022, 4:30 p.m. EST Added sections on Reddit and Twitch.





Click Here For The Original Source.

. . . . . . .

admin

Share
Published by
admin

Recent Posts

The ‘Hi Mum’ scam cost Australians millions. Now, it’s had an upgrade | #whatsapp | #lovescams | #phonescams

Key PointsNAB has warned customers of new scams emerging in 2024, with voice impersonation phone…

25 mins ago

Chris Stapleton Imposter Scams Ohio Fan on Facebook | #datingscams | #lovescams | #facebookscams

The number of scammers posing as celebrities on social media seems to be at an…

29 mins ago

3 plead to $4.5M scam that left 100 victims ‘broke and heartbroken’ – NBC10 Philadelphia | #datingscams | #lovescams

Three people -- including a husband and wife -- from New Jersey have pleaded guilty…

57 mins ago

CBS to air weeklong series on romance scams | #datingscams | #lovescams

Subscribe to NewscastStudio for the latest news, project case studies and product announcements in broadcast…

2 hours ago

Dating apps: Right place to find life partner? #nigeria | #nigeriascams | #lovescams

The findings were based on data from 158,586 social media conversations and 16 interviews conducted…

2 hours ago

Former Fort Liberty soldier headed to prison for romance scams totaling over $350,000 | #datingscams | #lovescams

Former U.S. soldier Sanda Frimpong was sentenced to over three years in federal prison for…

3 hours ago