Social media users lose trust in platforms when they encounter censorship: study

close video Senate GOP whip on big tech censorship

South Dakota Sen. John Thune joined ‘Maria Bartiromo’s Wall Street’ to discuss social media platforms censoring and suppressing speech.

A recent study found the majority of social media users trust an app less if they encounter censorship, and nearly a third of people who have encountered the blocked or labeled content have reduced the amount of time they spend on the platform. 

"Using true/false as a test for blocking or labeling content inevitably leads to perception of bias and breaks the trust of users. The survey shows that this leads to reduced time spent and engagement by the 49% of users who have experienced content being blocked or labeled," Mike Matthys, founder of The First and Fourteenth Institute, told Fox News Digital. 

The First and Fourteenth Institute, a non-partisan organization that advocates for free speech, free press, and due process, conducted the national research between Sept. 27 and Oct. 4 by phone and online among 1,100 people who have at least one social media account. 

The research found that 49% of people with social media accounts have seen blocked or labeled content, with 53% saying such censorship causes them to trust the app less. For Facebook users specifically, 58% of those surveyed said they trust the platform less after encountering blocked or labeled content. 

TWITTER EMPLOYEE SAYS HE WAS FIRED 'WHILE SLEEPING' AS ELON MUSK HALVES COMPANY WORKFORCE

PARIS, FRANCE – OCTOBER 06: In this photo illustration, the Facebook logo is displayed on the screen of an iPhone on October 06, 2021 in Paris, France. Frances Haugen, a former employee of the Facebook social network created by Mark Zuckerberg, told (Photo illustration by Chesnot/Getty Images / Getty Images)

Nearly a third of those polled said that such censorship makes them spend less time on a platform or less likely to share content on the site. For Facebook users, 27% said they spend less time on the site if they encounter blocked or labeled content, while 30% of people nationwide report the same. The majority of users for platforms – including Instagram, Facebook, YouTube, TikTok and others – report that encountering censorship makes no difference on how much time they spend on the sites. 

This Friday, Aug. 23, 2019 photo shows the Instagram app icon on the screen of a mobile device in New York. Instagram on Tuesday Dec. 7, 2021, launched a feature that urges teenagers to take breaks from the photo-sharing platform and announced other (AP Photo/Jenny Kane, File / AP Newsroom)

"If content moderation policies are causing a material portion of users to reduce engagement online, this may explain a portion of why some of the social media companies are seeing their growth slow or even reverse," Matthys said. 

The study also found that social media users would accept content moderation protocols that included tech giants using a "harmful/not harmful" test rather than completely relying on a "true/false" test for moderation. The majority of those polled – at six out of 10 users – said they would accept content moderation rules if it included the "harmful/not harmful" test.

Social media platforms have repeatedly been slammed for censoring content in recent years. Twitter and Facebook, for instance, banned former President Donald Trump from their platforms in 2021, while Twitter blocked the New York Post’s story in 2020 on Hunter Biden’s notorious laptop, and Facebook was slammed for cracking down on alleged COVID misinformation during the pandemic. 

NEW 'BIG TECH SCORECARD' SHOWS MOST FIRMS THREATEN FREE SPEECH

The study found that Republicans and conservatives are not the only ones to trust a platform less for censorship. Americans who consider themselves independents and moderates also reported trusting platforms less if they encounter blocked or labeled content. Fifty-one percent of Independents said they trust a platform less after seeing such labels, and 48% of moderates reported the same. 

TikTok app logo on the App Store is seen with TikTok logo displayed in the background in this illustration photo taken in Krakow, Poland on July 18, 2021. ((Photo Ilustration by Jakub Porzycki/NurPhoto via Getty Images) / Getty Images)

Among left-leaning social media users, 37% of Democrats say they trust a platform less after encountering blocked and labeled content, while 43% of liberals reported the same. 

"Approximately 12% of all Facebook users reported that they have reduced time and engagement online because of their experience with content moderation that reduced their trust in the platform," Matthys noted. 

The First and Fourteenth Institute crafted a proposal to create a third-party content moderation entity, which "is designed to help the social media platforms rebuild trust with their users," according to Matthys. 

DISINFORMATION GOVERNANCE BOARD IS A ‘DYSTOPIAN’ ‘MINISTRY OF TRUTH’: ARIZONA CONGRESSMAN

The proposal would "establish online video-calls, an online video courtroom, with a live arbitrator to resolve online content disputes quickly and transparently," according to the study, as opposed to online written messages between a social media user and a platform. 

Researchers detailed the proposal to those surveyed and found 66% of respondents supported such a plan. The study found 71% of Democrats and 65% of Republicans either somewhat or strongly supported the proposal. 

GET FOX BUSINESS ON THE GO BY CLICKING HERE

"While other studies show that users believe the monopoly social media platforms have excessive unchecked power to moderate content as they like, this study shows that additionally such content moderation, which is personally experienced by 49% of all social media users, is actually harming the business of these social media companies by reducing users’ trust in the platform and reducing the amount of time spent and information shared on the social media platforms by these users," the study concluded.

admin