Business News

Here’s how TikTok is cracking down on misinformation and paid political content ahead of the US midterm elections

[ad_1]

TikTok plans to crack down on misinformation in preparation for the US midterm elections in November.

The popular social media platform said on Wednesday that it has partnered with fact-checking organizations for help with flagging inaccurate information and banning paid political content posted by influencers.

The push comes after misinformation flooded social media during the 2020 presidential election and helped fuel the riot at the US Capitol. Efforts by social media companies to police content posted on their services led to conservatives claiming censorship and bias against them.

To enforce its policies, TikTok will use both people and technology to review content and accounts for misinformation. The service said it has partnered with independent intelligence firms and works with civil society organizations in its response to “emerging threats” without providing further details. TikTok added that it’s partnered with fact-checking organizations that will help determine the accuracy of posted content.

“And while they do not moderate content on our platform,” head of US safety Eric Han said about fact checkers in a statement, “their assessments provide valuable input which helps us take the appropriate action in line with our policies.”

In what it calls an “abundance of caution,” content that is in the process of being fact-checked will not be added to the app’s “For You” page, the app’s main content feed that is personalized for individual users. Additionally, if information is considered inaccurate, it will be labeled as unverified content and users will be asked if they’re sure before sharing the post.

TikTok also said it will do more to warn users of their responsibility to follow its guidelines on paid political content. The service acknowledged that it was a challenge during the 2020 elections.

“Over the next few weeks we’ll publish a series of educational content on our Creator Portal and TikTok, and host briefings with creators and agencies so the rules of the road are abundantly clear when it comes to paid content around elections,” Han said in the statement.

“If we discover political content was paid for and not properly disclosed, it is promptly removed from the platform,” he added.

In 2019, TikTok banned political ads, and it still does.

Finally, TikTok said it would introduce an “Elections Center,” which will connect users who engage with election content to official and reputable information. Those sources include the National Association of Secretaries of State, Ballotpedia, and the Campus Vote Project. Election results reported by the Associated Press will also be available on the app. Additionally, to make the Elections Center more visible to users, it will label content that it identifies as being related to the midterm elections.

“We are committed to promoting digital literacy skills and education,” Han said. “And our in-app center will feature videos that encourage our community to think critically about content they see online, as well as information about voting in the election.”

Twitter and Facebook-parent Meta have also said they would ramp up their efforts to police content in the run up to the midterm elections. Meta said it would restrict ads about social issues, elections, or politics in the US during the week leading up to election day. And Twitter has said it would enforce its policy of labeling misleading information along with launching an “Explore” tab for national news and resources dedicated to US elections to provide more accurate information.

Sign up for the Fortune Features email list so you don’t miss our biggest features, exclusive interviews, and investigations.

[ad_2]

Source link

Related Articles

Back to top button