Tech News

Welcome to TikTok’s endless cycle of censorship and error

[ad_1]

It is not necessarily a surprise to create these new videos. He makes his videos because people work. Getting perspectives has been one of the most effective strategies for pushing a large platform to fix something for years. Tiktok, Twitter and Facebook have made it easier for users to report abuse and regulate violations by other users. But when these companies seem to be breaking their policies, people often find that the best way forward is to try to publish on the platform itself, hoping to go viral and get the attention that comes with some sort of solution. Tyler’s two videos of marketing bios, for example, have more than a million views.

“He’s reporting content because he’s someone from a marginalized group who talks about his experiences with racism. Hate and hate can be very similar to an algorithm about talking about hate.”

Casey Fiesler, University of Colorado, Boulder

“They probably put something labeled on me once a week,” says Casey Fiesler, an assistant professor at the University of Colorado Boulder, who studies technology ethics and the online community. TikTok is active, has more than 50,000 followers, but while everything it sees doesn’t feel like a legal concern, it says the parade of regular app themes is real. it has had some such flaws in recent months, all of which have disproportionately affected marginalized groups on the platform.

MIT Technology Review asked TikTok about each of these recent examples and the answers are similar: after investigation, TikTok warned that the problem was a mistake, that the blocked content in the case did not violate their policies and stressed links to support the company. gives groups like that.

The question is whether this cycle (any technical or political error, viral response and forgiveness) can change.

Resolve issues before they arise

“There are two types of damage to this moderation in algorithmic content that people are observing,” says Fiesler. “One is false negatives. People say, ‘why is there so much hate speech on this platform and why isn’t it being removed?’ ”

The other is a false positive. “They are denouncing their content because they are someone from a marginalized group who is talking about their experiences with racism,” he says. “Hate speech and hate speech can be very similar to an algorithm.”

These two categories, he noted, are the same person: those taken in excess of abuse are algorithmically censored for talking about it.

TikTok-enak mysterious recommendation algorithms are part of the success—But its clear and constantly changing limitations are already having a huge effect on some users. Fiesler warns that many TikTok creators are self-censoring words on the platform to avoid causing a review. And while he doesn’t know exactly how much he’s doing that tactic, Fielser has started doing the same, just in case. Account bans, algorithmic mysteries, and strange moderation decisions are an ongoing part of the app’s conversation.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button