meta Announced in blog post update Yesterday, it announced that it would apply Instagram’s obscure settings to threads that allow users to control the amount of fact-checked content that appears in their feeds. According to Meta, fact-checking is aimed at combating misinformation and, in effect, allows users to decide how much of controversial topics they want to see on the site.
There are three levels of control: “no reduction,” “reduction,” and “further reduction.” None of the options can completely hide content, but will affect the ranking of posts that are “found to contain false or partially false information, altered content, or missing context.”
To access settings from a thread, users can tap the two lines in the top right corner of the profile tab, then tap[アカウント]>[その他のアカウント設定](Go to Instagram) >[コンテンツ設定]>[ファクトチェックによる削減]You need to tap .
At first glance, this concept seems very convincing. It can essentially be a “drama” filter, and who wouldn’t want that in some aspect of their life?meta said: statement to NBC News These options are intended to give users “more control over the algorithms that rank posts in their feed,” and give users “more power to decide what to show on the app.” It added that this was in response to the request.
NBC News pointed to Posts with thousands of likes It said the changes were aimed at censoring content related to the Israel-Hamas war. Whether that’s true or not, there is clearly plenty of room for censorship of tools that invite users to become complicit.
Meta uses third-party fact checkers to evaluate the truthfulness of content on Instagram and Facebook, and those decisions are now indirectly applied to content in threads. The company said that while fact checkers cannot directly rate content on Threads, Meta will transfer ratings from Instagram and Facebook to “nearly identical content on Threads.”
Mehta said Instagram has had a fact-checking ranking option for years, but it appears they never properly announced it. according to economic timesMeta added the feature to Facebook in May, and a Meta spokesperson said the goal of the feature was to “make user controls on Facebook more consistent with the controls that already exist on Instagram.” He said that.
With the rapid expansion of online communication from the once small pockets of web forums, moderation has not scaled appropriately. Large social networks have not found a silver bullet to solve the problem, and in some cases their efforts have only sparked anger and suspicion about motives and questions about federal involvement.
But meth needs to be reined in on its platforms, and not just because of laws mandating meth in the European Union or the United States’ own continued regulatory efforts. Advertisers are a big part of the equation, and the company has a perfect example of how abandoning moderation will impact its X (formerly Twitter) platform. X has reportedly seen revenue decline as its increasingly paid and unmoderated rhetoric has contributed to a continued exodus of advertisers.