With the 2024 U.S. Presidential Election looming, and various other pending polls around the world, Meta is expanding its fact-checking program to cover Threads content as well, as it continues to see more usage in its Twitter-clone app.
As per Meta:
“Early next year, our third-party fact-checking partners will be able to review and rate false content on Threads. Currently, when a fact-checker rates a piece of content as false on Facebook or Instagram, we extend that fact-check rating to near-identical content on Threads, but fact-checkers cannot rate Threads content on its own.”
As noted, given Threads’ rising usage, this is a necessary step. The app already has over 100 million users, with seemingly many more coming to the app week-by-week, as more new features are rolled out, and more new communities begin to take shape within the Threads ecosystem.
On that front, Meta’s been making a big push with sports communities, which has seen it gain momentum among NBA fans, specifically, with the recent In-Season Tournament marking a key milestone for NBA engagement via Threads.
But the more that usage rises, the more risk of misinformation and harm, which is why Meta needs to expand its fact-checking process to cover unique Threads content, as well as duplicate posts across its other apps.
In addition to this, Threads users will also soon get more control over how much sensitive content they’re exposed to in the app:
“We recently gave Instagram and Facebook users more controls, allowing them to decide how much sensitive or, if they’re in the U.S., how much fact-checked content they see on each app. Consistent with that approach, we’re also bringing these controls to Threads to give people in the U.S. the ability to choose whether they want to increase, lower or maintain the default level of demotions on fact-checked content in their Feed. If they choose to see less sensitive content on Instagram, that setting will also be applied on Threads.”
Fact-checking has become a more contentious topic this year, with X owner Elon Musk labeling much of the fact-checking conducted by social media platforms as “government censorship”, and framing such as part of a broader conspiracy to “control the narrative” and limit discussion of certain topics.
Which is not true, nor correct, and all of Musk’s various commissioned reports into supposed government interference at Twitter 1.0 haven’t actually proven reflective of broad-scale censorship, as suggested.
But at the same time, there is a need for a level of fact-checking to stop harmful misinformation from spreading. Because when you’re in charge of a platform that can amplify such to millions, even billions of people, there is a responsibility to measure and mitigate that harm, where possible.
Which is a more concerning aspect of some of Musk’s changes at the app, including the reinstatement of various harmful misinformation peddlers on the platform, where they can now broadcast their false information once again.
Back in 2016, in the wake of the U.S. Presidential Election in that year, there seemed to finally be a level of acknowledgment about the impacts of social media, and how social media movements can influence voting outcomes, and can thus be manipulated by ill intentioned groups.
There were Russian manipulation campaigns for one, but other groups had also been able to coordinate and proliferate via social apps, including Q Anon, The Proud Boys, ‘Boogaloo’ groups, and more.
We then also saw the rise of counter-science movements, like flat-Earthers and anti-vaxxers, the latter even leading to a resurgence in long-dormant diseases in Western nations.
Following the election, a concerted effort was made to tackle these groups across the board, and combat the spread of misinformation via social apps. But now, eight years removed, and heading into another U.S. election period, Elon Musk is handing a mic to many of them once again, which is set to cause chaos in the lead-up to the coming polls.
The ultimate outcome will be that misinformation will once again play a significant part in the next election cycle, as those driven by personal agendas and confirmation bias will use their renewed platforms to mobilize their followers, and solidify support through expanded reach.
This is a dangerous situation, and I wouldn’t be surprised if more action is taken to stop it. Apple, for example, is reportedly considering removing X from its App Store after X’s reinstatement of Alex Jones, who’s been banned by every other platform.
That seems to be a logical step. Because we already know the harm that these groups and individuals can cause, based on spurious, selective reporting, and deliberate manipulation.
With this in mind, it’s good to see Meta taking more steps to combat the same, and it is going to become a much bigger issue the closer we get to each election around the world.
Because there are no “alternative facts”, and you can’t simply “do your own research” on more complex scientific matters. That’s what we rely on our experts for, and while it’s more entertaining, and engaging, to view everything as a broad conspiracy, for the most part, that’s very, very unlikely to be the case.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : SocialmediaToday – https://www.socialmediatoday.com/news/meta-expands-fact-checking-program-to-include-threads-content/702357/