DISCLOSURE: Sourced from Russian government funded media
RT: Internal Facebook documents leaked to the media confirmed accusations that its algorithms pushed users to radical views. Proponents of increased content censorship have received new arguments for their case.
This was, for me, one of the first signs that something was systematically messed up in a potentially terrible way. I’d joined a whole lot of antivax, then chemtrails, and then flat earth groups (as it prompted me)…and then it started serving up this. I got Q pretty early too. https://t.co/oyYN3OTEga
— Renee DiResta (@noUpside) October 22, 2021
The Silicon Valley giant doesn’t do enough to protect society from various ills, its critics, including some employees, believe. Meanwhile, Facebook is working on new ways to curb the growth of certain groups through manipulation rather than blanket bans.
‘Carol Smith’, a self-identified politically conservative mother from Wilmington, North Carolina with an interest in Christianity, joined Facebook in the summer of 2019. She subscribed to the accounts of Fox News, Sinclair Broadcasting, and Donald Trump.
In two days, Facebook suggested she check out the latest memes from QAnon – a conspiracy theory claiming that a shadowy group of American ‘patriots’, along with Donald Trump, were about to crack down on a nationwide ring of powerful pedophiles, including prominent politicians. In three weeks, her feed was “a constant flow of misleading, polarizing and low-quality content,” according to an internal Facebook report leaked by an ex-employee, among a trove of other documents.
By turning a blind eye to QAnon, “Facebook literally helped facilitate a cult,” Renee DiResta, a researcher of ‘malign content’ at the Stanford Internet Observatory, told NBC.‘Carol’ was a fake persona created by Facebook researchers to see how quickly the social media network’s algorithms lured users into the ‘rabbit hole’ of political extremism. The document was leaked by a former Facebook employee, Frances Haugen, and shared with media outlets, including NBC News and the New York Times.
The social media network downplayed the accusations, saying that ‘Carol’ was “a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform.” The company started removing groups associated with QAnon in August 2020.
Haugen’s leaks have been trumpeted by critics of Facebook, who accuse it of being too slow and too restrained in quashing harmful content on its platform. The tech giant has rejected the accusations in broad terms, saying its policies and internal studies have been mischaracterized by biased reporters. The claim that a lack of censorship on Facebook ultimately led to the January 6 riot on Capitol Hill is one of the primary charges against the company.
The experiment with ‘Carol’ didn’t really reveal anything new to Facebook, since independent researchers have conducted their own studies along the same lines. And this kind of algorithm-driven political polarization is not limited to fans of Donald Trump. Fake left-leaning users have also been fed ‘low quality memes and political misinformation, according to the Times.
It does, however, match perfectly with the goals of those who want Facebook and other US-based social media sites to ramp up restrictions on what kind of speech is allowed on their platforms, and the enforcement of the rules. Some of this push has come from Facebook’s own employees, like Haugen, especially after the January 6 riot.
“I wish I felt otherwise, but it’s simply not enough to say that we’re adapting, because we should have adapted already long ago,” one employee reportedly wrote after supporters of Trump broke into the Capitol building, as cited by the Times.
“There were dozens of Stop the Steal groups active up until yesterday, and I doubt they minced words about their intentions,” the message added, referring to the claim that Joe Biden didn’t win the presidential election fairly – a claim which former President Trump has repeatedly made, ultimately resulting in him being ousted from various social media sites which accused him of inciting the January 6 riot.
Facebook is accused of having a pattern of not preemptively suppressing harmful content and only paying attention to undesirable movements after they grow, though its management claims it deploys many resources to identify and deal with these types of problems.
Content policing puts any platform in a tricky position, in which it has to balance preventing harm with freedom of speech. The tech giants have come under increased pressure since 2016, when, according to critics, they allowed Russia to meddle in the presidential election and help Trump get elected. Moscow denies that it has interfered or ‘meddled’ in the domestic affairs of the US.
Censorship proponents warn that a lack of action leads to cross-pollination and synergies between various fringe groups, NBC said. QAnon supporters have links with vaccine skeptics, and together they can draw in others, like ‘incel’ misogynists and ‘disinformation agents’, as the outlet describes it, to grow their numbers and strength.
Meanwhile, Facebook continues to look for creative ways to stifle unwanted speech on its platform, while staying on the right side of the First Amendment. One of the latest projects, which was described by NBC, is called ‘Drebbel’, after a 17th-century Dutch inventor.
The Drebbel group is analyzing the now-defunct groups for QAnon believers and anti-vaxxers to learn how they managed to attract millions of Facebook users. They hope they can establish the traits of a ‘gateway group’ in these networks and systematically set up roadblocks to keep users away from targeted movements.
“Group joins can be an important signal and pathway for people going towards harmful and disruptive communities,” the research team reportedly stated in a post on the corporate message board. “Disrupting this path can prevent further harm.”
Haugen’s leaks and testimony at Congress provided a boost to the case for increased government control over social media. It remains to be seen whether half-measures like user attention throttling will satiate Facebook critics, or more drastic measures will be demanded.
ATTENTION READERS
We See The World From All Sides and Want YOU To Be Fully InformedIn fact, intentional disinformation is a disgraceful scourge in media today. So to assuage any possible errant incorrect information posted herein, we strongly encourage you to seek corroboration from other non-VT sources before forming an educated opinion.
About VT - Policies & Disclosures - Comment Policy
My theory is the data is not only designed to foment , but can target and lure individuals and then send them where they want them, when they want. It is way too tempting of a herding tool, and political agendas would be drooling over it. Particularly the ones who we saw directly benefited.
Cambridge Analytica was not the equivalent of an ad campaign. It was more of a herd driving operation using the latest tech and peculiar data streaming. Select targeting, and high retention by injecting increasingly dramatic content, and “sense of belonging”.
Saying, Facebook knew it was radicalizing, is a massive understatement of the complexity and intention. The massive psychosis that we all saw unfold as a result. Examining what was accomplished should be enough to break up that company, and google also. We need to talk about what seemingly banal data (like birthdates) can do for real. Ignoring it is not wise.
Comments are closed.