Breaking the Filter Bubble Myth: It’s Users, Not Google

Search Engine

A study by Rutgers faculty reveals that user choices and political ideology, not Google Search algorithms, primarily drive engagement with partisan and unreliable news. The research indicates that while Google’s algorithms can surface polarizing content, the engagement with such content is largely based on a user’s personal political outlook.

A collaborative study of Google Search results suggests that user engagement with divisive news content is more significantly influenced by political beliefs rather than by the platform’s algorithms.

A study co-authored by Rutgers faculty and published in the journal Nature, reveals that user preference and political beliefs, not algorithmic suggestion, are the biggest drivers of engagement with partisan and unreliable news provided by Google Search.

The study addressed a long-standing concern that digital algorithms may amplify user biases by offering information that aligns with their preconceived notions and attitudes. Yet, the researchers discovered that the ideological variance in search results displayed to Democrats and Republicans is minimal. The ideological divergence becomes apparent when individuals choose which search results to interact with or which websites to visit independently.

Results suggest the same is true about the proportion of low-quality content shown to users. The quantity doesn’t differ considerably among partisans, though some groups – particularly older participants who identify as ‘strong Republicans’ – are more likely to engage with it.

Katherine Ognyanova, an associate professor of communication at the Rutgers School of Communication and Information and coauthor of the study, said Google’s algorithms do sometimes generate results that are polarizing and potentially dangerous.

“But what our findings suggest is that Google is surfacing this content evenly among users with different political views,” Ognyanova said. “To the extent that people are engaging with those websites, that’s based largely on personal political outlook.”

Despite the crucial role algorithms play in the news people consume, few studies have focused on web search – and even fewer have compared exposure (defined as the links users see in search results), follows (the links from search results people choose to visit), and engagement (all the websites that a user visits while browsing the web).

Part of the challenge has been measuring user activity. Tracking website visits requires access to people’s computers, and researchers have generally relied on more theoretical approaches to speculate how algorithms affect polarization or push people into “filter bubbles” and “echo chambers” of political extremes.

To address these knowledge gaps, researchers at Rutgers, Stanford, and Northeastern universities conducted a two-wave study, pairing survey results with empirical data collected from a custom-built browser extension to measure exposure and engagement to online content during the 2018 and 2020 U.S. elections.

Researchers recruited 1,021 participants to voluntarily install the browser extension for Chrome and Firefox. The software recorded the URLs of Google Search results, as well as Google and browser histories, giving researchers precise information on the content users were engaging with, and for how long.

Participants also completed a survey and self-reported their political identification on a seven-point scale that ranged from “strong Democrat” to “strong Republican.”

Results from both study waves showed that a participant’s political identification did little to influence the amount of partisan and unreliable news they were exposed to on Google Search. By contrast, there was a clear relationship between political identification and engagement with polarizing content.

Platforms such as Google, Facebook, and Twitter are technological black boxes: Researchers know what information goes in and can measure what comes out, but the algorithms that curate results are proprietary and rarely receive public scrutiny. Because of this, many blame the technology of these platforms for creating echo chambers and filter bubbles by systematically exposing users to content that conforms to and reinforces personal beliefs.

Ognyanova said the findings paint a more nuanced picture of search behavior.

“This doesn’t let platforms like Google off the hook,” she said. “They’re still showing people information that’s partisan and unreliable. But our study underscores that it is content consumers who are in the driver’s seat.”

Reference: “Users choose to engage with more partisan news than they are exposed to on Google Search” by Ronald E. Robertson, Jon Green, Damian J. Ruck, Katherine Ognyanova, Christo Wilson and David Lazer, 24 May 2023, Nature.
DOI: 10.1038/s41586-023-06078-5

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : SciTechDaily – https://scitechdaily.com/breaking-the-filter-bubble-myth-its-users-not-google/

Exit mobile version