FYI.

This story is over 5 years old.

Tech

Why ‘Popping’ the Social Media Filter Bubble Misses the Point

Social media filter bubbles are not the problem; they are symptomatic of the problem.

Let's be absolutely clear: social media filter bubbles are not responsible for the election of Donald Trump.

There are quite a few problems with this thinking. First, it draws a direct causal line between the outcome of the election and social media usage by supposing that every voter uses social media; not that every ballot cast was filled out by someone with a Facebook account, Twitter, or internet access. Second: it suggests that social media is the only mechanism by which the forces that characterized this election—misinformation, extremism, radicalization, and paranoia—proliferate.

Advertisement

Though recent efforts have attempted to creatively illustrate the echo chambers that form when online social platforms like Facebook curate feeds that reinforce our political ideologies, social media filter bubbles should be an object of critique, but not an object of persecution. Your Facebook feed is not the problem. It's a symptom.

What lies behind the bubble when it bursts?

According to the parlance of Wikipedia, filter bubbles result when "a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles."

This is the definition that undergirds the numerous thinkpieces on "popping the filter bubble," which are predicated on the fetish of digital disruption as a tactic that is inherently liberational. But there is no "way out," not in the traditional sense of bursting, popping, or breaking digital bubbles to create spaces unfettered from tracking, manipulation or enclosure.

We must ask: what lies behind the bubble when it bursts? Is this pathway even emancipatory, and what is liberation leading us towards if so? What does bursting the filter bubble actually achieve, and what happens to our ideas, practices, and behaviors without the frameworks like bubbles to lend them cohesion, order, and form? In order to answer these questions and propose practical solutions to resolving the filter bubble enigma, we must make a commitment to understanding what filter bubbles are and are not, what they conceal or expose in their imbrications of material, algorithmic, ideological, and communicative infrastructures. This essay does not answer those questions, but if offers a cursory point of departure for thinking about lines of interrogation that lead towards possible answers.

Advertisement

Here is what is missing from the operative definition of filter bubble in Wikipedia's account: filter bubbles are not static enclosures, contained environments, or stable platforms that are unidirectionally fed information by the user. A website's "selective guessing" is conditioned to be more accurate, nuanced, and therefore clickable the more a user interacts with the trackable items on a platform like Facebook. Users are implicated in a constantly evolving feedback loop of machine learning relations with social media infrastructure, and their very participation via the indiscriminate "click click like" grammars of action trains platforms like Facebook to continually refine the scope of information that it algorithmically calculates users will find "engageable." The more frequently and robustly a user interacts with a platform, the less guesswork required by the algorithm: both the processes of machine learning and the types of content it produces become more tailored to user performance. In other words: you are responsible for the narrowing of your epistemic horizon line via social media. Or, as researcher on Futures and Digital Media Giancarlo M Sandoval states: "When an individual with a given set of epistemic values enters this algorithmic infrastructure it also forms an algorithmic structure of belief."

Filter bubbles are technically enabled, but they are not strictly technological.

Advertisement

Social media platforms are both epistemes and examples of algorithmic culture at work: the social media "filter bubble" is operationalized and designed by continually active feedback loops and the reinforcement of their mechanisms based on a symbiotic relationship between the user and a platform. In Feed Forward, Mark B Hansen illustrates how this cognitive and algorithmic labor also exploits the activity of "feedforward" loops, which create a "precessual" rather than processual relationship between users and platforms in which the mechanisms that strengthen filter bubbles are not readily apprehended or perceived by a human subject. Feedforward systems challenge the idea that platforms are simply receptacles for information, responding to user inputs. In feedforward interactions, a user's engagement with social media is inversely related to the user's ability to maintain distance from it as a consumer—the feedforward mechanic is a consumption of the human by social media, an autocannibalizng of the user as a communicative agent. Media is increasingly predictive, future-facing, and proactive: it penetrates systems of epistemic belief by anticipating "belief", if belief in this sense can mean the probability calculation that a user will engage with a piece of content on social media.

The urgency with which people demand "bursting" filter bubbles needs to be called into question for its implicit premises that digital platforms are inherently restrictive; that digital disruption is inherently emancipatory; and, that social media filter bubbles are unique in preventing users from engaging with ideas that challenge or contest their ideological systems. Filter bubbles are not unique to this algorithmic moment, but they have become even more capable and imbued with agency in the era of machine learning. Filter bubbles are technically enabled, but they are not strictly technological. Social media is relatively new, but filter bubbles are not. Curator of Digital Culture Filippo Lorenzin says that "filter bubble" is a term that, in colloquial usage, seems to refer only to a very specific kind of mechanism: "it's about social media, dot-com algorithms and internet users."

Advertisement

Contrary to its use in casual discourse, the mechanism of filter bubbles are not strictly algorithmic: psychologist Leon Festinger observed the effects of filter bubble mechanism a few decades ago in his theorization of cognitive dissonance. Cognitive dissonance is, arguably, the outcome that bursting filter bubbles aims to produce. The desire to "burst" bubbles is founded on the idea that bursting is a process of informational release fostering informational exchange and the free flow of data, data that likely contradicts beliefs that a user has and sees reinforced online. But exposure to new ideas and a commitment to listening are not the same.

Furthermore, "freeing" data by bursting filter bubbles fails to consider that there is no blank canvas upon which information is released. The blank canvas is an ideology. Even in its raw form, data is produced by an agent. Data imposes frameworks for interpretation, makes choices about narrative apertures, presupposes the conditions of representation, assumes temporal and spatial boundaries, incorporates unconscious bias, and often relies on editorial decisions that reduce information to categories. The bubble is a metaphor for a politics of choice: of included/excluded, inside/outside; yes/no; 0/1; click/pass. The Trump v Hillary of operational decision-making. Bursting the social media filter bubble to free the constraints on information is the vision of establishing a techno-utopia where simple exposure to alternative facts or persuasive arguments naturally yields a reorientation of belief. This is the real filter bubble. This the fulcrum of division that polarized the 2016 election. Below are more examples of this bubble at work:

Advertisement

Overeducated, out-of touch — Undereducated, ignorant
Establishment — Anti-establishment
Elite, Financially Stable — White Working Poor
Ideologue, Feminazi — Racist, Fascist
Privileged — Economically anxious
Politically correct — Politically marginalized
Hypocrites — Voting "against self-interest"

Broadly speaking, these are flat, hegemonic categories that neatly fit a systemically reinforced that understands the world of Trump vs Hillary supporters in terms of abstracted identity-politics subjected to the logic of a two-part system. It's no wonder that the number of thinkpieces on Trump supporters spawned this apt satire, which expertly foregrounds how thinking in terms of abstract identity politics yields necessarily abstract identities. Throughout the course of this election, there was so few real attempts to understand the motivations of Trump voters qua individuals. The qualities of diversity, complexity, and difference of opinion among Trump supporters were eradicated if their motives did not fit within the scope of "economic anxiety," "white supremacy" or "ignorance" as political categories. The attempts at political portraiture turned to superficial caricature. Reluctant Trumpists, trolls, silent voters, lifelong GOP supporters, strategic opportunists seizing on the tides of neonationalist ideology, overt racists—they were, and continue to be, hierarchically compressed into a one-dimensional latitude: The Trump Voter. It's as nearsighted a discursive container as "Hillary supporter" is for its eradication of difference.

Advertisement

Feedforward and feedback processes reinforce the perceived epistemic validity of narrativizing this way. Polling data and voter prediction metric are enframed by ideologies, and ideologies warp the horizon of real expectation. Invoking Giancarlo Sandoval again: "This is where the algorithmic structure of belief is further aided by a human-machinic ontology and provides us with examples of groups and subgroups that can be mappable based on their beliefs—"if you are a liberal you read The Guardian, if you are a conservative you read The Telegraph." Democrats and Republicans are guilty of this, but apportioning blame is beside the point. The filter bubbles that polarize the extremes of a two-party system by making the "Other" unimaginable forecloses debate: this is the mechanism that is at fault for our current political climate.

Exposure to new ideas and a commitment to listening are not the same.

This is also why, in the wake of Donald Trump's election, it is ironic that so many people took to social media to voice shock and incomprehensibility: "Who are these Trump voters?!" "I don't know anyone that voted for Trump!" "This is unthinkable!" "This is unspeakable." The left's reaction was to scream dissent into the digital abyss, an attempt to puncture the echo chamber by sounding out the depths of its plateaus: who did they think they were reaching out to on their feeds, anyway? "How did this happen?! I don't know anyone that voted for Trump!" should be a lesson for those who voted against his election. This lesson will inform the strategies we take up for repairing our fractured political platforms. Let these reactions—my own, counted among them—expose themselves as the limits of the left's socio-political imaginaries. Permit them to identify the boundary markers of the "liberal elite's" largest and most pervasive "filter bubble." Unthinkability is not a problem of cognition: unspeakability is not a problem of language.

Bursting the online filter bubble will not dissolve the filter bubbles that we carry inside of us and inscribe upon the world. Popping, bursting, and disrupting does nothing if we all maintain commits to our epistemic positions on and offline. We need frameworks for thinking through our own being-in-the-world and "unfettered" information cannot give us that. We need a politics of revisability; a politics that permits individuals to incorporate and embrace new thoughts and perspectives without bursting the semipermeable membrane that contains what's thinkable in the first place. We need a politics that encourages thought-reorientation rather than ossification of belief. The bursting bubble is an image of failed communication because it upholds the impossibility of incorporating difference: it's self-annihilation; it's the idea that different ways of thinking cannot exist except "outside"; it's that idea that belief systems must be nothing if not uniform, coherent, and elegant. We need to permit ourselves the freedom not to claim all the answers as our own and the freedom to construct worldviews that contain the complexities of self-contradiction. As a political animal, we need to admit to not knowing right now; to not being able to know; to knowing and not knowing, and so on.

I'll end this essay with a gesture towards generative, rather than disruptive, practices. Marc Jongen's examination of Peter Sloterdijk's conception of Bubbles as psychotypological, breathable milieus elaborates on such a politics of revisability. Sloterdijk, according to Jongen, argues for a politics of foam:

"Unlike in the metaphysical, the one and whole sphere of Being, in a foamy universe of this kind there is no longer any centre from which the ―whole — which is in fact no longer a whole — might be overseen and explained. Nor is there any longer a circumference that would give boundaries and clear contours to the foam in its entirety. What there is, is different perspectives and views that shift from one bubble in the foam to the next, and the possibility for the observer of changing places between the bubbles."

The success that might stem from creating foamy ecosystems seems to be hinted at by a 2013 project at Yahoo by Mounia Lalmas and Daniel Quercia and Eduardo Graells-Garrido at the Universitat Pompeu Fabra in Barcelona. The researchers nudged "users to read content from people who may have opposite views, or high view gaps…while still being relevant according to their preferences." The success of this algorithmic intervention demonstrates how gradual, curated exposure to different ideas might make people more receptive to considering alternative viewpoints: the results suggests that "people can be more open than expected to ideas that oppose their own." An experiment like this one shows how, perhaps, the tools of algorithmic curation and the feedforward and feedback loops that strengthen filter bubbles can be appropriated and reprogrammed. In other words, perhaps they can be reformatted to react with user behaviour, but against content, to transform social media platforms into ecosystems that stimulate the intersection of different worldviews rather than side-lining or concealing them. [Of course, other problems persist: Who develops these tools, who encodes these ideologies, who measures the metrics of engagement, who owns these social media platforms?]

What it comes down to is this: we need to think of ways to create communities of radical exchange and forgo the idea that emancipation is located in a practice of bursting filter bubbles. One way to approach this is to appropriate the tools and platforms that constrain us in our social media bubbles to design new platforms [or pressure existing ones] to turn the logic of the filter bubble against itself. Of course, in order for new informational patterns to yield receptivity to new ideas, we must be committed to a project of political reorientation, epistemic realignment, and civil communication in the first place. The impossibility of completing this project is also the challenge to continue. Maybe one place to start is by building platforms—in politics, media, society—around the image of bubbles-becoming-foam.