martes, 28 de septiembre de 2021

Why sites like Twitter bring out the worst in us

Illustration by David Plunkert

The root of political tribalism isn’t in social media echo chambers — it’s deep inside ourselves

It is 4:30 p.m. Dave Kelly has just finished his workday at an advertising firm in early September 2018 and pops a CD into the stereo of his aging car. He is preparing to do battle with a formidable enemy: the New Jersey Turnpike at the beginning of a holiday weekend.

When Dave finally reaches the exit for his hometown more than one hour later, he stops to perform a weekly ritual. Each Friday night, Dave checks out half a dozen books from his local library and settles in to read for at least an hour. This week he has chosen a mix of well-thumbed paperback novels, a book about the latest advances in cancer research and a thick tome on human nature by an evolutionary anthropologist.

Though he might not fit the stereotype of Donald Trump supporters, Dave voted for the former real estate magnate in 2016. Raised in a family of moderate Democrats, Dave veered toward the right in the 1980s because he was so impressed by the leadership of Ronald Reagan.

But Dave is not a card-carrying member of the Republican Party. He cast two ballots for Bill Clinton in the 1990s and takes liberal positions on most civil rights issues. “I’m perfectly happy with gay marriage,” Dave says. “I don’t understand why you would want to make an issue out of that.”

But on economic matters, Dave is more libertarian. When he learned that New York City officials were considering a new law that would require businesses with more than five employees to provide two weeks of paid vacation, Dave warned, “There’s gonna be a lot of companies that fire people to get away from that. There’s gonna be companies that just can’t do it and are gonna go out of business.”

Living outside liberal Philadelphia — and working in a profession dominated by Democrats — Dave normally hides his conservative views. “I have friends I won’t discuss this stuff with,” he says, “because I’m not going to change my mind and they’re not going to change theirs — so what’s the point?”

The few times he tried to start such conversations, he explains, things quickly became heated — and the only thing Dave hates more than New Jersey traffic is heated arguments about politics.

Because he feels like an unwelcome minority in his day-to-day life, Dave describes social media as a kind of refuge. He originally joined Facebook and Twitter to escape politics and follow updates about his favorite television shows. But he kept finding himself getting “sucked into political discussions.”

Over the past few years, Dave — who does not use his real name on social media — has spent many late nights arguing with Democrats on Twitter. Remembering one of these conflicts, Dave said, “Don’t judge me ... I had a couple of beers.”

A local radio station, he explained, had reported a group of white supremacists were planning to march on the campus of a nearby university. “Turns out they’re not,” he says. “The whole thing is a hoax.” After reading more about the story, Dave learned that one of the groups that raised the alarm was the progressive Southern Poverty Law Center. “They pretty much claim anyone who’s to the right of Karl Marx is a hate group,” he says. When he dismissed the incident on Twitter, another user quickly fired back, calling him a racist. “I called her an idiot,” he says. She didn’t know what she was talking about, he decided, because she was only getting one side of the story.

But so is Dave. Though he prides himself on being informed, Dave gets his news from a conservative talk radio station, the right-leaning website Daily Caller and Twitter. Of the several hundred accounts that he follows on Twitter, only New York Times columnist Bret Stephens could be described as centrist.

Dave has consumed a steady diet of conservative views on social media for years. Each day, his feed gets filled with content from Fox News, posts by Trump and other prominent Republicans, and dozens of memes bemoaning liberal hypocrisy. Dave has even retweeted a few messages from Russian trolls masquerading as American conservatives along the way.

And that drunken Twitter argument about the white supremacist march at a local university? It turns out that Dave used more colorful language than “idiot” to describe his liberal opponent that night.


You might think you already know what’s going on here: Dave is stuck in an echo chamber.

Social media sites allow people to choose what types of information about politics they want to expose themselves to — or learn what Justin Bieber ate for dinner last night. The problem is that most people seek out information that reinforces their preexisting views. We connect with newspapers, pundits or bloggers who share our worldview.

If you’re a conservative like Dave, you might follow Tucker Carlson, the Fox News host, since you appreciate what he has to say about government spending or illegal immigration. And if you’re a progressive liberal, you might follow CNN’s Don Lemon because you appreciate his frequent posts about the issues you care about — racial inequality, perhaps, or climate change.

The problem, the story goes, is that our ability to choose what we want to see traps us inside echo chambers that create a kind of myopia. The more we are exposed to information from our side, the more we think our system of beliefs is just, rational and truthful. As we get pulled deeper into networks that include only like-minded people, we begin to lose perspective.

We fail to recognize that there are two sides to every story or we begin listening to different stories altogether. Echo chambers have their most pernicious effect, common wisdom suggests, when people like Dave are unaware of them: when people think that they are doing research about an issue, but they are actually just listening to what they want to hear.

When we encounter people from the other side, their views can therefore seem irrational, self-serving or — perhaps most troubling — untrue. If we could only step outside our echo chambers, many people argue, political polarization would plummet.

The concept of the echo chamber existed long before social media did. Political scientist V. O. Key introduced the concept in the 1960s to describe how repeated exposure to a single media source shapes how people vote. The concept gained major traction, however, with the rise of 24/7 cable news stations in more recent decades. Social scientists quickly realized that such stations were allowing Democrats and Republicans to perceive starkly different versions of reality.

A popular example of the echo chamber effect is the 2002 U.S. invasion of Iraq. During this period, Fox News repeatedly claimed that Saddam Hussein, the Iraqi dictator, was collaborating with al-Qaeda, the terrorist organization responsible for the Sept. 11 attacks. It was later discovered that such claims were false. But an influential study found that Fox News viewers were two times more likely to believe that such links existed than those who got their news from other sources.

If you are a Democrat, don’t pat yourself on the back too quickly. A recent study showed more Democrats are trapped inside echo chambers than Republicans.

Concerns about echo chambers gained added urgency with the rise of the internet and social media. In his influential 2001 book, “Republic.com,” legal scholar Cass Sunstein warned that partisan websites and blogs would allow people to avoid opposing views even more efficiently than cable news.

The internet activist Eli Pariser pushed this argument even further in his 2012 book “The Filter Bubble.” He argued that algorithms employed by large technology companies made the echo chamber effect even worse. Facebook, Google and other giant corporations exacerbate our built-in tendency to seek information that is aligned with our worldview via algorithms that recommend even more of such content to us. The most dangerous part of these algorithms, Pariser argued, is that social media users are not aware of them. Filter bubbles can preclude the very possibility of bipartisan interaction, Pariser warned, allowing our deeply biased views to go unchallenged.

 Illustration by David Plunkert

Meanwhile, social scientists began to uncover substantial evidence of social media echo chambers as well. A 2015 study by data scientists at Facebook estimated only one-quarter of the content that Republicans post on Facebook is ever seen by Democrats, and vice versa. A study of Twitter reached similar conclusions. More than three-quarters of the people who retweet — or share — a message, the study concluded, belong to the same party as the message’s author.

These findings were particularly concerning since social media was rapidly becoming one of the most popular ways for Americans to get their news. Between 2016 and 2018, the number of people who got their news from social media surpassed those who learned about current events from print newspapers. By 2018, social media had become the most popular news source for people ages 18 to 29.

It should come as no surprise, then, that a growing chorus of technology leaders, pundits and policymakers now warn of a grim future in which any discussion of politics on social media will quickly devolve into tribalism. We hear calls for social media platforms to break our echo chambers — or at least revise the algorithms that reinforce their walls. And if social media companies won’t relent, then social media users should begin stepping outside of their echo chambers themselves. Only then, many people believe, can we begin the difficult conversations needed to beat back polarization on our platforms.

It’s a compelling story — especially when the people who tell it are those who helped build social media platforms and now regret their actions. But I believe the common wisdom about social media, echo chambers and political polarization may not only be wrong, but also counterproductive.


Common wisdom often becomes unassailable because it is very difficult to verify. Social scientists have wondered whether echo chambers shape our political beliefs for decades, but studying this process is very challenging. We can analyze people like Dave Kelly — the Trump voter described above — but are his experiences typical? Echo chambers result from the coordinated behavior of millions of people across sprawling social networks that evolve in complex patterns over time.

Even if we had the time and resources to identify thousands of Dave Kellys — and see that people like him develop increasingly partisan views over time — how could we be sure that people’s echo chambers shape their political beliefs, and not the other way around?

If our political beliefs guide how we try to understand the world, would we really give them up so easily? Would Dave Kelly begin to moderate his views if we suddenly began exposing him to social media posts from progressive groups like the Southern Poverty Law Center?

Regardless of what you think about echo chambers, Facebook, Twitter and other social media platforms have produced exciting new opportunities to study them. The social sciences were once considered data poor compared to other fields of study. But some platforms now allow us to collect information about millions of people in seconds. Even more importantly, we can now conduct an epidemiology of ideas, tracing how beliefs about the world spread across large social networks over time.

The age of computational social science — the study of human behavior using large digital data sets — also provides new opportunities for experimentation. By embedding randomized controlled trials within social media platforms, computational social scientists have been able to increase voter turnout, organ donation and a host of other positive human behaviors. These types of experiments also hold enormous power to provide insights into social media echo chambers.

But there is also a dark side to computational social science. In 2013, the psychologist Michal Kosinski launched a study to determine whether patterns in social media data — such as information about the things we like or the accounts we follow — could be used to predict our ethnicity, sexual orientation or even our intelligence. Kosinski and his team produced an app that allowed Facebook users to perform a personality test on themselves via the data generated within their accounts. But the now-infamous political consulting firm Cambridge Analytica allegedly created a similar app to collect data for a nonacademic purpose: creating microtargeting campaigns to sway political elections. Though many social scientists question whether such ads were effective, the story highlights a dangerous precedent: The tools of computational social science can be repurposed to violate privacy and potentially manipulate the behavior of people who did not consent to be studied.

Computational social science has another problem too: The digital footprints we leave behind on social media platforms provide a very incomplete record of human behavior.

As a thought experiment, let’s put Dave Kelly’s data into the type of app created by Cambridge Analytica. We could easily conclude that Dave is a Republican by analyzing the news organizations and pundits he likes or follows. A political campaign might even be able to identify which television shows Dave watches and buy commercials to reach people like him.

But we would also misunderstand some of the most important things about Dave. Though his Twitter feed makes him seem like an angry “Make America Great Again” warrior, the app would not reveal that Dave is actually worried about climate change and disappointed by his party’s treatment of gay people. You’d never know that Dave thinks Trump is a bully or worries about racial discrimination in policing. You would not learn that Dave was skeptical about whether white supremacists were really marching at a nearby university during the incident I described earlier because he believes media organizations are stoking ethnic tensions for financial gain. Most important, you would not learn that this issue is particularly important to Dave because he is part Puerto Rican and suffered terrible discrimination as a child.

I mention these details not only to show how many things are left out of the digital record of our lives. On the contrary, I believe the rapidly growing gap between social media and real life is one of the most powerful sources of political polarization in our era.

How did I come to this conclusion? I am a computational social scientist who has spent my entire career studying how social media shapes political polarization. Several years ago, I became so concerned about political tribalism that I founded the Polarization Lab — a team of social scientists, statisticians and computer scientists at Duke University, where I am a professor. Our team diagnoses the problems with our platforms using scientific research and builds new technology to reverse the course. Together, my colleagues and I have collected hundreds of millions of data points that describe the behavior of thousands of social media users over multiple years. We’ve run new kinds of experiments with automated social media accounts, conducted some of the first studies of how foreign misinformation campaigns influence people and ventured deep inside social media companies to help them fight polarization. We’ve even created our own social media platform for academic research — allowing us to turn on and off different features of platforms to identify better ways of connecting people.

This work has led me to question the conventional wisdom about social media echo chambers, but it has also inspired me to ask much deeper questions. Why does everyone seem so extreme on social media? Why do people like Dave Kelly spend hours arguing with strangers, even when they don’t think it will change anyone’s mind? Is using social media a temporary addiction that we can shake — like smoking — or is it fundamentally reshaping who we are and what we think of each other? No amount of data science wizardry can answer these questions.

Instead, I wanted to see social media through the eyes of the people who use it each day. This is why our lab spent hundreds of hours interviewing people like Dave Kelly and carefully reconstructing their daily lives on- and offline.

These stories not only help paint a more complete picture of how political polarization unfolds on social media; they also inspired me and my colleagues to run new types of large-scale experiments in turn.

Studying social media from the perspective of the people who use it is also important because they are conspicuously absent from public debates about social media and political tribalism. Instead, our current conversation is dominated by a handful of tech entrepreneurs and software engineers who helped build our platforms. These Silicon Valley apostates now claim the technology they created wields unprecedented influence over human psychology — technology that not only traps us within echo chambers, but also influences what we buy, think or even feel. Facebook, Twitter and other platforms were either asleep at the wheel when malicious foreign actors launched campaigns to influence social media users — these apostates claim — or willfully ignored them because they increased user engagement (and therefore their bottom line).

This narrative is very seductive for anyone searching for a scapegoat for our current situation, but is it really true? Though social media companies are by no means blameless for our current situation, the evidence that people are simple dupes of political microtargeting, foreign influence campaigns or content recommendation algorithms is surprisingly thin.

Our focus upon Silicon Valley obscures a much more unsettling truth: The root source of political tribalism on social media lies deep inside ourselves. We think of platforms like Facebook and Twitter as places where we can seek information or entertain ourselves for a few minutes. But in an era of growing social isolation, social media platforms have become one of the most important tools we use to understand ourselves — and one another.

We are addicted to social media not because it provides us with flashy eye candy or endless distractions, but because it helps us do something we humans are hardwired to do: Present different versions of ourselves, observe what other people think of them and revise our identities accordingly.

But instead of a giant mirror that we can use to see our entire society, social media is more like a prism that refracts our identities — leaving us with a distorted understanding of one another, and ourselves. The social media prism fuels status-seeking extremists, mutes moderates who think there is little to be gained by discussing politics on social media and leaves most of us with profound misgivings about those on the other side, and even the scope of polarization itself.

If social media platforms are so deleterious to democracy, why not delete our accounts? After all, I might enjoy using carrier pigeons to communicate my latest musings on Justin Bieber. But deleting our accounts is just not realistic. Social media has become so woven into the fabric of our lives — and particularly those of young people — that it is here to say.

The good news is this: If we social media users are the main source of political polarization, this means we also have the power to push back against it.

Excerpted from “Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing,” by Chris Bail, Princeton University Press.

This story appears in the October issue of Deseret Magazine. Learn more about how to subscribe.



from Deseret News https://ift.tt/3EWtk36

No hay comentarios:

Publicar un comentario

Slutty Japanese Babe Toyed And Creamed

Japanese hot babe with big tits gets toyed and creamed. Author: sexualbabe Added: 02/11/2021