(Image by ProStock Studio on Shutterstock)
Social media is an echo chamber, and not a particularly big one at that.
In A Nutshell
- The people dominating online political debates aren’t like most of us, according to researchers. A study shows the most active commenters are uniquely comfortable with conflict and drawn to polarization. Meanwhile, most people find these same environments toxic and off-putting, so they stay quiet—distorting what appears to be “public opinion” online.
- Researchers discovered the same toxic environment pushes people in opposite directions. What silences moderate voices actually motivates the loudest users to post even more. This creates a cycle where online spaces feel increasingly hostile while a small minority makes it look like their views are everywhere.
- Scientists tested whether money or civility rules could fix the problem—they didn’t. Offering people up to $40 to participate barely moved the needle. Reminding everyone to “stay civil” changed nothing. Even when discussions were genuinely constructive, people still saw them as too hostile to join.
- The study shows what we see online is basically a funhouse mirror. The loudest voices make extreme views look way more common than they actually are. Researchers say platforms might need stronger moves—like removing toxic content, banning repeat offenders, or limiting how much any one person can flood the conversation.
Ever feel like the people dominating political debates online don’t sound like anyone you know in real life? You’re not imagining things. A study reveals that online political discussions are systematically dominated by a small, unrepresentative minority, and the very conditions that silence most people actually motivate the loudest voices to post even more.
Researchers created six private Reddit communities and recruited 520 Americans to discuss 20 contentious political issues over four weeks. What emerged was a troubling portrait of how social media distorts public opinion. While 331 of the 520 participants posted at least once, participation was extremely unequal. A handful of users generated most of the content, contributing to a total of 5,819 comments. Among the most engaged participants, some wrote over 100 comments while others contributed just one or two.
Why Most People Stay Silent in Online Political Discussions
This wouldn’t matter much if the silent minority and vocal minority held similar views and temperaments. But they don’t. The study, published in Science Advances, found that people who stayed silent consistently perceived discussions as more toxic, disrespectful, and unconstructive than those who spoke up. Meanwhile, the most active commenters (those posting dozens or even hundreds of times) tended to post more in environments they perceived as polarized or toxic.
The conditions that drive moderate voices into the shadows act as jet fuel for the most active users. As a result, what appears in our feeds becomes increasingly divorced from what the broader public actually thinks.
The researchers didn’t just observe this dynamic. They tried to fix it. In one set of communities, participants were offered $2 for every day they wrote at least one serious comment, potentially earning $40 over the month. In another set, moderators emphasized civility norms and activated automated content filters. The financial incentive reduced participation inequality somewhat, though the effect was surprisingly modest given the payout. The civility treatment didn’t measurably change participation or toxicity, likely because baseline toxicity in the communities was already very low.

Who Actually Comments on Political Posts Online
Men commented significantly more than women or nonbinary participants. People with higher political interest posted more often. But the strongest predictors were perceptual. Lurkers (those who read but never commented) rated their communities as toxic, disrespectful, and lacking in knowledge. They found discussions unenjoyable and unconstructive.
Among the minority who did comment, those same negative perceptions had the opposite effect. The more someone perceived the group as polarized, toxic, and ignorant, the more they posted. Users who felt distant from the group’s political attitudes also commented more frequently, even within these mostly like-minded groups. These are people comfortable with conflict, drawn to debate, unbothered by hostility.
The study used AI to measure objective toxicity in each community and found both factors mattered: Communities with genuinely more toxic content saw less participation overall, but individual sensitivity to toxicity determined who stayed and who left. People with thicker skin stuck around and dominated the conversation.
Social feedback amplified these dynamics. When participants received upvotes on their comments, they posted significantly more afterward. A single point of positive reinforcement marked a clear threshold. Scores above Reddit’s default baseline predicted increased commenting, while scores below it correlated with withdrawal. Early success bred more engagement, creating popular users who shaped entire discussions.
The Truth About Online Political Debates
The composition of these communities leaned heavily liberal, with participants largely agreeing on political orientation. Yet even in these relatively homogeneous spaces, the participation gap persisted. Among 520 people who cared enough to sign up for a political discussion study, a small subset produced most of the comments.
Those who did speak up wrote remarkably long, thoughtful comments (much longer than their typical Reddit posts elsewhere). Average toxicity remained very low throughout the study despite controversial topics like gun control, abortion policy, and immigration. By objective measures, these were constructive conversations. Yet lurkers still perceived them as hostile and uninviting.
What feels toxic or polarized is subjective and varies wildly between people. The same environment that seems respectful and welcoming to one person feels hostile and unconstructive to another. And people who find it hostile are exactly those most likely to stay silent, while those comfortable with conflict keep posting.
Over the month-long study, participants showed some attitude shifts. On average, people showed a small leftward shift, which aligned with the sample’s overall lean. Affective polarization (emotional dislike of the opposing party) decreased. But trust in politics, media, and others declined, as did self-reported political interest.
Some explicitly said they felt “intimidated by the sophistication” of other commenters or “could not add anything” to discussions. Even in relatively friendly environments, people felt outmatched by more confident voices.
How Platforms Could Make Online Discussions More Representative
The people we see online (those shaping perceived public opinion, driving viral moments, influencing algorithms) are systematically different from the silent minority in troubling ways. They’re more comfortable with toxicity, more drawn to polarization, and more persistent in hostile environments. Their voices drown out moderate perspectives not because they’re more numerous, but because they’re uniquely motivated by the very conditions that silence everyone else.
Light-touch interventions like reminding people to be civil appear useless. Even substantial financial incentives only partially close the gap. The researchers argue these approaches may be more promising. They point to options like actively removing toxic content and banning repeat offenders to change the environment itself, rather than trying to change who participates. They also propose capping how much any one person can post in a given timeframe, indirectly encouraging diverse voices.
Platforms could offer social or symbolic rewards for constructive participation by highlighting quality contributions, awarding badges, or enabling “thank you” buttons. But the deeper challenge remains: The people most willing to engage heavily in online political discussions are exactly those whose voices least represent the broader public. Until that changes, social media will continue to present a funhouse mirror version of public opinion, where the loudest voices create the illusion of consensus while most people watch in silence.
Paper Notes
Study Limitations
The research was conducted during a specific four-week period in June-July 2024, and temporal confounds from external political events cannot be ruled out. The sample skewed liberal and highly educated, limiting generalizability to other political groups and demographics. The study examined relatively homogeneous discussion groups, and results may differ in more politically diverse environments. The research was conducted on Reddit with U.S. participants and may not generalize to other platforms or cultural contexts. The civility treatment combined norm messaging with automated moderation, making it difficult to isolate which component had no effect. Statistical power for detecting treatment effects was limited by the group-level design with only six communities.
Funding and Disclosures
This project was funded by the European Commission (Horizon 2020 grant 101094752 SoMe4Dem). The researchers acknowledge support from the Max Planck Institute for Human Development. The authors declared no competing interests.
Publication Information
Lisa Oswald (Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany), William Small Schulz (Social Media Lab and Cyber Policy Center, Stanford University, Stanford, USA), and Philipp Lorenz-Spreen (Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany; Center Synergy of Systems and Center for Scalable Data Analytics and Artificial Intelligence, Dresden University of Technology, Dresden, Germany). Published in Science Advances, Volume 11, December 10, 2025. DOI: 10.1126/sciadv.ady8022.







