Social Media Fails Many Users. Experts Have an Idea to Fix It


Social media’s shortfalls are becoming more evident than ever. Most platforms have been designed to maximize user engagement as a means of generating advertising revenue—a model that exploits our worst impulses, rewarding sensational and provocative content while creating division and polarization, and leaving many feeling anxious and isolated in the process.

But things don’t have to be this way. A new paper released today by leading public thinkers, titled “Prosocial Media,” provides an innovative vision for how these ills can be addressed by redesigning social media to strengthen what one of its authors, renowned digital activist and Taiwan’s former minister of digital affairs Audrey Tang, calls “the connective tissue or civic muscle of society.” She and her collaborators—including the economist and Microsoft researcher Glen Weyl and executive director of the collective intelligence project Divya Siddarth—outline a bold plan that could foster coherence within and across communities, creating collective meaning and strengthening democratic health. The authors, who also include researchers from Kings College London, the University of Groningen, and Vanderbilt University, say it is a future worth steering towards, and they are in conversation with platforms including BlueSky to implement their recommendations.

[time-brightcove not-tgx=”true”]

Reclaiming context

A fundamental issue with today’s platforms—what the authors call “antisocial media”—is that while they have access to and profit from detailed information about their users, their behavior, and the communities in which they exist, users themselves have much less information. As a result, people cannot tell whether the content they see is widely endorsed or just popular within their narrow community. This often creates a sense of “false consensus,” where users think their beliefs are much more mainstream than they in fact are, and leaves people vulnerable to attacks by potentially malicious actors who wish to exacerbate divisions for their own ends. Cambridge Analytica, a political consulting firm, became an infamous example of the potential misuses of such data when the company used improperly obtained Facebook data to psychologically profile voters for electoral campaigns. 

The solution, the authors argue, is to explicitly label content to show what community it originated from, and how strongly it is believed within and across different communities. “We need to expose that information back to the communities,” says Tang.

Read more: Inside Audrey Tang’s Plan to Align Technology with Democracy 

For example, a post about U.S. politics could be widely-believed within one subcommunity, but divisive among other subcommunities. Labels attached to the post, which would be different for each user depending on their personal community affiliations, would indicate whether the post was consensus or controversial, and allow users to go deeper by following links that show what other communities are saying. Exactly how this looks in terms of user-interface would be up to the platforms. While the authors stop short of a full technical specification, they provide enough detail for a platform engineer to draw on and adapt for their specific platforms.

Weyl explains the goal is to create transparency about what social structures people are participating in, and about how “the algorithm is pushing them in a direction, so they have agency to move in a different direction, if they choose.” He and his co-authors draw on enduring standards of press freedom and responsibility to distinguish between “bridging” content, which highlights areas of agreement across communities, and “balancing” content, which surfaces differing perspectives, including those that represent divisions within a community, or underrepresented viewpoints.

A new business model

The proposed redesign also requires a new business model. “Somebody’s going to be paying the bills and shaping the discourse—the question is who, or what?” says Weyl. In the authors’ model, discourse would be shaped at the level of the community. Users can pay to boost bridging and balancing content, increasing its ranking (and thus how many people see it) within their communities. What they can’t do, Weyl explains, is pay to uplift solely divisive content. The algorithm enforces balance: a payment to boost content that is popular with one group will simultaneously surface counterbalancing content from other perspectives. “It’s a lot like a newspaper or magazine subscription in the world of old,” says Weyl. “You don’t ever have to see anything that you don’t want to see. But if you want to be part of broader communities, then you’ll get exposed to broader content.”

This could lead to communities many would disapprove of—such as white supremacists—arriving at a better understanding of what their members believe and where they might disagree, creating common ground, says Weyl. He argues that this is “reasonable and even desirable,” because producing clarity on a community’s beliefs, internal controversies, and limits “gives the rest of society an understanding of where they are.”

In some cases, a community may be explicitly defined, as with how LinkedIn links people through organization affiliation. In others, communities may be carved up algorithmically, leaving users to name and define them. “Community coherence is actually a common good, and many people are willing to pay for that,” says Tang, arguing that individuals value content that creates shared moments of togetherness of the kind induced by sports games, live concerts, or Superbowl ads. At a time where people have complex multifaceted identities that may be in tension, this coherence could be particularly valuable, says Tang. “My spiritual side, my professional side—if they’re tearing me apart, I’m willing to pay to sponsor content that brings them together.”

Advertising still has a place in this model: advertisers could pay to target communities, rather than individuals, again emulating the collective viewing experiences provided by live TV, and allowing brands to define themselves to communities in a way personalized advertising does not permit. 

Instantiating a grand vision

There are both financial and social incentives for platforms to adopt features of this flavour, and some examples already exist. The platform X (formerly Twitter) has a “community notes” feature, for example, that allows certain users to leave notes on content they think could be misleading, the accuracy of which other users can vote on. Only notes that receive upvotes from a politically diverse set of users are prominently displayed But Weyl argues platform companies are motivated by more than just their bottom line. “What really influences these companies is not the dollars and cents, it’s what they think the future is going to be like, and what they have to do to get a piece of it,” he says. The more social platforms are tweaked in this direction, the more other platforms may also want in. 

These potential solutions come at a transitional moment for social media companies. With Meta recently ending its fact-checking program and overhauling its content moderation policies—including reportedly moving to adopt community notes-like features—TikTok’s precarious ownership position, and Elon Musk’s control over the X platform, the foundations on which social media was built appear to be shifting. The authors argue that platforms should experiment with building community into their design: productivity platforms such as LinkedIn could seek to boost bridging and balancing content to increase productivity; platforms like X, where there is more political discourse, could experiment with different ways of displaying community affiliation; and cultural platforms like TikTok could trial features that let users curate their community membership. The Project Liberty Institute, where Tang is a senior fellow, is investing in X competitor BlueSky’s ecosystem to strengthen freedom of speech protections.

While it’s unclear what elements of the authors’ vision may be taken up by the platforms, their goal is ambitious: to redesign platforms to foster community cohesion, allowing them to finally deliver on their promise of creating genuine connection, rather than further division.



Source link

Related posts

Kash Patel, Trump’s Pick for FBI Director, Is Confirmed by Senate

Breaking Down the Curveball Ending of Robert De Niro’s Netflix Political Thriller Zero Day

Mitch McConnell Won’t Seek Re-Election in 2026, Ending Tenure as Republican Power Broker