Can Democracy Survive Online?

The dream of a democratic web has turned into a nightmare of moderation crises, content mines, and billionaire overlords. Rebuilding digital spaces for meaningful participation in a post-X future will require nothing less than reclaiming the digital commons.

This corporate capture of the digital public sphere mirrors the broader decline in democratic engagement. (Beata Zawrzel / NurPhoto via Getty Images)

In the beginning, there was a blizzard. It was January 1978, and strong subtropical gusts moving north collided over the Great Lakes with an Arctic jet stream heading south, producing one of the most wicked snowstorms in US history. The “Great Blizzard of 1978” dumped close to a yard of snow on some parts of Michigan, causing close to a hundred fatalities and half a billion dollars in inflation-adjusted damages.

Stuck at home during the whiteout, two Chicago-based computer aficionados decided to build a system that would allow them to communicate with each other. At the time, the internet protocol suite was still years away from widespread use, but advances in microprocessing had begun to bring computers out of the lab and into the homes of tinkerers. Early computer modems allowed one to “dial up” and transmit information over ordinary phone lines, and Ward Christensen and Randy Suess, both members of the Chicago Area Computer Hobbyists’ Exchange (CACHE), brought their expertise in writing software and tweaking hardware together to create a modified modem that others could dial into and leave messages on for others to read later.

The result, inspired by the physical corkboard hanging on the wall at CACHE meetings, is generally acknowledged as the first online Bulletin Board System (BBS), an early social network that prefigured today’s more familiar online forums and microblogs. As the media scholar Nathan Schneider puts it, BBSs were notable for “offering computer hobbyists outside academia and military-funded research centers their first experience of digitally mediated community.”

But with this community also came the sysop, or “system operator.” Online bulletin boards were often physically located in someone’s house — literally jacked into their phone line — and thus gave that person huge influence in shaping the community’s norms. The sysop ran the hardware and had root access to the BBS software, allowing them to delete posts at their whim. If members of the community didn’t like the administrator’s moderation style or decisions, they had little recourse outside of simply quitting the service with hopes of finding greener pastures elsewhere. Schneider calls this governance model “implicit feudalism,” a structure where power concentrates in the hands of a few — a pattern that persists in online communities to this day.

After BBS came USENET, offering decentralized “newsgroups” while preserving the power of system administrators. An early USENET FAQ could not have put it more bluntly: “Who can force the moderators to change their policies? Nobody.”

Years later, Facebook mimicked the corkboard model when it created the “Wall,” but centralized governance remained the norm. Despite a fleeting experiment with user voting on policy changes, Facebook never meaningfully ceded decision-making power — what should users be allowed post? — to the vast international body of individuals from which it derives its value. The same is true of Twitter, now X, whose current sysop, Elon Musk, governs erratically, often as if the servers were literally sitting in his basement.

Today complaints about the divisive, harmful, and manipulative state of online-mediated politics have become cliche. Following the reelection of Donald Trump to the US presidency, growing numbers of academics, activists, and organizers are seeking alternative spaces. Many are moving to Bluesky, as well as Mastodon, Signal, and other social networks with different ownership structures and technical designs.

How can the Left organize effectively in this fragmented new era of digital communication? Are these new platforms genuinely different, or are we doomed to toil in the same content mines, chasing elusive dreams of more democratic, participatory online spaces?

Rebooting the Commons

Nathan Schneider — writer, activist, and communication scholar — has long grappled with the democratic shortcomings of digital spaces. In his latest book, Governable Spaces, he joins a tradition of leftist media criticism, following works like Astra Taylor’s The People’s Platform and recent contributions by Ben Tarnoff and James Muldoon. Like these authors, Schneider refocuses debates about technology on the need for collectively managed systems. His book is a thought-provoking call for better digital tools explicitly designed as “democratic mediums.”

Schneider argues that the increasingly important digital platforms that billions of us around the world are using to work, play, and communicate are, with a few small exceptions, run as autocratic and unaccountable corporate fiefdoms, with little ability for ordinary users to shape how they work and are run. For Schneider, the issue is not just about governance structures but about how these platforms undermine everyday democratic practices both online and off.

Schneider contends that our digital sphere has experienced serious democratic erosion. Seamless “free” services, funded by venture capital, advertising, and other business models, have largely replaced user-managed tools like community newsletters and local bulletin boards. Once a vibrant part of online culture, this kind of grassroots participation is now largely confined to highly technical hobbyists.

This corporate capture of the digital public sphere mirrors the broader decline in democratic engagement. The defeat of unions has dealt a huge blow to meaningful workplace democracy. Political party membership has massively declined in Europe and beyond. Today’s “capitalist realism” fosters a retreat into private life, limiting many people’s engagement with democracy to voting every few years.

The same trend plays out online. Platforms increasingly outsource system administration and community moderation to low-wage call center workers around the world. In the late 2000s, many US-based internet startups, with an eye on growth and profit, made the bet that users wouldn’t want — or would be too busy — to oversee the growing workload of content regulation. Instead, firms delegated this labor to ad hoc teams of lawyers and policy wonks, eventually outsourcing content moderation to call centers in the global periphery to handle the overwhelming volume of flagged posts.

Governable Spaces seeks to reverse this trend by rebuilding democratic practices online and collectively creating new forms of radical digital commons. For Schneider, it isn’t just about the purpose and goals of online spaces but also about how they are structured to facilitate participation. His analysis involves a historical view, critically examining how spaces like BBS and USENET were governed, as well as reflecting on some of the innovations embraced by other connective technologies not traditionally understood as social media. What would it mean to explicitly push beyond a “feudalist” mindset and seek, from the ground up, to engage citizens and communities in collaborative forms of norm building, rulemaking, and justice seeking?

The book’s vision of digital democracy is rooted in grassroots efforts and local experimentation. Schneider and his collaborators develop practical tools for people seeking, for example, to move their neighborhood WhatsApp group onto an open-source server that they can manage together. Their “Metagovernance” project offers guides for creating custom policies, rules, and even voting systems to empower users. Through education and organizing, Schneider hopes to help communities transcend the “sysop problem” through informed decisions about the trade-offs inherent in different forms of online organization.

Leviathan’s Limits

In The Networked Leviathan, legal scholar and political scientist Paul Gowder also tackles the challenge of governance in social media and proposes ways to deepen democracy online in an age of polycrisis. But his angle is slightly different than Schneider’s, as is his diagnosis of some of the problems facing us today in the age of Bluesky, X, Telegram, and TikTok.

Consider a simple example: a local bulletin board system for guitar enthusiasts and amateur players. As a frequent user, I might tolerate arbitrary rules — such as bans on images of electric basses and the rapid removal of all general political discussion — because the admins’ work spares me the effort of creating a rival community. I might also feel locked in by the “network effect” and feel reluctant to abandon the forum where I’ve made friends and built connections.

Now imagine this bulletin board grows exponentially, becoming a sprawling, international platform encompassing people posting in multiple languages and on multiple topics. First came the bass players, then the synth freaks, and now it has somehow grown into a space that goes way beyond just instruments and music. It’s no longer accurate to describe the BBS as a single space — it provides a platform for multiple occasionally overlapping yet nonetheless somewhat distinct online communities.

At a certain point, when the platform grows big enough, the power of the sysop wanes. Yes, they still make the rules, but their ability to actually implement them effectively is challenged by the scale and complexity of the online space they oversee. Moderators become unable to monitor all the activity on their once humble digital corkboard — which is no longer just one corkboard but millions. They lack the expertise required to understand the nuances of certain debates (knowing a lot about guitars doesn’t make one an expert on the intricacies of hate speech) and struggle with multilingual moderation. The result is governance failure, opening the door to harmful behavior, such as coordinated harassment campaigns, doxxing, and death threats.

Bad governance can be dangerous. At the center of Gowder’s book lurks one of the most infamous cases of negligent online content moderation to date: Facebook’s role in facilitating the public incitement to violence against Rohingya communities in Myanmar in 2017. Technology policy debates are complex, and there are many issues — such as user privacy and the extent of online tracking — that pit the public’s interest against corporate business models and profit-seeking movies. But as Gowder frames it, there are some other areas where these interests should be more or less aligned, and this is one. Yet Facebook’s governance architecture failed to stop the spread of incitement to violence.

The problem, Gowder suggests, lies in the fact that multinational platform companies are sprawling, complicated institutions, which, like other historically powerful and yet complex governing institutions, can become plagued by a problem of information. Drawing on James C. Scott and similarly minded thinkers, this can be framed as a problem of monitoring activity, understanding what is happening in the polity, and then acting on it: these are “difficulties with integrating knowledge from the periphery and offering legitimate rules to diverse constituencies.”

In a marked break from the well-worn portrayal of today’s tech giants as all-knowing, Orwellian forces of control and manipulation, Gowder’s “platform leviathan” faces inherent organizational and technological limits on its power. After all, today’s planetary-scale digital platforms are exponentially larger and more complex than the somewhat niche specialist microcommunities of early social networks. Amazon manages an ecosystem of a few million third-party sellers. YouTube famously has claimed that more than five hundred hours of video are being uploaded to the service every minute.

As they grow, these profit motivated and cost-minimizing corporate actors face increasing pressure to ensure that products are safe and don’t violate local consumer protection laws, and that user speech isn’t inciting dangerous forms of mobilization. Companies respond by hiring experts, building bureaucratic systems for international policy development, and developing automated systems that try to evaluate content — or outsourcing these tasks to a burgeoning “safetytech” sector.

However, some challenges defy easy solutions. Myanmar was “the only country in the world with a significant online presence” that hadn’t widely adopted Unicode, a system for converting written scripts into a standardized and machine-readable form for display on digital devices. Most residents of Myanmar, using the popular Zawgyi typeface to represent the very complex Burmese script, were thus producing content that was literally intelligible to the systems Facebook relied on to monitor what users were up to. Other emerging areas of policy concern — such as child sexual abuse material online — are similarly hard to parse and expensive to counter responsibly, involving dedicated teams of specialized investigators with wide remit to proactively search for illegal content.

Serfing the Web

If we were to rebuild today’s platforms from the ground up, what would we do to get around these dual problems of governance legitimacy and governance capacity? One strategy might be to decentralize power to users, a goal pursued by Eugen Rochko, the German developer at the heart of the Mastodon project, a service built on the open ActivityPub protocol.

Gowder’s proposed solution prioritizes capacity over legitimacy. If we’re thinking about the platform information problem, could one make the extractive kingdoms of the contemporary internet economy more representative and effective? For instance, could a platform like Bluesky — with a more conscientious team at its helm than Musk’s at X — deepen platform democracy by creating citizen assemblies? Gowder envisions a system where ordinary users can participate in platforms’ governance: providing policy feedback, deliberating on local impacts, and maybe even directing future expenditures of resources on safety and product development. The elevator pitch is simple: What if Meta or OpenAI or Google or Bluesky put a large and internationally diverse set of individuals on their payroll as paid policy consultants?

Schneider’s vision, by contrast, is communitarian, grassroots, DIY — a throwback to the ethos of IndyMedia and the Battle of Seattle. He imagines a world where most people use Mastodon or other federated platforms. In this vision, I might post on a small corkboard I operate and pay to host with my friends; we can follow others posting on other corkboards due to the power of open protocols. Servers could be hosted at home, at school, or on small, friendly cloud providers. Web tracking might eventually be replaced by donations, community subscriptions, and other alternative business models. Moderation would be collaborative, potentially making use of blockchain-enabled voting mechanisms. Scheider, notably more optimistic about the political potential of cryptocurrency than many left tech critics, has explored ideas like “real time decision making,” “algorithmic dispute resolution,” and “permissionless participation” that could be enabled through tokenization and alternative platform architectures. Governments, in this vision, would subsidize these infrastructures rather than using them for social and political control, cultivating innovation and democratic deliberation instead.

Gowder’s approach feels much like the current status quo — a world which preserves, for most of us, the everyday reality of how we use social networks and other platforms. There are still sysops — teams of policy employees in Menlo Park and Dublin and Seattle — but now they’re better advised. Some users might even periodically be asked to participate in a form of “jury duty” for the major firms. If all goes well, these individuals give good counsel, and their input is meaningfully incorporated into how tech companies make decisions. Our new corporate overlords better understand the intricacies of how their service is actually being used around the world. Perhaps they even learn from the whole experiment that they should value all of their users, especially those that come from low-income countries where the dollar-per-user value is low for Big Tech.

Toward Digital Democracy

When considered together, Governable Spaces and The Networked Leviathan reveal some of the many challenges that would face us if we sought to make one or both of these worlds a reality. Gowder urges us to think about governance efficacy and quality: Will community-operated platforms actually be able to develop the robust democratic constituencies needed to provide the kinds of public benefits that they promise?

There’s long been a participation problem online. In the early 2010s, a string of research suggested that Wikipedia — held up by many as the pinnacle of a nonprofit, collaboratively managed project of “peer production” — relied on just 5 percent of its editors to produce more than 80 percent of the content. How do online spaces seeking to collaboratively moderate do so democratically at scale, if many users don’t wish to offer up their labor and instead “lurk” and free ride? Will they actually be able to effectively overcome the high expenses of the computational tools and expertise required to do so?

Another challenge is the concentration problem, inherent to complex technical systems. Most users are not technically savvy and have come to lean on the more or less effortless usability of managed services with full-time developers. As the cryptographer Moxie Marlinspike has put it, “people don’t want to run their own servers, and never will.” This is a notable problem plaguing efforts to onboard users onto federated platforms like Mastodon or self-hosted community chat services. The blockchain-backed Web3 is also no exception to this problem: the extreme complexity — and the high stakes of making mistakes when there’s no undo button on tokenized services — has led to a new wave of intermediaries, which become choke points between decentralized services and the user.

Schneider’s work, on the other hand, urges us to be ambitious in our vision for collective ownership and control of digital infrastructure. While Gowder’s model may make companies like Bluesky more effective overlords — if they choose to listen to their new advisers — it also preserves platforms as firms under the existing model of multinational capitalism. If we’re all working away in the citizen assemblies of the digital economy, shouldn’t we go one step further and start demanding a meaningful share of its spoils?

It can be difficult to imagine the real viability of deepening the practices of democracy online in a world that feels more divided than ever. It’s even harder when one considers the multiple ways in which the current Big Tech model — the need for an incessant upward march of tech valuations, the constant creation of new hype cycles — has become systemically important for the ongoing viability of the global economy. Nonetheless, such intellectual work remains a vital long-term goal, more crucial today than ever. We should not shy away from the potentially utopian project of meaningfully governable computational “stacks,” even if, as James Muldoon has written, the odds are against us, and “it has become easier for us to imagine humans living forever in colonies on Mars than exercising meaningful democratic control over digital platforms.”

Through organizing, advocacy, and, yes, new and imaginative sociotechnical institutional forms, we might one day be able to finally escape the sharp gravitational pull of the internet’s implicit hierarchies.