of giving-control-to-users department
Boiling frog syndrome suggests that if a frog jumps into a pot of boiling water, it immediately jumps – but if a frog jumps into a slowly heating pot, it senses no danger and gets cooked. Mark Zuckerberg’s Facebook has been steadily malfunctioning for a decade – some are horrified, but many see no serious problem. Now Elon Musk has jumped into a Twitter that he could quickly boil. Many expect him – or hordes of non-extremist Twitter users – to take the plunge.
The frog syndrome may not be true for frogs, and Musk may not boil Twitter immediately, but the deeper issue that could boil us all is the “Law of the Platforms:” Social media, especially Twitter, have become powerful platforms that bring our new virtual “public square” to a boil. Disinformation and harmful and polarizing hate speech threaten democracy here and around the world.
The apparent problem is censorship against free speech (whatever that means) – but the deeper problem is who sets the rules for what can be said, to what audience? We are now faced with a platform law regime, where these private platforms have almost unlimited power to set and enforce censorship rules of who can say what, with little transparency or scrutiny, even as they quickly become essential services. . Should we entrust this to a few billionaire owners or to Wall Street? Pseudo-independent supervisory boards? The cogs of government turning slowly and erratically? “Self-sovereign” users or user communities that can self-organize, but can also run amok like mobs? Or a new hybrid of some or all that can offer both freedom and order?
Musk is now bringing this issue to a boil for all users to see. Either democracies will see the urgency and act, or they will die. Even if the boiling is slow and takes decades, leaving this power to control speech in this new public square in the hands of private companies or governments will leave “a loaded gun on the table”, ready to be picked up by any candidate. authoritarian.
It will take time and a lot of sorting, but some hybrid control is the only workable solution that can preserve democracy. There are many ideas leading to this revival – the optimistic scenario is that Musk could foster this.
Twitter has already begun to consider a step in this direction with Bluesky, an independent project funded by Jack Dorsey, and consistent with Mike Masnick’s proposals for “Protocols, Not Platforms”. Variants include Cory Doctorow’s Adversarial Interoperability and Ethan Zuckerman’s Digital Public Infrastructure. A “middleware” architecture proposed by Francis Fukuyama’s Stanford Group would allow users to choose from an open market of delegated filtering services to work as their agents, to provide them with whatever platforms they want. Any of them would transfer the power of the platforms to each user, to control what each sees – a variation on ideas also proposed by Stephen Wolfram, Ben Thompson and myself, among others.
Interestingly, it has been largely forgotten that the highly controversial 1996 law that enabled the current legal regime, Section 230, also stated, “It is the policy of the United States…to encourage the development of technologies that maximize user control over information received by individuals. To be sure, there are significant challenges in this approach. The most fundamental is that doing the filtering well (ranking and recommendation) requires access to the platforms’ sensitive personal data. But promising solutions are emerging.
A way to achieve this is described in a series of Technical policy Press by Chris Riley and me. The central idea is to put primary control of what each of us sees in our hands, choosing from an open market of composable sets of filtering services that suit our individual desires. In addition, light regulation would guarantee minimal constraints on illegal content, while leaving the criteria for dealing with “lawful but awful” content to the services chosen by users.
But that alone is not enough. What traditionally prevented us from accessing “horrible” content was neither a censorship authority nor direct user control, but a rich ecosystem of mediation services that filtered the old way: Publishers, communities and other institutions have served as an open network of curators serving more or less specific audiences — which we were free to choose or circumvent. Now this open meditation infrastructure is being disintermediated by social media platforms. We have had freedom of impression – but are now losing it to control of the platform.
True freedom of expression requires remedying this kind of infrastructure for indirect user control. There are already legislative efforts in the US and Europe to mandate interoperability – and some include user “delegation” – to open up platforms and break up platform law monopolies. Creating a layer of delegated user agents can create an opening for an open infrastructure of mediation services, to support filtering, as well as other aspects of social media propagation. This can allow traditional ombuds institutions to reintegrate into this online ecosystem and regain their important role – for those who appreciate what they can offer. It may also allow the platform to support new types of mediation services to emerge and find an important place in our media ecosystem. Some fear that this user control will make filtering bubble echo chambers worse, but how many of us really want to close our eyes and remain ignorant and stupid? Individual power to choose from a diversity of sources of information has always been a hallmark of successful societies.
In this way, social media can restore the original promise of the Internet as the generative basis of a vibrant and open next level of society.
Observers have dismissed Musk as a “mischievous trickster god” and naïve about free speech. Maybe we’re all done. But maybe (depending on how much pot he’s smoking?) he could support Twitter’s burgeoning potential to change the game for the better – or inspire us to take the pot off the burner.
Richard Reisman (@rreisman) is an independent media technology innovator and frequent contributor to Tech Policy Press, which blogs about human-centered digital services and technology policy at SmartlyIntertwingled.com.
Filed under: content moderation, control, elon musk, protocols