This article was originally published in on May 3, 2017.
There are two big problems with America鈥檚 news and information landscape: concentration of media, and new ways for the powerful to game it.
First, we increasingly turn to only a few aggregators like Facebook and Twitter to find out what鈥檚 going on the world, which makes their decisions about what to show us impossibly fraught. Those aggregators draw鈥攐paquely while consistently鈥攆rom largely undifferentiated sources to figure out what to show us. They are, they often remind regulators, only aggregators rather than content originators or editors.
Second, the opacity by which these platforms offer us news and set our information agendas means that we don鈥檛 have cues about whether what we see is representative of sentiment at large, or for that matter of anything, including expert consensus. But expert outsiders can still game the system to ensure disproportionate attention to the propaganda they want to inject into public discourse. Those users might employ bots, capable of numbers that swamp actual people, and of persistence that ensures their voices are heard above all others while still appearing to be humbly part of the real crowd.
What to do about it? We must realize that the market for vital information is not merely a market.
The ideals of the journalistic profession鈥攏o doubt flawed in practice, but nonetheless worthy鈥攈elped mitigate an earlier generation of concentration of media ownership. News divisions were by strong tradition independent of the commercial side of broadcasting and publishing, while cross-subsidized by other programming. And in the United States, they were largely independent of government, too, with exceptions flagrantly sticking out.
Facebook and Twitter for social media, and Google and Microsoft for search, must recognize a special responsibility for the parts of their services that host or inform public discourse. They should be upfront about how they promote some stories and de-emphasize others, instead of treating their ranking systems as trade secrets. We should hold them to their desire to be platforms rather than editors by insisting that they allow for creating user feeds, so that they aren鈥檛 saddled with the impossible task of making a single perfect feed for everyone.
There should be a method for non-personally-identifying partial disclosure: my Twitter-mates could be assured, say, that I am, in fact, a person, and from what country I hail, even if I don鈥檛 choose to advertise my name. Bots can be allowed鈥攂ut should be known for the mere silhouettes that they are.
And Facebook and Twitter should version-up the crude levers of user interaction that have created a parched, flattening, even infantilizing discourse. For example, why not have, in addition to 鈥渓ike,鈥 a 鈥淰oltaire,鈥 a button to indicate respect for a point鈥攚hile disagreeing with it? Or one to indicate a desire to know if a shared item is in fact true, an invitation to librarians and others to offer more context as it becomes available, flagged later for the curious user?
Finally, it鈥檚 with the bankrupt system of click-based advertising. By 鈥渂ankrupt鈥 I don鈥檛 mean that it鈥檚 bad for America or the world, though it is. Rather, by its own terms it is replete with fraud. The same bots that populate Twitter armies also inspire clicks that are meaningless鈥攎oney out of the pockets of advertisers, with no human impact to show for it. There are thoughtful proposals to reseed a media landscape of genuine and diverse voices, and we would do well to experiment widely with them as the clickbait architecture collapses on its own accord.
While there is no baseline pure or neutral architecture for discourse, there are better and worse ones, and the one we have now is being exploited by those with the means and patience to game it. It鈥檚 time to reorient what we have with a focus on 鈥攈onestly satisfying their curiosity and helping them find and engage with others in ways so that disagreement does not entail doxxing and threats, but rather reinforcement of the human aspiration to understand our world and our fellow strugglers within it.
Jonathan Zittrain is a professor at Harvard Law School and the Kennedy School of Government. He is also a professor of computer science at the Harvard School of Engineering and Applied Sciences, and the co-founder of the Berkman Klein Center for Internet & Society.
This article is part of The Democracy Project, a collaboration with The Atlantic.
Photo credit: Jon Chase/Harvard Staff Photographer