Online Governance: Six Case Studies
(Originally published in http://patreon.com/howardrheingold — feel free to support me there)
Online Community Governance: Six Case Studies
How should online communities be governed? I’m not concerned here with Facebook and Reddit, which have special problems because of their size and volume of interaction. Despite all the attention that the gigantic social media platforms attract, the Web is full of smaller communities — communities of interest and practice, mutual aid communities, support communities, learning communities, activist communities, science communities, businesses and their customers. They meet through message boards, chat rooms, Slack, mailing lists, wikis. Eventually (or ideally before they launch), the issue of how the community should be governed arises — usually because of some transgression that stirs community ire. Today, a wealth of online community governance toolkits has emerged. I won’t attempt to be comprehensive here, but will mention my experiences with governance in the WELL, The River, Electric Minds, Brainstorms, Slashdot, and the Omidyar Network. At the end of this essay, I include links to resources and toolkits. Please let me know of any resources that ought to be added.
Governance on The Well
I joined the ‘Whole Earth ‘Lectronic Link in 1985, wrote about it in The Whole Earth Review (“Virtual Communities”) in 1987, and in my 1993 book, The Virtual Community. One of the attractions, and ultimately (in my opinion) a limitation, of the WELL in 1985 was a loosely enacted governance that blended consensus and anarchy. One of the reasons the WELL came to feel like a community was partially an accident — the temporary absence of the founder. Stewart Brand and Larry Brilliant founded the WELL and very deliberately designed it as an experiment in which the users would find their own way. And in the first couple years, Brand was in Boston, not Sausalito (before the Internet, users were mostly local to the San Francisco region because of dial-in costs). He was busy writing a book on the Media Lab. The people he left in charge had zero technical experience, but were decade-long veterans of The Farm, a pioneering experiment in physical intentional community. The WELL managers were happy to let the inmates run the asylum.
Technically, the early WELL was governed as a benevolent dictatorship. The two Farm alumni who took over from the Farm alum who Brand initially appointed, Matthew McClure, Cliff Figallo (aka “Fig”) and John Coate (“Tex”) had the power to make technical, business, and governance decisions. But they (who WELLites ended up referring to as “Figtex” or “The Vile Figtex”) were also members of the community. In the early days, we were hyperconscious that we were not just having conversations. We knew we were early experimenters in a medium that was bound to grow explosively in the future. So we liked to talk about process so much that the process of arguing process became known as “meta.” Which meant that if Figtex made a decision the community didn’t like, they would be expected to devote hours of their attention to the resulting meta. It also meant that when they told us that the server hardware was getting too slow for our needs, enough members of the community decided to pay our monthly dues a year in advance so they could install faster hardware.
Cliff was in charge of keeping the business and technology running and John was in charge of “marketing.” In practice, Tex was one of the first online community managers (Usenet had moderated newsgroups since 1980 and arguably some moderators were what we would consider community managers.) He wrote about what he had learned from his first decade in “Cyberspace Innkeeping: Building Online Community” . Not long after I joined the WELL, I moved near the WELL offices, and not long after I moved in, John messaged me and asked if he could show up in person. So he showed up at my house and we walked and talked in the nearby countryside. At first, I wondered why he was doing this, but soon came to see how he cultivated and paid attention to personal relationships with all the people who were hosting conferences. I understood this as a governance practice when I came across the Japanese managerial concept of nemawashi — “root washing” — in which astute managers listened and persuaded personally every person who would be in a decision-making meeting prior to the actual meeting.
The conversational containers in the WELL, known as “conferences,” covered sports and parenting, media and politics, and a few dozen other topics. Each conference was organized and managed by a “host” who, like a face-to-face party host, invited people, introduced newcomers, stimulated conversations, broke up fights, and decided how to start new conversations, settle disputes. Although governance was decentralized, keeping a very light hand was a norm (initially encouraged and modeled by Figtex) among conference hosts, who had our own private conference “Backstage” to hammer out meta. In general, participants and hosts dealt with decisionmaking by arguing about it, sometimes for thousands of posts. To get a particularly noxious troll thrown out of the WELL required months of meta. (One of the WELLites, Jef Poskanzer, invented the “bozofilter” that enabled individual WELLites to make others invisible to them — which had the effect of general tolerance of trolls who weren’t outrageously obnoxious enough to trigger months of meta) As Oscar Wilde noted of socialism (“requires too many evenings”), governance by semi-consensus attracts people who like to argue meta at length.
In 1994, the Point Foundation, the parent organization of the Whole Earth Catalog, was faced with a financial crisis. The WELL was sold to Bruce Katz, an entrepreneur with a background in the shoe business, and whose initial conversations with the WELL community didn’t go so well. Shocked that something we had treated as a commons could be bought and sold to owners who could make decisions without submitting to months of meta, a group of WELL hosts decided to create an online community owned and operated by the people who created its value — the community members. We created The River, a California Cooperative Corporation in which every community member held one share of stock and one vote in choosing the Board of Directors.
The River: Online Community as Cooperative Corporation
In 1994, I wrote a syndicated column, “Tomorrow.” One of my columns was about the River:
We had the technical, legal, administrative, social, financial, and marketing expertise to create a self-sustaining computer conferencing business. We argued for weeks, then raised thirty thousand dollars from three hundred people in six days, incorporated as a California cooperative corporation, bought a computer, found a place to put it, connected it to the Internet, installed computer conferencing software, and started having conversations. Democracy turned out to be more interesting than we had bargained for.
When our common cause was fear for the future of the WELL, a group of admittedly idiosyncratically individualists who normally wouldn’t agree on the time of day put aside our differences and accomplished an enormous amount of decision-making and effective action in a short amount of time. When it took the elected Implementation Committee a few more months to get set up to open, however, the temporary moratorium on intramural bickering dissolved rapidly. We fell to arguing in most vicious terms. And then we discovered that the charter of the organization we had created collectively had given us a tool for transforming yet another meaningless back-biting argument about who should have done what when into decisive action: we had an election. We argued about the bylaws, sent out e-mail to River enthusiasts who had strayed from the community-building and backbiting, gathered a quorum, held a campaign, held debates, broke out into fights over dozens of issues, including fights over whether or not the issues we were fighting over were trivial. But we held the election, and the newly elected Board of Directors is authorized to complete the process of turning our alpha-test into a real business.
Anybody who wants to join the The River can pay twenty dollars a month, and anybody who wanted to become a voting member can pay an additional hundred dollars a year. Members elect the board of directors, the directors hire and fire the staff. We adopted bylaws that give the members the power to call an emergency meeting online, if enough members agree the situation warrants the measure. And we started creating the seeds of community by starting conversations about everything from technology to the arts, sports to politics to ufos. And among the conversations have been heated ones about how to run the place.
The River never really took off — but the governance framework is useful. Set up a Cooperative Corporation, elect a board, leave the decision making and rule making to the board and to those they appoint — subject or not to enmeshment in community-wide meta, depending on the norms of each community.
Electric Minds: Total Anarchy & Its Aftermath
In 1996, Electric Minds combined user-generated content and social media — ten years too early. Our business model depended on a human selling advertising, and I hired the wrong guy. Time Magazine named Electric Minds one of the ten best web sites of 1996. By the summer of 1997, we were out of business. The original editorial content is preserved, although community conversations were lost. We even attempted to create a virtual community directory. We had thousands of users at first, and when IBM chose us as the site of conversation around the chess match between Garry Kasparov and IBM’s Deep Blue computer, we gained tens of thousands more.
I’ve never been particularly libertarian, but largely because I wanted to avoid the endless heated meta of consensus/anarchy, I decided to make the Electric Minds online community an experiment in emergent governance (which taught me a lesson I will elaborate upon — I believe it is necessary to create a minimum decision-making “boot sector” before leaving decision-making up to the community). There were no explicit rules. It was messy. People spoofed usernames. I had to put up with h0ward_rheingold and coward_rheingold and dozens of other variations of my username spewing parodies and invective; there were actual nazis. We didn’t stay in business long enough for norms and self-governance experiments to fully mature. When it was clear our business was to be acquired by Durand Communications, the core of the existing community were forced to come up with some kind of self-governance mechanism. The intense meta that followed became known as “the Thrash.” One of the most knowledgeable and experienced online facilitators, Nancy White, later wrote about the Thrash — and its lessons for governance:
Eventually a system of governance did emerge in Eminds, “Since then what has evolved is a basic set of norms that is loosely enforced by a paid host at one level and by the other volunteer hosts at another” (White). White has outlined some “online community governance design patterns and criteria: “
Make it as simple as it can be.
Make sure the needs and purpose of the community (and community owners) are articulated.
Consider that structures may need to be fractal in nature giving the most control at the smallest group units
Consider that sometimes benevolent dictatorships are good solutions.
Consider that listening is probably the most important skill for any player, site owner, staff or member.
Consider that it is easy to leave an online community so why make it easier?
Avoid time-unlimited circular conversations (know when to fold-em!).
Define and use decision-making processes.
Put up or shut up. Cook or get out of the kitchen. Fish, no bait cutting here.
When a group process is used, consider the power of words and seek some alignment on definitions the minute people fall into advocacy modes as opposed to dialog.
Keep it in perspective. Life is short and precious.
Eat more chocolate!” (White).
An ethical question that is raised by Nancy White‘s experience with governance in online communities is, What are the benefits of imposed governance by benevolent dictators vs. systems of governance that community members create themselves?
After my experiences with The WELL, The River, and Electric Minds, I quickly wrote up some tips for “the art of hosting good conversations online.” Although I wrote it in about a minute, more than 20 years ago, I’m told it’s still useful. Although it is related to governance, this art is more about fostering and maintaining beneficial conversations — within whatever governance scheme a community chooses.
Brainstorms: “Howard’s Bar & Grill” — Benevolent Dictatorship
After Electric Minds went away, I found that I missed having an online braintrust and community. Instead of going back to the WELL, I started Brainstorms. My friend and online pioneer, the late Lisa Kimball, introduced me to Caucus, a web-based forum. So I set up a number of conferences — a general commons, and conferences on mind, science, arts & culture, phun, etc. — and started a few conversation threads in each one. By this time I had learned that it is best to start out with a few threads — about this conference, ask questions, introductions, and what should we talk about — and slowly grow the collection of threads. Then I emailed the self-registration URL to a few dozen people I knew around the world. When I got up the next day, there was a blooming, buzzing community in formation. That was 1998. I thought of it as “Howard’s Bar & Grill.” Participants agree to not attack each other, and the proprietor reserves the right to throw out miscreants. The way I put it at the time was: “an online community where it is possible to throw out assholes.”
For the most part, Brainstorms — which still exists! — was what I meant it to be: adults having serious and fun conversations without flaming or trolling. At Brainstorms’ height in the years before Facebook, Tumblr, Twitter offered so many alternatives to a general-purpose message board, we had about 500 active members. We had face to face get-togethers of dozens in San Francisco, Amsterdam, Melbourne, and Memphis. By now, the approximately 50–70 of us who are left are a couple decades older. We maintained a mutual aid fund that gave or loaned hundreds and sometimes thousands to BSers in need. We probably dispersed tens of thousands of dollars over the years. Governance long ago ceased to be an issue…after an initial period of sporadic turbulence. That doesn’t mean that every gets along all the time. It does mean that we settled into norms regarding how to about disagreeing.
I thought a benevolent dictatorship would take a lot less of my time than consensus. I was wrong about that. I made the decisions, after polite backchannel attempts to settle disputes or educate rude newcomers, but that didn’t kill meta. There were always different sides. Besides public arguments, I was bombarded with backchannel email lobbying. And a couple of the people who I tossed out took it personally in scary ways — for a while I had security on speed-dial at Berkeley and Stanford because a local troll threatened to disrupt my classes. Eventually, after four or five years, I tired of being Dad. I took a break from Brainstorms for a few months, and appointed a committee of 5 old-time BSers to take care of conflicts. They adjudicated a couple disputes, and for the last 15 years or so, entire years have gone by without needing their decisions. Every couple years, one of the committee retires and names a replacement.
Slashdot: Peer Moderation
Slashdot (“News for Nerds, Stuff that Matters”) was founded in 1997 by Rob Malda (“CmdrTaco”) and classmate Jeff Bates (“Hemos”) as a “social news” site (8 years before the founding of Reddit): paid editors post links and summaries to news stories and articles, then community members discuss the linked material in threaded conversations, message-board style. The site quickly became popular with hordes of open source programmers, several hundred thousand a day, and many online sites were overwhelmed by traffic — “Slashdotted” — when linked. The governance structure was designed to put the community in charge of moderation; instead of censoring offensive material, the goal was to give community members themselves a means of deciding whether or not to read offensive material.
The Slashdot governance system was described in “Slash(dot) and Burn: Distributed Moderation in a Large Online Conversation Space.” by Cliff Lampe and Paul Resnick: Anyone can post anonymously, but anonymous posts are labeled “by anonymous coward.” Registered users can post comments under their name. Regular commenters gain “karma points.” Every comment is initially given a score of −1 to +2, with a default score of +1 for registered users, 0 for anonymous users, +2 for users with high “karma”, or −1 for users with low “karma”. Continuous moderation happens when registered users with a minimum karma score are randomly selected to be temporary moderators; each moderator is shown a number of posts, which the moderator can upvote or downvote. Regular registered users with sufficient karma are also randomly chosen to be metamoderators: shown a number of moderated comments, metamoderators can add to or subtract from the actions of previous moderators. Registered users can filter their reading to show only posts with high scores, or if they are gluttons for trolling, flaming, racism, misogyny, they can elect to see posts with low scores.
I was a regular reader of Slashdot more than 15 years ago — long enough to do my turn as a moderator. I didn’t participate sufficiently to claim expertise about the general effectiveness of this system of governance, but Slashdot is still alive, which is itself an existence proof that something works — for its particular community. Especially in a community with a libertarian-leaning, anti-authoritarian ethos like Slashdots hundreds of thousands of open-source programmers, giving individuals the power to decide which comments the don’t want to see has proven to be an effective method of dealing with offensive posts (for those who choose to put up with it).
Omidyar Network Online: Why Effective Online Governance Requires a Decision-Making Boot-Sector
Omidyar Network is the name of the philanthropic organization created by Ebay founder Pierre Omidyar. It was also the name of an online community from the early 2000s in which Omidyar experimented with a governance schema that grew out of his experience with Ebay. Online auctions among strangers pose what is known as “the prisoner’s dilemma” — a social dilemma in which each party is forced to accept a less than optimal deal out of a lack of trust for the other party. Omidyar instituted seller ratings that enabled buyers to rate sellers on a scale of one to five. Sellers with higher ratings make more sales, and sellers with a sufficiently high rating become “power sellers.”
The online network that Pierre and his wife set up was a resource for non-profits. For no charge, members could set up forums, mailing lists, wikis for their group. In addition to the groups, there was a commons in which members could talk about community issues. Because Pierre and Pam Omidyar had stated that they intended to contribute their approximately $15 billion of ebay stock to worthy causes, the online network grew rapidly — about 4000 members in the first few months. Because the seller reputation system worked so well at ebay, Omidyar set up a more complex reputation system: Members could gain points by logging in, by commenting, by creating conversation threads; members could also gift points to others and to take points away from others. I can’t remember what extra privileges positive points granted, but I remember that if any member’s points dropped to a negative level (I think I recall that it was -50), their posts would no longer be visible in public, and people would have to go to the sanctioned member’s home page to see what they have to say.
I was interested in this governance experiment, so I joined the network shortly before the Omidyars announced their first experiment in networked philanthropy. They pledged to contribute an initial $25,000 to a group or groups chosen by the community. Thus began a massive meta thrash. All 4000+ members of the community were made aware of the community conversation about the grant. Probably around 1000 people checked out the conversation thread, around 200 participated in the conversation, and around 25 people contributed the majority of the posts, which ended up numbering in the thousands. Attempts at consensus continued to fail on minority objections. My contributions to the conversation thread consisted largely of unheeded warnings that without a clear decision-making procedure, this conversation was doomed to be an infinite meta rathole.
The group did agree that it would be necessary to vote, in the absence of consensus. But how would the franchise be defined? Could someone who joined yesterday have a vote? Would a minimum number of karma points be required? How would the vote be carried out? So another round of infinite meta revolved around how to decide how to decide. The network as it was originally set up did not specify a decision mechanism, nor did the Omidyars intervene by declaring mechanisms by fiat. Eventually, giving up on voting, the exhausted networkers divided the grant among three groups. The process appeared to drain rather than boost the community’s ability to explore the networked philanthropy the Omidyars had hoped to initiate.
And than the whole issue of community decision-making blew up when it turned out that one of the most vocal debaters in the grant-making thrash had created dozens of sock-puppet accounts and was awarding points to friends and subtracting points from those who disagreed with her. Not too long after that, the Omidyars shut down the online community. That’s when I became convinced that virtual communities must begin by defining minimum decision-making procedures. I recalled that when personal computers were still young, they had to be booted up with a floppy disk. The first information to be read from that disk was a “boot sector” that instructed the mechanism on how to load the operating system. My experience with governance thrashes in the WELL, the River, Electric Minds, Brainstorms, and the Omidyar Network led me to believe that it is nearly impossible for a group to formulate a decision-making procedure from scratch — they need a boot sector with a minimum description of who can vote, and how the vote should be carried out. With that, the community can change the rules if they want — but at least they won’t have to argue about how to decide how to decide.
Online Community Governance Resources
Recently, research into community governance has yielded toolkits that community founders can use to design systems for their participants. Here are a few. If you know of others, please let me know and I will add
The software behind online community platforms encodes a governance model that represents a strikingly narrow set of governance possibilities focused on moderators and administrators. When online communities desire other forms of government, such as ones that take many members’ opinions into account or that distribute power in non-trivial ways, communities must resort to laborious manual effort. In this paper, we present PolicyKit, a software infrastructure that empowers online community members to concisely author a wide range of governance procedures and automatically carry out those procedures on their home platforms. We draw upon polisci theory by Elinor Ostrom to develop a data model for describing governance as a series of actions and policies, where unlike a typical permissions model, policies encode a short procedure for determining what actions execute. Using our PolicyKit system, an online community can author policies (by implementing 6 short Python functions) to describe governance that suits their needs and values, plus *evolve* that governance collectively over time through special constitutional actions and policies.
CommunityRule is a governance toolkit for great communities.
Howard Rheingold’s unorganized list of online facilitation resources
Nancy White: Useful Community Practices
Peter Kollock and Marc Smith: Managing the Virtual Commons: Cooperation and Conflict in Computer Communities
Governance Thrash Redux?