6 июн. 2014 г.

Social software: A Group Is Its Own Worst Enemy

A Group Is Its Own Worst Enemy

Learning from experience is the worst possible way to learn something. Learning from experience is one up from remembering. That's not great. The best way to learn something is when someone else figures it out and tells you: "Don't go in that swamp. There are alligators in there."
  • Groups of people are aggregations of individuals or a cohesive group? Hopelessly committed to both.
  • Uncontrolled groups tend to defeat their goals via sex talks, identification and vilification of external enemies, religious veneration etc.
  • Constitutions are a necessary component of large, long-lived, heterogeneous groups.
  • You cannot completely separate technical and social issues.
  • Members are different than users.
  • The core group has rights that trump individual rights in some situations.
  • Social software should have the handles the user can invest in ("identity").
  • Social software should have some way in which good works get recognized.
  • Social software should have to have some cost to either join or participate, if not at the lowest level, then at higher levels.
  • Social software should have a way to spare the group from scale.

How is a group its own worst enemy?

Groups of people are aggregations of individuals or a cohesive group? Hopelessly committed to both.

Humans are fundamentally individual, and also fundamentally social. Every one of us has a kind of rational decision-making mind where we can assess what's going on and make decisions and act on them. And we are all also able to enter viscerally into emotional bonds with other groups of people that transcend the intellectual aspects of the individual.

Illustration: you are at a party, and you get bored. The party fails to meet some threshold of interest. And then a really remarkable thing happens: You don't leave. You make a decision "I don't like this." If you were in a bookstore and you said "I'm done," you'd walk out. You're sitting at a party, you decide "I don't like this; I don't want to be here." And then you don't leave. And then, another really remarkable thing happens. Twenty minutes later, one person stands up and gets their coat, and what happens? Suddenly everyone is getting their coats on, all at the same time. Which means that everyone had decided that the party was not for them, and no one had done anything about it, until finally this triggering event let the air out of the group, and everyone kind of felt okay about leaving. This effect is so steady it's sometimes called the paradox of groups.

When enough individuals, for whatever reason, sort of agree that something worthwhile is happening, and the decision they make at that moment is: This is good and must be protected. And at that moment, even if it's subconscious, you start getting group effects.

There are some very specific patterns that they're entering into to defeat the ostensible purpose of the group meeting together:
  1. Sex talk. And what that means is, the group conceives of its purpose as the hosting of flirtatious or salacious talk or emotions passing between pairs of members. You go on IRC and you scan the channel list, and you say "Oh, I know what that group is about, because I see the channel label." And you go into the group, you will also almost invariably find that it's about sex talk as well. Not necessarily overt. But that is always in scope in human conversations. That is one basic pattern that groups can always devolve into, away from the sophisticated purpose and towards one of these basic purposes.
  2. The identification and vilification of external enemies. Anyone who was around the Open Source movement could see this all the time. If you cared about Linux on the desktop, there was a big list of jobs to do. But you could always instead get a conversation going about Microsoft and Bill Gates. And people would start bleeding from their ears, they would get so mad. Nothing causes a group to galvanize like an external enemy. So even if someone isn't really your enemy, identifying them as an enemy can cause a pleasant sense of group cohesion. And groups often gravitate towards members who are the most paranoid and make them leaders, because those are the people who are best at identifying external enemies.
  3. Religious veneration. The nomination and worship of a religious icon or a set of religious tenets. The religious pattern is, essentially, we have nominated something that's beyond critique. You can see this pattern on the Internet any day you like. Go onto a Tolkien newsgroup or discussion forum, and try saying "You know, The Two Towers is a little dull. I mean loooong. We didn't need that much description about the forest, because it's pretty much the same forest all the way." Try having that discussion. On the door of the group it will say: "This is for discussing the works of Tolkien." Go in and try and have that discussion.
So these are human patterns that have shown up on the Internet, not because of the software, but because it's being used by humans.

Group structure is necessary. Constitutions are necessary. Norms, rituals, laws, the whole list of ways that we say, out of the universe of possible behaviors, we're going to draw a relatively small circle around the acceptable ones.

People who work on social software are closer in spirit to economists and political scientists than they are to people making compilers. They both look like programming, but when you're dealing with groups of people as one of your run-time phenomena, that is an incredibly different practice. In the political realm, we would call these kinds of crises a constitutional crisis. It's what happens when the tension between the individual and the group, and the rights and responsibilities of individuals and groups, gets so serious that something has to be done.

And the worst crisis is the first crisis, because it's not just "We need to have some rules." It's also "We need to have some rules for making some rules." And this is what we see over and over again in large and long-lived social software systems. Constitutions are a necessary component of large, long-lived, heterogeneous groups.

The normal experience of social software is failure. If you go into Yahoo groups and you map out the subscriptions, it is, unsurprisingly, a power law. There's a small number of highly populated groups, a moderate number of moderately populated groups, and this long, flat tail of failure. And the failure is inevitably more than 50% of the total mailing lists in any category. So it's not like a cake recipe. There's nothing you can do to make it come out right every time.

If you are going to create a piece of social software designed to support large groups, you have to accept three things, and design for four things.

Three Things to Accept

  1. You cannot completely separate technical and social issues. You cannot fork the conversation between social and technical issues, because the conversation can't be forked. You also can't specify all social issues in technology. The group is going to assert its rights somehow, and you're going to get this mix of social and technological effects.
    So the group is real. It will exhibit emergent effects. It can't be ignored, and it can't be programmed, which means you have an ongoing issue. And the best pattern, or at least the pattern that's worked the most often, is to put into the hands of the group itself the responsibility for defining what value is, and defending that value, rather than trying to ascribe those things in the software upfront.
  2. Members are different than users. A pattern will arise in which there is some group of users that cares more than average about the integrity and success of the group as a whole. And that becomes your core group, Art Kleiner's phrase for "the group within the group that matters most." In all successful online communities a core group arises that cares about and gardens effectively. Gardens the environment, to keep it growing, to keep it healthy. Now, if the software doesn't allow the core group to express itself, it will invent new ways of doing so.
  3. The core group has rights that trump individual rights in some situations. This absolutely pulls against the one person/one vote notion. Imagine today if Internet users had to be polled before any pro-war group could be created. The people who want to have those discussions are the people who matter. And absolute citizenship, with the idea that if you can log in, you are a citizen, is a harmful pattern, because it is the tyranny of the majority.
    So the core group needs ways to defend itself - both in getting started and because of the described effects - so that it can stay on its sophisticated goals and away from its basic instincts.

Four Things to Design For

  1. The handles the user can invest in ("identity").
    The world's best reputation management system is right here, in the brain. And actually, it's right here, in the back, in the emotional part of the brain. Almost all the work being done on reputation systems today is either trivial or useless or both, because reputations aren't linearizable, and they're not portable. There are people who cheat on their spouse but not at cards, and vice versa, and both and neither. Reputation is not necessarily portable from one situation to another, and it's not easily expressed.
    If you want a good reputation system, just let me remember who you are. If you give users a way of remembering one another, reputation will happen, and that requires nothing more than simple and somewhat persistent handles.
  2. You have to design a way for there to be members in good standing. Have to design some way in which good works get recognized. The minimal way is, posts appear with identity. You can do more sophisticated things like having formal karma or "member since."
  3. You need barriers to participation. This is one of the things that killed Usenet. You have to have some cost to either join or participate, if not at the lowest level, then at higher levels. There needs to be some kind of segmentation of capabilities.
    The segmentation can be partial - anyone can read Slashdot, anonymous cowards can post, non-anonymous cowards can post with a higher rating. But to moderate, you really have to have been around for a while. It has to be hard to do at least some things on the system for some users, or the core group will not have the tools that they need to defend themselves.
    Now, this pulls against the cardinal virtue of ease of use. But the user of social software is the group, and ease of use should be for the group. If the ease of use is only calculated from the user's point of view, it will be difficult to defend the group from the "group is its own worst enemy" style attacks from within.
    I think we've all been to meetings where everyone had a really good time, we're all talking to one another and telling jokes and laughing, and it was a great meeting, except we got nothing done. Everyone was amusing themselves so much that the group's goal was defeated by the individual interventions.
  4. You have to find a way to spare the group from scale. Scale alone kills conversations, because conversations require dense two-way conversations. The fact that the amount of two-way connections you have to support goes up with the square of the users means that the density of conversation falls off very fast as the system scales even a little bit. You have to have some way to let users hang onto the less is more pattern, in order to keep associated with one another.
    You may have a thousand contacts, maybe 150 people you can call friends, 30 people you can call close friends, two or three people you'd donate a kidney to. The value is inverse to the size of the group. And you have to find some way to protect the group within the context of those effects.
    Sometimes you can do soft forking. Live Journal does the best soft forking of any software I've ever seen, where the concepts of "you" and "your group" are pretty much intertwingled. The average size of a Live Journal group is about a dozen people. And the median size is around five. But each user is a little bit connected to other such clusters, through their friends, and so while the clusters are real, they're not completely bounded - there's a soft overlap which means that though most users participate in small groups, most of the half-million LiveJournal users are connected to one another through some short chain.
    IRC channels and mailing lists are self-moderating with scale, because as the signal to noise ratio gets worse, people start to drop off, until it gets better, so people join, and so it gets worse. You get these sort of self-correcting oscillating patterns.
    You have to find some way to protect your own users from scale. This doesn't mean the scale of the whole system can't grow. But you can't try to make the system large by taking individual conversations and blowing them up like a balloon; human interaction, many to many interaction, doesn't blow up like a balloon. It either dissipates, or turns into broadcast, or collapses. So plan for dealing with scale in advance, because it's going to happen anyway.

Комментариев нет:

Отправить комментарий