Moderating Online Chat Rooms

By ai-depot | October 24, 2002

This paper shows how self-organised systems appear among chaos, given a nurturing environment. Such a technique is employed for self-moderation of message boards. Three types of moderation are discussed: automatic filtering, hierarchical censoring, and self-moderation through bottom-up feedback.

Written by Michael T. Pagano.

Introduction

Proposal

This paper will discuss various methods of moderating the content of online chat rooms. The three types of moderation that will be discussed are automatic filtering, hierarchical censoring, and self-moderation through bottom-up feedback. The focus of the paper will be self-moderation. It is hoped that this paper will give a better understanding of the Webmaster’s options with regard to the control of his message-board content. This topic was chosen after reading the book, “Emergence” by Steven Johnson, which discusses how self-organized systems will appear amongst chaos, given a nurturing environment. Another reason for investigating this topic is my ongoing personal involvement with several online discussion forums as well as an interest in “complex systems” and chaos theory.

Introduction to Message Boards

Online message boards are places where people meet online to discuss various ideas. They allow members to ask questions of one another and to post new ideas. Most online message boards are run by webmasters attempting to bring traffic to their site. The traffic then helps to generate business. “If you have a website, you’ll know that bringing people back to your site is no easy task. One of the best ways of attracting people back is to have changing content, like a message board” (Voy Forums, 2001).
There are online message boards for almost every topic you can think of. “Whatever your interest is: playing dijeridoo music, sheep sheering, world oil prices, piercings, etc. [there are] others who share your interests in this vast virtual world we call “the net.”” (Voy Forums, 2001). These days, anyone who is comfortable with the Internet and has something to talk about is out there chatting. Most of these chat rooms are public access, which means that they accept new membership from anyone who attempts to sign up. There are some exceptions to this rule such as chat rooms requiring invites or chat rooms that are completely closed to new members. These are rare however, and it’s safe to assume most chat rooms are public access.

Moderation Overview

Anyone can say just about anything in most chat rooms. From this fact the need for moderation arises. This need can come about for three primary reasons. The first is the legal liability the site’s host could face if illegal content were displayed. An example would be trafficking black market guns and using an online chat room to communicate. The Webmaster definitely would not want to support such an illegal operation, and hence would look to remove such a discussion from his forum. The second is the preference of the site’s owner that the conversations remain “on topic”. In the case of a commercial website, keeping members on topic keeps their conversations focused on the site’s product or related issues. In the case of the non-commercial website it is simply at the Webmasters personal preference. Either way however, keeping message boards “on topic” is a common task for moderators. The third reason to moderate content is to keep the conversations from becoming hostile. Knowing that a happy chatter will probably be a repeat chatter (and a repeat customer in the case of a commercial site) it is important to keep discourse civil.

If you visit most message boards today you will note that most of them are off topic, and many times arguments develop (Keeping Chat Rooms, 2000). Conversations spur off about topics that do not pertain to the forum’s intentions because users inevitably share interests beyond the forum’s scope. These conversations need to be redirected back onto the topic for reasons discussed previously.

Beyond being off topic, many of the messages contain flaming and trolling. Flaming is online lingo for what in the real world would be yelling at one another. Again, assuming a happy customer is a repeat customer, a Webmaster is wise to keep flaming to a minimum. Trolling refers to members who drop posts just to be a nuisance. A troll would post something like “The Holocaust Never Happened” at a website dedicated to Holocaust survivors. Trolling logically leads to flaming, which again webmasters dislike.

To combat trolling, flaming, and the tendency to go off topic Webmasters have traditionally used two methods of control. They either employ moderators or they use filtering software. Employing Moderators, or Mods, to censor content is the current rage in online discussion forums. Moderators skim through the pages and pages of discussion to look for foul content. Occasionally they are directed to specific threads by complaining members. Filtering software on the other hand filters out specific words or phrases that the Webmaster have deemed offensive.

The shortcomings of filtering software are evident. Though it can pick up foul language, it fails to censor content. This shortcoming then turns Webmasters to the use of Mods. The problem with Mods is less clear, but equally troubling. First, Mods fail to keep up with every single post by every single user. They have to skim through threads and occasionally they miss things. Second, Mods are human, meaning that they cannot always be there. “Sometimes even moderators have to take time off, and when they do, trouble can start.” (Keeping Chat Rooms Lively) Third, unless the Mods hold the core ideology of the company supreme, they may be tempted to bring their own opinions into the moderating activities. Since the Mods are the watchdog of content, their brazen responses may go unnoticed. Fourth, unless you are a non-profit site with many willing volunteers, there is a cost associated with employing Mods.

Pages: 1 2 3 4

Tags: none
Category: essay |

Comments