Decision Trees and Evolutionary Programming

By ai-depot | December 4, 2002

Commonly used in pattern recognition and AI decision systems, the basic structure of DT allows application of IF…THEN type rules to classification, which are easily utilized and interpreted. This article presents a unique method of developing such trees using Evolutionary Programming.

Written by Kirk Delisle.

Decision Trees and Evolutionary Programming

Introduction

The objective of developing prediction and classification rules for various problem domains has been pursued by the statistical and machine learning communities for many years. In general, we start with an initial set of data which consists of various observations or cases for which a known property or class has been assigned. Further, for each of these observations, a number of measurable features exists which describe various details of the observations. The ultimate problem involves developing a system by which the properties can be predicted from the features. An extension of this objective would be to use the classifier system to predict what we would expect the class to be for an observation which is not in our original data set.

This can be conceptualized by considering a data matrix in which the rows represent the observations and the columns represent the features associated with those observations. In Table 1, taken from [Quinlan 1989], the observations (rows) represent various days of the week in which we may choose to go outside or stay in for some "unspecified activity." The columns each represent various features associated with the observations, or in this case the various conditions of the day in question.

Table 1: Go Out or Stay In?

Day # Outlook Temperature Humidity Windy? Out or In?
1 Sunny Hot High False In
2 Sunny Hot High True In
3 Overcast Hot High False Out
4 Rain Mild High False Out
5 Rain Cool High False Out
6 Rain Cool Normal True In
7 Overcast Cool Normal True Out
8 Sunny Mild High False In
9 Sunny Cool Normal False Out
10 Rain Mild Normal False Out
11 Sunny Mild Normal True Out
12 Overcast Mild High True Out
13 Overcast Hot Normal False Out
14 Rain Mild High True In

While this simple example represents a choice on whether to go outside or stay inside, the same representation can be applied to virtually any field. For example, consider classifying glass collected at a crime scene as automobile headlight or window pane glass based upon the chemical content of the samples; proteins into various classes based upon their amino acid content; chemical compounds into the categories toxic or non-toxic based upon their chemical structures; tissue samples as cancerous or non-cancerous based upon various measurements of the tissue itself such as their expressed genes. Each of these, and too many others to mention, are actual examples of classification projects which have been the subject of classification rule development.

The types of rule systems, or classifiers, are as diverse as the types of projects being investigated. One of the more simple to understand and manipulate is the decision tree. Decision trees represent a series of IF�THEN type rules which are linked together and can be used to predict properties for our observations based upon the values of various features. Figure 1 represents a perfect decision tree for the data of Table 1. We start at the topmost point in the tree and ask the question, "What is the outlook for the day?" This point, and all those in which a question based upon the data is asked, is called a node (blue circles in Figure 1), with the first decision node referred to as the root of the tree. The answer to the question determines the path we take through the tree. For example, if we respond, "Overcast, we move down the middle path to a position which provides a class value for the observation in question. This final destination which has no other paths leading away from it is called a leaf, and each leaf has a classification attached to it (yellow squares in Figure 1).

Figure 1: A decision tree for the data of Table 1.

To illustrate the utility of a decision tree, consider Day #1 from Table 1. In the tree from Figure 1, we start at the root and ask the question, "What is the Outlook?" From Table 1 we see that the outlook is sunny which leads us to the left path and another decision node asking the question , "What is the Humidity?" We see that the humidity is High, so the left path is taken and we arrive at a leaf which predicts that we should stay inside. Looking at the category column of Table 1 (In or Out?) we see that, indeed, this is classified as in inside day.

As mentioned earlier, one of the more common uses of decision trees such as this one is to predict the class of an observation which was not in our original data set. In this situation, we refer to the original data as the Training Set, and consider it to be a highly accurate and reasonable basis to use for future predictions. A decision tree is generated from this Training Set and used to predict the class for new observations. The actual class of these observations may not be known, and therefore the decision tree is playing the role of a predictive classifier. For example, suppose we developed a decision tree based upon a data matrix which was related to classifying cells as cancerous or non-cancerous. Once the decision tree is generated, a physician can use it to assist in the diagnosis of new patients.

Pages: 1 2 3

Tags: none
Category: tutorial |

Comments