Rule reduction over numerical attributes in decision trees using multilayer perceptron

Dae Eun Kim, Jaeho Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non- linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those re- lations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that vari- able thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.

Original languageEnglish
Title of host publicationAdvances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings
EditorsDavid Cheung, Graham J. Williams, Qing Li
PublisherSpringer Verlag
Pages538-549
Number of pages12
ISBN (Print)3540419101, 9783540419105
DOIs
StatePublished - 2001
Event5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001 - Kowloon, Hong Kong
Duration: 16 Apr 200118 Apr 2001

Publication series

NameLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume2035
ISSN (Print)0302-9743

Conference

Conference5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001
Country/TerritoryHong Kong
CityKowloon
Period16/04/0118/04/01

Fingerprint

Dive into the research topics of 'Rule reduction over numerical attributes in decision trees using multilayer perceptron'. Together they form a unique fingerprint.

Cite this