Abstract
This chapter introduces two practical techniques for improving Naïve Bayes text classifiers that are widely used for text classification. The Naïve Bayes has been evaluated to be a practical text classification algorithm due to its simple classification model, reasonable classification accuracy, and easy update of classification model. Thus, many researchers have a strong incentive to improve the Naïve Bayes by combining it with other meta-learning approaches such as EM (Expectation Maximization) and Boosting. The EM approach is to combine the Naïve Bayes with the EM algorithm and the Boosting approach is to use the Naïve Bayes as a base classifier in the AdaBoost algorithm. For both approaches, a special uncertainty measure fit for Naïve Bayes learning is used. In the Naïve Bayes learning framework, these approaches are expected to be practical solutions to the problem of lack of training documents in text classification systems.
| Original language | English |
|---|---|
| Title of host publication | Handbook of Research on Text and Web Mining Technologies |
| Subtitle of host publication | Volume I-II |
| Publisher | IGI Global |
| Pages | 111-127 |
| Number of pages | 17 |
| Volume | I |
| ISBN (Electronic) | 9781599049915 |
| ISBN (Print) | 9781599049908 |
| DOIs | |
| State | Published - 1 Jan 2008 |
Fingerprint
Dive into the research topics of 'Improving Techniques for Naïve Bayes Text Classifiers'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver