A balanced approach to health information evaluation: A vocabulary-based naïve Bayes classifier and readability formulas

Gondy Leroy, Trudi Miller, Graciela Rosemblat, Allen Browne

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Since millions seek health information online, it is vital for this information to be comprehensible. Most studies use readability formulas, which ignore vocabulary, and conclude that online health information is too difficult. We developed a vocabularly-based, naïve Bayes classifier to distinguish between three difficulty levels in text. It proved 98% accurate in a 250-document evaluation. We compared our classifier with readability formulas for 90 new documents with different origins and asked representative human evaluators, an expert and a consumer, to judge each document. Average readability grade levels for educational and commercial pages was 10th grade or higher, too difficult according to current literature. In contrast, the classifier showed that 70-90% of these pages were written at an intermediate, appropriate level indicating that vocabulary usage is frequently appropriate in text considered too difficult by readability formula evaluations. The expert considered the pages more difficult for a consumer than the consumer did.

Original languageEnglish (US)
Pages (from-to)1409-1419
Number of pages11
JournalJournal of the American Society for Information Science and Technology
Volume59
Issue number9
DOIs
StatePublished - Jul 2008
Externally publishedYes

ASJC Scopus subject areas

  • Software
  • Information Systems
  • Human-Computer Interaction
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A balanced approach to health information evaluation: A vocabulary-based naïve Bayes classifier and readability formulas'. Together they form a unique fingerprint.

Cite this