Facial biotype classification for orthodontic treatment planning using an alternative learning algorithm for tree augmented Naive Bayes

Gonzalo A. Ruz, Pamela Araya-Díaz, Pablo A. Henríquez

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Background: When designing a treatment in orthodontics, especially for children and teenagers, it is crucial to be aware of the changes that occur throughout facial growth because the rate and direction of growth can greatly affect the necessity of using different treatment mechanics. This paper presents a Bayesian network approach for facial biotype classification to classify patients’ biotypes into Dolichofacial (long and narrow face), Brachyfacial (short and wide face), and an intermediate kind called Mesofacial, we develop a novel learning technique for tree augmented Naive Bayes (TAN) for this purpose. Results: The proposed method, on average, outperformed all the other models based on accuracy, precision, recall, F1-score , and kappa, for the particular dataset analyzed. Moreover, the proposed method presented the lowest dispersion, making this model more stable and robust against different runs. Conclusions: The proposed method obtained high accuracy values compared to other competitive classifiers. When analyzing a resulting Bayesian network, many of the interactions shown in the network had an orthodontic interpretation. For orthodontists, the Bayesian network classifier can be a helpful decision-making tool.

Original languageEnglish
Article number316
JournalBMC Medical Informatics and Decision Making
Volume22
Issue number1
DOIs
StatePublished - Dec 2022
Externally publishedYes

Keywords

  • Bayesian networks
  • Evolution strategy
  • Facial biotypes
  • Orthodontic treatment planning
  • Tree augmented Naive Bayes

Fingerprint

Dive into the research topics of 'Facial biotype classification for orthodontic treatment planning using an alternative learning algorithm for tree augmented Naive Bayes'. Together they form a unique fingerprint.

Cite this