Active learning of causal probability trees

Tue Herlau*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

The past two decades have seen a growing interest in combining causal information, commonly represented using causal graphs, with machine learning models. Probability trees provide a simple yet powerful alternative representation of causal information. They enable both computation of intervention and counterfactuals, and are strictly more general, since they allow context-dependent causal dependencies. Here we present a Bayesian method for learning probability trees from a combination of interventional and observational data. The method quantifies the expected information gain from an intervention, and selects the interventions with the largest gain. We demonstrate the efficiency of the method on simulated and real data. An effective method for learning probability trees on a limited interventional budget will greatly expand their applicability.
Original languageEnglish
Title of host publicationProceedings of 2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA)
PublisherIEEE
Publication date2023
Pages1196-1202
ISBN (Print)978-1-6654-6284-6
ISBN (Electronic)978-1-6654-6283-9
DOIs
Publication statusPublished - 2023
Event2022 IEEE 21st International Conference on Machine Learning and Applications - Nassau, Bahamas
Duration: 12 Dec 202214 Dec 2022
Conference number: 21

Conference

Conference2022 IEEE 21st International Conference on Machine Learning and Applications
Number21
Country/TerritoryBahamas
CityNassau
Period12/12/202214/12/2022

Keywords

  • Computational modeling
  • Machine learning
  • Bayes methods

Fingerprint

Dive into the research topics of 'Active learning of causal probability trees'. Together they form a unique fingerprint.

Cite this