menu

Monday 6 June 2016

Using expert judgment to build better decision support models


The 'big data' juggernaut seems to be rumbling along with many oblivious to the limitations of what pure machine learning techniques can really achieve in most important applications. We have written here before about the dangers of 'learning' from data alone (no matter how 'big' the data is).

Contrary to the narrative being sold by many in the big data community, if you want accurate predictions and improved decision-making then, invariably, you need to incorporate human knowledge and judgment. Much of the research in the BAYES-KNOWLEDGE project is concerned with building better decision-support models - normally Bayesian networks (BNs) - by incorporating knowledge and data.

There are two major steps to building a BN model for a decision analysis problem:
  1. Identify the key variables and which ones directly influence each other.   
  2. Define the probability tables for each variable conditioned on its parents
We have been reporting on this blog about various recent papers from the project that have addressed these steps, most in the context of case studies*, while some of the project work on combining judgement and data to learn the probability tables has been incorporated into the BAYES-KNOWLEDGE tool on the Agenarisk platform.

Now new research (supported jointly by BAYES-KNOWLEDGE and the China Scholarship Council) has been published in the top ranked journal "Decision Support Systems" that describes an important advance in defining the probability tables of a BN. The paper shows that, in practice, many of the variables in a BN model are related by certain types of 'monotonic constraints'.  As a very simple example consider a model in which the variable "Lung cancer" has the parent "Smoking". Although we do not know the exact relationship between these variables it is known that as probability values of "Smoking" increase so do the probability values of "Lung cancer". So this is an example of a positive monotonic constraint. It turns out that, even with fairly minimal data, it is possible to exploit an expert's knowledge about the existence of monotonic constraints to learn complete probability tables that lead to accurate and useful models. This is important because most approaches to incorporating expert judgement to define the probability tables requires the expert to consider multiple combinations of variables states.

The full citation for this new paper is:
Zhou, Y., Fenton, N. E., Zhu, C. (2016), "An Empirical Study of Bayesian Network Parameter Learning with Monotonic Causality Constraints", Decision Support Systems. http://dx.doi.org/10.1016/j.dss.2016.05.001 pre-publication pdf version here

*See:

No comments:

Post a Comment