Page - 33 - in Applied Interdisciplinary Theory in Health Informatics - Knowledge Base for Practitioners
Image of the Page - 33 -
Text of the Page - 33 -
common information), then we have an objective basis for reducing the number of tests
performed.
Two additional concepts were used in Lee and Maslove [6]. The first of these was
the conditional entropy of X given Y. This measures the average uncertainty about a
random variable x that remains when y is known (x and y being the respective random
variables for the ensembles X and Y). It is defined as:
Eq 6. ܪሺܺȁܻሻൌσ ሺݔǡݕሻ݈ ݃ ଵ
ሺ௫ȁ௬ሻ௫௬א
ೊ
Referring back to the definition of an ensemble, AX is the alphabet of the ensemble
X; that is, the set of legal values of the random variable x. Similarly, AY is the alphabet
of the ensemble Y.
A related concept is the mutual information between X and Y. This measures the
amount of information that x conveys about y, and is defined as:
Eq 7. ܫሺܺǢܻሻؠܪሺܺሻെܪሺܺȁܻሻ
Note that we have followed the definitions as given in Mackay [7].
Lee and Maslove extracted laboratory test results from MIMIC II, a fully
anonymised public database. They analysed a total of 29,149 ICU admissions,
investigating the following laboratory tests: haematocrit; platelet count; white blood cell
count (WBC); glucose; HCO3; potassium, sodium; chloride; BUN (Blood Urea
Nitrogen); creatinine; and, lactate. Overall, their findings strongly supported the view
that a significant amount of the bloodwork performed in ICUs is unnecessary. This had
previously been discussed in [14], but Lee and Maslove were able to quantify the level
of redundant information content. As a specific example, they found a high level of
redundancy in information between the tests for BUN and creatinine; suggesting that if
one is known, the other can be inferred with reasonable confidence. Furthermore, their
analysis indicated that given the choice, it would be better to prefer BUN over creatinine.
Of course, clinical judgement will always be needed but this information theoretic
approach does provide an objective foundation to an informed choice.
6. Discussion
We have shown in this chapter that information theory can have value in informing
medical decision making. We have drawn on a number of studies in order to illustrate
this. However, there is one area where we do beg to differ with most of those studies.
Many of them bring in additional terminology to try and provide an intuitive semantics
to some of the concepts in information theory; notions of “surprise”, “closeness to
certainty”, perhaps a tendency to try and equate entropy to uncertainty. One can
understand this. Within classical thermodynamics, entropy is perhaps one of the hardest
concepts to gain a feeling for. However, we have been careful to refer only to measures
of information and entropy. We have briefly alluded to an equivalence between Shannon
entropy and entropy from statistical mechanics through associating the messages that can
be potentially received from a patient with the internal microstates of that patient.
P.Krause / InformationTheoryandMedicalDecisionMaking 33
back to the
book Applied Interdisciplinary Theory in Health Informatics - Knowledge Base for Practitioners"
Applied Interdisciplinary Theory in Health Informatics
Knowledge Base for Practitioners
- Title
- Applied Interdisciplinary Theory in Health Informatics
- Subtitle
- Knowledge Base for Practitioners
- Authors
- Philip Scott
- Nicolette de Keizer
- Andrew Georgiou
- Publisher
- IOS Press BV
- Location
- Amsterdam
- Date
- 2019
- Language
- English
- License
- CC BY-NC 4.0
- ISBN
- 978-1-61499-991-1
- Size
- 16.0 x 24.0 cm
- Pages
- 242
- Category
- Informatik