Web-Books
im Austria-Forum
Austria-Forum
Web-Books
Informatik
Applied Interdisciplinary Theory in Health Informatics - Knowledge Base for Practitioners
Seite - 29 -
  • Benutzer
  • Version
    • Vollversion
    • Textversion
  • Sprache
    • Deutsch
    • English - Englisch

Seite - 29 - in Applied Interdisciplinary Theory in Health Informatics - Knowledge Base for Practitioners

Bild der Seite - 29 -

Bild der Seite - 29 - in Applied Interdisciplinary Theory in Health Informatics - Knowledge Base for Practitioners

Text der Seite - 29 -

2. We are ignoring uncertainties in the veracity of a message that might be introduced by the communication pathway of Figure 2. Point 2 is critically important when taking this into a real clinical setting. However, for simplicity of exposition we will continue to ignore this issue until the concluding section. 3. Relative entropy and diagnostic tests Let us phrase the diagnostic strategy a little more formally. A patient’s specific internal state has an ensemble of messages M = (m, AM, PM) associated with it. A message will normally be triggered by a specific “interrogation” being performed on the patient. An interrogation may be, for example: a question asked of the patient; a test performed on the patient; an inspection performed by a nurse. Prior to an interrogation the alphabet AM of messages will have a probability distribution QM over it. Receipt of a message mk (a positive test result, for example) will result in a posterior probability PM over the alphabet of messages. To measure the change in entropy, we use the relative entropy, or Kullback-Leibler divergence, between the two probability distributions [7]: Eq 3. ܦ௄ ௅ ሺ ெܲȁȁܳெሻൌσ ݌ ௞ ݈݋ ݃ଶ ௣ ೖ ௤ ೖ ௄ ௞ ୀଵ It is worth noting two properties of relative entropy. Firstly, it satisfies what is known as Gibbs’ inequality, with equality if and only if PM = QM: Eq 4. ܦ௄ ௅ ሺ ெܲȁȁܳெሻ൒ Ͳ Secondly, in general it is not symmetric under interchange of the two probability distributions. That is, ܦ௄ ௅ ሺ ெܲȁȁܳெሻ്ܦ௄ ௅ ሺܳெȁȁ ெܲሻ . Consequently, relative entropy/Kullback-Leibler divergence does not formally qualify as a measure (hence the use of the term “divergence”). Expressed in terms of Bayesian inference, DKL(P||Q) is a measure of the information gained when a physician’s beliefs are revised from a prior Q to a posterior P following some investigation. We will use a hypothetical example adapted from [2] to illustrate the approach so far. We hypothesise a population of patients with arthritis, framed with a prior probability distribution over four possible syndromes. We have two diagnostic tests that are available to us, t1 and t2. Table 1 provides the pre-test probabilities and the respective post-test probabilities following a positive outcome from each of the two tests. Which of the two tests provides the greater information gain? P.Krause / InformationTheoryandMedicalDecisionMaking 29
zurück zum  Buch Applied Interdisciplinary Theory in Health Informatics - Knowledge Base for Practitioners"
Applied Interdisciplinary Theory in Health Informatics Knowledge Base for Practitioners
Titel
Applied Interdisciplinary Theory in Health Informatics
Untertitel
Knowledge Base for Practitioners
Autoren
Philip Scott
Nicolette de Keizer
Andrew Georgiou
Verlag
IOS Press BV
Ort
Amsterdam
Datum
2019
Sprache
englisch
Lizenz
CC BY-NC 4.0
ISBN
978-1-61499-991-1
Abmessungen
16.0 x 24.0 cm
Seiten
242
Kategorie
Informatik
Web-Books
Bibliothek
Datenschutz
Impressum
Austria-Forum
Austria-Forum
Web-Books
Applied Interdisciplinary Theory in Health Informatics