Information theory has gained application in a wide range of disciplines, including statistical inference, natural language processing, cryptography and molecular biology. However, its usage is less pronounced in medical science. In this chapter, we illustrate a number of approaches that have been taken to applying concepts from information theory to enhance medical decision making. We start with an introduction to information theory itself, and the foundational concepts of information content and entropy. We then illustrate how relative entropy can be used to identify the most informative test at a particular stage in a diagnosis. In the case of a binary outcome from a test, Shannon entropy can be used to identify the range of values of test results over which that test provides useful information about the patient’s state. This, of course, is not the only method that is available, but it can provide an easily interpretable visualization. The chapter then moves on to introduce the more advanced concepts of conditional entropy and mutual information and shows how these can be used to prioritise and identify redundancies in clinical tests. Finally, we discuss the experience gained so far and conclude that there is value in providing an informed foundation for the broad application of information theory to medical decision making.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org