As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Information theory has gained application in a wide range of disciplines, including statistical inference, natural language processing, cryptography and molecular biology. However, its usage is less pronounced in medical science. In this chapter, we illustrate a number of approaches that have been taken to applying concepts from information theory to enhance medical decision making. We start with an introduction to information theory itself, and the foundational concepts of information content and entropy. We then illustrate how relative entropy can be used to identify the most informative test at a particular stage in a diagnosis. In the case of a binary outcome from a test, Shannon entropy can be used to identify the range of values of test results over which that test provides useful information about the patient’s state. This, of course, is not the only method that is available, but it can provide an easily interpretable visualization. The chapter then moves on to introduce the more advanced concepts of conditional entropy and mutual information and shows how these can be used to prioritise and identify redundancies in clinical tests. Finally, we discuss the experience gained so far and conclude that there is value in providing an informed foundation for the broad application of information theory to medical decision making.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.