As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Explanations play an essential role in decision support and recommender systems as they are directly associated with the acceptance of those systems and the choices they make. Although approaches have been proposed to explain automated decisions based on multi-attribute decision models, there is a lack of evidence that they produce the explanations users need. In response, in this paper we propose an explanation generation technique, which follows user-derived explanation patterns. It receives as input a multi-attribute decision model, which is used together with user-centric principles to make a decision to which an explanation is generated. The technique includes algorithms that select relevant attributes and produce an explanation that justifies an automated choice. An evaluation with a user study demonstrates the effectiveness of our approach.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.