Rajabi Enayat, Etminani Kobra
Cape Breton University, Sydney, NS, Canada.
Center for Applied Intelligent Systems Research (CAISR), Halmstad University, Sweden.
Stud Health Technol Inform. 2021 May 27;281:502-503. doi: 10.3233/SHTI210215.
The decisions derived from AI-based clinical decision support systems should be explainable and transparent so that the healthcare professionals can understand the rationale behind the predictions. To improve the explanations, knowledge graphs are a well-suited choice to be integrated into eXplainable AI. In this paper, we introduce a knowledge graph-based explainable framework for AI-based clinical decision support systems to increase their level of explainability.
基于人工智能的临床决策支持系统所做出的决策应该是可解释且透明的,以便医疗保健专业人员能够理解预测背后的基本原理。为了改进解释,知识图谱是适合集成到可解释人工智能中的选择。在本文中,我们为基于人工智能的临床决策支持系统引入了一个基于知识图谱的可解释框架,以提高其可解释性水平。