Description

Title
Abstract Explanations have been subject of study in a variety of fields (e.g. philosophy, psychology and social science), experiencing at last a new wave of popularity in Artificial Intelligence thanks to the success of machine learning (see DARPA's eXplainable AI). Yet, the events of recent times have shown that the effectiveness of intelligent systems is still limited due to their inability to explain their decisions to human users, hence losing in understandability and trustworthiness. In this talk, I will give an overview of some efforts made during my research, aiming at developing systems able to automatically generate explanations. In particular, I will show how such systems can be based on the existing research on explanations, combined with AI techniques and the large-scale knowledge sources available nowadays.

Other presentations by Ilaria Tiddi

DateTitle
10 September 2018