The 1st World Conference on Explainable Artificial Intelligence (XAI 2023) was held in Lisbon, Portugal from July 26 to 28, 2023. XAI 2023 is an annual event that aims to bring together researchers, academics, and professionals, promoting the sharing and discussion of knowledge, new perspectives, experiences, and innovations in the field of eXplainable Artificial Intelligence (XAI).
The event is multidisciplinary and interdisciplinary, bringing together academics and scholars of different disciplines, including Computer Science, Psychology, Philosophy, and Social Science, and industry practitioners interested in the practical, social and ethical aspects of the explanation of the models emerging from the discipline of Artificial intelligence (AI).
For three days, PUZZLE representatives actively engaged in the conference life, discussed with researchers, lecturers, industry stakeholders, attendees and presented project achievements and development.
During the event, 94 works were presented, which were thoroughly reviewed and selected from 220 qualified submissions, and organized in the following sections:
- Part I: Interdisciplinary perspectives, approaches and strategies for xAI; Model-agnostic explanations, methods and techniques for xAI, Causality and Explainable AI; Explainable AI in Finance, cybersecurity, health-care, and biomedicine.
- Part II: Surveys, benchmarks, visual representations and applications for xAI; xAI for decision-making and human-AI collaboration, for Machine Learning on Graphs with Ontologies and Graph Neural Networks; Actionable eXplainable AI, Semantics and explainability, and Explanations for Advice-Giving Systems.
- Part III: xAI for time series and Natural Language Processing; Human-centered explanations and xAI for Trustworthy and Responsible AI; Explainable and Interpretable AI with Argumentation, Representational Learning and concept extraction for xAI.
Conference in numbers: 1 keynote, 96 speakers, 26 late-breaking works, scholars from more than 32 countries, a program committee with more than 200 scholars worldwide, more than 120 represented research institutions, and hundreds of attendees.
AI has experienced a considerable shift in emphasis towards building and developing interpretable and explainable intelligent systems. This is because of the complexity of the data-driven model and the legal requirements set by many national and international legislatures. As a result, this has been echoed in both the academic literature and the news, attracting experts from around the world as well as a lay readership. As indicated in the Regulation of the European Parliament and The Council (AI ACT), laying down harmonised rules on Artificial Intelligence and revising various union legislative acts, xAI can help overcome some of the difficulties of AI.