Explainability
nouncandidate·updated May 13, 2026
No definition recorded.
Framework senses
- §1
- Within the context of AI, the extent to which AI decisioning processes and outcomes are reasonably understood.
- §1
- The ability to provide a human interpretable explanation for a machine learning prediction and produce insights about the causes of decisions, potentially to line up with human reasoning.
- §1
- A characteristic of an AI system in which there is provision of accompanying evidence or reasons for system output in a manner that is meaningful or understandable to individual users (as well as to developers and auditors) and reflects the system’s process for generating the output (e.g., what alternatives were considered, but not proposed, and why not).