LAHORE      -        Explainable AI (XAI) emphasizes not just how algorithms provide an output, but also how they work with the user, and how the output or conclusion is reached. XAI approaches shine a light on the algorithm’s inner workings to show the factors that influenced its output. The idea is for this information to be available in a human-readable way, rather than being hidden within code. ACCA’s (Association of Chartered Certified Accountants) latest report Explainable AI addresses explainability from the perspective of practitioners, i.e. accountancy and finance professionals. Head of Business Insights, Narayanan Vaidyanathan, said: ‘It is in the public interest to improve understanding of XAI, which helps to balance the protection of the consumer with innovation in the marketplace.’

Complexity, speed and volume of AI decision-making often obscure what is going on in the background (the blackbox), which makes the model difficult to interrogate. Explainability, or the lack of this, affects the ability of professional accountants to understand and display scepticism. In a recent ACCA survey, more than double, 54%, agreed with this statement compared to those who didn’t.