The dramatic recent uptick in investment in artificial intelligence has led to a torrent of AI applications. Continued advances promise to produce autonomous systems that will perceive, learn, decide and act on their own. However, the effectiveness of these systems is sometimes limited by the machine’s current inability to explain decisions and actions to human users.
Large enterprises are facing more challenges that demand more intelligent and autonomous systems. AI deployed in financial tools such as loan assessments, or deployed in self-driving cars may use Deep Learning techniques through Neural Networks. Although extremely powerful, these technologies are often inscrutable and it is difficult to explain the reasoning in a human readable format. Without being able to establish culpability and without being able to audit and double-verify how an algorithm works, who’s responsible when something goes wrong?
Explainable AI (XAI) will be essential if companies are to understand, appropriately trust, and effectively manage an emerging generation of AI machine partners.
Last week at Money2020 Europe, our Co-Founder & CTO, Kevin McCarthy, provided his insights into XAI and its importance in the financial services sector with the upcoming introduction of GDPR next year.
RecommenderX is developing XAI systems into its Recommendation as a Service (RaaS) platform to explain, enable trust and understanding in its outputs: we’re not dealing with a black box! Our explanation technologies expose the decision trail.
If you’d like to find out how more about XAI and its importance during this golden age of AI, get in touch! We’d love to hear from you (email@example.com).