New software could make it easier to “adopt and trust” AI systems that set insurance premiums

By Canadian Underwriter | October 6, 2017 | Last updated on October 30, 2024
2 min read

New software developed at the University of Waterloo (UWaterloo) in Ontario could make it easier to adopt and trust powerful artificial intelligence (AI) systems that set insurance premiums, generate stock market predictions and assess who qualifies for mortgages, the university said in a press release on Friday.

The software is designed to analyze and explain decisions made by deep-learning AI algorithms, providing key insights needed to satisfy regulatory authorities and give analysts confidence in their recommendations.

“The potential impact, especially in regulatory settings, is massive,” suggested Devinder Kumar, lead researcher and a PhD candidate in systems design engineering at UWaterloo. “If you can’t provide reasons for their decisions, you can’t use those state-of-the-art systems right now.”

Deep-learning AI algorithms essentially teach themselves by processing and detecting patterns in vast quantities of data, the release explained. As a result, even their creators don’t know why they come to their conclusions.

To develop a program capable of explaining deep-learning AI decisions, researchers first created an algorithm to predict next-day movements on the Standard & Poor’s 500 stock index. That system was trained with three years of historical data and programmed to make predictions based on market information from the previous 30 days.

Explanatory software called CLEAR-Trade was then developed to examine those predictions and produce colour-coded graphs and charts highlighting the days and daily factors – index high, low, open and close levels, plus trading volume – most relied on by the AI system. Those insights would enable analysts to use their experience and knowledge of recent world events to determine if deep-learning AI decisions actually make sense or not.

“If you’re investing millions of dollars, you can’t just blindly trust a machine when it says a stock will go up or down,” said Kumar, who expects to start field trials of the software within a year. “This will allow financial institutions to use the most powerful, state-of-the-art methods to make decisions.”

The ability to explain deep-learning AI decisions is expected to become increasingly important as the technology advances and regulators require financial institutions to provide reasons to the people affected by them, UWaterloo suggested in the release.

While the stock market was used for development purposes, Kumar said the explanatory software is applicable to predictive deep-learning AI systems in all areas of finance.

Kumar, who collaborated with professors Alexander Wong of UWaterloo and Graham Taylor of the University of Guelph, will present the research at the two-day Conference on Vision and Imaging Systems at UWaterloo at the end of October.

Canadian Underwriter