Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Artificial Intelligence on Wall Street Will Be Great, Until It Isn’t

The implementation of AI has extreme potentials for success, but also for market failure, and we need to further analyze the ramifications of these dichotomies before accepting its widespread use on Wall Street.

(TNS) — Until recently, artificial intelligence has struggled to gain a foothold on Wall Street. No longer.

In the last few years, large investment banks like Goldman Sachs and JP Morgan have hired artificial intelligence specialists away from academia and put them in charge of their internal AI divisions. Financial technology start-ups have begun using machine-learning algorithms to model credit ratings and detect fraud. And hedge funds and high-frequency traders are using AI to make investment decisions.

Politicians are starting to take notice. In mid-October, the newly formed Task Force on Artificial Intelligence of the House Financial Service Committee held hearings on how AI could raise data privacy concerns in the financial industry. In June, Sen. Elizabeth Warren called on federal regulators to crack down on "algorithmic discrimination" by financial institutions, noting that financial technology companies often charge minorities higher interest rates.

Artificial intelligence also could fundamentally change the way that our financial system works. And until we understand how those changes could play out, we will be ill-equipped to deal with them. In the last decade, the broader field of artificial intelligence has made remarkable strides. We have seen AI beat the world's best players of "Jeopardy" and the ancient board game Go, identify unknown genes related to Lou Gehrig's disease and power driverless cars around the streets of Phoenix. These achievements have been enabled by better algorithms, more powerful computers and ever-bigger data sets.

For many reasons, the rise of artificial financial intelligence on Wall Street should be applauded. It is a good thing if we can find ways to deploy capital more efficiently, identify risk more accurately or simply make money faster. It can smooth the gears of commerce and, at least theoretically, raise all boats.

But every new tool has its quirks and its risks, and AI is no exception.

The problems with AI in finance stem from the way AI algorithms work. Today, when people talk about AI they are really talking about a specific field of computer science known as machine learning. Machine-learning algorithms are fed large amounts of information and predict future events by identifying patterns in the information. At the base of this complex system is data, which drives AI.

But the very features of AI that have allowed it to be so successful in other arenas also make it dangerous when applied to the financial world. These threats mirror the problems that created the last financial crisis — when complex derivatives and poorly understood subprime mortgages sent the world into a deep depression — and must be taken seriously.

For one, AI could lead to financial bubbles growing bigger or lasting longer by feeding the flames of irrational exuberance. Machine-learning algorithms rely on large data sets to make predictions about the world.

If the data used to make these predictions is outdated, financial chaos could ensue.

Imagine if you trained an AI on a data set that included stock market returns from 1992 to 2000 — it might conclude that tech stocks always outperformed non-tech stocks because that was true during that time span. It wouldn't factor in information from after the dot-com bubble burst in 2002 that would alter this conclusion. AI algorithms trained on skewed data might well invest yet more money into tech stocks, inflating the bubble even further.

AI optimists would say that, sure, AI has limitations, but responsible decision-makers are aware of them and will respond appropriately. AI is simply another tool in the toolshed.

But because AI algorithms are so complex and data-dependent, it is extremely hard to understand how they work. The spread of complex, inscrutable financial instruments was at the root of the 2007 financial crisis and may well be at the root of the next.

We learned from the last crash that when something is hard to understand — such as the collateralized debt obligations that packaged together collections of risky subprime mortgages in a way that purported to make them safe — it is also hard to second-guess. If financial decision-makers have an AI recommendation, which contains a clear "answer" and purports to be based on millions of pieces of information, they will be hard-pressed to ignore it. It could become not so much a tool as a crutch.

Perhaps most importantly, we are not sure how AI algorithms will interact with each other in the jungles of Wall Street. In capital markets, stock prices depend heavily on the decisions of other participants in the market. If most of the participants are AI-driven, and they adopt broadly similar machine-learning strategies, they might create echo effects where they all pile into (or out of) a stock at a moment's notice. Flash crashes might become more frequent as a result.

This is particularly troubling given the rise of simple yet devastatingly effective adversarial strategies that attempt to fool AI algorithms into behaving in unexpected ways. For example, one study found that affixing a few small black and white stickers onto a stop sign tricked an image-recognition algorithm into never recognizing it.

While this creates a major problem for self-driving cars, it could wreak havoc in the financial world. Bad actors might spread financial information known to cause investment algorithms to misfire, or intentionally manipulate data to hide fraud.

Warren was right to call attention to the issues of AI in finance, and federal regulators would be well-advised to take her concerns seriously. This is about much more than just a game of "Jeopardy" or Go. It is about ensuring that technology makes finance better, fairer and more efficient for all.

©2019 the Los Angeles Times. Distributed by Tribune Content Agency, LLC.

From Our Partners