Head of British standards-setting body warns of rise in AI risk to financial markets
The global algorithmic trading market is expected to grow to US$18.16 billion by 2025 from US$8.79 billion in 2016 – but are the technology, artificial intelligence and electronic trading systems involved regulated strictly enough?
International capital markets have become much stronger and safer since the financial crisis a decade ago thanks to tighter regulation and the recapitalisation of banks around the world.
But the rise of technology, artificial intelligence (AI) and electronic trading is creating a whole set of new hazards, according to the head of the world’s leading standards watchdog for fixed income.
Although regulation in many countries has become more sophisticated, it has mainly focused on rebuilding the financial system by improving liquidity of banks and setting a framework for the size and scope of their business activities.
The changes were not designed, however, to address day-to-day market behaviour that “comes within the culture of a organisation”, said City of London veteran Mark Yallop, chairman of the FICC Markets Standards Board (FMSB), who was speaking in Hong Kong during the 9th annual Pan Asian Regulatory Summit.
The London-based FMSB has considerable clout. It was set up in 2015 by the Bank of England and UK Treasury in response to the Libor interest rate and foreign exchange rigging scandals and its work is aimed at improving global systems and controls in future.
The London-based standards-setting body consists of member banks, asset managers, brokers, exchanges and trading platforms. That membership is a veritable who’s who of the financial world, including Bank of America Merrill Lynch, Barclays, Deutsche Bank, Goldman Sachs, HSBC, Standard Life Aberdeen and UBS, who are normally expected to follow its guidelines.
Earlier this year, the FMSB carried out a study of 390 cases of financial market misconduct stretching back 225 years which Mark Carney, governor of the Bank of England, said was “fundamental to identifying the causes of misconduct and to finding ways to reinforce the collective memory of the market about what constitutes acceptable conduct and practice”.
Seven broad types of financial misconduct were identified: price manipulation, inside information, circular trading, reference price influence, collusion and information sharing, improper order handling and misleading customers.
The review found that patterns of bad behaviour had been repeated across national borders and asset classes, as well as being adapted to new technologies.
“Technology is not new – it has been a feature of markets for years, and as such there is corresponding body of evidence of conduct malpractice in the screen-based trading environment,” the review said. “These behaviours are not new – they are known behaviours that have adapted to new media.”
Now Yallop says these are being magnified in the world of computer trading, with automation significantly speeding up biased financials transactions, allowing them to be conducted across multiple markets at once.
“Electronic trading has huge advantages for cutting costs and improving transparency. It also, however, has dangers that have yet to be recognised or properly addressed. Not enough thought has gone into protecting ourselves against those dangers,” Yallop said.
Computer programmes using artificial intelligence, sometimes unintentionally, can also be coded to determine parameters of orders and execute financial transactions based on their own market learning that potentially differs from human judgment and in turn harms clients, he added.
The use of electronic trading has increased enormously in recent years, yet some of these systems lie outside the reach of regulatory oversight.
Fixed income and foreign currency markets, for instance, which account for a daily turnover of US$15 trillion globally, are not regulated markets.
The global algorithmic trading market is expected to grow to US$18.16 billion by 2025 from US$8.79 billion in 2016, according to estimates from Ireland-based Research and Markets, potentially amassing electronic forms of market abuse or oversight which can often go undetected for long periods.
“There is a key risk to the global economy if something were to happen that would clearly be a technical trigger disrupting the credit markets. The US market is driven 80 per cent by machines,” said Christopher Wood, equity strategist at CLSA.
“In one minute it would spread through the AI, passive, pseudo passive, smart beta strategies so they call it the ‘terminator risk’.”
Athena, an algorithmic, high frequency trading firm based in New York, used complex computer programmes to manipulate the closing prices of thousands of Nasdaq-listed stocks over six months, according to the US Securities and Exchange Commission in 2014.
Swift Trade, a Canadian investment firm, engaged in a form of manipulative trading activity known as “layering” in 2015.
This caused a succession of small price movements in a wide range of shares on the London Stock Exchange from which Swift Trade was able to profit. The trading activity involved tens of thousands of orders, was repeated on many occasions and was conducted in many different shares.
Last year’s “flash crash” in pound sterling, which saw the currency nosedive by 9 per cent against the dollar, was caused by algorithmic trading and complex trading positions although an experienced trader, which led to extreme prices and disruption of trades needed by the businesses of individuals, companies and asset managers.
Yallop was UK CEO for UBS from 2013-14. Before that he was group COO for ICAP from 2005-11, and spent 20 years at Deutsche Bank, as was its group COO from 2002-04. He has been a member of numerous financial services industry bodies.
He added last week that private sector members of the wholesale industry, such as banks and borrowers, need to step up in their responsibility for identifying misconduct and delivering solutions to safeguard the interests of the financial markets.
“The sense of how to behave [in business practises] existed in the past, but has somehow now been lost,” he added.
“And recreating that sense of appropriate behaviour can only be delivered by those organisations themselves.”