Van the man:
Foster the People:
This is quite the mess. I’m reading lobbyist Jeff Connaughton’s book “Payoff: Why Wall Street Always Wins”, and he goes into great detail about his stint as chief of staff to then-Sen. Ted Kaufman (D-DE) as they tried in vain to get the SEC to address this very issue. Quite the eye opener: Basically, the SEC doesn’t do a damned thing unless Wall Street lets them. Add to that the fact that these trades are so complicated, so esoteric, even the traders don’t always understand them – let alone the regulators. It’s a casino, and brokers object to even a 50-millisecond delay. Kaufman did predict a flash crash, and the SEC did take notice – by finally agreeing to ask Wall Street for data about these trades. Good luck with that!
Why is it important? Because it means the retail trader (you) will never, ever get accurate information about the markets. It’s being manipulated by these high-speed trades and you’ll never really know.
That’s why this Chicago Fed studyis important:
The Chicago Federal Reserve paper, How to Keep Markets Safe in the Era of High-Speed Trading, prattled of a laundry list of the most recent high frequency trading debacles including Knight Capital as well as others. Yet in spite of these increasingly frequent stock market disasters, even basic risk controls are not implemented. Why? They claim it would slow down their trading systems.
Industry and regulatory groups have articulated best practices related to risk controls, but many firms fail to implement all the recommendations or rely on other firms in the trade cycle to catch an out-of-control algorithm or erroneous
trade. In part, this is because applying risk controls before the start of a trade can slow down an order, and high-speed trading firms are often under enormous pressure to route their orders to the exchange quickly so as to capture a trade at the desired price.
While the paper focuses on events, contained within is a solid example of really bad software engineering. The Chicago Fed found code wasn’t even tested, literally changes are being made on the fly on live production servers not just putting those trades at risk but the entire system as well.
Another area of concern is that some firms do not have stringent processes for the development, testing, and deployment of code used in their trading algorithms. For example, a few trading firms interviewed said they deploy new trading strategies quickly by tweaking old code and placing it into production in a matter of minutes.
Perhaps financial organizations should consider hiring some real engineers who know a thing or two about software design instead of what they are doing. No American who is worth their salt, including those specializing in advanced mathematics, would ever change, on the fly, algorithms on a live server, dealing with billions of dollars.
The study also found out of whack fictional financial mathematics, referred to as algorithms, being designed as well.
Chicago Fed staff also found that out-of-control algorithms were more common than anticipated prior to the study and that there were no clear patterns as to their cause. Two of the four clearing BDs/FCMs, two-thirds of proprietary trading firms, and every exchange interviewed had experienced one or more errant algorithms.
The report amplifies just astounding irresponsibility and engineering incompetence. Can you imagine someone in a nuclear facility implementing software changes on the fly? Can you imagine air traffic control algorithms refusing to put in safety and error checks, claiming that slows down real time air traffic routing execution?
The Chicago Fed does give some recommendations in their report:
- Limits on the number of orders that can be sent to an exchange within a specified period of time
- A “kill switch” that could stop trading at one or more levels
- Intraday position limits that set the maximum position a firm can take during one day
- Profit-and-loss limits that restrict the dollar value that can be lost.
Lianne La Havas:
Band of Horses: