In reviewing a number of the recent articles, I've outlined some proposed solutions to some problems created by ultra-high-speed HFT. Here is a summary of them all in one place. Unfortunately we cannot solve all problems at once - the ones I am trying to address here are:
The solutions have a simple theme: execution venues should allow liquidity providers to have more control over their orders. In this vein, the exchanges would implement:
The execution venues would use a TimeMatch-based order book (instead of the current price-time priority), allowing the liquidity provider to choose the time-frame over which a short auction auction will operate every time a trade is matched.
This allows the liquidity provider can specify the minimum size that each parcel of their order can execute in. eg. this would prevent 1-share orders from executing against them. Whilst this is beneficial to retail traders who then get stuck with a (expensive) single share, it also prevents algorithms from gaining insight into what others are doing.
Note that this allows 'all-or-nothing' orders, by specifying the minimum execution size as the size of the entire order.
A lot of the asymmetry in algorithmic tools lies in the fact that achieving low latency is an capital intensive exercise. Constant refinement, upgrades and improvements are required to keep an algorithm running at top speed - since all your competitors are constantly improving as well. There is always a select few in the market that have the best, at any given time, implementation of a particular type of algorithm. By 'best' I guess I am really saying, fastest - so we are assuming these algorithms are fairly simple and stupid beasts - and largely compete on their speed. Note that this competition does not benefit anyone (but themselves), so there is no commensurate improvement in market quality as a result of this battle.
Allow lit and dark liquidity to co-exist, don't banish the dark stuff to the dark pools! This is very controversial, and I need to do some more writing on it - however the liquidity provider is king. If he doesn't get what he wants, he goes somewhere else (read: fragmentation). He can get darkness somewhere, just at further cost to the market, so give it to him in the central market. More on this later..
One way to 'level this playing field' is to provide a perfect (from a speed point of view) implementation to everyone, on an equal access basis. This can be achieved by implementing algorithms within the exchange, and allowing trading firms to place these order types - which then get run within the exchange computers. This is not, and does not require co-location by the firm. It is simply a matter of a firm, instead of specifying a simple limit or market order, sending an algorithmic order. The exchange's computer then runs the algorithm, creating whatever limit and/or market orders result from the particular strategy.
This capability would have three key impacts:
The sorts of basic algorithms that could be useful:
Of course all these algorithms would need to come with a suite of features to prevent themselves from being 'sniffed out' by other algorithms, since the capabilities of these algorithms would be widely known. Things such as: time/volume randomisations and thresholds would allow them to avoid detection - by ensuring they behave slightly differently in every instance. All algorithms would be dark - obviously you don't want information being leaked on stop orders, or what underlying strategies are being used during execution.
Extra effort would be needed (over and above what exchanges usually provide) to also make these algorithms very safe. Given that these will never need to compete on speed (operating within the exchange's computers), it makes it a very good environment to implement a lot of intelligence to prevent the kind of non-linearity that currently causes massive nonsensical spikes in the market.
Would it be totally crazy to suggest the idea of firms being able to create their own algorithms, to run on the exchange's servers? Now, there are many areas of concern here, but stick with me!
One impact this would have (if implemented correctly) would be full diversity of algorithmic activity, ensuring you have a good spread of behaviour in the market - which leads to lower volatility. This would allow equal-access to algorithmic tools to all participants, meaning no technology war in order to be the fastest. Every firm would now be 'the fastest' and would now compete on the intelligence of algorithms rather than the outright speed. Speed is dangerous because it leads to poor and fragile behaviour - as safety and good strategy costs in performance.
For those worrying about algos seriously screwing with the markets in this scenario, well what can I say but that's a valid concern! But it may be possible to manage the risks. The algorithms would obviously be heavily 'sand boxed' and be prevented from impacting on the performance of the exchange itself beyond reasonable limits. I don't think it's impossible, though, to implement this in a workable fashion: but your quality control on this would be extremely high. The up-side is that because it's fully visible to the exchange you actually have the ability to control the process, rather than trust all the "Knight Capital"-esq firms: that they can do their software deployments correctly.
[to be continued..]