Tuesday, July 5, 2011

The evolutionary arms race in the realm of HFT algos

The history of life is littered with abundant evidence for evolutionary arms races, by which one (or one group) of organism(s) develops some advantage over competitors, predators, or prey, which is soon after countered by the disadvantaged group. The dance has continued from the earliest times of life until the present, and is presumed to be ongoing. Indeed, it is one of the central selective pressures effecting evolution--by eliminating the losers of the arms race.

As I am not an evolutionary biologist, I was thinking in particular of asymmetric races, in which competing organisms adopt different methods, rather than symmetric races.

My interest in such things stems from having a son (and many, many relatives) with G6PD deficiency, a relatively common enzyme deficiency--selected for in humans most likely because it confers some resistance to malaria. The chief drawback of this genetic condition is that eating certain foods (and medications) can cause massive destruction of red blood cells.

How does such a condition appear? Like most genetic conditions it most likely is an example of a random mutation which hangs around because it is selected for in malarial environments.

Plants have developed toxins over evolutionary history, and one such class of toxins causes destruction of red blood cells. Mammals (among other animals) have developed enzymes that break down these toxins, and the breakdown products are now beneficial. In fact we call these toxins "antioxidants".

Ironically, the enzyme G6PD apparently plays a role in the life-cycle of the malaria parasite, as those who have this condition and who are infected by malaria typically carry lower parasite loads.

In the digital realm, the concept of evolutionary arms races has been around since about 1980, and are most commonly observed in the ongoing battle between computer viruses and antivirus products.

They also appear in the realm of high-frequency trading (HFT), although as no one is eager to produce manuals on how their specific systems work it is a little harder to see how.

Years ago we used to see stops getting busted on in-demand shares--which we soon recognized as a sign that the particular stocks targeted would soon see gains. It always happened during a quiet trading time, usually after about 11 in the morning, and suddenly all the bids would be hit until a massive stop-loss was triggered and picked up. I remember in 2002 seeing CDE-N knocked down 30% in a matter of minutes, followed by a massive pick-up of some sucker's stop-loss, followed by furious action as the price bounced back. That passed for HFT in those days.

One of the modern approaches to HFT is latency arbitrage, whereby some entities are able to see more up-to-date buy and sell orders than the general public sees, and use this info to either scoop up the market with an arbitraged advantage or withdraw orders only to replace them moments later at a higher price. For instance, you may be trying to buy shares in company ABC, but as there are differing time-delays for each of the markets on which you are seeking shares--as soon as your first order appears on an exchange, all other available share orders at your buy price are cancelled and resubmitted at a higher price.

Recently, RBC announced a new program called "Thor", meant to combat latency arbitrage. The idea was that RBC would monitor the latency for all markets and use that info to ensure their orders arrive on all markets simultaneously.

Well, what's an HF arbitrageur to do? Why not try quote stuffing? A large number of quotes on a single stock can slow down the reporting on an entire channel, so why not use it to vary the latencies by random factors, making it more difficult for a program like Thor to work. If the latency factors for each market starts varying randomly, choosing the appropriate lags for Thor becomes impossible.

No comments:

Post a Comment