Introduction to Algorithmic Trading
Financial institutions have been rapidly increasing the usage of digital technology since the 1970s. Competitive market enforced having the latest technology in many operations that used to be manual in the past. Looks like one of those would be trading on the markets, where good old ‘gut-feeling’ is becoming less and less useful and leaves a place to clusters of CPUs or GPUs, ‘Big Data’ and so-called ‘Algorithmic Trading’.
What is Algorithmic Trading?
Algorithmic trading means the use of electronic platforms for entering trading orders with an algorithm which executes pre-programmed trading instructions whose variables may include timing, price or quantity of the order. Trades are executed with marginal or without any human intervention, algorithms require investors to first specify their goals in terms of mathematical variables.
The overall trend observed during the past few decades has been towards quantitative ans algorithmic approach in investing and there’s seems to be no reversal from that on the horizon. Algorithmic trading plays a major role in the financial markets and there is good chance it will play even bigger in the future.
Depending on investors needs, customized algorithms range from simple to highly sophisticated, computers implement trades following the exactly prescribed instructions. Algorithmic trading has become possible due to fully electronic infrastructure in stock trading systems. The term black box trading is used to describe very advanced systems operating without or with minimal human intervention.
Pros and Cons of Algorithmic Trading
Algorithmic trading has several advantages compared to trading carried out by a human trader. Computer systems have a much shorter reaction time and reach a higher level of reliability. The decisions made by a computer system rely on the underlying strategy with a specified set of rules, which lead to a reproducibility of these decisions. Consequently, back-testing and improving the strategy by varying the rules possible. Algorithmic trading also ensures objectivity in trading decisions and is not exposed to subjective influences. When trading many different securities at the same time, one computer system may substitute many human traders. Observation and the trading of securities of a large variety become possible for companies without employing hundreds of traders.
Nevertheless, the automated trading requires monitoring, altogether it results in a better performance of the investment strategy as well as in lower trading costs.
It is challenging to automate the whole process from investment decision to execution, there is always a trade-off between system stability and robustness and complexity. The less complex the system the more solid operation, on the other hand, lack of complexity may lead to lower profitability.
Algorithmic trading has been often related to increasing volatility on the markets, according to some algorithms may produce snowball effects and so-called ‘flash crashes’. Another aspect in a discussion about algorithmic trading is its effect on liquidity. These trading strategies may improve liquidity if they act as market makers, but on the other hand, if they create extra imbalance the effect may just the opposite. There are scientific findings that indicate, algorithmic trading improves liquidity and enhances the informativeness of quotes. It helps reducing transaction costs and risk, improves entry speed, increases trade control and reduces bid/ask spread.
Automated Trading System
The concept of an automated trading system refers to an idea of computer trading program that automatically submits trades to a given exchange. Depending on a trading frequency the speed of data may have a significant influence whether the trading system is profitable or not. An automated trading system consists of many layers all of which must be taken into consideration. First of all, trading system must have a reliable market data feed. This data is being fed to algorithms to find if there are currently such market conditions, which would have been likely profitable trading opportunities according to back-test (Alpha Model). If the back-test support market conditions then the risk management should be taken into account: does this trade fit to our current portfolio’s risk profile and what kind of effect it has on portfolio’s total market exposure (Risk Model). Assuming the potential trade ticks two above boxes, it can be executed on the market. Before executing an order, a strategy should also tell when the trade should be exited if the market turns against the trade but also when to close the profitable trade (Transaction Cost Model).
The result of a strategy will be the returns over time of those trades and the performance measures. Just to scratch the surface, there are two main kinds of strategies to perform in quantitative trading, mean-reversion and momentum. If the price of an asset is relative low or high respect a reference, for example, the average historical price of a share, and it is expected the come back to that reference, the strategies exploiting this situation are called mean-reversion strategies. For example, it is possible to make a strategy with two stocks highly correlated in time. If the prices are moving in opposite direction (making the difference between them greater) buying one and selling the other with the same amount of money expecting a reverse in that movement. Momentum strategies follow a trend in a certain direction.
Pitfalls to avoid
In the strategy research process, there are numerous ways to fall in pitfalls, leading in well-performed strategies with historical data but useless with real-time prices.
The main issues may be:
- Taking into consideration stocks splits and dividends
- Missing data and abnormally differences between periods
- Survivorship bias, problem testing only companies that are ‘alive’
- Use data that it has not been generated during the period analyzed
- Overfitting, use parameters to fit the historical data
- Overfitting, use parameters to fit the historical data
To begin with, the data sources could contain errors. The dividends are discounted in the stock price, the past data is usually adjusted to not have big drops in prices when, actually, the money is in the account of the investor. Also, the prices have to be adjusted when the company decides to divide a share into multiples shares, the same way the prices have to do. The reliability of the data has associated a data cost to pay to the provider and must to discount to the final result of the strategy. An important associated cost and typical pitfall to fall not including it in the strategy result is the transaction cost. This is the price paid to the broker to send the orders to the market. For this reason, the strategies with fewer trades and great returns in each trade are good to avoid this cost. The pitfall called survivorship bias is when that is not including in the strategy research the companies that went into bankruptcy. Another important pitfall is overfitting, strategy performs well only with historical data used for tuning, but underperforms in live conditions. To avoid this bias, the strategy has to be tested with additional data or “out-of-the-sample” data before going life. Last but not least, look-ahead bias, that is using data that were not available in the test period, for example, a financial ratio that is brought at the end of the year.
Traders use a variety of ratios to measure and compare the performance of different strategies. This process is present in the whole cycle of testing and tuning to ensure the performance will be good at a time when strategy executes real trades.
Most often the media or investment fund reports show the returns to tell how an investment had performed, it says little of the characteristics and risk of the investment.
Things like long periods of negative returns or high returns followed by big drops are not always expected by investors, to take that into consideration few measures have been developed:
Sharpe Ratio – it measures the excess return of the portfolio (above the risk-free rate) in terms of excess portfolio’s volatility. The ratio measures just the alpha component of the total return. Important to note that the Sharpe Ratio does not distinguish between positive and negative returns at the volatility.
Sortino Ratio – it is a modification of the Sharpe Ratio where only the downside volatility penalizes the ratio.
Drawdown measures like ‘max drawdown’ that is the maximum percentage decrease in capital encountered in the period or ‘max drawdown period’ that is maximum period in which strategy was losing money
Understanding the concept of risk is crucial for creating a good risk management strategy. Risk can be defined as the uncertainty of outcomes. In the financial literature, a risk is the likelihood of losses resulting from unexpected events related to movements in the market. Extreme events (like crashes) may have a low probability of occurring but also may cause a big loss. These low probability events are more intractable because they are usually hard to anticipate. Algorithmic trading is exposed to different types of risk which could be generally categorized into two main types: market risk (related to market behavior) and operational risk (eg. infrastructure failure).
As financial markets become more competitive, financial institutions and private investors have started to turn to automated trading, to gain competitive advantage. With the ability to communicate electronically with exchanges and other electronic trading venues, it is now possible to construct automated trading systems to analyze the changing market data and place orders when certain criteria are met. These systems can be customized to execute almost any trading strategy. Different people might utilize algorithmic trading systems to achieve different objectives although most of them contain one of the basic layers:
- The trading signal generation
- Risk management
- Systems that automate the trade execution