A method for de-trending asset prices

Financial market prices and return indices are non-stationary time series, even in logarithmic form. This means not only that they are drifting, but also that their distribution changes overtime. The main purpose of de-trending is to mitigate the effects of non-stationarity on estimated price or return distribution. De-trending can also support the design of trading strategies. The simplest basis for estimating trends is to subtract moving averages. The key challenge is to pick the appropriate average window, which must be long enough to detect a trend and short enough to make the de-trended data stationary. A neat method is to pick the window based on the kurtosis criterion, i.e. choosing the window length that brings the ‘fatness of tails’ of de-trended data to what it should look like under a normal distribution.

(more…)

Tradable economics

Tradable economics is a technology for building systematic trading strategies based on economic data. Economic data are statistics that – unlike market prices – directly inform on economic activity. Tradable economics is not a zero-sum game. Trading profits are ultimately paid out of the economic gains from a faster and smoother alignment of market prices with economic conditions. Hence, technological advances in the field increase the value generation or “alpha” of the asset management industry overall. This suggests that the technology is highly scalable. One critical step is to make economic data applicable to systematic trading or trading support tools, which requires considerable investment in data wrangling, transformation, econometric estimation, documentation, and economic research.

(more…)

FX trading strategies based on output gaps

Macroeconomic theory suggests that currencies of countries in a strong cyclical position should appreciate against those in a weak position. One metric for cyclical strength is the output gap, i.e. the production level relative to output at a sustainable operating rate. In the past, even a simple proxy of this gap, based on the manufacturing sector, seems to have provided an information advantage in FX markets. Empirical analysis suggests that [1] following the output gap in simple strategies would have turned a trading profit in the long-term, and [2] the return profile would have been quite different from classical FX trading factors.

(more…)

Crowded trades: measure and effect

One measure of the crowdedness of trades in a portfolio is centrality. Centrality is a concept of network analysis that measures how similar one institution’s portfolio is to its peers by assessing its importance as a network node. Empirical analysis suggests that [1] the centrality of individual portfolios is negatively related to future returns, [2] mutual fund holdings become more similar when volatility is high, and [3] the centrality of portfolios seems to reflect lack of information advantage. This evidence cautions against exposure to crowded trades that rely upon others’ information leadership or are motivated by widely publicized persuasive views.

(more…)

Treasury basis and dollar overshooting

Safe dollar assets, such as Treasury securities, carry significant convenience yields. Their suitability for liquidity management and collateralization means that they provide value over and above financial return. The dollar exchange rate clears the market for safe dollar-denominated assets. Hence, when the convenience value of such assets turns positive the dollar appreciates above its long-term equilibrium, similar to classical exchange rate overshooting. Changes in convenience yields are common responses to financial crises, monetary policy actions, and regulatory changes. A proxy for such fluctuations is the Treasury basis, the difference between an actual Treasury yield and the yield on a synthetic counterpart based on foreign-currency yields and FX hedges. There is empirical support for the link between the Treasury basis on the dollar exchange rate.

(more…)

The quantitative path to macro information efficiency

Financial markets are not information efficient with respect to macroeconomic information because data are notoriously ‘dirty’, relevant economic research is expensive, and establishing stable relations between macro data and market performance is challenging. However, statistical programming and packages have prepared the ground for great advances in macro information efficiency. The quantitative path to macro information efficiency leads over three stages. The first is elaborate in-depth data wrangling that turns raw macro data (and even textual information) into clean and meaningful time series whose frequency and time stamps accord with market prices. The second stage is statistical learning, be it supervised (to validate logical hypotheses), or unsupervised (to detect patterns). The third stage is realistic backtesting to verify the value of the learning process and to assess the commercial viability of a macro trading strategy.

(more…)

Reinforcement learning and its potential for trading systems

In general, machine learning is a form of artificial intelligence that allows computers to improve the performance of a task through data, without being directly programmed. Reinforcing learning is a specialized application of (deep) machine learning that interacts with the environment and seeks to improve on the way it performs a task so as to maximize its reward. The computer employs trial and error. The model designer defines the reward but gives no clues as to how to solve the problem. Reinforcement learning holds potential for trading systems because markets are highly complex and quickly changing dynamic systems. Conventional forecasting models have been notoriously inadequate. A self-adaptive approach that can learn quickly from the outcome of actions may be more suitable. A recent paper proposes a reinforcement learning algorithm for that purpose.

(more…)

The low-risk effect: evidence and reason

The low-risk effect refers to the empirical finding that within an asset classes higher-beta securities fail to outperform lower-beta securities. As a result, “betting against beta”, i.e. leveraged portfolios of longs in low-risk securities versus shorts in high-risk securities, have been profitable in the past. The empirical evidence for the low-risk effect indeed is reported as strong and consistent across asset classes and time. The effect is explained by structural inefficiencies in financial markets, such as leverage constraints for many investors, focus on the performance of portfolios against benchmarks, institutional incentives to enhance beta and – for some investors – a preference for lottery-like securities with high upside risks.

(more…)

How to build a quantamental system for investment management

A quantamental system combines customized high-quality databases and statistical programming outlines in order to systematically investigate relations between market returns and plausible predictors. The term “quantamental” refers to a joint quantitative and fundamental approach to investing. The purpose of a quantamental system is to increase the information efficiency of investment managers, support the development of robust algorithmic trading strategies and to reduce costs of quantitative research. Its main building blocks are [1] bespoke proprietary databases of “clean” high-quality data, [2] market research outlines that analyse the features of particular types of trades, [3] factor construction outlines that calculate plausible trading factors based on theoretical reasoning, [4] factor research outlines that explore the behaviour and predictive power of these trading factors, [5] backtest outlines that investigate the commercial prospects of factor-based strategies, and [6] trade generators that calculate positions of factor-based strategies.

(more…)

Analyzing global fixed income markets with tensors

Roughly speaking, a tensor is an array (generalization of a matrix) of numbers that transform according to certain rules when the array’s coordinates change. Fixed-income returns across countries can be seen as residing on tensor-like multidimensional data structures. Hence a tensor-valued approach allows identifying common factors behind international yield curves in the same way as principal components analysis identifies key factors behind a local yield curve. Estimated risk factors can be decomposed into two parallel risk domains, the maturity domain, and the country domain. This achieves a significant reduction in the number of parameters required to fully describe the international investment universe.

(more…)