A Simplified Approach to Understanding the Kalman Filter Technique

31
The Kalman Filter is a time series estimation algorithm that is applied extensively in the field of engineering and recently (relative to engineering) in the field of finance and economics. However, presentations of the technique are somewhat intimidating despite the relative ease of generating the algorithm. This paper presents the Kalman Filter in a simplified manner and produces an example of an application of the algorithm in Excel. This scaled down version of the Kalman filter can be introduced in the (advanced) undergraduate classroom as well as the graduate classroom.


Factors on Demand: Building a Platform for Portfolio Managers, Risk Managers and Traders

32
We introduce "factors on demand", a modular, multi-asset-class return decomposition framework that extends beyond the standard systematic-plus-idiosyncratic approach. This framework, which rests on the conditional link between flexible bottom-up estimation factor models and flexible top-down attribution factor models, attains higher explanatory power, empirical accuracy and theoretical consistency than standard approaches.

We explore applications stemming from factors on demand:
- The joint use of a statistical model with non-idiosyncratic residual for return estimation and a cross-sectional model for return attribution
- The optimal hedge of a portfolio of options, even when the investment horizon is close to the expiry and thus the securities are heavily non-linear
- The "on demand" feature of FoD to extract a parsimonious set of few dominant attribution factors/hedges that change dynamically with time
- Accommodating in the same platform global and regional models that give rise to the same, consistent risk numbers
- Point-in-time style analysis, as opposed to the standard trailing regression
- Risk attribution to select target portfolios to track the effect of incremental alpha signals on the allocation process

Fully commented code supporting the above case studies is available at MATLAB Central File Exchange under the author's page.

Can the VIX Signal Market's Direction? An Asymmetric Dynamic Strategy

33
The article shows statistically that the VIX Implied Volatility Index is an important driver of the S&P 500 future returns. The statistical analysis is performed by means of a regression based on dummy variables in order to circumvent the difficulties posed by the lack of linearity between the variables. The results obtained are then used to construct an automated procedure that signals daily whether it is convenient to invest in the S&P 500 or to stay put. Finally, we test the quality of the signal by implementing an asymmetrical buy-and-hold strategy with 3-months horizon on the S&P 500. Our results show that the strategy outperforms the long-only strategy on the same index, thus confirming a widespread belief among traders.

Which Trend Is Your Friend?

34
Managed-futures funds and CTAs trade predominantly on trends. There are several ways of identifying trends, either using heuristics or statistical measures often called “filters.” Two important statistical measures of price trends are time series momentum and moving average crossovers. We show both empirically and theoretically that these trend indicators are closely connected. In fact, they are equivalent representations in their most general forms, and they also capture many other types of filters such as the HP filter, the Kalman filter, and all other linear filters. Further, we show how these filters can be represented through “trend signature plots” showing their dependence on past prices and returns by horizon. Our results unify and broaden a range of trend-following strategies, and we discuss the implications for investors.

Wavelet Improvement in Turning Point Detection Using a Hidden Markov Model

35
The Hidden Markov Model (HMM) has been widely used in regime classification and turning point detection for econometric series after the decisive paper by Hamilton (1989). The present paper will show that when using HMM to detect the turning point in cyclical series, the accuracy of the detection will be influenced when the data are exposed to high volatilities or combine multiple types of cycles that have different frequency bands. Moreover, outliers will be frequently misidentified as turning points. The present paper shows that these issues can be resolved by wavelet multi-resolution analysis based methods. By providing both frequency and time resolutions, the wavelet power spectrum can identify the process dynamics at various resolution levels. We apply a Monte Carlo experiment to show that the detection accuracy of HMMs is highly improved when combined with the wavelet approach. Further simulations demonstrate the excellent accuracy of this improved HMM method relative to another two change point detection algorithms. Two empirical examples illustrate how the wavelet method can be applied to improve turning point detection in practice.


Rise of the Machines: Algorithmic Trading in the Foreign Exchange Market

36
We study the impact of algorithmic trading in the foreign exchange market using a long time series of high-frequency data that specifically identifies computer-generated trading activity. Using both a reduced-form and a structural estimation, we find clear evidence that algorithmic trading causes an improvement in two measures of price efficiency in this market: the frequency of triangular arbitrage opportunities and the autocorrelation of high-frequency returns. Relating our results to the recent theoretical literature on the subject, we show that the reduction in arbitrage opportunities is associated primarily with computers taking liquidity, while the reduction in the autocorrelation of returns owes more to the algorithmic provision of liquidity. We also find evidence that algorithmic traders do not trade with each other as much as a random matching model would predict, which we view as consistent with their trading strategies being highly correlated. However, the analysis shows that this high degree of correlation does not appear to cause a degradation in market quality.

Statistical Arbitrage: Medium Frequency Portfolio Trading

37
Medium frequency trading strategies include all trading activities, that do not require market microstructure analysis on one side and signicantly depend on market impact on the other side. The most important dierence from high frequency trading is the ability to analyze big amount of data using complex algorithms. Portfolio management in this case is the dynamic process, combination of signal (alpha) discovery and optimal execution on the level of trading scheduling. We used close price and trading volume time series for the list of S&P 500 companies that exist in an index since the beginning of 2008 at least. In this paper we present signal generation approaches as well as optimization of portfolio transactions. Formally the performances of medium frequency statistical arbitrage strategies are much better than the performance of their benchmarks, but they are very sensitive to the quality of trading engine and optimization software. In this minor revision we added the results of out-of-sample tests and explanations of terms and methodology.

Multifactor Model of Growth and Z Score for Projecting Stock Return and Evaluating Risk

38
A growing body of literature has examined and noted significant anomalies in the form of empirical regularities in stock return. These phenomena contradict the well-established paradigms of finance and puzzled many financial researchers. To contribute toward this field of study, this paper seeks to investigate two anomalies, namely, Z score and sales growth effects, in the United States equity market. Applying the time-series regressions, the findings of the analysis provide evidences that these two anomaly-variables exist in the US securities market. As implication of this study, the three factors (market, Z score and sales growth) can be used to guide portfolio selection.

From Correlation to Granger Causality

39
The paper focuses on establishing causation in regression analysis in observational settings. Simple static regression analysis cannot establish causality in the absence of a priori theory on possible causal mechanisms or controlled and randomized experiments. However, two regression based econometric techniques – instrumental variables and Granger causality - can be used to test for causality given some assumptions. The Granger causality technique is applied to a time series data set on energy and economic growth from Sweden spanning 150 years to determine whether increases in energy use and energy quality have driven economic growth. I show that the Granger causality technique is very sensitive to variable definition, choice of additional variables in the model, and sample periods. Better results can be obtained by using multivariate models, defining variables to better reflect their theoretical definition, and by using larger samples. The better specified models with larger samples are more likely to show that energy causes output growth but it is also possible that the relationship between energy and growth has changed over time. Energy prices have a significant causal impact on both energy use and output while there is no strong evidence that energy use causes carbon and sulfur emissions despite the obvious physical relationship.

Wavelet Multiresolution Analysis of High-Frequency Asian FX Rates

40
FX pricing processes are nonstationary and their frequency characteristics are time-dependent. Most do not conform to geometric Brownian motion, since they exhibit a scaling law with a Hurst exponent between zero and 0.5 and fractal dimensions between 1.5 and 2. This paper uses wavelet multiresolution analysis, with Haar wavelets, to analyze the nonstationarity (time-dependence) and self-similarity (scale-dependence) of intra-day Asian currency spot exchange rates. These are the ask and bid quotes of the currencies of eight Asian countries (Japan, Hong Kong, Indonesia, Malaysia, Philippines, Singapore, Taiwan, Thailand), and of Germany for comparison, for the crisis period May 1, 1998 - August 31, 1997, provided by Telerate (U.S. dollar is the numéraire). Their time-scale dependent spectra, which are localized in time, are observed in wavelet based scalograms. The FX increments can be characterized by the irregularity of their singularities. This degrees of irregularity are measured by homogeneous Hurst exponents. These critical exponents are used to identify the fractal dimension, relative stability and long term dependence of each Asian FX series. The invariance of each identified Hurst exponent is tested by comparing it at varying time and scale (frequency) resolutions. It appears that almost all FX markets show anti-persistent pricing behavior. The anchor currencies of the D-mark and Japanese Yen are ultra-efficient in the sense of being most anti-persistent. The Taiwanese dollar is the most persistent, and thus unpredictable, most likely due to administrative control. FX markets exhibit these non-linear, non-Gaussian dynamic structures, long term dependence, high kurtosis, and high degrees of non-informational (noise) trading, possibly because of frequent capital flows induced by non-synchronized regional business cycles, rapidly changing political risks, unexpected informational shocks to investment opportunities, and, in particular, investment strategies synthesizing interregional claims using cash swaps with different duration horizons.


Who is online

Users browsing this forum: No registered users and 14 guests