Robust Resampling Methods for Time Series

41
We study the robustness of block resampling procedures for time series. We first derive a set of formulas to characterize their quantile breakdown point. For the moving block bootstrap and the subsampling, we find a very low quantile breakdown point. A similar robustness problem arises in relation to data-driven methods for selecting the block size in applications. This renders inference based on standard resampling methods useless already in simple estimation and testing settings. To solve this problem, we introduce a robust fast resampling scheme that is applicable to a wide class of time series settings. Monte Carlo simulations and sensitivity analysis for the simple AR(1) model confirm the dramatic fragility of classical resampling procedures in presence of contaminations by outliers. They also show the better accuracy and efficiency of the robust resampling approach under di®erent types of data constellations. A real data application to testing for stock return predictability shows that our robust approach can detect predictability structures more consistently than classical methods.


Nonlinear Time Series Modeling: An Introduction

42
Recent developments in nonlinear time series modelling are reviewed. Three main types of nonlinear model are discussed: Markov Switching, Threshold Autoregression and Smooth Transition Autoregression. Classical and Bayesian estimation techniques are described for each model. Parametric tests for nonlinearity are reviewed with examples from the three types of model. Finally forecasting and impulse response analysis is developed.

Ratio Analysis and Equity Valuation

43
This paper outlines a financial statement analysis for use in equity valuation. Standard profitability analysis is incorporated, and extended, and is complemented with an analysis of growth. The perspective is one of forecasting payoffs to equities. So financial statement analysis is presented first as a matter of pro forma analysis of the future, with forecasted ratios viewed as building blocks of forecasts of payoffs. The analysis of current financial statements is then seen as a matter of identifying current ratios as predictors of the future ratios that drive equity payoffs. The financial statement analysis is hierarchical, with ratios lower in the ordering identified as finer information about those higher up. To provide historical benchmarks for forecasting, typical values for ratios are documented for the period 1963-1996, along with their cross-sectional variation and correlation. And, again with a view to forecasting, the time series behavior of many of the ratios is also described and their typical "long-run, steady-state" levels are documented.

Applications of Least Mean Square (LMS) Algorithm Regression in Time-Series Analysis

44
In this paper we present a very brief description of least mean square algorithm with applications in time-series analysis of economic and financial time series. We present some numerical applications; forecasts for the Gross Domestic Product growth rate of UK and Italy, forecasts for S&P 500 stock index returns and finally we examine the day of the week effect of FTSE 100 for a short period. A full programming routine written in MATLAB software environment is provided for replications and further research applications.

Time-Localized Wavelet Multiple Regression and Correlation: Eurozone Stock Markets Across Scales and Time

45
This paper extends wavelet methodology to handle comovement dynamics of multivariate time series via moving weighted regression on wavelet coefficients.

The concept of wavelet local multiple correlation is used to produce one single set of multiscale correlations along time, in contrast with the large number of wavelet correlation maps that need to be compared when using standard pairwise wavelet correlations with rolling windows. Also, the spectral properties of weight functions are investigated and it is argued that some common time windows, such as the usual rectangular rolling window, are not satisfactory on these grounds.
The method is illustrated with a multiscale analysis of the comovements of Eurozone stock markets during this century. It is shown how the evolution of the correlation structure in these markets has been far from homogeneous both along time and across timescales featuring an acute divide across timescales at about the quarterly scale. At longer scales, evidence from the long-term correlation structure can be interpreted as stable perfect integration among Euro stock markets. On the other hand, at intramonth and intraweek scales, the short-term correlation structure has been clearly evolving along time, experiencing a sharp increase during financial crises which may be interpreted as evidence of financial ‘contagion’.


Automated Trading with Genetic-Algorithm Neural-Network Risk Cybernetics: An Application on FX Markets

46
Recent years have witnessed the advancement of automated algorithmic trading systems as institutional solutions in the form of autobots, black box or expert advisors. However, little research has been done in this area with sufficient evidence to show the efficiency of these systems. This paper builds an automated trading system which implements an optimized genetic-algorithm neural-network (GANN) model with cybernetic concepts and evaluates the success using a modified value-at-risk (MVaR) framework. The cybernetic engine includes a circular causal feedback control feature and a developed golden-ratio estimator, which can be applied to any form of market data in the development of risk-pricing models. The paper applies the Euro and Yen forex rates as data inputs. It is shown that the technique is useful as a trading and volatility control system for institutions including central bank monetary policy as a risk-minimizing strategy. Furthermore, the results are achieved within a 30-second timeframe for an intra-week trading strategy, offering relatively low latency performance. The results show that risk exposures are reduced by four to five times with a maximum possible success rate of 96%, providing evidence for further research and development in this area.

On the Continuous Limit of GARCH

47
GARCH processes constitute the major area of time series variance analysis, hence the limit of these processes is of considerable interest for continuous time volatility modelling. The continuous time limit of the GARCH(1,1) model is fundamental for limits of other GARCH processes, yet it has been the point of much debate between econometricians. The seminal work of Nelson (1990) derived the GARCH(1,1) limit as a stochastic volatility process, uncorrelated with the price process. But then a subsequent paper of Corradi (2000) that derives the limit as a deterministic volatility process and several other contradictory papers followed. We reconsider this continuous limit, arguing that because the strong GARCH model is not aggregating in time it is incorrect to consider its limit. Instead it is legitimate to use the weak definition of GARCH that is aggregating in time. This model differs from strong GARCH by defining the discrete time process on the best linear predictor of the squared errors, rather than the conditional variance itself. We prove that its continuous limit is a stochastic volatility model with correlated Brownian motions in which both the variance diffusion coefficient and the price-volatility correlation are related to the skewness and kurtosis of the physical returns density. Under certain assumptions our limit model reduces to Nelson's GARCH diffusion.

Extreme Risk and Fractal Regularity in Finance

48
As the Great Financial Crisis reminds us, extreme movements in the level and volatility of asset prices are key features of financial markets. These phenomena are difficult to quantify using traditional models that specify extreme risk as a rare event. Multifractal analysis, whose use in finance has considerably expanded over the past fifteen years, reveals that price series observed at different time horizons exhibit several forms of scale invariance. Building on these regularities, researchers have developed a new class of multifractal processes that permit the extrapolation from high-frequency to low-frequency events and generate accurate forecasts of asset volatility. The new models provide a structured framework for studying the likely size and price impact of events that are more extreme than the ones historically observed.

An Improved Moving Average Technical Trading Rule

49
This paper proposes a modified version of the widely used price and moving average cross-over trading strategies. The suggested approach (presented in its 'long only' version) is a combination of cross-over 'buy' signals and a dynamic threshold value which acts as a dynamic trailing stop. The trading behavior and performance from this modified strategy is different from the standard approach with results showing that, on average, the proposed modification increases the cumulative return and the Sharpe ratio of the investor while exhibiting smaller maximum drawdown and smaller drawdown duration than the standard strategy.

Fundamental Information in Technical Trading Strategies

50
Technical trading strategies assume that past changes in prices help predict future changes. This makes sense if the past price trend reflects fundamental information that has not yet been fully incorporated in the current price. However, if the past price trend only reflects temporary pricing pressures, the technical trading strategy is doomed to fail. We demonstrate that this failure can be avoided by using financial statements as additional sources of information.

We implement a trading strategy that invests in stocks with high past returns and high operating cash flows. This combination strategy yields a 3-factor alpha of 15% per year, which is much higher than that of the pure momentum strategy that invests in stocks with high past returns without considering operating cash flows. The combination strategy outperforms the momentum strategy in almost all years. The outperformance can be traced back to a higher probability of picking outperforming stocks. These are stocks that yield high future cash flows and hardly ever delist due to poor performance. The combination strategy is easily implemented: the information used is publicly available, the stocks chosen are liquid, and even high transaction costs do not erode the outperformance.


Who is online

Users browsing this forum: No registered users and 13 guests