Wednesday, March 6, 2013

A volatility filter using historical vol

We have been looking at a way to improve risk adjusted returns by using a volatility filter. Although we could use VIX or equivalent, it turns out that historical volatility will work just as well, if not a little better.

You can see part 1 here Digging into the VIX, and part 2 here What can we use VIX for?

Although the mean return of how we slice things is zero, the distribution of returns is wider for higher readings of our relative measure of volatility. High volatility begets high volatility, at least for our purposes.

By staying out during periods of higher relative volatility, we aim to reduce drawdowns and the volatility of our returns, leading to better risk adjusted results.

A plus of using HV over some external measure like VIX is that it is readily available for any underlying. This means such a filtering technique can be applied to whatever it is we are trading.


Below is a table with two comparisons, the first compares the HV filter to buy and hold.  Although performance is generally better, we still get some pretty big drawdowns.

The second adds in a 200 period moving average, which is a reasonably strong way of protecting against downside. Again we can see lower volatility and smaller drawdowns with the addition of a vol filter.

I used a 3 month/63 day look back for our relative volatility measure. I haven’t really dug in to what happens when volatility remains elevated for extended periods of time.

I also did not experiment much with the threshold for where we draw the line on ‘high’ relative volatility. I use 0.6 as the cutoff because I originally split things into quintiles when making the first charts.

I also ran this for RUT and NDX over the same period

These results are all frictionless, don’t factor in dividends, return on cash, etc, etc. I don’t consider this viable as a standalone indicator, but something that can be used along side other factors like rotational strategies, or as a potential tool if you are looking for lower volatility.

The source is up here, feel free to have a play around with it and see how you go.

Thanks for reading, 'till next time.


  1. This was very interesting, thanks for writing it up and posting! The combination of HV and SMA really seemed to achieve good returns while tempering the downside...I'm wondering what the strategy would have done in more normal times, i.e., during a period that didn't include the 2008 crash. Not that we have "normal" anymore. :-)

  2. Hey thanks! i did do some testing of different time frames and different underlyings and how goes is kinda mixed. In general it does usually give a better sharpe, but often the final return isn't as good as a straight MA (or even straight B&H), and is sometime negative. I don't really like Sharpe as a metric but everyone knows it and how it works.

    It is quite dependent on lookback period and threshold. Longer lookbacks (9-12 months and up) do tend to improve its performance. I haven't checked but I'm sure the threshold for high vol can be tweaked.

    Again it's not really something I think that works as a standalone indicator, but can be used in conjunction with rotational/relative strength strategies, position sizing or as a feature for machine learning input.

  3. Unfortunately I am not able to run the code.

    R says:

    > perfdata <- lapply(symenv, comp_rets)
    NDX ..Error in FUN(.subset_xts(data, (i - width + 1):i), ...) :
    unused argument(s) (fill = NA)

    What am I doing wrong?

    Thank you for helping!

    1. Hey

      It looks like an error with one of the calls to rollapply. I was getting similar errors when I was writing it, there was something going on between quantmod and PerformanceAnalytics modules.
      The solution for me was to update my modules (quantmod, PerformanceAnalytics, zoo and xts as well) and it all worked from there. Try that and let me know how it goes.

    2. Now everything seems to work fine - Thank you!


Note: Only a member of this blog may post a comment.