Saturday, December 17, 2011
December Ramblings
It's been a little quiet here of late, I have relocated from Berlin to Sydney which has engaged my attention but I am starting to feel a little more settled. It's great to be back with familiar faces and places after a few years out in the wilderness.
I have enrolled in a postgraduate Math/Statistics degree which I am very much looking forward to. It starts next year so I have been revising my undergraduate math material and slowly making my way through the free Stanford Machine Learning course material which I have been finding excellent.
I have still been testing out ideas but most have been dead ends, so not much to report there. Instead I would like to get on my soapbox on two things, the strange contradiction in some of those who self describe as "fiscally conservative", and a general rumination on democracy and efficiency.
Not so conservative
I don't really get how one can say they are fiscally conservative yet in the same breath advocate lower taxes. If you sit down and think about it, surely the conservative view would be to gather as much tax revenue as one could. I don't think many private sector accountants would advocate limiting ones revenue sources.
I'm not really for super high taxes (say >= 55%), but very low taxes (<= 30%) seem like a bad idea. Public services cost money to run, and society as a whole benefits from good infrastructure, health care and education system. An exception might be Switzerland, but after living there for a while, yes on the face of it taxes are low, but everything else is expensive. There are a lot of compulsory things like insurance, licenses for this and that, so you end up paying anyway. This in turn stratifies society as only those relatively wealthy can take up the benefits of life in a modern western society.
There are a bunch of countries with very high taxes but they are mostly well managed and great places to live, better than the US by pretty much any quantifiable metric. I remain a great fan of Sweden which to my mind has struck a good balance between social policy and equality while remaining a fundamentally capitalist economy.
Democracy and the worst case
People discuss the inefficiencies and weaknesses of democracy, as if they would magically disappear under some other system, or that private enterprise is some beacon of expediency and should be a model aspired to by governments.
From what I have seen large private enterprises are very bureaucratic and stifling, and I think no worse than your favourite government department. The difference is that public institutions have the light of transparency shining down on them, whilst private institutions remain just that, private, so the waste and inefficiency is largely unexposed.
It is very hard to get things done in a large organisation, and to an extent I believe that to be by design. It is a safety mechanism against the worst case scenarios. For sure opportunities may be missed due to inefficient processes, but not all ideas are good, and long, drawn out approval processes also kill bad ideas that may result in taking a company down. There are exceptions but as a generalisation I think there is a degree of truth to it.
When people glorify private enterprises as a model of efficacy and efficiency, they presuppose the aim of an organisation is to reach some optimal maximum, i.e. attaining the best case. And though I agree it is important to strive for good results and outcomes, for large organisations I think it is more important to ensure worst case outcomes never eventuate (I'm sure we can all think of examples.)
While I was in Europe I had the opportunity to visit Auschwitz in Poland, one of the most infamous concentration camps of WW2 pictured above. There is a train line that runs right up to the entrance of the gas chambers used to murder of millions of people over the course of the war. The powers of the time clearly thought this was a great idea.
There are many problems with democratic political systems, and it is true they will never likely reach some optimal best case. We have to accept that most of the time we probably wont achieve the best case, but at the least we avoid the worst case. History shows us the worst case can get very nasty, so all in all I think it's a reasonable trade off.
Hope you and your families all enjoy Christmas and the holiday season.
Friday, October 14, 2011
Trading Mean Reversion with Augen Spikes
One of the more interesting things I have come across is the idea of looking at price changes in terms of recent standard deviation, a concept put forward by Jeff Augen. The gist is to express a close to close return as a function of the standard deviation of recent price moves. If returns were normally distributed you would expect moves of less than 1 standard deviation approximately 68% of the time. It's probably no surprise that actual returns don't appear to be normally distributed, so larger spikes are relatively frequent.
The plan
Augen uses it to gain an edge in detecting where option volatility is mispriced, but I believe the concept is very useful and can be used in other areas. I originally thought it might be useful as a volatility filter by highlighting "abnormal" spikes in the VIX but I'm yet to have much success with that. Another idea was that it might be useful for swing trading in these mean-reverting times, which turned out to be more fruitful.
The basic signal is to go short when an upwards spike greater than 1 standard deviation occurs, and to go long when a downwards spike greater than 1 occurs (i.e spike values greater than 1 or less -1 respectively). For reference I compared it to trading daily mean reversion (going short after an up day, going long after a down day), and fading RSI(2) extremes of 90/10.
How'd it go?
Overall it performed well, with the caveat it was only effective in markets where mean reversion strategies in general are effective. Interestingly, using the default parameters of a 20 day standard deviation with a 1 day lookback has outperformed both Daily Mean Reversion and RSI(2) over the last few years, both of which have fallen a bit flat during the same period.
Scaling into positions based on the size of the spike also seemed to be effective in producing good returns in absolute terms.
Final thoughts
Like most MR strategies it has an abysmal backtest if you start from the 70's. I am still fascinated by the regime change that took place between follow through and mean reversion around 1999/2000, which Marketsci has covered in some depth. Maybe one day I will figure it out.
A comparison of the equity curves is below, along with the code. I think this is a very useful way of looking at price changes, and will continue to investigate ways it can be put to good use.
require(quantmod)
require(PerformanceAnalytics)
slideapply <- function(x, n, FUN=sd) {
v <- c(rep(NA, length(x)))
for (i in n:length(x) ) {
v[i] <- FUN(x[(i-n+1):i])
}
return(v)
}
augenSpike <- function(x, n=20, k=1) {
prchg <- c(rep(NA,k), diff(x, k))
lgchg <- c(rep(NA,k), diff(log(x), k))
stdevlgchg <- slideapply(lgchg, n, sd)
stdpr <- x * stdevlgchg
#shuffle things up one
stdpr <- c(NA, stdpr[-length(stdpr)])
spike <- prchg / stdpr
return(spike)
}
perf <- function(x) { print(maxDrawdown(x)); table.AnnualizedReturns(x) }
retsum <- function(x) { return(exp(cumsum(na.omit(x))))}
getSymbols("SPY", from="2000-01-01")
spy <- SPY["2000/2011"]
spy$up_day <- ifelse(ROC(Cl(spy)) >=0, 1, 0 )
sig <- Lag(ifelse(spy$up_day == 1, -1, 1))
ret <- ROC(Cl(spy)) * sig
dmr_eq <- retsum(ret)
plot(dmr_eq)
perf(ret)
#[1] 0.3217873
# SPY.Close
#Annualized Return 0.1131
#Annualized Std Dev 0.2203
#Annualized Sharpe (Rf=0%) 0.5137
spy$rsi2 <- RSI(Cl(spy), 2)
sig <- Lag(ifelse(spy$rsi2 < 10, 1, ifelse(spy$rsi2 > 90, -1, 0)))
ret <- ROC(Cl(spy)) * sig
rsi_eq <- retsum(ret)
plot(rsi_eq)
perf(ret)
#[1] 0.1819428
# SPY.Close
#Annualized Return 0.0963
#Annualized Std Dev 0.1198
#Annualized Sharpe (Rf=0%) 0.8038
aus <- augenSpike(as.vector(Cl(spy)), k=1)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, -1, ifelse(spy$spike < -1, 1, 0)))
ret <- ROC(Cl(spy)) * sig
k1_eq <- retsum(ret)
plot(k1_eq)
perf(ret)
#[1] 0.1379868
# SPY.Close
#Annualized Return 0.1066
#Annualized Std Dev 0.1152
#Annualized Sharpe (Rf=0%) 0.9256
aus <- augenSpike(as.vector(Cl(spy)), k=2)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, -1, ifelse(spy$spike < -1, 1, 0)))
ret <- ROC(Cl(spy)) * sig
k2_eq <- retsum(ret)
plot(k2_eq)
perf(ret)
#[1] 0.2134826
# SPY.Close
#Annualized Return 0.1091
#Annualized Std Dev 0.1433
#Annualized Sharpe (Rf=0%) 0.7608
aus <- augenSpike(as.vector(Cl(spy)), k=1)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, (-1 * spy$spike), ifelse(spy$spike < -1, abs(spy$spike), 0)))
ret <- ROC(Cl(spy)) * sig
k1_scaled_eq <- retsum(ret)
plot(k1_scaled_eq)
perf(ret)
The plan
Augen uses it to gain an edge in detecting where option volatility is mispriced, but I believe the concept is very useful and can be used in other areas. I originally thought it might be useful as a volatility filter by highlighting "abnormal" spikes in the VIX but I'm yet to have much success with that. Another idea was that it might be useful for swing trading in these mean-reverting times, which turned out to be more fruitful.
The basic signal is to go short when an upwards spike greater than 1 standard deviation occurs, and to go long when a downwards spike greater than 1 occurs (i.e spike values greater than 1 or less -1 respectively). For reference I compared it to trading daily mean reversion (going short after an up day, going long after a down day), and fading RSI(2) extremes of 90/10.
How'd it go?
Overall it performed well, with the caveat it was only effective in markets where mean reversion strategies in general are effective. Interestingly, using the default parameters of a 20 day standard deviation with a 1 day lookback has outperformed both Daily Mean Reversion and RSI(2) over the last few years, both of which have fallen a bit flat during the same period.
Scaling into positions based on the size of the spike also seemed to be effective in producing good returns in absolute terms.
Final thoughts
Like most MR strategies it has an abysmal backtest if you start from the 70's. I am still fascinated by the regime change that took place between follow through and mean reversion around 1999/2000, which Marketsci has covered in some depth. Maybe one day I will figure it out.
A comparison of the equity curves is below, along with the code. I think this is a very useful way of looking at price changes, and will continue to investigate ways it can be put to good use.
require(PerformanceAnalytics)
slideapply <- function(x, n, FUN=sd) {
v <- c(rep(NA, length(x)))
for (i in n:length(x) ) {
v[i] <- FUN(x[(i-n+1):i])
}
return(v)
}
augenSpike <- function(x, n=20, k=1) {
prchg <- c(rep(NA,k), diff(x, k))
lgchg <- c(rep(NA,k), diff(log(x), k))
stdevlgchg <- slideapply(lgchg, n, sd)
stdpr <- x * stdevlgchg
#shuffle things up one
stdpr <- c(NA, stdpr[-length(stdpr)])
spike <- prchg / stdpr
return(spike)
}
perf <- function(x) { print(maxDrawdown(x)); table.AnnualizedReturns(x) }
retsum <- function(x) { return(exp(cumsum(na.omit(x))))}
getSymbols("SPY", from="2000-01-01")
spy <- SPY["2000/2011"]
spy$up_day <- ifelse(ROC(Cl(spy)) >=0, 1, 0 )
sig <- Lag(ifelse(spy$up_day == 1, -1, 1))
ret <- ROC(Cl(spy)) * sig
dmr_eq <- retsum(ret)
plot(dmr_eq)
perf(ret)
#[1] 0.3217873
# SPY.Close
#Annualized Return 0.1131
#Annualized Std Dev 0.2203
#Annualized Sharpe (Rf=0%) 0.5137
spy$rsi2 <- RSI(Cl(spy), 2)
sig <- Lag(ifelse(spy$rsi2 < 10, 1, ifelse(spy$rsi2 > 90, -1, 0)))
ret <- ROC(Cl(spy)) * sig
rsi_eq <- retsum(ret)
plot(rsi_eq)
perf(ret)
#[1] 0.1819428
# SPY.Close
#Annualized Return 0.0963
#Annualized Std Dev 0.1198
#Annualized Sharpe (Rf=0%) 0.8038
aus <- augenSpike(as.vector(Cl(spy)), k=1)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, -1, ifelse(spy$spike < -1, 1, 0)))
ret <- ROC(Cl(spy)) * sig
k1_eq <- retsum(ret)
plot(k1_eq)
perf(ret)
#[1] 0.1379868
# SPY.Close
#Annualized Return 0.1066
#Annualized Std Dev 0.1152
#Annualized Sharpe (Rf=0%) 0.9256
aus <- augenSpike(as.vector(Cl(spy)), k=2)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, -1, ifelse(spy$spike < -1, 1, 0)))
ret <- ROC(Cl(spy)) * sig
k2_eq <- retsum(ret)
plot(k2_eq)
perf(ret)
#[1] 0.2134826
# SPY.Close
#Annualized Return 0.1091
#Annualized Std Dev 0.1433
#Annualized Sharpe (Rf=0%) 0.7608
aus <- augenSpike(as.vector(Cl(spy)), k=1)
spy$spike <- aus
sig <- Lag(ifelse(spy$spike > 1, (-1 * spy$spike), ifelse(spy$spike < -1, abs(spy$spike), 0)))
ret <- ROC(Cl(spy)) * sig
k1_scaled_eq <- retsum(ret)
plot(k1_scaled_eq)
perf(ret)
Thursday, October 6, 2011
Sunday, October 2, 2011
Jeff Augen Volatility Spike Code in R
[Update: I have updated this so the number of days used for standard deviation can be passed as a parameter, you can find the code at Trading Mean Reversion with Augen Spikes ]
Jeff Augen has written many excellent books on options trading, including The Volatility Edge in Options Trading in which he presents a novel way of looking at a securities price movement as a function of its recent standard deviation.
I believe it's a very useful way at looking at price moves, so implemented the following which I believe matches as it was described in the book.
slideapply <- function(x, n, FUN=sd) {
v <- c(rep(NA, length(x)))
for (i in n:length(x) ) {
v[i] <- FUN(x[(i-n+1):i])
}
return(v)
}
augenSpike <- function(x, n=20) {
prchg <- c(NA, diff(x))
lgchg <- c(NA, diff(log(x)))
stdevlgchg <- slideapply(lgchg, n, sd)
stdpr <- x * stdevlgchg
#shuffle things up one
stdpr <- c(NA, stdpr[-length(stdpr)])
spike <- prchg / stdpr
return(spike)
}
An example of how to use it with quantmod:
getSymbols('SPY')
sp <- SPY['2010/2011']
asp <- augenSpike(as.vector(Cl(sp)))
sp$spike <- asp
barplot(sp['2011']$spike, main="Augen Price Spike SPY 2011", xlab="Time Daily", ylab="Price Spike in Std Dev")
Which gives the following chart
If you want to verify it has been implemented correctly (and I won't hold it against you), I used the following which is based on the example data he gave in the book. You will need the slideapply function from above which will apply a function to a subset of a vector along a sliding window.
aub <- data.frame(c(47.58, 47.78, 48.09, 47.52, 48.47, 48.38, 49.30, 49.61, 50.03, 51.65, 51.65, 51.57, 50.60, 50.45, 50.83, 51.08, 51.26, 50.89, 50.51, 51.42, 52.09, 55.83, 55.79, 56.20))
colnames(aub) <- c('Close')
aub$PriceChg <- c(NA, diff(aub$Close))
aub$LnChg <- ROC(aub$Close)
aub$StDevLgChg<-slideapply(aub$LnChg, 20, sd)
aub$StdDevPr <- aub$Close * aub$StDevLgChg
pr <- aub$StdDevPr
pr <- c(NA, pr[-length(pr)])
aub$Spike <- aub$PriceChg / pr
aub
Which for me at least gives the same data as printed. Let me know if you find it useful or find any errors.
Jeff Augen has written many excellent books on options trading, including The Volatility Edge in Options Trading in which he presents a novel way of looking at a securities price movement as a function of its recent standard deviation.
I believe it's a very useful way at looking at price moves, so implemented the following which I believe matches as it was described in the book.
slideapply <- function(x, n, FUN=sd) {
v <- c(rep(NA, length(x)))
for (i in n:length(x) ) {
v[i] <- FUN(x[(i-n+1):i])
}
return(v)
}
augenSpike <- function(x, n=20) {
prchg <- c(NA, diff(x))
lgchg <- c(NA, diff(log(x)))
stdevlgchg <- slideapply(lgchg, n, sd)
stdpr <- x * stdevlgchg
#shuffle things up one
stdpr <- c(NA, stdpr[-length(stdpr)])
spike <- prchg / stdpr
return(spike)
}
An example of how to use it with quantmod:
getSymbols('SPY')
sp <- SPY['2010/2011']
asp <- augenSpike(as.vector(Cl(sp)))
sp$spike <- asp
barplot(sp['2011']$spike, main="Augen Price Spike SPY 2011", xlab="Time Daily", ylab="Price Spike in Std Dev")
Which gives the following chart
If you want to verify it has been implemented correctly (and I won't hold it against you), I used the following which is based on the example data he gave in the book. You will need the slideapply function from above which will apply a function to a subset of a vector along a sliding window.
aub <- data.frame(c(47.58, 47.78, 48.09, 47.52, 48.47, 48.38, 49.30, 49.61, 50.03, 51.65, 51.65, 51.57, 50.60, 50.45, 50.83, 51.08, 51.26, 50.89, 50.51, 51.42, 52.09, 55.83, 55.79, 56.20))
colnames(aub) <- c('Close')
aub$PriceChg <- c(NA, diff(aub$Close))
aub$LnChg <- ROC(aub$Close)
aub$StDevLgChg<-slideapply(aub$LnChg, 20, sd)
aub$StdDevPr <- aub$Close * aub$StDevLgChg
pr <- aub$StdDevPr
pr <- c(NA, pr[-length(pr)])
aub$Spike <- aub$PriceChg / pr
aub
Which for me at least gives the same data as printed. Let me know if you find it useful or find any errors.
Adding a volatility filter with VIX
We saw in the basic system how we could add a factor, namely the 200 day moving average, to improve the overall performance of our system. You could spend a lot of time playing with different moving averages, and different combinations of crossovers if you are so inclined, but its fairly easy to see they only work well in strongly trending markets.
Instead of looking for further optimisation through price, what other factors might be of use in improving risk adjusted returns? And, more importantly for now, how can we represent them in R?
For this example I will use VIX as a proxy for overall market volatility. When VIX is high (for some definition of high), it means uncertainty reigns and for a long only system, its probably better to wait it out. We will quantify this as when the VIX is under it's 50 day moving average, volatility is low enough to risk our equity in the hope of gains.
Implementing this in R is quite straightforward, we just generate a second lagged signal vector and take the product of the 200 day MA vector.
The results are better. The extra factor reduces risk adjusted return, though on the whole the system isn't something I would put my own money into. At the very least it clearly gives a better result than buy & hold. Hopefully you can see the benefit of researching orthogonal factors as inputs.
As an aside, you could think of the work by Mebane Faber as introducing additional factors through the use of different asset classes and the relative performance of each. A relative performance filter, plus use of a price based filter like the 200 day moving average, provides a very solid overall performance. Looking for different factors you can model and use is probably going to be more fruitful than testing say the 250 SMA vs 200 SMA. There is only so much any one factor can give.
require(quantmod)
require(PerformanceAnalytics)
getSymbols(c('SPY', '^VIX'), from='1999-01-01')
SPY$ma200 <- SMA(Cl(SPY), 200)
VIX$ma50 <- SMA(Cl(VIX), 50)
spy <- SPY['2000/2011']
vix <- VIX['2000/2011']
sig <- Lag(ifelse(Cl(spy) > spy$ma200, 1, 0))
vix_sig <- Lag(ifelse(Cl(vix) < vix$ma50, 1, 0))
vf_sig <- sig * vix_sig
vf_ret <- ROC(Cl(spy)) * vf_sig
vf_eq <- exp(cumsum(na.omit(vf_ret)))
maxDrawdown(vf_ret)
#[1] 0.1532796
table.AnnualizedReturns(vf_ret)
# Annualized Return 0.0084
# Annualized Std Dev 0.0757
# Annualized Sharpe (Rf=0%) 0.1110
Instead of looking for further optimisation through price, what other factors might be of use in improving risk adjusted returns? And, more importantly for now, how can we represent them in R?
For this example I will use VIX as a proxy for overall market volatility. When VIX is high (for some definition of high), it means uncertainty reigns and for a long only system, its probably better to wait it out. We will quantify this as when the VIX is under it's 50 day moving average, volatility is low enough to risk our equity in the hope of gains.
Implementing this in R is quite straightforward, we just generate a second lagged signal vector and take the product of the 200 day MA vector.
The results are better. The extra factor reduces risk adjusted return, though on the whole the system isn't something I would put my own money into. At the very least it clearly gives a better result than buy & hold. Hopefully you can see the benefit of researching orthogonal factors as inputs.
As an aside, you could think of the work by Mebane Faber as introducing additional factors through the use of different asset classes and the relative performance of each. A relative performance filter, plus use of a price based filter like the 200 day moving average, provides a very solid overall performance. Looking for different factors you can model and use is probably going to be more fruitful than testing say the 250 SMA vs 200 SMA. There is only so much any one factor can give.
require(quantmod)
require(PerformanceAnalytics)
getSymbols(c('SPY', '^VIX'), from='1999-01-01')
SPY$ma200 <- SMA(Cl(SPY), 200)
VIX$ma50 <- SMA(Cl(VIX), 50)
spy <- SPY['2000/2011']
vix <- VIX['2000/2011']
sig <- Lag(ifelse(Cl(spy) > spy$ma200, 1, 0))
vix_sig <- Lag(ifelse(Cl(vix) < vix$ma50, 1, 0))
vf_sig <- sig * vix_sig
vf_ret <- ROC(Cl(spy)) * vf_sig
vf_eq <- exp(cumsum(na.omit(vf_ret)))
maxDrawdown(vf_ret)
#[1] 0.1532796
table.AnnualizedReturns(vf_ret)
# Annualized Return 0.0084
# Annualized Std Dev 0.0757
# Annualized Sharpe (Rf=0%) 0.1110
Thursday, September 29, 2011
A simple moving average system in R
I am currently working my way through the Stanford Machine Learning course as well as trying to get a bit more familiar with the statistical language R.
I am also interested in using tools to help make better trading decisions. I am not particularly sold on pure systematic/algorithmic trading systems, and have spent a reasonable chunk of the last few years investigating them. Price tells an important story, and one that is easily absorbed by looking at a chart. It's not so easy to quantify that into an algorithm, and there are some structural and practical issues that bother me about systematic trading.
But hey, it's fun to play with, and you never know when it might come in handy.
The base system is for now going long when price is above the 200 day moving average. Mebane Faber at Cambria Investment Management has some solid research around moving average systems available in his paper A Quantitative Approach to Tactical Asset Allocation. It's usually worth watching what everyone else is watching, if only because that's what everyone else is watching, and for equities at least the 200 day moving average is not uncommon.
Why long only? Well that is what the political economy wants, and who am I to argue.
A table of performance stats, with and without the moving average is as follows:
So, we objectively reduced drawdowns (some very large, hairy drawdowns), and increased the average annual return. But in absolute terms nothing at all to write home about.
I'm using numbers like the Sharpe ratio not because I believe its particularly great, but it is fairly common and well understood. I am looking to use it as a relative measure to determine if modifications lead to improvements, namely by higher risk adjusted return, which Sharpe reflects at least on some level.
Charts of the two equity curves and the code are below. The code is based off the version published by Joshua Ulrich in this post which I am very greatful for.
Next we'll take a look at adding a simple volatility filter using VIX.
require(quantmod)
require(PerformanceAnalytics)
#get the data and fill out the MA
getSymbols('SPY', from='1999-01-01')
SPY$ma200 <- SMA(Cl(SPY), 200)
#lets look at it from 2000 to 2011
spy <- SPY['2000/2011']
#our baseline, unfiltered results
ret <- ROC(Cl(spy))
eq <- exp(cumsum(na.omit(ret)))
#our comparision, filtered result
ma_sig <- Lag(ifelse(Cl(spy) > spy$ma200, 1, 0))
ma_ret <- ROC(Cl(spy)) * ma_sig
ma_eq <- exp(cumsum(na.omit(ma_ret)))
maxDrawdown(ret)
maxDrawdown(ma_ret)
table.AnnualizedReturns(ret)
table.AnnualizedReturns(ma_ret)
Sunday, September 25, 2011
Data Mining Tools
Sometimes you're not really sure about what you want do, or you have a data set you just want to throw a bunch of stuff at and see what sticks. I've found the following useful:
Rapid Miner This is a fully functional data mining tool. You create a processing workflow via drag and drop components, typically defining an input source, some processing and an output source. It supports everyone's favourite machine learning/AI techniques like SVMs, Neutral Nets, KNN etc. Thomas Ott at Neural Market Trends has some great rapidminer tutorials that can help you get up to speed.
Eureqa This is a tool which takes a data set and will try and find the function that describes it best. A bit more lightweight in terms of functionality compared to rapid miner, but can also be quite handy. It is windows based and I have not used it extensively, however it did seem to do what it said on the box.
Rapid Miner This is a fully functional data mining tool. You create a processing workflow via drag and drop components, typically defining an input source, some processing and an output source. It supports everyone's favourite machine learning/AI techniques like SVMs, Neutral Nets, KNN etc. Thomas Ott at Neural Market Trends has some great rapidminer tutorials that can help you get up to speed.
Eureqa This is a tool which takes a data set and will try and find the function that describes it best. A bit more lightweight in terms of functionality compared to rapid miner, but can also be quite handy. It is windows based and I have not used it extensively, however it did seem to do what it said on the box.
Book Review: The French Revolution: A Very Short Introduction
All work and no play makes Jack a dull boy so I decided to expand my reading circles. I got a copy of The French Revolution: A Very Short Introduction by William Doyle. I am a big fan of the Very Short Introduction series, they are usually a great overview for stuff you would like to know more about but have limited time or a short attention span.
I found the book well written and over all easy to follow, however the prose was slightly heavy in places. The book details the underlying causes, the specific events in chronological order, the aftermath and rise of Napoleon, giving some context to Napoleon's reign, which as an ignorant Australian I had never quite understood.
There were two main things I found interesting. Firstly, the causes were largely economic, France was burdened with crippling debts and its people were suffering due to rising food prices. The leadership of the day was an absolute monarchy, seen to be isolated and out of touch with the travails of day to day common life. The country was experiencing a financial crisis due to protracted wars. This remained unresolved as the monarchy was beholden to the nobility and clergy, preventing any chance of effective resolution. In practical terms this would have meant tax reform, which the both the nobility and the church were opposed to. I just can't think of a modern parallel.
The second was its impact on the subsequent 200 years. It gave rise to the concept of the sovereignty of a nation of people, rather than a monarch. It is a reminder that the political structures we have today are still relatively new, and perhaps have not yet reached some optimal maximum. It also showed how the concepts from the Declaration of the Rights of Man and Citizen have found their way into places like the Universal Declaration of Human Rights and formed the basis for many of the liberal democracies we enjoy today.
I really enjoyed reading it, and at 150 pages is not too great a commitment.
I found the book well written and over all easy to follow, however the prose was slightly heavy in places. The book details the underlying causes, the specific events in chronological order, the aftermath and rise of Napoleon, giving some context to Napoleon's reign, which as an ignorant Australian I had never quite understood.
There were two main things I found interesting. Firstly, the causes were largely economic, France was burdened with crippling debts and its people were suffering due to rising food prices. The leadership of the day was an absolute monarchy, seen to be isolated and out of touch with the travails of day to day common life. The country was experiencing a financial crisis due to protracted wars. This remained unresolved as the monarchy was beholden to the nobility and clergy, preventing any chance of effective resolution. In practical terms this would have meant tax reform, which the both the nobility and the church were opposed to. I just can't think of a modern parallel.
The second was its impact on the subsequent 200 years. It gave rise to the concept of the sovereignty of a nation of people, rather than a monarch. It is a reminder that the political structures we have today are still relatively new, and perhaps have not yet reached some optimal maximum. It also showed how the concepts from the Declaration of the Rights of Man and Citizen have found their way into places like the Universal Declaration of Human Rights and formed the basis for many of the liberal democracies we enjoy today.
I really enjoyed reading it, and at 150 pages is not too great a commitment.
Thursday, September 22, 2011
Subscribe to:
Posts (Atom)