Monday, May 27, 2013

BEER!![8] – The Great Australian Beer Spectapular


Last Saturday I went to the Great Australian Beer Spectapular at the Royal Exhibition Building in Melbourne, and here is what I tasted:
 

 

What I did while drinking them:  Compared tasting notes with my friend, largely consisting of ‘this is really good’, or ‘this was a bad choice’.

What I did after drinking them: Downed a few glasses of red wine at my father-in-law’s 60th.

Sunday, May 26, 2013

AFL Power Rankings: Round 9 2013

RISING UP

The Gold Coast Suns have been the big improver of 2013, and they picked up some more ranking points on the weekend with a good performance at the MCG against Hawthorn. Meanwhile, Essendon reversed its form drop of the past few weeks with a comfortable win in the Dreamtime match against Richmond, moving up from 11th to 9th.

FALLING DOWN

The Tigers swapped places with the Bombers, and lost ranking points for the fourth time in its past six matches. However, the largest drop in ranking points for the round belonged to St. Kilda, who lost to the 16th-ranked Bulldogs.  

ALSO OF NOTE

Geelong, Fremantle, Sydney, and West Coast all gained a couple of ranking points after large wins on the weekend. Along with the Hawks and Adelaide, they have made up the top six ranked teams after six out of the nine rounds this season.

1 (Last week: 1) Hawthorn 32.0 (Last week: 35.5)
2 (2) Geelong 24.6 (22.3)
3 (3) Fremantle 22.7 (20.8)
4 (4) Sydney 21.6 (19.1)
5 (5) Adelaide 17.5 (17.9)
6 (6) West Coast 16.3 (14.0)
7 (8) Carlton 10.9 (11.0)
8 (7) Collingwood 9.2 (12.5)
9 (11) Essendon 7.4 (4.6)
10 (10) North Melbourne 6.4 (5.8)
11 (9) Richmond 5.6 (9.6)
12 (12) St. Kilda -3.7 (2.1)
13 (13) Port Adelaide -11.2 (-8.8)
14 (15) Gold Coast -17.9 (-22.5)
15 (14) Brisbane -18.1 (-18.4)
16 (16) Western Bulldogs -28.0 (-33.7)
17 (17) Melbourne -54.5 (-53.2)
18 (18) Greater Western Sydney -64.9 (-61.8)

Saturday, May 25, 2013

The Wooden Finger Five: May 2013


Twenty years after the Vapors, Phoenix has done a version of ‘Turning Japanese’ that is actually tasteful. Easily the catchiest song off their new album, ‘Entertainment’ may just well be the best Phoenix tune since ‘Too Young’. If they had titled their album after this rather than the (still good) instrumental track ‘Bankrupt’ they’d probably sell twice as many copies. (OK, it didn’t work for Gang of Four …)




Still haven’t seen ’Searching For Sugar Man’ yet, but I don’t mind the soundtrack. This song reminds me a bit of my favourite album ever, Love’s ‘Forever Changes’ - the lyrics make very little sense, but the sound is beautiful. I feel like I should know who ‘Molly MacDonald’ and ‘Willy Thompson’ are.
The best Vampire Weekend moment ever:

“If I can’t trust you, then DAMN YOU HANNAH!!!
There’s no FUTURE, there’s no ANSWER!
Though we live on the $US dollar
YOU AND ME, we got our own sense of time …”
Easily the most impassioned lyric that Ezra Koenig has ever uttered.


This month I finally worked out how to create ringtones from my CD collection. For years I wanted a ringtone of this song, because of these lyrics:
“You’ve got a nerve to be asking a favour!/You’ve got a nerve to be calling my number!”

How cool is that? It would be even better if there was a recent ex to assign it to. Speaking of which, I also assigned special ringtones to my other half, and my mother



To this point, Miles Kane seemed like someone who had traded off the fame off his famous Frozen Simian friends. This tune though is a winner – it’s still relying on nostalgia for the ‘60s, but the ‘la … la la la la la la’ refrain is irresistible.

Monday, May 20, 2013

AFL Power Rankings: Round 8 2013

RISING UP

Fremantle take Sydney’s spot in third place after holding the Swans to a draw at the SCG. The Dockers have been in very good form since the middle of 2012.

FALLING DOWN

Essendon drop further to 11th after losing at home to the 14th-ranked Lions. Could the Dons’ lowly rank comparative to their ladder position looklike ‘genius’ after all?

ALSO OF NOTE

Hawthorn has now taken top spot on the AFL ladder, to go along with its #1 ranking.

1 (Last week: 1) Hawthorn 35.5 (38.9)
2 (2) Geelong 22.3 (24.4)
3 (4) Fremantle 20.8 (18.8)
4 (3) Sydney 19.1 (20.6)
5 (6) Adelaide 17.9 (15.5)
6 (5) West Coast 14.0 (15.6)
7 (8) Collingwood 12.5 (11.3)
8 (7) Carlton 11.0 (11.3)
9 (9) Richmond 9.6 (10.5)
10 (11) North Melbourne 5.8 (4.7)
11 (10) Essendon 4.6 (7.6)
12 (12) St. Kilda 2.1 (3.1)
13 (13) Port Adelaide -8.8 (-10.5)
14 (14) Brisbane -18.4 (-22.5)
15 (15) Gold Coast -22.5 (-24.1)
16 (16) Western Bulldogs -33.7 (-32.2)
17 (17) Melbourne -53.2 (-56.2)
18 (18) Greater Western Sydney -61.8 (-65.6)

Wednesday, May 15, 2013

Comparison of AFL Power Rankings Systems

When I created my AFL Power Rankings in 2011, I did not know of any other similar ranking systems for AFL. Basically my purpose was to create a system which gave a better indication of the relative strength of each team than the AFL ladder did. Ladder positions, particularly early in the season, can be to a large extent determined by which other teams each team has played, while they can also cover over the strength of recent form for each team. My system was never really intended to predict future results – though it could well be used for that purpose – but to give what I thought was a better assessment of past results than the ladder did.

My rankings system, as I assume is the case with most ranking systems, is not at all intended to indicate that the ladder is meaningless. There can be no denying that a team would rather win the premiership than be #1 on some rankings system. What ranking systems are meant to do is to give a better indication of the ‘actual’ strength of each team. If team A has a 65 per cent chance of winning a match, and team B has a 35 per cent chance of winning, then team A would be considered the stronger team. But that does not mean team A will win – by definition, team B has a non-zero chance of winning the match. There are many, many cases during the season where the team assessed as being weaker will win, including in the Grand Final. In constructing the rankings system I intended to look beyond the evidence from any particular match, and take account of the evidence not only from that match, but from other matches over a period of time, to get a better assessment of how strong a particular team has been.     

My rankings system depends only on these factors: the final margin for each match, where each match was played, the strength of the teams in each match, and how recent each match was. My reasons for choosing these factors were outlined here, but essentially I chose them because they seemed to me the main factors that football watchers use when adjusting the worth of each result. But while people often mentally adjust for these factors, they would rarely (including me) have an ‘objective’, quantifiable means of doing so. Thus my ranking system was really a way of adding some ‘objective’ rules to the subjective judgments that fans such as myself make.

A good example of where people make adjustments to the ladder is the premiership betting. A team might start the season 3-0 and be first or second on the ladder, but if they had a mediocre season last year then they might not be too far from mid-range in the premiership betting. Indeed, you could argue that the premiership betting might be the best ranking system of them all, because it reflects the collective assessments of many football followers, including those with possibly more accurate models than mine. One thing that convinced me that my ranking system might be OK was that it gave results that were not too far away from the premiership betting market. My hunch though is that there are enough people in the betting market who are shifted by emotion for it to react too quickly to shifts in form, and that there are systems out there which can beat the market, even if only by a little.    

Since then I have found out about other ranking systems, including those used at AFL Footy Maths, and just recently (though it is an old system) at Footy Forecaster. I can’t find the formula for Footy Forecaster’s rankings but it does not look like it would be all that different from mine given how close the ranking points for each team are under each system. If I had found this system before I devised my rankings I might never have bothered to create my own. Indeed, it might be that the Footy Forecaster rankings would be close to what I got if I fixed up the logical flaws in my system that always slightly bothered me. For example, the sum of the ranking points across teams in my system does not add up to zero, whereas they do in the Footy Forecaster system. However, the main difference in the systems to me appears to be the adjustment for home ground advantage, which looks to be considerably less for Footy Forecaster.       

For the AFL Footy Maths system, my understanding is that the main factors determining each team’s rankings are the same as my own. Again if I found it before creating my own I may not have bothered with mine. One main difference I have noticed is that teams tend to move slightly more quickly around the ranking positions in my system – I don’t know if this is a good or a bad thing, but in any case the rates of movement are not that different since the Footy Maths system underwent its renewal. I think another main difference (though I am prepared to stand corrected) is that in my system each match that is played changes the worth of previous results. For example, in my system, if team A beats team B by 60 points and Team B’s average net margin is -20 points then this is a very good result for team A. If team B then gets beaten by 60 points again the next week it is still a very good result for team A but less so. And if team B keeps getting beaten by 60 points every week then team A’s margin of victory eventually becomes considered par for the course.

And then there’s Roby…

Roby’s rankings appear on the Big Footy forum, where, rightly or wrongly, they are routinely subjected to some pretty hefty criticism. One thing I will say is that if Roby does actually calculate all the factors he says he does then maintaining his system must mean a hell of a lot of work. By contrast, my own system takes about 10 minutes to update. Another thing is that Roby does use his system for betting. Based on his description (and again I might be wrong), only backward-looking information determines his rankings, and not forward-looking information. For example, one might try and forecast expected future performance based not only on past performance, but also forward-looking factors such as future age profile, and … well, I can’t think of anything else at the moment. The result of using only backward-looking information is that, unless things stay the same forever and ever, there will be inevitably be errors in his predictions (same with my system). If you used forward-looking information as well which accurately predicted how a team’s form would develop into the future, you could reduce these errors. But anyway it could still be the case that Roby’s prediction errors are lower than anybody else’s.

Roby claims the intention of his rankings is to get a better understanding of how close each team is to winning a premiership. Presumably then this means that the team ranked #1 is considered the most likely to win a premiership. This, of course, does not mean that team will certainly win the premiership, or even that it is likely to. The current premiership favourite on the betting market, Hawthorn, typically has odds of around $3.25 to win the premiership, which means that it is generally considered far more likely that it will not win the premiership this year than it will.

Having said that, Roby’s phrasing is a bit unfortunate, because there will come a point in the season where multiple teams have 0 per cent chance of winning the premiership (16 will have no chance by Grand Final day). But unless I am missing something fundamental, I think you can also say that his rankings are just meant to be an indication of how well each team would be expected to perform relative to other teams in a game on neutral turf, with no injuries, with the same number of days break, etc., so I won’t be overly pedantic on that point.   

Roby’s rankings collate and model data on: final margins; score differentials over the course of the game; the team’s expected performance based on team selection, form, home ground advantage, breaks, travel, historical matchups; and previous/current ranking position(s), in-game injuries, umpiring decisions, and weather conditions.  As I said previously, my own rankings and the rankings over at AFL Footy Maths are based on final margins, home ground advantage, and strength of opponent. If the other factors Roby includes are found to have a significant impact on performance, and if he can assess those factors accurately, and he is at least as accurate as me on the factors that I include, then his ranking system will be more accurate. But how significant are those factors likely to be and how accurately can they be assessed? I don’t have the evidence – Roby may or may not but I don’t think he’s shared it – so the following represents my best guesses.

Score differentials over the course of the game: Factoring this in is saying that not only the final margin matters but also how you get there – being up by 80 points at three quarter time and then winning by 40 points is more dominant than being even at three quarter time and then winning by 40. A fair proportion of the football following public would agree with this. I don’t know if, empirically, it helps in terms of predicting future performance, but it doesn’t seem unreasonable that it could, and you can quantify it.  

Team selection: I’d say this could be similar to the ‘Full Strength Indicator’ that Champion Data produces.  One has to make some judgment calls as to what a team’s best line-up is, but if you have an accurate method of rating players this could be useful.

Breaks and travel: Well, coaches often seem to think these things matter. Again, I don’t know if, empirically, they are right, but it’s not an unreasonable possibility, and these things are easily quantified.

Historical matchups:  I suppose this means that, after accounting for the current strength of each team, if a team does better against a particular team than it has done in the past then its performance is rated more highly. For example, if Hawthorn broke its losing streak against Geelong it would gain points beyond those from beating a team of Geelong’s calibre. That makes sense, though I don’t know if each team has enough ‘bogey’ teams for this to make much difference.

In-game injuries: Now how do you quantify this? If I was to do it, I would need to know the difference in ability between the player who was injured and the player who replaced them in that position. I guess I’d also need some way of quantifying the effect, if there is one, of other players dropping their performance because they are more tired. And in the case where the player stayed on the ground, I would need to know what the reduction in their capacity to perform was (or I could just ignore these cases). Not impossible to do, but not easy.

But I don’t think this factor would make a lot of difference. Because of the way injury news is phrased, for example, ‘So-and-so is OUT!’, it brings to mind a big gaping hole, but a player who is injured is replaced, and unless you lose a star the drop-off in quality will not be that sharp, and anyway that player is only 1/18 of the players for that team on the field. Losing three or four players in one game might well have a big impact on the outcome of that game, but that wouldn’t occur to a particular team too often. Out-of-game injuries would be more important, but that is presumably covered under team selection.

Umpiring decisions: I guess that umpiring decisions could have a significant impact on the outcome of a game, maybe less so over a season as I’d expect errors to even out. But quantifying the impact is a very difficult exercise. You basically have to work out what is the expected scoring impact of each decision, based on where the ball is on the ground, the type of decision, and so on.

Besides this, any judgment about umpiring decisions will be subjective. Now I have no reason to believe that Roby is a bad judge of umpiring decisions, and it sounds like he does carefully review them (that also sounds like a hell of a lot of work). But try explaining this to other football followers, most of which will swear their team was crucified by the umpires last weekend. (I swear that Richmond has been crucified by the umpires for the past thirty years.) Adding in the effects of umpiring decisions could make the model more accurate, but it’s a tough sell.

In the end then, all of the factors that Roby has added make sense, but without knowing the empirical evidence for their inclusion, and how they affect the final rankings, it is hard to comment on their worth. Roby could help out here, and could certainly help the credibility of his rankings, by revealing this information. On the other hand, if I put in as much effort to generate rankings as Roby seems to, and those rankings actually did generate betting profits over the long run, I’d probably be reluctant to reveal that information as well.

Thanks to John Murray and AFL Footy Maths for the discussions which led to this post.

Monday, May 13, 2013

AFL Power Rankings: Round 7 2013

RISING UP

They may not have jumped many spots in the rankings this week, but Adelaide and Gold Coast had the largest gains in ranking points. Adelaide had a mammoth win on the road against the GWS Giants, while Gold Coast had a big win away against Melbourne. Gold Coast now has its best rating ever.

FALLING DOWN

Conversely, while they did not fall down any spots – because they did not have anywhere left to fall – Melbourne and GWS dropped significantly further away from the rest of the competition in terms of ranking points.

ALSO OF NOTE

Essendon is now 10th. The reasons for the disparity between the Bombers’ 2013 ladder position and their ranking position were covered here, and here, and were also discussed over at AFL Footy Maths. Essentially, given the way these ranking systems currently work, Essendon’s five weeks this season of very good performances are offset to a large extent by its collective instances from late last season of bad performances, even if that evidence is decreasing in importance by the week. Every rankings system has its outliers … and indeed, the Dons may prove yet not to be one.

(And while I shouldn’t keep pointing out ‘weird’ things with my own rankings, the gain in points for Carlton and the loss in points for St. Kilda might be noticed. What I’ll say there is that, as other teams’ ratings change, then the ‘worth’ of the Blues’ and Saints’ recent performances are adjusted, and those adjustments outweighed the impact from the Saints’ win this week, which was slight given the close margin. In the Saints' case, their past four opponents before Carlton dropped in ‘worth’ this week, while Carlton was largely helped by the Crows’ improvement.)  

1 (Last week: 1) Hawthorn 38.9 (38.3)
2 (2) Geelong 24.4 (24.4)
3 (3) Sydney 20.6 (24.2)
4 (4) Fremantle 18.8 (18.5)
5 (5) West Coast 15.6 (15.4)
6 (7) Adelaide 15.5 (11.0)
7 (8) Carlton 11.3 (11.0)
8 (6) Collingwood 11.3 (13.9)
9 (10) Richmond 10.5 (7.8)
10 (9) Essendon 7.6 (10.0)
11 (11) North Melbourne 4.7 (5.6)
12 (12) St. Kilda 3.1 (5.4)
13 (13) Port Adelaide -10.5 (-8.0)
14 (14) Brisbane -22.5 (-21.8)
15 (15) Gold Coast -24.1 (-29.0)
16 (16) Western Bulldogs -32.2 (-31.2)
17 (17) Melbourne -56.2 (-50.3)
18 (18) Greater Western Sydney -65.6 (-59.5)

Thursday, May 9, 2013

The AFL Power Rankings and the Effect Of The Season Break


In last week’s ‘AFL Power Rankings’, I noted that the Bombers’ fall in ranking from 6th to 9th may be a bit controversial given that they had just won their sixth straight game to start the 2013 season. Essendon also fell down the rankings over at AFL Footy Maths, moving into 12th spot.

On Twitter, Greg Jericho suggested that perhaps both ranking systems needed to change their weightings for matches in the previous season. It is certainly something I have thought about before. Currently, in my rankings system, a team’s result from four matches ago carries more weight than its result from five matches ago, but it makes no difference whether those matches were one week apart or six months apart. There are some reasons to think that a team’s performance may change by more over a season break than within a season – players age over a season break by as much as they do within a season, players leave and new players are recruited, and so on. What I need to find out is if the relationship between a team’s performance from the end of one season to the start of another season is very different from the variation in a team’s performance within a season. There are two possibilities, each with its own implication:
 
1)      They are very different – and that means a match should, all other things equal, carry less weight if it was played in the previous season than if it was played in the current season.

2)      They are not very different – and therefore the weight of a match should only depend on how many matches ago it was for the team.
To test this, I summed up the net margin for each team over its first five matches of a season, and then compared that to the net margin for each team over its next five matches of that season, and the net margin for each team over its last five (home and away) matches of the previous season. Obviously net margins will depend on which teams that team plays, but I’m assuming that for the purpose of this comparison, those effects will even out across teams. I only did this for each of the seasons from 2008 to 2012, because strange as it may seem, I don’t have a spreadsheet of the result of every match in AFL history lying around, and so I had to undergo the tedious exercise of entering the results in for seasons prior to the advent of the Power Rankings.  That does not give a large sample size, so all that is stake here for the moment is how far I am willing to pursue this question further.

Correlations of AFL teams’ net margins over first five matches of season with next five matches of season and last five (home and away) matches of previous season

Season
Correlation with next five matches of season
Correlation with last five matches of previous season
2012
0.46
0.51
2011
0.61
0.44
2010
0.36
0.38
2009
0.74
0.58
2008
0.65
0.27

As you can see, over the past five seasons there is, on average, a greater correlation between a team’s net margin over its first five matches of the season and its next five matches of the season than with its last five (home and away) matches of the previous season. Again, it’s only five seasons of observations – it might be that the previous fifty seasons show the reverse pattern! – but perhaps the question of the effect of the AFL season break on performance is something I should give further thought. If only I had a spreadsheet of every VFL/AFL result ...

Sunday, May 5, 2013

AFL Power Rankings: Round 6 2013

RISING UP

With the mid-range teams being closely bunched together, Collingwood, Adelaide and Carlton all jump up a couple of spots. Collingwood comfortably beat St. Kilda, Carlton thrashed Melbourne by even more than the Demons usually get beaten, and Adelaide had a close game against the #1-ranked Hawks.

FALLING DOWN

With the mid-range teams being closely bunched together, Richmond drop from 7th to 10th after a sizable loss to Geelong. That movement in the rankings should be fairly uncontroversial.

What will be considerably less uncontroversial to those who stumble across these rankings is Essendon falling from 6th to 9th, given that:

-          Essendon just won their sixth straight game to start the season, and are now one of only two undefeated teams.
-          Essendon has beaten three of the teams above it in the rankings this season: Collingwood (easily), Adelaide (easily, and away to boot), and Fremantle (away).
-          Adelaide lost, Essendon won, and Adelaide still moved above Essendon in the rankings.

Essendon has built up a substantial number of ranking points this season due to its excellent start, but it is still only considered by the rankings system to be mid-range because it was pretty bad in the second half of 2012. And the reason it moved below Adelaide and Carlton this week is because it was relatively more lacklustre against GWS (yes, a 39-point win at home against GWS is considered lacklustre) than Carlton was against Melbourne or Adelaide was against Hawthorn.

If the game against GWS is the bursting of Essendon’s bubble then the Dons’ current ranking might look like genius in a few weeks time. If it is not then I imagine it’ll look pretty stupid. Regardless, if Essendon beat Geelong next week, you can bet it’ll be rated a lot more highly on this blog.      

ALSO OF NOTE

Geelong and Sydney have broken from the pack, and the rankings now consider them as clearly the best teams in the land after the Hawks.

1 (Last week: 1) Hawthorn 38.3 (37.2)
2 (3) Geelong 24.4 (19.7)
3 (2) Sydney 24.2 (22.2)
4 (4) Fremantle 18.5 (17.8)
5 (5) West Coast 15.4 (13.6)
6 (8) Collingwood 13.9 (10.7)
7 (9) Adelaide 11.0 (9.6)
8 (10) Carlton 11.0 (9.1)
9 (6) Essendon 10.0 (13.3)
10 (7) Richmond 7.8 (10.7)
11 (12) North Melbourne 5.6 (5.0)
12 (11) St. Kilda 5.4 (8.1)
13 (13) Port Adelaide -8.0 (-9.5)
14 (14) Brisbane -21.8 (-21.7)
15 (15) Gold Coast -29.0 (-29.8)
16 (16) Western Bulldogs -31.2 (-31.0)
17 (17) Melbourne -50.3 (-49.4)
18 (18) Greater Western Sydney -59.5 (-63.9)

Thursday, May 2, 2013

Film Review: The Never Ending Story


On a recent plane flight home (and actually, on TV recently as well) I watched ‘The Never Ending Story’ for the first time since I was a kid. This was significant because, when I was a kid, ‘The Never Ending Story’ scared the bejesus out of me. I can’t precisely remember where or how I saw it (I think it was at a school friend’s house), but wherever it was I was only half-watching it, so that my lasting impression of it was of it being a series of disturbing, nightmarish images. Watching it again as an adult, the movie was of course not anywhere nearly as scary. Still, it reminded me of what gave me the willies as a child – consider these scenes:

a)    The giant Rock Biter appears: As a kid I didn’t like giants, and I especially didn’t like giants reaching out and eating things – it made me feel helpless. So the introduction of a giant that was grabbing and munching up the landscape didn’t exactly make me feel comfortable.

b)   The horse drowning: I didn’t actually remember this scene until I watched the movie again, and it’s possible I didn’t see it the first time. But if I did, I think a sad horse sinking into the swamp would likely have left me feeling depressed for the entire weekend.

c)    The gate to the Southern Oracle: In this scene, a knight tries to pass two giant fearsome-looking sphinxes, which proceed to shoot beams out of their eyes and obliterate him. Freaking terrifying. The Southern Oracle crumbling to bits after speaking creepily to Atreyu just made it worse.

d)   Atreyu meets Gmork in Spook City: If there is a more disturbing scene in a so-called children’s film than Atreyu looking at spooky pictures of scenes from the movie to date, and then seeing the picture of the wolf-like Gmork in the shadows … and then one second later, actually seeing Gmork in the shadows! … I can’t immediately think of it.* Though I admit that, watching it again, it wasn’t nearly as dark as it had been in my memory for twenty years.

e)    Fantasia is destroyed: The image of Fantasia reduced to nothing more than a few shards of rock floating in space is the final dark, depressing, devastating exclamation mark on an already dark and depressing film. It’s quite possible I stopped watching the movie altogether at this point – indeed, since I didn’t remember the ending at all, it’s quite probable.
Frankly, the only characters that didn’t freak me out were Falcor, the two boys, and maybe the Empress (though I can’t be sure of that last one). But I loved the theme song! (Though even here, my memories were faulty – I thought there was this incredibly romantic line ‘Don’t close your eyes or she may fade away’ when in fact it’s ‘Show no fear for she may fade away’.) And I’m glad I watched it again (and again), otherwise I may have had the embarrassing situation where I was still cowering away from this movie in my 50s. 

*I did think of it later – the room of heads in ‘Return To Oz’. I definitely stopped watching that movie at that point.