Sunday, June 30, 2013

AFL Power Rankings: Round 14 2013


Richmond has been the in-form team over their past five games, as shown in my nifty new summary table below. Even though they have played relatively weak opposition, the Tigers have been winning by large enough margins to gain them over 10 ranking points and a spot in the top four.  As the nifty new summary table shows, Richmond has made all of its gains for the season in its past five games. 


St. Kilda meanwhile continues to slip further away. As the nifty new summary table shows they have lost the most ranking points of any team this season (over five goals worse), with half of that loss coming in their past five games.


Did I mention I had a nifty new summary table? Apart from each team’s ranking points as of this week, it also includes the change in points over the past week, over the team’s past five games, and over the season so far. Taking the example of Essendon - because they’ve been the example for everything in the 2013 rankings - the Dons have made massive gains over the season to date, but have cooled off a bit in their past five games, though they racked up a few points last week from their win over the Eagles in Perth. (It’s all made possible thanks to my new best friend, the VLOOKUP function in Excel.)  

The Finger Points Outwards - No. 61

VIDEO GAMES: A new series of books on classic video games.

VIDEO GAMES: How to play the coin sound in Super Mario Bros.

TELEVISION: Which show portrays marriage worse: Mad Men or Game of Thrones?

BASKETBALL: Bill Simmons on the career of NBA legend Tim Duncan.

BASKETBALL: LeBron James has two NBA championships – what are his odds of catching Jordan’s six?

ECONOMICS: A wake up app that economists would love.

FASHION: I am not a massive fan of tattoos, but I do like these ones.

Tuesday, June 25, 2013

Why Aren't LeBron's Heat Winning Heaps More Than LeBron's Cavs?

In 2009-10, LeBron James and the Cleveland Cavaliers won 61 out of 82 games, and in 2008-09 they won 66 games. When James left the Cavs to join NBA All-Stars Dwyane Wade and Chris Bosh at the Miami Heat, some thought the Heat may win over 70 games. While the Heat have won a lot of games, their winning percentage has not really been any higher than the Cavs in the couple of seasons before LeBron's departure, as they won 58 games in 2010-11, 46 out of 66 games in the lockout-shortened 2011-12 season, and 66 games in 2012-13. The Heat have won two NBA championships, but they have not steamrolled opponents in the way one might have expected given they have two All-Stars and the best player in the game. Why?

To answer this, let's look at the 'wins produced' by each player on the Cavs and the Heat during these seasons. 'Wins produced' is a formula popularised and in part devised by economist Dave Berri to measure the worth of each NBA player. If you don't think this formula means anything, you may as well stop reading here. But assuming it is useful, let's see what it says.

Here are the wins produced by the Cavs in 2008-09 and 2009-10 (courtsey of NBA Geek):

And here are the wins produced by each Heat players in 2010-11 to 2012-13:

LeBron has been pretty consistent, although he produced a bit more during the Cavs years when he was at an age that players (according to the wins produced metric) are more likely to be at their peak.  Dwyane Wade is clearly a better 'second banana' than anyone on the Cavs was. But the difference between Wade and the 'second banana' Cavs (Delonte West in 2008-09, Anderson Varejao in 2009-10) does little more than offset the difference between 'peak Cavs LeBron' and 'slightly past peak Heat LeBron'.

Chris Bosh, meanwhile, does not produce that much more than a bunch of other Heat players, at least not over the past two seasons. Hence, even though he is an All-Star, in terms of what he produces the wins produced metric does not rate him any higher than, say, Mo Williams on the Cavs (though Williams was named an All-Star too). And the rest of the squads are pretty even.

Have a look at the 2008-09 Cavs and the 2012-13 Heat, which both won 66 out of 82 games. LeBron and West on the Cavs produced as much as LeBron and Wade on the Heat. Williams, Varejao, Ben Wallace, and Wally Szczerbiak on the Cavs collectively produced a couple more wins than Ray Allen, Bosh, Shane Battier, and Mario Chalmers.  But the Heat made up that difference  through the rest of their squad, including Mike Miller and Udonis Haslem.

So, in summary, according to the wins produced metric, LeBron's Heat don't win heaps more than LeBron's Cavs because LeBron isn't quite as productive on the Heat as he was the last two seasons with the Cavs, Wade is clearly better than the Cavs' second best player but only by enough to offset the drop-off in LeBron's production, and Bosh is pretty much only an average NBA player on the Heat.

Sunday, June 23, 2013

AFL Power Rankings: Round 13 2013

Port Adelaide move up another spot after beating the #2-ranked Sydney Swans.  The Power have only gained two spots in 2013, having fallen a long way behind the pack during 2012, but have gained 14 ranking points.    
Sydney and Geelong both drop back a few points this week after losing to relatively lowly-ranked teams in Port and Brisbane, with the Cats moving down a spot to 4th.
I’m calling the Bulldogs just Western this week because there’s nothing much else to say, and watching them play on the weekend reminded me that I’m still annoyed that they changed to that silly name from Footscray.  
1 (Last week: 1) Hawthorn 28.9 (Last week: 31.4)
2 (2) Sydney 24.8 (29.3)
3 (4) Fremantle 21.9 (21.9)
4 (3) Geelong 19.7 (22.9)
5 (5) Richmond 14.7 (14.1)
6 (6) West Coast 12.8 (12.6)
7 (7) Collingwood 10.7 (12.0)
8 (8) Carlton 9.0 (9.7)
9 (10) Essendon 6.3 (7.0)
10 (9) North Melbourne 6.3 (7.4)
11 (11) Adelaide 5.8 (6.7)
12 (13) Port Adelaide -8.6 (-12.2)
13 (12) St. Kilda -11.3 (-9.6)
14 (14) Gold Coast -17.3 (-16.0)
15 (15) Brisbane -18.9 (-21.2)
16 (16) Western  -26.9 (-25.5)
17 (17) Melbourne -58.3 (-60.3)
18 (18) Greater Western Sydney -69.5 (-68.9)

Friday, June 21, 2013

The Evolution of the AFL Leading Goalkicker

In May last year, I looked at which was the most impressive goalkicking season in VFL/AFL history. This was based on each season's leading goalkicker's "Goal Efficiency Rating", which was calculated as:

Goal Efficiency Rating = Goals kicked by player in home-and-away season / [Average goals per match for home-and-away season (all teams) * Number of matches in home-and-away season]

Hence, this adjusts for players in earlier seasons having less games in which to score goals, and that scoring in earlier seasons was less common. (Based on the GER, I said that Gordon Coventry's 1929 was the most impressive.)

One point I made in that post was that the drop in reliance on one player (generally the full-forward) to kick goals seems to have been happening for a while, but it hadn't really been noticed due to the higher scoring. Well, a year on, it occured to me to show this graphically.

The graph above shows that the GER was rising from 1897 and peaked in about the 1930s/40s. Since then it's been on a trend downwards (though of course spiking up when great goalkickers like John Coleman and Peter Hudson were playing).  But because scoring reached its peak in the 1980s/90s, and there were more games, players like Jason Dunstall, Tony Lockett, and Gary Ablett routinely beat the goalkicking totals of forwards past*, although those players were also outstanding for their era. Now the reliance on a primary goalkicker looks to be as low as it's ever been. The game has changed, and at this rate, Coventry's feat of scoring 30 per cent of an average match's goals looks like it will take a long time beating.

*Scoring spiked up again in the year Buddy Franklin kicked 102 (in 2008).

Thursday, June 20, 2013

Mid-Year AFL All-Australian Team 2013

As always, there are absolutely no statistics or arguments to back these selections up, but this is who I think would make the AFL All-Australian team at this point:

B: Scott Thompson (NM), Jake Carlisle (Ess), Michael Hibberd (Ess)
HB: Andrew Walker (Carl), Harry Taylor (Geel), Grant Birchall (Haw)
C: Patrick Dangerfield (Adel), Jobe Watson (Ess), Daniel Hannebery (Syd)
HF: Lindsay Thomas (NM), Nick Riewoldt (StK), Steve Johnson (Geel)

F: Jeff Garlett (Carl), Josh Kennedy (WC), Jarryd Roughead (Haw)

R: Will Minson (WB), Scott Pendlebury (Coll), Gary Ablett (GC)

I: Kieren Jack (Syd), Kane Cornes (PA), Jarrad McVeigh (Syd), Nat Fyfe (Fre)

One idea I had was to name a "Freakin' Awesome" team. Basically, there would only be two selection criteria for this team:

1) The team as a whole should be able to beat any AFL team. Thus, if you have Majak Daw and all of his freakin' awesomeness on the team you probably need either Gary Ablett or Scott Pendlebury (both also awesome) to make sure you get the win.

2) If anyone asks you why such-and-such player is on the team, you just say "Because they're freakin' awesome!"

But strangely enough, for a team that should be selected on instinct, I need to think a bit more about this before season's end. Having said that Ablett, Pendlebury, Dangerfield,  Steve Johnson, and Harry Taylor  would  be walk-up starts.

Sunday, June 16, 2013

AFL Power Rankings: Round 12 2013


With two good wins against West Coast and Adelaide, Richmond rise to the top of the mid-range pack. But with only seven ranking points separating 5th from 11th, it doesn’t take much to move back down again, as evidenced by …   


… the Adelaide Crows, who have crashed from 5th to 11th after last week’s thrashing from Sydney, and then being beaten comfortably this week by the Tigers. The Crows were last year’s big improvers, but they are coming closer to returning to negative ranking point territory.   


Not much really. With six teams having the week off there weren’t too many movements in the rankings.

1 (Last week: 1) Hawthorn 31.4 (Last week: 33.3)
2 (2) Sydney 29.3 (30.9)
3 (4) Geelong 22.9 (23.2)
4 (3) Fremantle 21.9 (23.8)
5 (7) Richmond 14.1 (11.1)
6 (6) West Coast 12.6 (12.4)
7 (5) Collingwood 12.0 (14.2)
8 (9) Carlton 9.7 (8.6)
9 (10) North Melbourne 7.4 (8.1)
10 (11) Essendon 7.0 (6.7)
11 (8) Adelaide 6.7 (9.1)
12 (12) St. Kilda -9.6 (-8.7)
13 (13) Port Adelaide -12.2 (-14.4)
14 (14) Gold Coast -16.0 (-14.6)
15 (15) Brisbane -21.2 (-22.1)
16 (16) Western Bulldogs -25.5 (-26.1)
17 (17) Melbourne -60.3 (-60.8)
18 (18) Greater Western Sydney -68.9 (-65.9)

Wednesday, June 12, 2013

Graphic Novels You Would Like If You Weren’t Too Chicken To Read Them: Miracleman

Alan Moore’s ‘Miracleman’ (known as ‘Marvelman’ in the UK, but changed in the US for obvious Marvel Comics-related reasons) is the ‘lost’ classic of superhero comics, but still one of the most important series ever. Various legal disputes has meant that it has remained out of print in any form for roughly two decades, that is, except for online, which is where I finally got to read it. Before Moore’s own ‘Watchmen’ and Frank Miller's ‘The Dark Knight Returns’ deconstructed superheroes, Moore and his various artists showed us what might happen if superpowered beings really lived among us.  

The main character of the series is Mike Moran, a normal middle-aged guy with an expanding gut and dreams of flying. Turns out that Mike was once the godlike Miracleman, and one day when he says his ‘magic word’ his alter-ego is suddenly reborn. Moore uses the character of Miracleman (who originally began as a hero in the 1950s) to turn just about every superhero trope on its head. Miracleman’s former sidekick, Kid Miracleman, has grown up, and become a brutal sociopath. Miracleman’s ‘adventures’ also get re-evaluated as the series progresses, and he learns they may not be quite as ‘real’ as he remembers. Fantasy is actually a major theme in the series, as the power and glamour of saying his magic word and becoming instantly super-powered begin to consume Mike’s ordinary life. There is also a superhuman childbirth in issue #9 - but be warned that it does not spare you the gory details.        

And then there is the infamous issue #15 …  in the case of this issue I really mean that you might be too chicken to ever read it. Kid Miracleman returns with a vengeance, and takes out his loathing on London’s populace, which given his godlike powers, has catastrophic and gruesome results. Miracleman, who has been off-planet, returns to stop him, and their ensuing battle shows what might well happen if two guys with superpowers fought it out in the midst of a major city. It is horrifying to read, but it is also one of the greatest single issues ever. Arguably just as good is issue #16 - Moore’s final issue – which deals with the drastic steps that Miracleman takes in the aftermath of the massacre to ensure that the world is protected going forward. Many subsequent series would take up Moore’s ideas about what might happen if superheroes decided that the best way to protect ordinary humans was to rule them.
‘Watchmen’ will always be, to my mind, Alan Moore’s greatest achievement (if one can forget the movie), but ‘Miracleman’ is not far behind it. Unlike ‘Watchmen’ the story does not fit into neat little boxes – it’s messy and sprawling, and it changes style and tone quite a bit during its 16 issue-run, but there is no doubt it will stick in the memory of anyone who reads it. If the legal rights cannot get sorted out, make sure to read it online, and you’ll feel a lot differently when it comes time to watch the next superhero summer movie blockbuster.    

Monday, June 10, 2013

AFL Power Rankings: Round 11 2013


Sydney shoots up to a clear 2nd after thrashing a well-rated team in Adelaide on its home turf. The reigning premiers are now only a few ranking points away from taking the top spot.


Conversely, Adelaide falls back to the pack following its big loss. After being the big riser last week, North Melbourne return to whence they came, dropping from 6th back down to 10th after losing against the Gold Coast Suns.


Six teams had the bye this week, but teams can change in ranking points even if they do not play. This is because the ‘worth’ of their previous results are re-adjusted as the teams that did play this week rise or fall in value.  

1 (Last week: 1) Hawthorn 33.3 (Last week: 33.3)
2 (4) Sydney 30.9 (23.7)
3 (3) Fremantle 23.8 (24.5)
4 (2) Geelong 23.2 (24.6)
5 (7) Collingwood 14.2 (12.2)
6 (8) West Coast 12.4 (12.0)
7 (9) Richmond 11.1 (11.3)
8 (5) Adelaide 9.1 (16.7)
9 (10) Carlton 8.6 (10.6)
10 (6) North Melbourne 8.1 (12.2)
11 (11) Essendon 6.7 (6.1)
12 (12) St. Kilda -8.7 (-8.1)
13 (13) Port Adelaide -14.4 (-13.7)
14 (14) Gold Coast -14.6 (-17.6)
15 (15) Brisbane -22.1 (-21.8)
16 (16) Western Bulldogs -26.1 (-25.5)
17 (17) Melbourne -60.8 (-57.5)
18 (18) Greater Western Sydney -65.9 (-66.6)

Saturday, June 8, 2013

The Wooden Finger Hundred: June 2013

Alright, there’s no way to pretend this post isn’t self-indulgent … This weekend, Triple J are counting down the Hottest 100 songs of the past 20 years, as voted by aging Gen Xers. And I saw a blog post this morning with 10 personal favourites, so what the hell… I’m going to post my favourite 100 songs from the past 20 years. There’s no way I’m going to go through and justify all my selections, but most of my top favourites were discussed a few years back. And in the spirit of Triple J countdowns there’s a Powderfinger song in there, and an Australian song is #1.  

100. Stillness Is The Move – Dirty Projectors
99. My Girls – Animal Collective
98. Better Than Sunday – Ladyhawke
97. Summertime – The Sundays
96. 19-2000 – Gorillaz
95. Lover, You Should’ve Come Over – Jeff Buckley
94. Apply Some Pressure – Maximo Park
93. This Day – The Sleepy Jackson
92. These Are The Ghosts – The Bees
91. The Modern Age – The Strokes
90. Staring At The Sun – TV On The Radio
89. Mykonos – Fleet Foxes
88. Take Me Out – Franz Ferdinand
87. Maps – Yeah Yeah Yeahs
86. Where Did All The Love Go? - Kasabian
85. I’m His Girl – Friends
84. Lost! – Coldplay
83. The Rat – The Walkmen
82. Empire State Of Mind – Jay-Z and Alicia Keys
81. No One Knows – Queens Of The Stone Age
80. Caring Is Creepy – The Shins
79. Take Your Mama Out – Scissor Sisters
78. Wrecking Bar (Ra Ra Ra) – The Vaccines
77. Shampain – Marina & The Diamonds
76. Some Might Say – Oasis
75. Clocks – Coldplay
74. The Same Boy You’ve Always Known – The White Stripes
73. The Day We Caught The Train – Ocean Colour Scene
72. Two Weeks – Grizzly Bear
71. Mistaken For Strangers – The National
70. Homesick – The Vines
69. Chelsea Dagger – The Fratellis
68. Ambulance – Blur
67. Changing The Rain – The Horrors
66. Pennyroyal Tea – Nirvana
65. Monoliths – Lotus Plaza
64. Common People – Pulp
63. King Of The Rodeo – Kings Of Leon
62. Catch The Sun – Doves
61. Yuko & Hiro – Blur
60. Little Secrets – Passion Pit
59. Seven Nation Army – The White Stripes
58. Feel The Love – Cut Copy
57. Angels – Robbie Williams
56. Wires – Athlete
55. Blue Jeans – Blur
54. Fallen Angel – Elbow
53. When You Were Young – The Killers
52. Hate To Say I Told You So – The Hives
51. Sister Surround – The Soundtrack Of Our Lives
50. The Scientist – Coldplay
49. Hey Ya! – Outkast
48. Breaking Into Heaven – The Stone Roses
47. House Of Jealous Lovers – The Rapture
46. Madder Red – Yeasayer
45. Alice Practice – Crystal Castles
44. This Is Hardcore – Pulp
43. Champagne Supernova – Oasis
42. Dreaming Of You – The Coral
41. Neighbourhood #2 (Laika) – The Arcade Fire
40. The Cedar Room – Doves
39. Lazarus – The Boo Radleys
38. Are You Gonna Be My Girl? – Jet
37. Velvet – The Big Pink
36. Jenny Was A Friend Of Mine – The Killers
35. Don’t Look Back In Anger – Oasis
34. Time To Pretend – MGMT
33. Up The Bracket – The Libertines
32. The Bends – Radiohead
31. A New Decade – The Verve
30. Hurt – Johnny Cash
29. Ruby – Kaiser Chiefs
28. Jigsaw Falling Into Place - Radiohead
27. L.E.S. Artistes – Santogold
26. Club Foot – Kasabian
25. Rebellion (Lies) – The Arcade Fire
24. 2080 - Yeasayer
23. Hippipolla – Sigur Ros
22. Sunsets – Powderfinger
21. Last Goodbye – Jeff Buckley
20. Sulk – Radiohead
19. Being Bad Feels Pretty Good – Does It Offend You Yeah?
18. Howl – Black Rebel Motorcycle Club
17. Whatever – Oasis
16. The Golden Age – Beck
15. Animal Nitrate - Suede
14. C’mon C’mon – The Von Bondies
13. Get Free – The Vines
12. History – The Verve
11. Tongue Tied – Grouplove
10. Midnight City – M83
9. Getting Away With It (All Messed Up) - James
8. Live Forever – Oasis
7. Goddess On A Hi-Way – Mercury Rev
6. Hysteria – Muse
5. So Haunted – Cut Copy
4. Hard To Beat – Hard-Fi
3. Float On – Modest Mouse
2. You Only Live Once – The Strokes
1. Shark Fin Blues – The Drones


Friday, June 7, 2013

What Has Happened To St. Kilda and Fremantle?

In late 2011, St. Kilda’s coach Ross Lyon left the Saints to coach the Fremantle Dockers. In 2011, the Saints made the finals and the Dockers missed out. However, since then Fremantle has performed better than St. Kilda. The Dockers made the finals in 2012 while St .Kilda missed out. And after round 10 in 2013, Fremantle is in the top four and St. Kilda is in the bottom four. Does this mean that Ross Lyon’s defection is responsible for the turnaround in fortunes? Well, it may be part of the reason, but as always, the explanation does not appear to be quite as simple as that.

Let’s have a look at the performance of each of these teams’ players from 2011 to 2013. I am going to use each player’s average SuperCoach score as a measure of his performance; it’s debatable how good that indicator is as a measure of a player’s worth, but I can’t think of a better one. Similarly, I am going to use each team’s total average SuperCoach score per game as a measure of how well that team played each year. I am also going to measure the total average SuperCoach score per game across that team’s ‘best 22’, where the ‘best 22’ are simply those 22 players with the highest average SuperCoach scores that year. That helps to quantify the impact of injuries on a team’s performance. This method pays no attention to a player’s position, but I think the conclusions you can draw from it are somewhat useful.

Alright, this is the performance of each St. Kilda player from 2011 to Round 10 2013 (click to enlarge):

And here is the performance of each Fremantle player:

These numbers suggest that, even though St. Kilda made the finals in 2011 and Fremantle did not, Fremantle’s lower performance was due to injuries, and its ‘best 22’ was ‘better’ than St. Kilda’s ‘best 22’. Top Dockers Aaron Sandilands, David Mundy, and Michael Barlow all missed large chunks of 2011, while pretty much all the top Saints played almost all of the games.

In 2012, Fremantle’s average score improved as Mundy and Barlow returned, although Sandilands and Nathan Fyfe still missed a lot of games. But the performance of their ‘best 22’ also improved, with Paul Duffield, Clancee Pearce, Matthew De Boer, Michael Johnson, Christopher Mayne, and Luke McPharlin all being big improvers. Did these players improve because of Ross Lyon’s coaching? Meanwhile, St. Kilda regressed a bit, but their ‘best 22’ improved from 2011, suggesting injuries had an impact. Sam Fisher, Jason Gram, Clinton Jones, Ben McEvoy, and even Nick Riewoldt missed a few more games in 2012 than they did in 2011.

In 2013, Fremantle has improved even further, with Fyfe only missing two games to date and Barlow improving, although Matthew Pavlich has missed a lot of the season so far. Their ‘best 22’ has fallen back a bit, but that could be attributed to Sandilands’ complete absence in 2013. St. Kilda's ‘best 22’ however has fallen back further. The players who have deteriorated most include Nick Dal Santo, Sean Dempster, Jason Blake, Lenny Hayes, and Stephen Milne. Could they be suffering without Ross Lyon there to coach them? Maybe, but it’s quite possible that the drop in these players’ performance is just due to them being relatively old. (Not that being relatively old means a player will necessarily regress – by contrast Riewoldt has improved.) The Saints also lost one of their top performers to free agency in Brendon Goddard before the 2013 season.

In summary then, these figures suggest that Fremantle had the capability to be better than St. Kilda even before Ross Lyon joined them, but injuries had an important part in holding them back. Since then, quite a few Dockers have improved, which could be due to Lyon’s coaching. Meanwhile a few Saints have regressed in 2013, but this could just be primarily to do with getting ‘old’.

Monday, June 3, 2013

AFL Power Rankings: Round 10 2013


North Melbourne hop from 10th to 6th after a large win against St. Kilda, even if big wins against the Saints do not count for as much as they used to. The Roos’ close losses in 2013 are considered much more favourably by the Power Rankings than they are by the AFL ladder.  


West Coast drop from 6th to 8th after losing at home to the Tigers. Carlton drop even further, from 7th to 10th, but that is because the teams around it had strong performances against better competition, not because the Blues performed below expectations. Having said that, GWS is so diabolically bad that even a 16-goal win against it was considered about par. 


As late as Round 18 in 2012, Fremantle were ranked 12th. But after a blistering end to 2012 (in which they gained over five goals worth of ranking points) and a very good start to 2013 (including a win on the weekend in Adelaide) the Dockers are now a dash away from second. Yes, the Dockers. Yes, Fremantle.

I’ve done a preliminary player-by-player analysis to figure out how the Dockers and St. Kilda have gone in such different directions so quickly after Ross Lyon went west. Depending on how interesting I think it is, I might share the results on here later this week. Essentially it looks like the Dockers’ best 22 were better than the Saints’ best 22 even before Ross Lyon crossed over, but injuries to top players kept Freo back in 2011. Since then, some of the Dockers players have improved, which may or may not be due to Lyon’s coaching. Meanwhile the older players for the Saints have fallen down a bit in 2013 … and of course, they lost a certain top free agent.

1 (Last week: 1) Hawthorn 33.3 (Last week: 32.0)
2 (2) Geelong 24.6 (24.6)
3 (3) Fremantle 24.5 (22.7)
4 (4) Sydney 23.7 (21.6)
5 (5) Adelaide 16.7 (17.5)
6 (10) North Melbourne 12.2 (6.4)
7 (8) Collingwood 12.2 (9.2)
8 (6) West Coast 12.0 (16.3)
9 (11) Richmond 11.3 (5.6)
10 (7) Carlton 10.6 (10.9)
11 (9) Essendon 6.1 (7.4)
12 (12) St. Kilda -8.1 (3.7)
13 (13) Port Adelaide -13.7 (-11.2)
14 (14) Gold Coast -17.6 (-17.9)
15 (15) Brisbane -21.8 (-18.1)
16 (16) Western Bulldogs -25.5 (-28.0)
17 (17) Melbourne -57.5 (-54.5)
18 (18) Greater Western Sydney -66.6 (-64.9)

Rapid Reaction to the 2013 Annual Wage Review

Today, the Fair Work Commission announced that, as a result of its 2013 Annual Wage Review, minimum wages in awards would increase by 2.6 per cent on the first full pay period after 1 July 2013. Similar to the post-game NBA Rapid Reactions I read over at the ESPN website, here are my first thoughts about the Commission’s decision. Views are obviously my own – while I did work at Fair Work Australia in the past, it has been a year since I left, and so I can say I had absolutely bupkis to do with this Annual Wage Review.

- As it did in 2011 and 2012, the Minimum Wage Panel at FWC increased all minimum wages in awards by the same percentage amount (as opposed to the same dollar amount). This means the ratios of the higher award rates of pay to the lower award rates of pay remain the same. Back in March, I wondered if the Panel would continue this practice, and said it would be interesting to see how the recent research on award-reliant employees on higher rates of pay would affect the Panel’s views.  But as far as I can see that research isn’t mentioned in the latest decision, and contrary to its past two decisions, the Panel doesn’t discuss why it chose one form of adjustment over the other. On that basis, it appears that, while the Panel is charged with adjusting minimum wages, percentage increases to minimum wages have a greater look of permanency.

-The superannuation guarantee rate will increase each year up to 2019. The Panel say at paragraph 360 that ‘the increase in modern award minimum wages and the [national minimum wage] we have awarded in this Review is lower than it otherwise would have been in the absence of the [superannuation guarantee] rate increase’. But it has not applied a ‘direct, quantifiable, discount’ (para. 359), contrary to some parties’ proposals. Nevertheless, one would expect that, even if there was an explicit ‘discount’, it would be no more than 0.25 per cent, which is the increase per year in the superannuation guarantee rate in both 2013 and 2014. That suggests that, even if the Panel had ignored the increase in the SG rate, the minimum wage increase would still be lower than 3 per cent.

- The 2.6 per cent increase is just higher than the consumer price inflation figure over the year to the March quarter 2013 of 2.5 per cent. Over the same period the increase in the living cost index for employee households – which account for changes in interest charges – was 1.7 per cent.  The Reserve Bank of Australia forecasts consumer price inflation over the year to June 2014 of 2-3 per cent. So the increase in minimum wages is broadly in line with both past and expected changes in the cost of living. But as I noted a few months back, if minimum wages just move in line with living costs we would expect that, in the long-run, the gap between minimum wages and wages in the rest of the market will further increase, as long as labour productivity is growing.

- The Panel make note of the growing gap between minimum wages and wages in the rest of the market, saying that it may have implications for the economy and social cohesion (para. 425). That concern for social cohesion may or may not be well-founded – the minimum wage is still above 50 per cent of median earnings – but minimum wage growth has certainly been outpaced by growth in median and average earnings over recent years. The Panel notes that the tax-transfer system can play a role in alleviating the impact of earnings inequality (para. 426), and asks parties to give consideration as to how the tax-transfer system should be taken into account in minimum wage fixation in next year’s review. Nevertheless, even if one saw changes in the tax-transfer system that raise the disposable incomes of low-paid households as a perfect substitute for changes in minimum wages paragraph 395 shows that minimum wage households have still been receiving lower increases in income than other households over the past decade.     

- Something a little weird: in paragraph 40, the Panel says that ‘this Review has not convinced the Panel to alter its position from previous reviews that a modest increase in minimum wages has a very small, or even zero, effect on employment’. But if that is the case, why have a minimum wage increase that is so far below the average increase in wages if the growing gap is possibly a concern? Now I’m not personally saying that minimum wages don’t have a significant effect on employment, but that is what FWC are saying, and therefore by its own logic, it seems a little weird that it is so cautious. Left as it is, it seems like there is some unstated reason why the Panel has gone for a sub-3 per cent increase.

Those are my initial thoughts – the last two points in particular seem to me to indicate that FWC has awarded a lower increase than its own arguments would suggest. Its discussion of a ‘favourable’ economic outlook (paras. 320, 332) lends further weight to this. Perhaps the Panel’s true train of thought was that it had awarded a 2.9 per cent increase in 2012, and economic conditions are generally considered to have deteriorated since then so it had to go lower this year. But that’s purely speculation. Until next year …