Hawthorn extend their lead at the top after smashing the Bombers over the weekend. The Saints edge back up to third, while the Eagles and Pies close the gap with fourth-placed Adelaide after they both had massive wins.
(And for those who are wondering: how can the Tigers move further up after losing to Carlton?! ... Don't overthink the ranking positions. Essentially, the Tigers and Blues were close to even before their game on the weekend, there was very little between them in that game, and so they are still close to even after it.)
1 (1) Hawthorn 46.9 (39.9)
2 (2) Sydney 30.5 (29.5)
3 (4) St. Kilda 21.4 (17.1)
4 (3) Adelaide 19.3 (19.2)
5 (5) West Coast 18.4 (12.5)
6 (6) Collingwood 18.2 (12.5)
7 (7) Geelong 12.3 (10.2)
8 (8) North Melbourne 7.3 (6.2)
9 (10) Richmond 1.7 (0.4)
10 (9) Essendon 0.8 (5.0)
11 (11) Carlton 0.7 (0.2)
12 (12) Fremantle -3.6 (-6.0)
13 (13) Brisbane -14.1 (-10.8)
14 (14) Port Adelaide -21.1 (-20.2)
15 (15) Western Bulldogs -28.9 (-26.4)
16 (16) Melbourne -36.4 (-36.9)
17 (17) Gold Coast -46.8 (-47.1)
18 (18) Greater Western Sydney -74.1 (-70.2)
Sunday, July 29, 2012
The Most "Efficient" Premiership Winning Team In VFL/AFL History
Alright, Carlton and Essendon have the most VFL/AFL premierships, with 16 each, but looking at who has won the most premierships is not a very interesting blog post (and doesn't fill our quota of at least one mathematical formula per post). Another question is which team has been the most "efficient" at winning premierships? To answer this, I've devised the following formula:
Premiership Efficiency Rating = (Premierships Won / Number of Seasons Played) - Average Probability Of Winning Premiership
The average probability of winning the premiership is calculated as the average of one divided by the number of teams over all the seasons a club has played. In 1897, the probability of winning was 0.125 (1/8); in 2011 it was 0.059 (1/17). Hence, the average probability is lower for newer clubs because a far higher proportion of their seasons were played in a larger league.
The "premiership efficiency rating" for each team is as follows:
Brisbane (Bears/Lions) 0.056
West Coast 0.056
Essendon 0.055
Carlton 0.051
Collingwood 0.042
Hawthorn 0.037
Adelaide 0.032
Melbourne 0.021
Richmond 0.011
Port Adelaide 0.004
Geelong -0.007
Fitzroy -0.013
North Melbourne -0.032
Sydney -0.052
Gold Coast -0.059
Fremantle -0.062
Western Bulldogs -0.067
St. Kilda -0.078
University -0.100
Brisbane and West Coast come out on top, narrowly ahead of Essendon, assuming that the Brisbane Lions are counted as a continuation of the Brisbane Bears. If they're not, then the Brisbane Lions come out way ahead, with a PER of 0.138, as they have won 3 premierships since they began in 1997 in a league no smaller than 16 teams.
One could apply this formula to other competitions, and you would get more interesting results for the American professional sports associations, where there has been a wide variation in the degree of expansion over time. Indeed, I first had this idea in relation to the Davis Cup for tennis, in which the US and Australia won a lot of titles when fewer countries competed (and the format was different). Depending on how enthusiastic I feel, I might look at other leagues at a later date.
Premiership Efficiency Rating = (Premierships Won / Number of Seasons Played) - Average Probability Of Winning Premiership
The average probability of winning the premiership is calculated as the average of one divided by the number of teams over all the seasons a club has played. In 1897, the probability of winning was 0.125 (1/8); in 2011 it was 0.059 (1/17). Hence, the average probability is lower for newer clubs because a far higher proportion of their seasons were played in a larger league.
The "premiership efficiency rating" for each team is as follows:
Brisbane (Bears/Lions) 0.056
West Coast 0.056
Essendon 0.055
Carlton 0.051
Collingwood 0.042
Hawthorn 0.037
Adelaide 0.032
Melbourne 0.021
Richmond 0.011
Port Adelaide 0.004
Geelong -0.007
Fitzroy -0.013
North Melbourne -0.032
Sydney -0.052
Gold Coast -0.059
Fremantle -0.062
Western Bulldogs -0.067
St. Kilda -0.078
University -0.100
Brisbane and West Coast come out on top, narrowly ahead of Essendon, assuming that the Brisbane Lions are counted as a continuation of the Brisbane Bears. If they're not, then the Brisbane Lions come out way ahead, with a PER of 0.138, as they have won 3 premierships since they began in 1997 in a league no smaller than 16 teams.
One could apply this formula to other competitions, and you would get more interesting results for the American professional sports associations, where there has been a wide variation in the degree of expansion over time. Indeed, I first had this idea in relation to the Davis Cup for tennis, in which the US and Australia won a lot of titles when fewer countries competed (and the format was different). Depending on how enthusiastic I feel, I might look at other leagues at a later date.
Saturday, July 28, 2012
The Best Sporting Nation On Earth (Per Capita)
In their book "Soccernomics" Simon Kuper and Stefan Szymanski came up with a system for determining which countries have historically performed the best at sports given their population. They awarded points to countries based on their success in the following sports: rugby union, karate, cricket, baseball, basketball, women's soccer, men's and women's tennis, golf, chess, cycling, auto racing, the Summer and Winter Olympics, and the men's soccer World Cup. They then divided each country's total points by population to work out each country's sporting efficiency rating.
The winner? Norway, and by a considerable margin. Norway has been very successful at the Winter Olympics and women's soccer, and has a population of only five million. Sweden finished second on the table, Australia third, and New Zealand fourth.
Unsurpisingly, on the "raw" points total, the US came out as the most successful sporting nation.
The top 10:
Norway - 4 points per million inhabitants
Sweden - 1.22
Australia - 0.98
New Zealand - 0.97
United Germany - 0.91
Britain/England - 0.85
Hungary - 0.80
West Indies - 0.77
France - 0.67
Italy - 0.54
The winner? Norway, and by a considerable margin. Norway has been very successful at the Winter Olympics and women's soccer, and has a population of only five million. Sweden finished second on the table, Australia third, and New Zealand fourth.
Unsurpisingly, on the "raw" points total, the US came out as the most successful sporting nation.
The top 10:
Norway - 4 points per million inhabitants
Sweden - 1.22
Australia - 0.98
New Zealand - 0.97
United Germany - 0.91
Britain/England - 0.85
Hungary - 0.80
West Indies - 0.77
France - 0.67
Italy - 0.54
Sunday, July 22, 2012
AFL Power Rankings: Round 17 2012
The big mover this week in terms of positions is Adelaide, who jump from sixth to third after their big win against the Eagles, while the teams near them all lost this weekend. Another big mover is Geelong, who move back to seventh after their 11-goal win against Essendon.
The top three ranked teams are also now the top three teams on this season's ladder, but the Hawks are still ranked No.1.
1 (1) Hawthorn 39.9 (38.0)
2 (2) Sydney 29.5 (28.8)
3 (6) Adelaide 19.2 (15.1)
4 (3) St. Kilda 17.1 (17.4)
5 (4) West Coast 12.5 (16.0)
6 (5) Collingwood 12.5 (15.1)
7 (9) Geelong 10.2 (5.3)
8 (8) North Melbourne 6.2 (6.5)
9 (7) Essendon 5.0 (9.1)
10 (11) Richmond 0.4 (-0.3)
11 (10) Carlton 0.2 (2.1)
12 (13) Fremantle -6.0 (-8.5)
13 (12) Brisbane -10.8 (-7.4)
14 (14) Port Adelaide -20.2 (-22.0)
15 (15) Western Bulldogs -26.4 (-27.5)
16 (16) Melbourne -36.9 (-35.0)
17 (17) Gold Coast -47.1 (-49.5)
18 (18) Greater Western Sydney -70.2 (-67.6)
The top three ranked teams are also now the top three teams on this season's ladder, but the Hawks are still ranked No.1.
1 (1) Hawthorn 39.9 (38.0)
2 (2) Sydney 29.5 (28.8)
3 (6) Adelaide 19.2 (15.1)
4 (3) St. Kilda 17.1 (17.4)
5 (4) West Coast 12.5 (16.0)
6 (5) Collingwood 12.5 (15.1)
7 (9) Geelong 10.2 (5.3)
8 (8) North Melbourne 6.2 (6.5)
9 (7) Essendon 5.0 (9.1)
10 (11) Richmond 0.4 (-0.3)
11 (10) Carlton 0.2 (2.1)
12 (13) Fremantle -6.0 (-8.5)
13 (12) Brisbane -10.8 (-7.4)
14 (14) Port Adelaide -20.2 (-22.0)
15 (15) Western Bulldogs -26.4 (-27.5)
16 (16) Melbourne -36.9 (-35.0)
17 (17) Gold Coast -47.1 (-49.5)
18 (18) Greater Western Sydney -70.2 (-67.6)
Saturday, July 21, 2012
No More Economics Posts
As I mentioned in my last post, I've left Fair Work Australia now and am taking up a new position (starting 30 July). While I could probably post about a number of economics-related subjects that don't relate at all to my new position, rather than cause any stress I've decided the safest course is not to post anything further about economics or economic policy for now. ("Freakonomics"-like subjects like baby names and cheating in sumo wrestling should still be OK ... not that I'm preparing any blog posts on cheating in sumo wrestling in the near future.)
I have pondered what to do with previous economics and policy-related posts. I'll leave them up for now (though I have added some disclaimers) and just draw a line at this spot.
There, done! Now I might go work on a "Dark Knight" post or something.
I have pondered what to do with previous economics and policy-related posts. I'll leave them up for now (though I have added some disclaimers) and just draw a line at this spot.
There, done! Now I might go work on a "Dark Knight" post or something.
Thursday, July 19, 2012
Minimum Wages Research In Australia: 2006-2012
Over
the past six years there has been a substantial amount of research relating to
minimum wages in Australia. This started with the Australian Fair Pay
Commission—the body established by the “Work Choices” legislation to set
minimum wage rates—and continued when those wage-setting powers were
transferred to Fair Work Australia in 2009. I worked for the AFPC from 2006 to
2009 and FWA from 2009 to 2012, and was involved in producing at least some of
that research (as you can see from my name on the cover of some of the
reports!) To mark the end of my involvement in minimum wage research, I thought
I’d post a list of the research that I personally found the most interesting
over that period, and which in my opinion is worth checking out (at least the
abstracts) if you have an interest in the subject.
The
first articles I’d recommend are not about minimum wages in Australia at all,
but in a lot of respects they crystallise the main debates around the role of
minimum wages. At the AFPC’s Minimum Wage Research Forum in 2008, the two
keynote speakers were Stephen Machin from the UK, and Richard Burkhauser from
the US.
Professor Machin
talked about the UK experience following the introduction of a National Minimum
Wage in 1997, and how the setting of minimum wages had been very much an “evidence-based”
process; that is, by conducting research into the economic effects of minimum
wages in the UK. Professor Machin claimed that (at that time) there had been
little evidence of negative effects of minimum wages on employment and hours,
and that minimum wages can raise the wages and welfare of working families.
By
contrast, Professor Burkhauser claimed that the negative effects of minimum
wages on employment outweigh the movement out of poverty by those workers who
are helped by the policy. Professor Burkhauser argued that instead an Earned
Income Tax Credit is a far more effective way of ensuring those who work are
not poor (an EITC is a tax refund for lower-income households: have a look on
Wikipedia if you’re interested in the details).
Another
interesting article from that Forum was Ian McDonald’s “macroeconomic
perspective” on the setting of minimum wages. He argued that minimum wages in
Australia should be adjusted by 4 per cent a year—2.5 per cent to account for
changes in prices and 1.5 per cent for changes in productivity. Some might find
this view unusual, but it’s worth a read to see how Professor McDonald backs up
his proposal.
Of
the non-Forum papers NATSEM’s report on the interactions between wages and the
tax-transfer system shows what would happen to the incomes of a range of
low-wage households if minimum wages were raised, given that those households
would face higher tax payments and reduced government transfers. It’s a bit out
of date now (it was published in 2006), but it gives some indication of how
much of a minimum wage increase households are really getting in their pockets.
I
found Downes’ and Hanslow’s 2009 report interesting, if only because it’s been
the only research commissioned by either the AFPC or FWA to model the
macroeconomic impacts of increasing minimum wages. Of course all economic
modelling faces limitations, and this is no exception, but it’s worth a look if
you’re interested in models (and who isn’t?)
Also
from that year I was quite interested in Hahn's and Wilkins’ report on the living
standards of low-paid workers. They took a multidimensional approach to
measuring living standards; so a person could be considered to have low living
standards if they have low income and low wealth, or low income and low
consumption, or maybe all of the above. It’s interesting to see how the
incidence of low living standards amongst low-paid workers changes as the
definition of “low living standards” changes.
From
the FWA era, a report I read a number of times was Jocelyn Pech’s report on various
ways of defining and measuring the relative living standards and needs of
low-paid workers. How does one determine how well-off low-paid workers are?
Well, there’s been a number of ways that this has been done by Australian
researchers: this report canvasses their approaches.
And
of course you should read all of my research! OK, maybe not, but an article that
I quite like is the one I did on labour market outcomes for low-skilled people in Australia (it’s the final article in the report). It brought together a number
of ideas that had been circulating around my head about what the data relating
to low-skilled employment can tell us. Also, I’d recommend the report I did
with Tom Bolton on the distribution of earnings for employees earning minimum wages – like Blur’s “Song 2” it seems to have had a lot of legs for something that
was knocked out pretty quickly.
It
would be remiss of me to write about Australian minimum wage research over the
past few years without mentioning Dr Josh Healy’s mammoth work on the wages
safety net of the Australian Industrial Relations Commission from 1993 to 2005.
It takes a look at how minimum wages were decided over this
period, and the implications of those decisions. More recently, Wilkins and Wooden examined minimum wage/award reliance as measured by the HILDA survey.
The inclusion of an “award reliance” variable in the HILDA survey is
potentially an exciting development for minimum wage research in Australia,
because the HILDA survey has a large number of variables that can be
cross-tabbed, and employees are tracked over time.
Of
course, there are many more research reports to explore on the FWA website if
you’re interested. And for those who just come here for the AFL Power Rankings,
they’ll be back on Sunday.
Monday, July 16, 2012
AFL Power Rankings: Rounds 15 and 16 2012
OK, we have two weeks' worth of Power Rankings to catch up on. First, here is how things stood last week at the conclusion of Round 15:
1 (1) Hawthorn 37.5 (32.2)
2 (3) Sydney 23.8 (23.3)
3 (2) West Coast 22.0 (24.1)
4 (7) St. Kilda 16.8 (10.1)
5 (4) Collingwood 13.9 (16.5)
6 (8) Adelaide 10.5 (8.1)
7 (5) Geelong 10.0 (15.1)
8 (9) Carlton 9.1 (7.3)
9 (6) Essendon 8.4 (13.7)
10 (10) Richmond 5.1 (6.4)
11 (11) North Melbourne 1.9 (-0.9)
12 (12) Brisbane -6.1 (-5.3)
13 (13) Fremantle -8.5 (-10.5)
14 (14) Port Adelaide -19.8 (-18.0)
15 (15) Western Bulldogs -25.6 (-22.2)
16 (16) Melbourne -31.1 (-32.0)
17 (18) Gold Coast -53.3 (-57.6)
18 (17) Greater Western Sydney -62.0 (-57.3)
And here is how they stand at the end of Round 16:
1 (1) Hawthorn 38.0 (37.5)
2 (2) Sydney 28.8 (23.8)
3 (4) St. Kilda 17.4 (16.8)
4 (3) West Coast 16.0 (22.0)
5 (5) Collingwood 15.1 (13.9)
6 (6) Adelaide 15.1 (10.5)
7 (9) Essendon 9.1 (8.4)
8 (11) North Melbourne 6.5 (-0.9)
9 (7) Geelong 5.3 (10.0)
10 (8) Carlton 2.1 (9.1)
11 (10) Richmond -0.3 (5.1)
12 (12) Brisbane -7.4 (-6.1)
13 (13) Fremantle -8.5 (-8.5)
14 (14) Port Adelaide -22.0 (-19.8)
15 (15) Western Bulldogs -27.5 (-25.6)
16 (16) Melbourne -35.0 (-31.1)
17 (17) Gold Coast -49.5 (-53.3)
18 (18) Greater Western Sydney -67.6 (-62.0)
Some big changes there from a couple of weeks back - let's go through them:
- Hawthorn is still a clear first, extending its lead in Round 15 with its 162-point hiding of GWS. (The Hawks have climbed to fourth on the actual ladder and are not far off the top.)
- The real ladder-leaders the Sydney Swans are now a clear second after their big win over the Eagles in Perth.
- St. Kilda have jumped to third. Boy, is that likely to raise some eyebrows! The Saints are perceived as currently just hanging on to a final eight spot. How best to explain this? St. Kilda are part of a close group of four with West Coast, Collingwood and Adelaide, having catapulted there after their huge win against the Bombers in Round 15 (conversely, that loss is partly keeping the Bombers out of that group). It may be surprising to know, but St. Kilda actually has a much higher average net margin than Collingwood over its past 22 games (the Magpies have rarely been beating teams by a lot). West Coast's average net margin is higher again, but last week's big loss at home drags them down. The Crows are winning big this season, but they have an easier draw than most teams.
- Essendon and North Melbourne round out the top eight - the Kangas jumping three spots after a big win against Carlton.
- Last year's top-ranked team and premier Geelong, are now ninth! Taking their past 13 games as a whole, they have negative ranking points, only their great 2011 form is keeping them in positive territory.
- The Giants are now clearly the bottom-ranked team in the AFL, having been annihilated in each of the past four weeks.
1 (1) Hawthorn 37.5 (32.2)
2 (3) Sydney 23.8 (23.3)
3 (2) West Coast 22.0 (24.1)
4 (7) St. Kilda 16.8 (10.1)
5 (4) Collingwood 13.9 (16.5)
6 (8) Adelaide 10.5 (8.1)
7 (5) Geelong 10.0 (15.1)
8 (9) Carlton 9.1 (7.3)
9 (6) Essendon 8.4 (13.7)
10 (10) Richmond 5.1 (6.4)
11 (11) North Melbourne 1.9 (-0.9)
12 (12) Brisbane -6.1 (-5.3)
13 (13) Fremantle -8.5 (-10.5)
14 (14) Port Adelaide -19.8 (-18.0)
15 (15) Western Bulldogs -25.6 (-22.2)
16 (16) Melbourne -31.1 (-32.0)
17 (18) Gold Coast -53.3 (-57.6)
18 (17) Greater Western Sydney -62.0 (-57.3)
And here is how they stand at the end of Round 16:
1 (1) Hawthorn 38.0 (37.5)
2 (2) Sydney 28.8 (23.8)
3 (4) St. Kilda 17.4 (16.8)
4 (3) West Coast 16.0 (22.0)
5 (5) Collingwood 15.1 (13.9)
6 (6) Adelaide 15.1 (10.5)
7 (9) Essendon 9.1 (8.4)
8 (11) North Melbourne 6.5 (-0.9)
9 (7) Geelong 5.3 (10.0)
10 (8) Carlton 2.1 (9.1)
11 (10) Richmond -0.3 (5.1)
12 (12) Brisbane -7.4 (-6.1)
13 (13) Fremantle -8.5 (-8.5)
14 (14) Port Adelaide -22.0 (-19.8)
15 (15) Western Bulldogs -27.5 (-25.6)
16 (16) Melbourne -35.0 (-31.1)
17 (17) Gold Coast -49.5 (-53.3)
18 (18) Greater Western Sydney -67.6 (-62.0)
Some big changes there from a couple of weeks back - let's go through them:
- Hawthorn is still a clear first, extending its lead in Round 15 with its 162-point hiding of GWS. (The Hawks have climbed to fourth on the actual ladder and are not far off the top.)
- The real ladder-leaders the Sydney Swans are now a clear second after their big win over the Eagles in Perth.
- St. Kilda have jumped to third. Boy, is that likely to raise some eyebrows! The Saints are perceived as currently just hanging on to a final eight spot. How best to explain this? St. Kilda are part of a close group of four with West Coast, Collingwood and Adelaide, having catapulted there after their huge win against the Bombers in Round 15 (conversely, that loss is partly keeping the Bombers out of that group). It may be surprising to know, but St. Kilda actually has a much higher average net margin than Collingwood over its past 22 games (the Magpies have rarely been beating teams by a lot). West Coast's average net margin is higher again, but last week's big loss at home drags them down. The Crows are winning big this season, but they have an easier draw than most teams.
- Essendon and North Melbourne round out the top eight - the Kangas jumping three spots after a big win against Carlton.
- Last year's top-ranked team and premier Geelong, are now ninth! Taking their past 13 games as a whole, they have negative ranking points, only their great 2011 form is keeping them in positive territory.
- The Giants are now clearly the bottom-ranked team in the AFL, having been annihilated in each of the past four weeks.
Saturday, July 7, 2012
AFL Power Rankings: Round 15 2012
Will be coming next week, as I'll be on a plane when this week's round finishes up. But it looks like, after this round, Collingwood will be closer on the actual AFL ladder to where they have been here the past couple of months.
Minimum Wage Claims: Has ACCI or the ACTU Been Closer?
Each year for the annual minimum wage reviews in Australia,
a range of groups put in claims for what they believe minimum wages should be
increased by, including the Australian Council of Trade Unions (the ACTU) and
the Australian Chamber of Commerce and Industry (ACCI). Typically the ACTU’s
claim is above the eventual increase in minimum wages, and ACCI’s claim is
below the eventual increase. But which of these groups has been closer to the
eventual outcome?
(Of course there are many other claims from other parties for annual wage reviews, but for the purposes of this post I am just going to focus on ACCI and the ACTU. Also, if I restrict the focus to those groups then I can just get the wage claims data from the Australian Financial Reviewwebsite rather than having to look it up myself!)
To see whose claims have been closer to the actual outcomes,
we can take the average absolute deviation of the claims from the actual
outcomes since 1997. For the ACTU the average absolute deviation has been
$10.11, and for ACCI it has been $10.92. Therefore, the ACTU’s wage claims
have, on average, been slightly closer to the actual minimum wage increase than
ACCI’s wage claims. If we take away 2006 and 2008, for which I have assumed
“claims” for ACCI, then the average deviation is slightly lower for ACCI than
for the ACTU (however, as mentioned above the outcomes were much higher in
these years than the increases that ACCI typically seeks.)
Unsurprisingly, if one takes the average of the ACCI and
ACTU claims for each year, the average deviation is much smaller (see graph
below).
(Of course there are many other claims from other parties for annual wage reviews, but for the purposes of this post I am just going to focus on ACCI and the ACTU. Also, if I restrict the focus to those groups then I can just get the wage claims data from the Australian Financial Reviewwebsite rather than having to look it up myself!)
Below is a graph of ACCI’s and the ACTU’s claims for the
increase in the federal/national minimum wage for each year since 1997. Note
that ACCI did not actually propose a figure in 2006 and 2008. For 2006 I have
assumed a figure of $12.50; this was their claim in 2010, when there also had
not been a minimum wage increase for over a year. For 2008 I assumed a figure of
$10, which has been a common claim made by ACCI over the past decade. These
were NOT the claims made by ACCI in either of these years; however given that
the eventual increases in 2006 and 2008 were both well above the amount that
ACCI typically seeks removing these years would skew the results.
Using this average is a pretty good explanatory variable for
the actual minimum wage increases each year. However, one should be careful not
to make too much of this; since both series have been trending upwards over
time (as wages and prices increase), this would increase the apparent explanatory
power. The correlation between the two series is actually pretty low,
particularly during the period in which the Australian Industrial Relations
Commission set minimum wages (1997 to 2005).
Sunday, July 1, 2012
AFL Power Rankings: Round 14 2012
Plenty of movement this week ... the Hawks remain on top (with an increased margin), but the Eagles move up to 2nd after beating up on their easybeat (Gold Coast) by a little more than the Swans beat up on theirs (GWS). Collingwood and Geelong, who both weren't able to beat up on the lowly-ranked teams themselves, slip a little further back from the top three. But the Bombers, who did, move up from 9th to 6th, and are closing in on the Pies and Cats.
1 (1) Hawthorn 32.2 (29.4)
2 (3) West Coast 24.1 (21.8)
3 (2) Sydney 23.3 (22.4)
4 (5) Collingwood 16.5 (18.3)
5 (4) Geelong 15.1 (18.9)
6 (9) Essendon 13.7 (8.0)
7 (6) St. Kilda 10.1 (14.9)
8 (8) Adelaide 8.1 (9.2)
9 (7) Carlton 7.3 (10.1)
10 (10) Richmond 6.4 (6.5)
11 (11) North Melbourne -0.9 (-5.1)
12 (12) Brisbane -5.3 (-6.2)
13 (13) Fremantle -10.5 (-12.0)
14 (15) Port Adelaide -18.0 (-20.0)
15 (14) Western Bulldogs -22.2 (-16.5)
16 (16) Melbourne -32.0 (-29.9)
17 (18) Greater Western Sydney -57.3 (-55.5)
18 (17) Gold Coast -57.6 (-53.6)
(Next Sunday I'm flying out to Bangkok for a week, so the rankings will return on Monday 16th.)
1 (1) Hawthorn 32.2 (29.4)
2 (3) West Coast 24.1 (21.8)
3 (2) Sydney 23.3 (22.4)
4 (5) Collingwood 16.5 (18.3)
5 (4) Geelong 15.1 (18.9)
6 (9) Essendon 13.7 (8.0)
7 (6) St. Kilda 10.1 (14.9)
8 (8) Adelaide 8.1 (9.2)
9 (7) Carlton 7.3 (10.1)
10 (10) Richmond 6.4 (6.5)
11 (11) North Melbourne -0.9 (-5.1)
12 (12) Brisbane -5.3 (-6.2)
13 (13) Fremantle -10.5 (-12.0)
14 (15) Port Adelaide -18.0 (-20.0)
15 (14) Western Bulldogs -22.2 (-16.5)
16 (16) Melbourne -32.0 (-29.9)
17 (18) Greater Western Sydney -57.3 (-55.5)
18 (17) Gold Coast -57.6 (-53.6)
(Next Sunday I'm flying out to Bangkok for a week, so the rankings will return on Monday 16th.)