Monday, December 10, 2018

AFL Statistics Series #2: Scoring a Behind – ‘Scoreboard Impact’ or a ‘Missed Opportunity’?

One point for ‘trying’

What is the ‘value’ of a behind in Australian rules? On the scoreboard it is of course one point scored for your team. That’s less than the six points for scoring a goal, but better than no points and potentially the difference between winning and losing.

A behind scored by an individual player is also recorded against the name of that player, as part of their contribution to the team’s score. In fantasy football leagues a behind makes a minor but positive contribution towards a player’s fantasy points total. So while it isn’t as good as a goal, it seems like something at least.

Is it a positive though? Every supporter has known the agony of his or her team losing a match through inaccuracy in kicking for goal. It’s even more agonizing when the players are missing shots for goal that are considered relatively easy.

Shots on goal are hard to come by, and six behinds are needed to obtain as many points as just one goal. Furthermore scoring a behind gives the ball back to the opposition for a kick into play, in contrast to the roughly ‘fifty-fifty’ chance a team has of getting the ball back again from the centre bounce that follows a goal.

With that in mind should we really consider kicking a behind favourably? Should a behind be seen more as ‘impacting the scoreboard’ or a ‘missed opportunity’?

How existing player rating systems credit behinds

As I said in my first post in this series part of thinking here is perhaps to arrive at a new ‘player rating’ system. Both the Australian Football League Fantasy and SuperCoach (Champion Data) ratings give a player a point for scoring a behind. HPN’s PAV system also gives positive credit to a player for any point he scores, though as with the other systems a behind will have only a minor effect on a player’s rating.

In the AFL Player Ratings system however a player can lose rating points for missing a shot at goal. Rating points in this system essentially depend upon how a player’s action affects a team’s expected score. For example, a player taking a shot from 15 metres out would in most cases be expected to score a goal, so scoring a behind instead means the outcome was a lot worse than expected. The ‘penalty’ for missing is less if the player was taking a more difficult shot – say, from 60 metres out.

Kicking a behind could then lead to a net negative effect on the player’s rating. In the extreme if the player who missed the shot on goal gets no credit at all for creating the shot then the player has only hurt his team. On this view, it would be a similar situation to a player undoing the good work of his teammates by kicking the ball to the opposition, and it is well-known that a player loses points for this under the SuperCoach rating system at least.

Reading through the explanation of the AFL Player Ratings system for the first time did alter my view of what a behind was worth. Or maybe it just returned me to a more intuitive state of being a fan in the stands watching a player on my team miss a shot on goal – shaking my head and cursing as he ‘blew’ all the hard work of my team getting the ball up the field for a scoring shot only to get one point out of it.

Attributing a team’s points to player accuracy

Players can help their team score by contributing to the creation of scoring shots, and by taking scoring shots. Crediting players for the former is going to take a bit of work. I think though it’s relatively easy to give credit for the latter.

OK, a player is taking a shot for goal – what is the change in the team’s expected points from him converting or missing the shot? Under the AFL Player Ratings system this depends upon the expected points for the position on the ground that the player takes the shot from. However I only have public data and I don’t know where the shot was taken from. Therefore, let’s define the extra points from scoring shots as following:

Extra points created by player from taking scoring shots = Points scored by player – (League average points per scoring shot, excluding rushed behinds * Scoring shots by player)

In 2018 the league average points per scoring shot, excluding rushed behinds, was 3.85 points. Therefore, if a team creates a scoring shot and I don’t know where on the field the scoring shot was created from, I’m going to assume the value of creating one scoring shot is 3.85 points. These points can be attributed amongst the players who contributed to creating the shot, including perhaps to the player who took the shot itself.

But what is the value of simply taking the shot? That is, let’s ignore the player’s role in creating the shot, the metres gain from kicking to goal, and what happens after the score is kicked. Under the system above each successful shot by a player on goal adds 2.15 points, with the other 3.85 points going to the players that created the shot. On the other hand, if a player misses a shot he can be said to have subtracted 2.85 points from his team’s total.

This simple system does ignore shots at goal that go out on bounds on full, but I cannot get these from public statistics. Probably the bigger weakness though is it does not account for the difficulty of the player’s shots on goal, as the AFL Player Ratings system does. For example, Lance Franklin converted shots on goal in 2018 at about the league average rate. Franklin though is well known for converting longer and more difficult shots than your average forward. (One might also argue that the difficulty of shots varies at a team level – i.e. some teams create better shots than others.)

If we just use this simple system though, which players created the most value last season from scoring shots converted, and which players lost the most value? This will depend upon the accuracy of the player’s shots and his volume of scoring shots. Hence, the top players in terms of extra points added from converting scoring shots in 2018 included leading goalkickers such as Ben Brown, Tom McDonald, Jack Riewoldt, Luke Breust, and Tom Hawkins (see table below).

The ‘worst’ goalkickers do not ‘destroy’ quite as much value as the best goalkickers create (see table below). According to this system Jarryd Lyons lost about 28 points for his team in 2018 through his inaccuracy, less than half of the 63 points Ben Brown created.

The line from ‘good’ to ‘bad’ converter is relatively thin. There is little overlap among the ‘best’ and ‘worst’ converters if the same calculations are done for 2017 (though Ben Brown topped the list in both years), with Christian Petracca even flipping between the two.

On a per game basis the points gained or lost from simply converting scoring shots may seem relatively small. Even Ben Brown is only credited for less than 3 extra points per game from his accuracy. However in a league where a team scores on average 80-90 points per game – meaning that each of a team’s 22 players contributes on average about four points per game – goal accuracy can be quite significant.

It is even more significant in evaluating a player’s contribution to an individual game. A player scoring four or five behinds without scoring a goal could very well obliterate every positive contribution he has made for the game. Few ‘possession chains’ a team or a player is involved in result in scoring shots; ‘wasting’ those that do is significant.

VERDICT: Creating a scoring shot is valuable. Scoring one point rather than six points with that shot is generally a MISSED OPPORTUNITY.

Wednesday, October 31, 2018

AFL Statistics Series #1: Which Statistics Matter The Most (Apart From The Scoreboard)?


This is a first in a series of posts that I’ll do about statistics in the Australian Football League. The posts will be about which AFL statistics I think matter – that is, what I think they tell us about how AFL teams and players go about scoring and stopping the other team from scoring.

Yes, there is a lot of writing out there already about AFL that uses statistics and numbers, and a lot of good writing. This is just how all of those statistics make sense to me. I hope if you’re reading this you find something in here that’s useful for you too.

A lot of the thoughts I’m going to talk about here came about as a result of me trying to devise a method of rating AFL players, without having access to the detailed data that Champion Data use to devise their ratings. We may still get to that in the end. It turns out though that to work out how each player contributes to winning a game you need to first think about how teams go about winning them.

Points differential: the most important statistic of all (duh…)

In their AFL Prospectus 2018, Champion Data made this obvious but important point:

“… we are asked [:] What’s the most important stat? As respectfully as possible we answer with POINTS FOR. It’s the one stat that guarantees a win … We go on to explain it’s more about how you get to that point.”

Of course points for compared with points against is important. There is a position here though that may not be quite so obvious. Some would argue that only the win or the loss matter, and not the margin of victory or defeat. Margins though tend to be a better predictor of future performance. Close wins may bring exhilaration and relief, but in general a team should take more comfort out of a comfortable win than a close one.

Metres gained matters

Metres gained gets some bad press, perhaps because it sounds like ‘a stat too far’. An article on The Roar last year even went so far as to call the statistic ‘irrelevant’. The main argument was that it doesn’t take into account the outcome of the possession – a long kick to the opposition would be credited with many more metres than an effective handball backwards to a teammate. “While it is impressive to see a player average over 300 metres gained a match,” the author says “the statistic is mostly empty in its meaning.”

The article makes some good points, but I disagree that metres gained is irrelevant. Indeed to me, there is hardly anything more fundamental to performing well at Australian rules football than gaining metres. When you’re watching your team, apart from when they’re actually in the action of kicking goals, what do you most want them to do? You want them to GET THE BALL CLOSER TO THEIR GOALPOSTS and GET THE BALL FAR AWAY FROM THE OTHER TEAM’S GOALPOSTS!

Now a critic of metres gained may point out that it isn’t so great if you kick 50 metres straight to the opposition. That’s true – ‘effective’ metres would probably be a better measure. We’ll get to more about keeping possession later.

Kicking the ball 50 metres to the opposition though isn’t necessarily a horrible outcome, despite what I will say later on about the value of turnovers. Now if the opposition run the ball down the field and score a goal off your turnover that is obviously a bad result. That worst-case scenario is relatively uncommon though – only about 10 per cent of possession ‘chains’ end in goals, and on average those ‘chains’ only last for about three disposals and gain about 45-50 metres. In other words, even if you kick 50 metres to the opposition it’s unlikely the other team will punish you by running the ball down the field and kicking a goal (obviously depending on where the ball is, and how badly you butcher the kick). More likely is that the ball may come back to your team within a few possessions, and back around where you started.

What about an effective kick across the ground that gains no metres but sets the team up for a shot at goal? Isn’t it true that metres gained isn’t very good at accounting for that? That cross-ground kick however is only valuable if the TEAM gains metres on a subsequent possession. If the opposition stops the ball before it goes any further then it’s just a kick across the ground that didn’t help much. The objection to metres gained here is more about crediting the total metres gained by the team to the individuals in that team – how much did that cross-ground kick help the team to score? – not about the value of the total metres gained by the team itself.

Metres gained matter. In each of the four AFL seasons since the statistic was made public in 2015 no other relatively common used statistic – except statistics directly related to scoring, e.g. score involvements and goal assists – has been more positively correlated with points differential (see table below).

Good teams like West Coast and Richmond had less disposals last year than their opponents, and lower teams like the Bulldogs and St. Kilda had more. Richmond were smashed in hit outs and clearances, and West Coast were behind on tackles. The higher teams though almost always had positive metres gained differentials over the season (see table below). It is one of the few relatively common statistics that you can reliably count on good teams being ahead in.  

[EDIT: Metres gained differential in a single match should, by definition, ALMOST reflect goal differential. Nevertheless, metres gained are still highly correlated with scoring.]    

Once you view gaining territory as an important indicator of a football team’s ability to score goals, the importance of some other AFL statistics falls into place. Inside 50 entries – a statistic that has been noted by others to be highly correlated with winning – indicates metres gained by measuring the number of times a team gets the ball past a particular point on the ground, one it has to pass over in order to score goals. An inside 40 or inside 30 measure would also indicate this.
(Rebound 50s indicate metres gained as well, but since they indicate metres gained in a team’s defensive part of the ground they are negatively correlated with winning. Put inside 50s and rebound 50s together and you get some of the way to a decent proxy for metres gained.) 
It also indicates why the number of kicks that a team or player records is generally important for winning (other than, of course, it is the only way to score goals), and why one kick is generally more important than one handball – more kicks often leads to more territory gained. Conversely though it also explains why just amassing kicks sometimes does not lead to success; for example, two short kicks of 20 metres get a team no closer to goal than one long kick of 40 metres.
All of this is not to say however that an individual player’s contribution to their team’s performance should be primarily measured by metres gained. The AFL leaders in metres gained per game last year were Jayden Short, James Sicily, and Nathan Wilson. Nobody thinks that these are the very best players in the league. (Maybe they should … but probably not.) That is because what these players don’t do as much as some other players is win their teams the ball in the first place.
Turnovers are the main source of scoring shots
In Australian rules football, the significance of individual possessions in helping a team score can sometimes seem hard to work out. The ball can pass back and forth between teams several times before anyone has a legitimate chance to score. Teams can also have very different styles when in possession of the ball, with some teams preferring a ‘high-possession, low-risk’ game, and others preferring to be more direct.
Let’s try and simplify it then. Obviously if you have possession of the ball you are the only team that can score until your ‘chain’ of possessions is broken. We’ll call any unbroken sequence of possessions by a team – whether of one possession or ten – a ‘possession chain’.
Let’s say that a possession chain for a team can start in one of three ways:
  • the team gets a ‘clearance’ – i.e. it clears the ball from a ball-up, either from a stoppage or a centre bounce at the start of a quarter or after a goal
  • the opposition turns the ball over; or
  • the opposition scores a behind, giving the team a kick into play.
Let’s also say that a possession chain ends in one of three ways:

  • the team scores a goal or behind – a successful (or at least partially successful) possession chain;
  • the team turns the ball over; or
  • the possession chain is ‘stopped’, due to a ball-up, or because the quarter is over, or because the scoreboard is on fire – basically any unsuccessful possession chain that does not result in the ball going directly back to the opposition.
(That may not be completely technically correct according to how Champion Data defines these terms, but I think it’s close enough for the purpose of my main point here.)

Therefore, for a team we will say that:

Possession chains started = Possession chains ended

Clearances + Opposition turnovers + Opposition behinds = Scoring Shots + Turnovers + Stopped Possession Chains

Or, more importantly:

Scoring Shots = Clearances + Opposition turnovers + Opposition behinds - Turnovers - Stopped Possession Chains

For example in 2018 premiers West Coast averaged 24.7 scoring shots per game. By the definition above they started 117.9 possession chains per game, from 36.6 clearances, 71.6 opposition turnovers, and 9.7 opposition behinds. Of the 93.2 unsuccessful possession chains they had per game, 67.7 of them were turnovers, and 25.4 were stopped possession chains.

The bottom team in 2018 Carlton averaged 17.9 scoring shots per game. By the definition above they started 109.7 possession chains per game, from 35.0 clearances, 62.3 opposition turnovers, and 12.4 opposition behinds. Of the 91.9 unsuccessful possession chains they had per game, 67.3 of them were turnovers, and 24.6 were stopped possession chains.

What’s the main difference in the possession chains of those two teams? Clearances, behinds, and stopped possession chains are all similar. The main difference is West Coast started more chains through opposition turnovers – about the same difference as the difference in scoring shots.

Returning to our scoring shot formula above, let’s now look at scoring shot differentials, or scoring shots for the team less opposition scoring shots. With a bit of mathematics we can show that:

Scoring shot differential = Clearance differential – 2 * Turnover differential – Behinds differential – Stopped possession chains differential

Maybe it’s just me, but I found this really interesting when I worked it out. Turnovers are not only more common than those other components they count for twice as much in this equation. When there is a stoppage one team’s possession chain ends, but then each team has about a 50 per cent chance of starting the next possession chain. When the ball is turned over, one team’s possession chain ends and the other team’s begins.

The importance of turnover differential can be seen when we compare how teams got their scoring shots in 2018 (see table below). Minor premiers Richmond were last by some margin in clearance differential, but they were way ahead in terms of (inverse) turnover differential. West Coast and Melbourne also rated highly in either causing opposition turnovers, or not turning the ball over themselves. Unsurprisingly, bottom teams Carlton and Gold Coast rated poorly in terms of turnover differential.

Turnover differential does not explain everything, as there are other ways to start possession chains. 2018 runners-up Collingwood gave up a lot of turnovers, but they were good at getting clearances. 2016 premiers the Western Bulldogs were fantastic at winning clearances. It is just that it is less likely to get a high differential through clearances or stoppages rather than turnovers as there are less of them. Hence, clearance differential is less correlated with winning than turnover differential is.
In terms of valuing players this suggests that not only are players with high clearance numbers such as Tom Mitchell and Nat Fyfe important for starting possession chains, but so are defenders who get a high number of intercepts such as Alex Rance and Jeremy McGovern. Though the question then is how important is the individual player who records the clearance or intercept to the team getting possession? How much of the credit should go to the ruckman getting the hit out, or the structure of the team defence?
Given how fundamental I said metres gained were how then does turnover differential relate back to that measure? If the main aim of Australian rules is to score by getting the ball close enough to your goal to do so, then the way to progress the ball down the ground is to get possession of the ball and keepisng it. The best possessions are those that help the team to gain a high number of metres with a relatively low risk of giving the ball back to the opposition, or of the ball being ‘stopped’. On defence, your aim is to stop the other team doing this.
Sounds simple, right? Well, it’s easier said than done. Also, what may be complex about Australian rules is the many ways you can go about doing this.
Goal accuracy: converting those shots at goal
Another statistic that I think is important is goal accuracy, though not quite in the same way as the other main statistics I have covered above.
Goal accuracy – the percentage of shots a team has on goal that are goals (scoring shots, if kicks out of bounds are not available) – is somewhat important to performance over an entire season. Last season, finals teams Sydney, Hawthorn, Collingwood and West Coast were good at it, but Richmond and GWS were not. In 2017 the leaders in goal accuracy were non-finalists Melbourne and the Brisbane Lions. Scoring shot creation is generally more important to performance over a season than scoring shot conversion is, with teams being reasonably close over a season in terms of the rates at which they convert.
I would say where it matters more is changing the outcome of a single game, which is even more important if that game is a final. In a single game there is more variability in scoring shot conversion than over a season, and the results can swing the match significantly. We see this in some of the comparisons between final scores and ‘expected scores’ for a game.
To state the obvious, six behinds are needed to gain as many points as a single goal. Given that the difference between premiers West Coast and last-placed Carlton was only about seven scoring shots per game you want to convert those chances when you get them. Or given that goal accuracy tends to even out over the season, you want to convert those chances when they are most important. (West Coast almost didn’t in last year’s Grand Final, until Dom Sheed did.)
In Australian rules football the ways for teams to score are:
  •  get possession of the ball, and keep it;
  • gain metres while you have the ball in order to get a shot at scoring; and
  • get as many of those shots at scoring through the goal posts as possible.

For defence the ways to stop the other team from scoring are:
  • try to get back possession of the ball, or otherwise get a stoppage;
  • if you haven’t got the ball back yet stop the other team from getting the ball down the field far enough to have a shot at scoring; and
  •  if the other team does get a shot a scoring, try and limit the chance that it is a goal.
This is essentially how I am going to talk about the value of AFL statistics in this series. I’ll mainly evaluate teams and players by how good they are at doing these things. I also plan to talk about how the evolving nature of statistics has given us a better picture over time of the effectiveness of teams and players. Some conclusions will be obvious, but some may be less so. In the end, we may yet even get to that 'player rating' system.

Sunday, September 30, 2018

AFL Power Rankings: Finals 2018

It was an even AFL season at the top, but when the West Coast Eagles had their two big forwards they were perhaps a touch above the rest.

In terms of top AFL teams, 2018 was a weird season. No team appeared to be as strong as the ‘great’, or even very good sides of recent years, with Richmond the only team during the season to get close to that level. Arguably the best teams going into the finals series – Richmond, Melbourne, and Geelong – all had big losses that eliminated them from the finals, while ‘mid-range’ good sides Collingwood and West Coast emerged as the in-form teams in September.
The result is it’s pretty hard to say, standing at the end of the 2018 season, who the strongest team of the year actually was. We do now know who the premiers were though.
Maybe the ‘top’ team was the Eagles all along
My ranking system, like many other AFL ranking systems, does not account for players missing through injury. Generally this probably doesn’t matter too much, unless a team has an ‘epidemic’ of them. There is enough evidence through the season however to say that this year’s premiers, the West Coast Eagles, were a much better side when their two big forwards – Josh Kennedy and Jack Darling – were available.
First, the main statistic: the Eagles were undefeated at 13-0 when both Kennedy and Darling played, and 6-6 when one of them was missing. Wins and losses can sometimes be a bit deceiving, because it depends on who you played, but that is enough games to suggest that something was different.
We can also look at the Eagles’ adjusted net margin – i.e. their net margin adjusted for estimated home ground and opposition strength – to see how much better they were with a full forward line (see chart below). When West Coast were missing either of or both of Kennedy and Darling their adjusted net margin in 2018 was a relatively mediocre +5. When both were in the side it was +28. That’s not historically dominant by any means, but it would have marked the Eagles out as the top side of 2018.

But perhaps that understates the Eagles a little as well. They had a few struggles to put away bad sides even when they had their two big forwards, beating Carlton by only 10 points, St. Kilda by only 13 points at home, and Fremantle by only 8 points. (These are the negative bars you can see for the Kennedy/Darling games in the chart above.) Against three of the top sides though they had some amazing games, beating Richmond by eight goals, Melbourne by 11 goals in the Preliminary Final, and Collingwood by six goals in their first meeting at the MCG. So with a ‘full’ forward structure they could be really good when it counted most.
West Coast converted more efficiently up forward… except in the Grand Final
Further proof that Kennedy and Darling improved the Eagles’ performance is that the Eagles were considerably better in the area of the ground that those two players could influence the most – namely, up forward.
As pointed out in an ABC article this week by the HPN guys West Coast was averaging almost 2 points per inside 50 with Kennedy and Darling in the side, whereas without those two they were at around the league average of 1.54 points. In other words, with 50 forward entries they would be expected to kick about 100 points with their two big forwards in the side, compared to about 80 points without them. In those aforementioned matches against Melbourne, Richmond, and Collingwood they didn’t have a major advantage getting the ball inside 50 in any of them. The Eagles’ ‘forward efficiency’ loomed as the main way that they could win the Grand Final.
Except that it wasn’t … the Eagles could only kick 79 points from 63 forward entries on the weekend. Collingwood’s defence through the finals was fantastic – holding both Richmond and GWS to under 60 points – and it was almost enough to beat the Eagles as well.
Instead the Eagles won through weight of inside 50 entries, which were 63 to 48 in their favour. Norm Smith Medallist Luke Shuey and Dom Sheed dominated, amassing 70 possessions of which 33 were contested, 17 clearances, and 14 inside 50 entries (Sheed kicked the winning goal to boot). The Eagles’ midfield was fairly decent all season but Shuey shone on Grand Final day. Still with one less goal I would be talking about how good Taylor Adams and Tom Langdon were yesterday instead ... 
That’s it for the 2018 AFL season, and the 2018 rankings. I wouldn’t call it one of the most memorable seasons, but we did have some good matches – the Showdowns, Melbourne v Geelong in the home and away season, Saturday afternoon in Round 20, and fortunately the Grand Final itself. We also don’t really have a strong premiership favourite for next year, which should help to make things exciting. I’m going to rest now, and hopefully come back with some new ideas for 2019.

Tuesday, August 28, 2018

AFL Power Rankings: Round 23 2018

So at the end of the 2018 AFL home-and-away season Richmond is the top-ranked side, although only narrowly over Geelong and Melbourne.

The Cats and Demons closed the gap considerably over the final two weeks – Geelong from annihilating ‘bad’ teams, and Melbourne from finally beating good teams. The Tigers also slipped a bit, although whether that indicates teams are actually playing better against them or that Richmond was just relaxing a touch before the finals (or both) remains to be seen.

Outside of that, the only highly-ranked team that did not qualify for the finals is Essendon. The Bombers’ high ranking reflects their strong finish to the season, but ultimately their poor start cost them a finals spot.

Predicting the finals series

For the past three years I’ve used the end of home-and-away season rankings to predict how the finals series will play out. Basically this is done just by comparing who has the higher ranking points, adjusted for any estimated home ground advantage.

In that time the Rankings have predicted five of the six Grand Finalists, and two of the three premiers. The miss – like every other AFL ranking system in existence – was the Western Bulldogs in 2016. These predictions however are purely for fun and interest’s sake, and not to coax you into betting your life savings.

Unsurprisingly, the Rankings pick Richmond to win the premiership for a second straight year. Unlike last year though I’d say the Tigers would still be considered the team with the best chance if one considered the likelihood of each team winning each potential match-up, not just the match-ups shown above. I’m picking West Coast, with two home finals from finishing second on the ladder, to be the other Grand Finalist.

Again though, I’ll emphasise three main potential problems with this straight head-to-head prediction method. The first is that just one result going against the prediction can change the outlook considerably. The second point, related to the first, is that our assessment of each team will change as we progress through the finals series.

And the third is that it doesn’t show how close the teams are in likelihood to winning. According to the Rankings there is basically nothing between Melbourne and Geelong. Melbourne should be seen as almost as likely as Geelong to progress as the Cats do above.

I’ll be back again after the finals series finishes with the final rankings for 2018. How soon after, like last year, may depend upon how far my Tigers go.

Sunday, August 19, 2018

AFL Power Rankings: Round 22 2018

You have to beat the good sides to win the premiership. Based on their form so far against the likely finalists, which teams should be a bit worried?

Melbourne’s bizarre season 

Today Melbourne qualified for its first AFL finals series since 2006 with an excellent win against the West Coast Eagles. That win aside though, Melbourne’s struggles against the top sides in 2018 have been well noted. They’ve had twelve wins and one loss against teams that are currently in the bottom half on the ladder. Against teams in the top half they’ve had seven losses and just the one win.

Prior to the Eagles match the gap in the Demons’ relative performances against top and bottom half teams was one of the largest in VFL/AFL history. (See this helpful graph from Insight Lane – by a nice coincidence I had already chosen the topic for this week’s post when this insight was given.) Usually good performances against good sides and not-so-good sides tend to go together. Strangely, two of the largest gaps ever in relative performance have come in the past two seasons – last year it was Port Adelaide who confounded rating systems by destroying ‘bad’ teams and capitulating against the good ones.

Melbourne has been highly rated by my ranking system for most of the season, because of its ability to annihilate lower teams. This has left me feeling more and more sheepish as the Demons have racked up losses against the top sides.

The best way to explain Melbourne’s high ranking is this: the Rankings take into account performances against the whole league, not just the top half. It is true that the Rankings have tended to overestimate Melbourne’s chances against the top teams this season. By the same token though, they have tended to underestimate the Demons’ victory margins against the lower teams.

It’s beating the top teams that matters now – which likely finalists should be worried?

However as we get to the finals series, it’s only a team’s performances against the teams that remain that matter. Based on this, once they get past the joy of qualifying for their first finals series in twelve years the Demons should be a bit concerned about their ability to progress.

Melbourne’s average adjusted net margin this season against the other likely finalists is -8 points. After adjusting for opposition strength and home ground advantage this makes them the equivalent of a below average side when they come up against the best (see table below). Richmond, Hawthorn, and Collingwood all won by large margins against the Demons, and Sydney beat Melbourne last week on its home ground.

For the other likely finalists, Collingwood’s inability to beat the top eight teams has also been well noted. The Magpies have only beaten one top eight team: Melbourne (although they came close to beating Sydney). Their best performances have come against the next tier of sides that are just outside the eight – Port Adelaide, North Melbourne, Essendon, and Adelaide. Against the likely finalists they have been the equivalent of an average side. Collingwood and Melbourne are two of the better sides overall, but their ladder positions have been helped a little by friendly fixtures.

Less well noted is GWS’ struggles against the better sides earlier this season. The Giants were thrashed by Geelong, and were also well beaten by West Coast. They have also lost to Sydney twice. Some of those performances may have in part been affected by injuries, but that may not be much comfort to the Giants as injuries have recently hit them again.

Who has done well against the best?

Minor premiers-elect Richmond has performed the best overall against the top teams. The Tigers have had some struggles outside of Victoria, but they have beaten every team they have played in their home state. Unless they lose in the first week of the finals they will play at the MCG for the duration of the finals series.

Also doing well against the top teams are Sydney, West Coast, Hawthorn, and Geelong, although for somewhat different reasons.

Sydney beat the Eagles both at home and away and has also beaten GWS twice this season. Most notably the Swans have had excellent form away from home, beating Geelong, Hawthorn, and Melbourne.

West Coast’s best win for the season was their big win against Richmond at home. They have also shown they are capable of winning in Victoria against good opposition by beating Hawthorn and Collingwood there.

Hawthorn thrashed Melbourne, and also easily beat Collingwood in the first match of the season. The Hawks beat Geelong twice (albeit narrowly), and have generally been close in their losses against the top sides.

Geelong is the second-highest team on the Rankings but will likely finish in seventh or eighth spot on the ladder, in part because of their tough fixture. The Cats’ form against the top teams has been fine: they thrashed GWS, and have been good or at least OK in most of their matches.

You may step it up though

So should we just be waiting for the seasons of Melbourne and Collingwood to come to an abrupt end? Of course not. Last year for example we saw Richmond play considerably better against teams in the finals series than they had done earlier in the season, including a massive (home ground advantage aside) 124 point turnaround against Adelaide. The Western Bulldogs substantially stepped up their performances in September the year before.

Melbourne and Collingwood are good teams. They may even be capable of matching it with the best teams. They just haven’t given a whole lot of evidence during the season that they can yet.