HISTORY: Pretty men from many years ago.
Wednesday, August 28, 2013
Monday, August 26, 2013
Different Wage-Setting Methods and the Responsiveness of Wage Growth in Australia
In Australia, annual wage inflation, as measured by the Wage
Price Index has tended to stay around 3-4 per cent over the past fifteen
years (see graph below). By the term ‘wage
inflation’ I mean wage growth absent any changes in hours worked or changes in
workforce composition, which is what the WPI is designed to measure. This might
be considered a fairly narrow range for wage inflation; by contrast, while annual
consumer price inflation has often been around 2-3 per cent over the same
period, it has been somewhat more volatile, even without the period (2nd
half of 2000) in which the Goods and Services Tax was introduced.
From this, it looks like wage growth in Australia is not all
that responsive to changes in the economy, at least in the short-term. I’ll
leave it up to the individual reader to decide whether that is a ‘good’ or a ‘bad’
thing. The main theme of this post is not to evaluate how ‘responsive’ wages
should be, but try to give a sense of the contribution that each wage-setting
method makes to the overall variability of wage growth.
In line with the Australian Bureau of Statistics’ Employee
Earnings and Hours survey, employees in Australia can be roughly divided up
into three types of wage-setting method: awards, collective agreements, and
individual arrangements. There are also working proprietors of incorporated
businesses, but they make up only a small percentage of employees.
According to the Employee
Earnings and Hours survey, over the past fifteen tears, around 15-20 per
cent of employees in Australia at any one time have been paid at the award rate.
Award rates include the ‘minimum wage’, but also minimum rates of pay for
employees across a range of job classifications. Wages for these awards are (typically)
set ‘centrally’ once per year through an annual wage review, which is currently
undertaken by the Fair Work Commission. With some exceptions, over the past
fifteen years, the increases to the national minimum wage have tended to be
between 2.5 and 4 per cent each year (see graph below). However, the relevant commissions have up
until the past few years awarded ‘flat dollar’ increases, which provide for
lower percentage increases for employees on higher award rates of pay. Given
this, award-reliant employees have probably, on average, tended to receive
increases of between 2 and 3 per cent each year, assuming that award reliant employees are bunched towards the lower award rates of pay.
Over the past fifteen years, around 40 per cent of employees
have at any one time had their wages set by collective agreements. Enterprise
bargaining agreements were introduced in the early-1990s, allowing for agreements
on the wages and conditions of employment to be struck between an individual
employer and a group of employees. On average though, the average annualised
wage increases for these agreements – at least for registered federal
collective agreements – have tended to be almost always between 3.5 and 4.5 per
cent in aggregate (see graph above). Hence, wage increases for collective
agreements appear to be the least ‘responsive’, or possibly more correctly the
least volatile, of the three wage-setting methods.
One might argue that if you strip out from the WPI wage
increases for collective agreements and awards, then what is left - essentially
wage increases for individual arrangements - is the part that ‘best’ reflects
labour market changes. To illustrate this, I have calculated year-ended
increases for an ‘adjusted WPI’, which has been calculated as follows:
Adjusted
WPI/year-ended increase for individual arrangements =
(Year-ended increase for WPI –
(proportion of employees on awards)*(year-ended increase for awards) –
(proportion of employees on collective agreements) * (year-ended
increase for collective agreements))
/(1 - proportion of employees on
awards - proportion of employees on collective agreements)
Note the calculations are a little rough, but I don’t think
it detracts from the main points.
·
The proportions of employees on awards and
collective agreements are taken from the biannual Employees Earnings and Hours
survey (i.e. they change only every two years, although one could interpolate
the changes between surveys).
·
The year-ended increases for awards are the most
recent percentage increase awarded for the minimum wage by the relevant federal
commission, adjusted downwards by a quarter in the years when there were ‘flat
dollar’ increases for awards.
·
The year-ended increase for collective agreements
is the average annualised wage increase for current registered federal
collective (enterprise) agreements. This
series does exclude state collective agreements, and it does not exactly show
what wage increases for these agreements were over the past year, given that
agreements can run multiple years. If I’m ever in a situation where I need to
refine this methodology a bit further I’ll tinker with this series, but for now
I think it will do.
As you can see from the graph above, the ‘adjusted WPI’ was
around 2 per cent in year-ended terms at the start of the millennium, then
moved up to around 5 per cent when the labour market was tight in 2005-2006,
and has moved back down to the 2-3 per cent range following the global
financial crisis. This series seems to me to be more reflective than the
‘ordinary’ WPI of the relative strength or weakness in the labour market over
this period. Given this, it could be an easier series to forecast, although
obviously to get an aggregate wage growth forecast you would have to add the
effects of awards and collective agreements back in. The adjusted WPI’s
correlation with employment growth (lagged four quarters) is a touch stronger
than the ordinary WPI. Where both fall down is when employment growth plummeted
following the height of the GFC, which might well reflect downward ‘stickiness’ of wages.
While I said that I’m not interested here in evaluating
whether a particular level of ‘responsiveness’ in wages is ‘good’ or ‘bad’, it
would be interesting to see how employers with collective agreements respond to
economic shocks compared to employers paying award rates or with individual
arrangements with their employees. I am sure there is probably a lot of evidence in terms
of the effects of various pay-setting methods on activity or productivity or
prices, but I wonder if anyone has specifically looked at the short-term
responses of employers to economic shocks based on their pay-setting methods.
If your pay-setting method means that wages do not move around much, are you
more likely to change employment instead, or does it make little difference? Are
relatively stable wage increases harmful or helpful? I’ll leave those questions
for someone else to consider.
Sunday, August 25, 2013
AFL Power Rankings: Round 22 2013
RISING UP
In a week that little happened in terms of The Power Rankings, Fremantle was the only team to move up a spot, shifting from 4th to 3rd. The Dockers are up there with the Cats, Hawks, and Swans in terms of The Power Teams in the AFL comp. Indeed, you could say the rankings system ‘underrates’ Freo because it expresses a team’s worth in terms of net margin. This might be thought to disadvantage teams like the Dockers that choke the life out of an opposition, rather than teams like the Hawks which score (and concede) more freely.
FALLING DOWN
With the Dockers being the only team to move up a spot this means the Swans were the only team to move down a spot. But to highlight them here seems a bit unfair to the Swans, who came up against the #1 ranked team on its home turf. So let’s highlight GWS instead, who lost the most ranking points this week, and now look certain to spend the entire 2013 season ranked stone motherless last.
ALSO OF NOTE
Apart from Richmond intruding in Round 14, Geelong, Hawthorn, Fremantle, and Sydney have been the top four teams in the rankings since Round 4, and they will now finish the home and away season in the top four spots. This is in contrast to the actual AFL ladder, which somehow thought Essendon was one of the top four teams up until a few weeks ago, and even thought Port Adelaide belonged there early in the season. The lesson, as always: put your trust in the rankings and you won’t be led astray. And just ignore the fact that it still had West Coast at #6 eight weeks ago.
In a week that little happened in terms of The Power Rankings, Fremantle was the only team to move up a spot, shifting from 4th to 3rd. The Dockers are up there with the Cats, Hawks, and Swans in terms of The Power Teams in the AFL comp. Indeed, you could say the rankings system ‘underrates’ Freo because it expresses a team’s worth in terms of net margin. This might be thought to disadvantage teams like the Dockers that choke the life out of an opposition, rather than teams like the Hawks which score (and concede) more freely.
FALLING DOWN
With the Dockers being the only team to move up a spot this means the Swans were the only team to move down a spot. But to highlight them here seems a bit unfair to the Swans, who came up against the #1 ranked team on its home turf. So let’s highlight GWS instead, who lost the most ranking points this week, and now look certain to spend the entire 2013 season ranked stone motherless last.
ALSO OF NOTE
Apart from Richmond intruding in Round 14, Geelong, Hawthorn, Fremantle, and Sydney have been the top four teams in the rankings since Round 4, and they will now finish the home and away season in the top four spots. This is in contrast to the actual AFL ladder, which somehow thought Essendon was one of the top four teams up until a few weeks ago, and even thought Port Adelaide belonged there early in the season. The lesson, as always: put your trust in the rankings and you won’t be led astray. And just ignore the fact that it still had West Coast at #6 eight weeks ago.
Saturday, August 24, 2013
Taking Higher Account of Recent Form: Exponential Weightings In The AFL Power Rankings
In the Wooden Finger Depot’s AFL Power Rankings, the more
recent a game is the more weight it carries. But some might think that recent
games do not carry enough weight; for example, it took a while for the rankings
to recognise that Port Adelaide and Essendon (until recently) had stepped up
this season, and that St. Kilda, Adelaide, and West Coast had taken a step
back. What would happen then if recent games were given even more weight – what
if an exponential function was used to assign weightings instead?
In the rankings, each of a team’s past 22 games
is given a weight. Currently, the weight is a linear function of how recent the
game is, as shown in the graph above (most recent games are on the right). To
derive an exponential function, I made the arbitrary choice to reduce the
weight on the least recent game by a quarter, and then found the multiplicative
factor that got the weights to add up to 1, give or take .0005.
What
happens to the rankings after I do this? Well, essentially, what you get is an
indication of which teams have hit form over the past few weeks. In general,
and not surprisingly, the teams that have been in form over the past few weeks
are also the teams that have been in form over the entirety of the season; that
is, the exponential rankings are not too different from the ordinary linear rankings. But there are some big changes; the Western Bulldogs have been a much
better team recently than they have been over the long haul (up in Carlton/Adelaide/Port
Adelaide territory), while Essendon has been much worse (essentially not too
much better than St. Kilda).
I’m
going to stick with the good old linear weights, but what do people think? Do
the exponential weightings look like they give a better indication of where
teams are at?
Sunday, August 18, 2013
AFL Power Rankings: Round 21 2013
RISING UP
It is a familiar feline feeling at the top of the rankings this week,
as Geelong complete the long climb back to #1. The Cats were the #1 ranked team
after 2011, and kept that spot for the first six rounds of 2012, before
plummeting as far down as 9th by Round 16. But Geelong has remained
in the top four throughout all of 2013, typically swinging between 2nd
and 3rd, and with its beating of the Eagles in Perth, it becomes the
third team to take the top spot in the past three weeks. Anyone prepared to
pick a premiership favourite?
FALLING DOWN
The Eagles moved into the top eight last week after kicking the boot
into the Bombers, but were in turn given a beating in the West this week by
the Cat Attack, Geelong.
(And even if you believe wins against Essendon aren’t worth as much at the
moment, the
Eagles wouldn’t have been ranked too much differently.) Take away St. Kilda
and Melbourne, and West Coast has taken the largest step back in 2013.
ALSO OF NOTE
I
have a long memory, or at least a memory that lasts a few weeks, and I took a
larger-than-usual interest in the Essendon-North Melbourne game this week. This
is because of the debate a few weeks back on the Big
Footy forums, which suggested that my rankings had rated North too high and
Essendon too low because they rely too heavily on margins. Instead, it was
postulated, North’s ‘inability’ to win close games in 2013, and Essendon’s ‘ability’
to win them (yes, note those quotation marks) had a psychological effect which
should contribute to the Bombers being ranked higher. Of course, by the very
fact I’m mentioning this, you can tell that North blew Essendon away to kingdom
come on the weekend. For a purely numbers-based system, I can harbour an
emotional attachment to it sometimes. At this rate, don’t count me out of
buying a North scarf sometime soon.
Wednesday, August 14, 2013
Those AFL Power Rankings Again: What If An Opponent’s Worth Was Weighted Towards More Recent Games?
In my AFL Power Rankings, which you could not have failed to
miss if you have read these pages before, a team’s ranking is a weighted sum of
its ‘relative adjusted net margins’ over its past 22 games. The weights are
higher for more recent games, and a team’s relative adjusted net margin for a
game is its net margin (points for less points against), adjusted for home
ground advantage, and adjusted again for the ‘worth’ of its opponent.
However, one thing that has always slightly bothered me about this formula since its inception is that while a team’s performances is weighted towards more recent games, the ‘worth’ of its opponent is not. The ‘worth’ of its opponent is simply the average of that opponent’s net margin adjusted for home ground advantage over the opponent’s past 22 games. Therefore, if an opponent is on the rise, a team will not get as much credit for its performance against them than if the opponent’s worth was weighted towards more recent games (think Bulldogs at the current point), and will get ‘too much credit’ if the opponent is falling (think St. Kilda).
Tonight, I decided to finally see what would happen if I altered this. Under my ‘alternative system’, each team’s adjusted net margin (i.e. their worth as an opponent) is weighted according to the same formula as its relative adjusted net margin usually is. (Note that I’m not planning to switch systems mid-season.) You can see the difference this would make in the table below. I have also included each team’s adjusted net margin under each system to show how its worth as an opponent varies.
First compare columns C and D (which I should probably have made columns A and B). Teams that are improving in form, such as North Melbourne, Collingwood, and the Western Bulldogs, are considered harder opponents under the alternative system (column C) than under the current system. Conversely, teams that are deteriorating in form, such as St. Kilda, Carlton, and Essendon, are considered easier opponents under the alternative system.
Now compare columns A and B, which compares ranking points under the alternative system with the current system. You can see there is not a lot of difference in each team’s ranking points. Adelaide moves up a spot, because its most recent opponent North Melbourne is considered tougher (it has a higher adjusted net margin) under the alternative system. (Every team’s ranking points also move up a bit - I won’t go into the details, but essentially it’s because the alternative weighting system lowers the effects of last year’s final series.) Gold Coast and the Bulldogs also switch spots, with the Bulldogs’ win against Carlton last week not considered as highly under the alternative system. But overall I’d say this shouldn’t be keeping up at night … ahem, not that it was.
Later on, I might try showing the effects of different weighting systems on performances. For example, what if even more weight was given to recent games? There might be a few other experiments I run before next season.
However, one thing that has always slightly bothered me about this formula since its inception is that while a team’s performances is weighted towards more recent games, the ‘worth’ of its opponent is not. The ‘worth’ of its opponent is simply the average of that opponent’s net margin adjusted for home ground advantage over the opponent’s past 22 games. Therefore, if an opponent is on the rise, a team will not get as much credit for its performance against them than if the opponent’s worth was weighted towards more recent games (think Bulldogs at the current point), and will get ‘too much credit’ if the opponent is falling (think St. Kilda).
Tonight, I decided to finally see what would happen if I altered this. Under my ‘alternative system’, each team’s adjusted net margin (i.e. their worth as an opponent) is weighted according to the same formula as its relative adjusted net margin usually is. (Note that I’m not planning to switch systems mid-season.) You can see the difference this would make in the table below. I have also included each team’s adjusted net margin under each system to show how its worth as an opponent varies.
First compare columns C and D (which I should probably have made columns A and B). Teams that are improving in form, such as North Melbourne, Collingwood, and the Western Bulldogs, are considered harder opponents under the alternative system (column C) than under the current system. Conversely, teams that are deteriorating in form, such as St. Kilda, Carlton, and Essendon, are considered easier opponents under the alternative system.
Now compare columns A and B, which compares ranking points under the alternative system with the current system. You can see there is not a lot of difference in each team’s ranking points. Adelaide moves up a spot, because its most recent opponent North Melbourne is considered tougher (it has a higher adjusted net margin) under the alternative system. (Every team’s ranking points also move up a bit - I won’t go into the details, but essentially it’s because the alternative weighting system lowers the effects of last year’s final series.) Gold Coast and the Bulldogs also switch spots, with the Bulldogs’ win against Carlton last week not considered as highly under the alternative system. But overall I’d say this shouldn’t be keeping up at night … ahem, not that it was.
Later on, I might try showing the effects of different weighting systems on performances. For example, what if even more weight was given to recent games? There might be a few other experiments I run before next season.
Sunday, August 11, 2013
AFL Power Rankings: Round 20 2013
RISING UP
Things are close up the top of the rankings, and after losing the #1 position for a few weeks, Hawthorn is back in the spot it held for 30-odd weeks over most of 2012 and 2013. The Hawks got a slight bump in ranking points from thrashing St. Kilda, while the previous top-ranked team Sydney dropped a few points from losing at ‘home’ (well, at least in New South Wales) to Collingwood. (And yes, even if ANZ Stadium was considered neutral ground for the Swans, they would have lost the top spot anyway.) Geelong also moves just below the Hawks after Port Adelaide pegged back its lead this week at Simonds Stadium.
FALLING DOWN
Carlton and Essendon take the biggest hits this week, with the Blues losing to the relatively lowly Bulldogs, while the Bombers got thrashed at home by the relatively average Eagles. Both teams are now ranked exactly where they were (10th and 12th) at the start of the season.
ALSO OF NOTE
The top seven ranked teams could be said to have broken away from the rest. Six of those teams – also the top six teams on the AFL ladder – are now certain to be playing in the finals this year. Sadly for the seventh team, North Melbourne, its results haven’t lived up to its ranking.
Things are close up the top of the rankings, and after losing the #1 position for a few weeks, Hawthorn is back in the spot it held for 30-odd weeks over most of 2012 and 2013. The Hawks got a slight bump in ranking points from thrashing St. Kilda, while the previous top-ranked team Sydney dropped a few points from losing at ‘home’ (well, at least in New South Wales) to Collingwood. (And yes, even if ANZ Stadium was considered neutral ground for the Swans, they would have lost the top spot anyway.) Geelong also moves just below the Hawks after Port Adelaide pegged back its lead this week at Simonds Stadium.
FALLING DOWN
Carlton and Essendon take the biggest hits this week, with the Blues losing to the relatively lowly Bulldogs, while the Bombers got thrashed at home by the relatively average Eagles. Both teams are now ranked exactly where they were (10th and 12th) at the start of the season.
ALSO OF NOTE
The top seven ranked teams could be said to have broken away from the rest. Six of those teams – also the top six teams on the AFL ladder – are now certain to be playing in the finals this year. Sadly for the seventh team, North Melbourne, its results haven’t lived up to its ranking.
Thursday, August 8, 2013
The Finger Points Outwards - No. 64
Sunday, August 4, 2013
AFL Power Rankings: Round 19 2013
RISING UP
In a week without much movement in the rankings, probably Collingwood’s gain of six ranking points after belting the Bombers is the most notable development. The Magpies haven’t been so highly rated since their Queen’s Birthday massacre of Melbourne.
FALLING DOWN
Essendon have obviously fallen, and could be as low as 12th soon; the Dons really haven’t had a ‘very good’ performance since the Dreamtime match (though their win in Perth was OK). But Hawthorn’s fall is also notable, with the Hawks moving down to 3rd, a position they haven’t held since … they last lost to Richmond.
ALSO OF NOTE
After breaking clear last week, all of the top three teams fall back a little to the pack. Sydney fell by less, and are now looking like the favourite for the flag according to this blog, not that this got Hawthorn over the line last year.
In a week without much movement in the rankings, probably Collingwood’s gain of six ranking points after belting the Bombers is the most notable development. The Magpies haven’t been so highly rated since their Queen’s Birthday massacre of Melbourne.
FALLING DOWN
Essendon have obviously fallen, and could be as low as 12th soon; the Dons really haven’t had a ‘very good’ performance since the Dreamtime match (though their win in Perth was OK). But Hawthorn’s fall is also notable, with the Hawks moving down to 3rd, a position they haven’t held since … they last lost to Richmond.
ALSO OF NOTE
After breaking clear last week, all of the top three teams fall back a little to the pack. Sydney fell by less, and are now looking like the favourite for the flag according to this blog, not that this got Hawthorn over the line last year.
The Wooden Finger Five: August 2013
The big development in terms of music listening
over the past month is that I’ve now subscribed to Spotify, This has had the
following consequences:
- my data allowance per month now seems woefully inadequate;
- I haven’t listened to any album more than twice in the past 30 days;
- since I can now listen to all the albums I couldn’t afford as a kid, I’ll probably never talk about a song that was released after 1995 on this blog again.
Suede’s ‘Dog Man Star’ was the first album I put on when I signed in to Spotify; I’d heard for years that it was a landmark for Britpop and much better than its more famous ‘Animal Nitrate’-sporting predecessor. ‘… Pigs’ is as catchy and exciting as any of the best Suede singles, and unlike many of the other tracks on ‘Dog Man Star’ it doesn’t try to beat you over the head with its bored sex and drug references. If I ever buy another album again in my life, it could be this one.
Believe it or not, but Black Grape’s ‘It’s Great When You’re Straight … Yeah!’ got 10/10 from the New Musical Express upon its release. And actually it probably is better than anything the Happy Mondays (whose infamy was perhaps greater than their ability, as much as I enjoy the Mondays) ever produced. This track is pretty much made by two of Shaun Ryder’s more memorable lyrics – the first ‘Jesus was a black man/Nah, Jesus was Batman/No … that was Bruce Wayne’ is complete bonkers, but it’s hard for any pop culture nerd to resist. The second, ‘Don’t talk to me about heroes/ Most of these men sing like surfs’ should really be used on some sports show, as the ultimate antidote to/celebration of mindless football hooliganism.
The Kinks in the late-‘60s were nearly as good as the Beatles, with their albums ‘The Village Green Preservation Society’ and ‘Arthur’ (from which ‘Shangri-La’ is taken) being right up there with ‘The White Album’ and ‘Revolver’. If ‘Shangri-La’ just had that slow, regal part about the Citizen Kane-like central figure it would still be a great track. But it’s when it kicks into that swinging London beat near the end that it becomes one of Ray Davies’ best-ever tunes.
Who on earth are the Longpigs? My only memory of them (and I’m not even certain it was them) is when the lead singer told the Herald-Sun’s Hit Magazine that his favourite band was REM because it took them over a decade to become the biggest band in the world, before adding that he hoped it didn’t take his band that long. Well obviously that never ended up happening, but don’t dismiss the Longpigs – they produced some good tracks, and for those who wished that Radiohead had actually produced a proper follow-up to ‘The Bends’ this might do the trick.
- my data allowance per month now seems woefully inadequate;
- I haven’t listened to any album more than twice in the past 30 days;
- since I can now listen to all the albums I couldn’t afford as a kid, I’ll probably never talk about a song that was released after 1995 on this blog again.
Here are the tracks that I’ve listened to most
since I got access to the whole history of Western rock:
Suede’s ‘Dog Man Star’ was the first album I put on when I signed in to Spotify; I’d heard for years that it was a landmark for Britpop and much better than its more famous ‘Animal Nitrate’-sporting predecessor. ‘… Pigs’ is as catchy and exciting as any of the best Suede singles, and unlike many of the other tracks on ‘Dog Man Star’ it doesn’t try to beat you over the head with its bored sex and drug references. If I ever buy another album again in my life, it could be this one.
Believe it or not, but Black Grape’s ‘It’s Great When You’re Straight … Yeah!’ got 10/10 from the New Musical Express upon its release. And actually it probably is better than anything the Happy Mondays (whose infamy was perhaps greater than their ability, as much as I enjoy the Mondays) ever produced. This track is pretty much made by two of Shaun Ryder’s more memorable lyrics – the first ‘Jesus was a black man/Nah, Jesus was Batman/No … that was Bruce Wayne’ is complete bonkers, but it’s hard for any pop culture nerd to resist. The second, ‘Don’t talk to me about heroes/ Most of these men sing like surfs’ should really be used on some sports show, as the ultimate antidote to/celebration of mindless football hooliganism.
The Kinks in the late-‘60s were nearly as good as the Beatles, with their albums ‘The Village Green Preservation Society’ and ‘Arthur’ (from which ‘Shangri-La’ is taken) being right up there with ‘The White Album’ and ‘Revolver’. If ‘Shangri-La’ just had that slow, regal part about the Citizen Kane-like central figure it would still be a great track. But it’s when it kicks into that swinging London beat near the end that it becomes one of Ray Davies’ best-ever tunes.
Who on earth are the Longpigs? My only memory of them (and I’m not even certain it was them) is when the lead singer told the Herald-Sun’s Hit Magazine that his favourite band was REM because it took them over a decade to become the biggest band in the world, before adding that he hoped it didn’t take his band that long. Well obviously that never ended up happening, but don’t dismiss the Longpigs – they produced some good tracks, and for those who wished that Radiohead had actually produced a proper follow-up to ‘The Bends’ this might do the trick.
OK, this one I definitely haven’t discovered
through Spotify; this has always been my favourite Elvis Costello track, even
more so than the ultra-fashionable ‘Pump It Up’, and ‘Alison’. I liked it even as
a kid, when my Dad played his Elvis cassette, and I sang along to the line ‘And
I would rather be anywhere else than here today… ’ My best friend, with whom I
was playing a not-so-enthralling board game in my room at the time, was
understandably wondering whether he should be insulted, until I ensured him that
I was just singing along to the music coming from the stereo. Thank God this song
didn’t come on ten years later with my girlfriend in the bedroom.