Applying the Win Production Functions to 2009 to 2011

In the previous blog I came up with win production functions for the AFL - ways of estimating a team's winning percentage on the basis of the difference between the scoring shots it produces and those it allows its opponents to create, and the difference between the rate at which it converts those scoring shots and the rate at which its opponents convert them.

First, let's apply those win production functions to this year's ladder as it stands now after 9 rounds.

(As usual, a larger version is available on clicking.)

The block in the middle uses the win production function fitted to the entire expanse of VFL/AFL history, and that on the left uses the win production function fitted only to the seasons from 1980 to 2010.

Looking firstly at the middle block, overall the fit to this year's ladder is very good, particularly when you consider that we're only 9 rounds into the season, so the statistical variability is still quite high. The mean absolute deviation (MAD) between the expected and actual win percentage is under 10% and that between expected and actual games won is less than 1 game.

The largest positive deviation belongs to the Gold Coast, whose scoring shot and conversion record suggests it should have won only 5% of its games, or about 2 games fewer than it has. I've commented previously over on the Team Dashboards blog about the remarkable yield that the Suns have reaped from their mediocre statistics.

Geelong is owner of the next-largest deviation. Its scoring shot and conversion record suggests it should have won only 78% of its games, or about 1.6 games fewer than it has. Its two narrow victories in Rounds 8 and 9 attest to the reasonableness of that assessment.

Essendon have the largest negative deviation, having won about 17% or 1.5 games fewer than their statistics suggest they deserve. They've drawn with Carlton and lost by 5 points to the Swans so, again, there seems to be some merit in the suggestion that they could be higher on the ladder. The Lions can also consider themselves underachievers on the basis of their statistics. Their win percentage is 16% under what their statistics imply, or about 1.3 games.

Using instead the win production function fitted to more recent seasons, little changes. Overall, the fit is a tad poorer, but the team-by-team picture is very similar.

Next, let's look at the final home-and-away season ladder for 2010.

Again, the middle block uses the win production function fitted to 1897 to 2010, and the rightmost block uses the win production function fitted to 1980 to 2010.

The fits, now based on the 22 games of a full season, are much better, with the average MAD for win percentage around 5-6% and that for games won just 1-1.5 games across an entire season. The Roos and Port Adelaide can each claim to have wrung the largest excess of wins from their statistics, with each winning about 3 games more than their relative scoring and conversion performances indicate.

No team won more than 2 games fewer than it was entitled to. Carlton deserved about 2 more wins, and the Dogs, Cats, Dees and Eagles deserved about 1.5 more.

Had the final eight been based on deserved rather than actual wins (using the win production fitted to 1897 to 2010), the top 4 would have been the same, but differently ordered with the Cats minor premiers, the Pies 2nd, the Dogs 3rd, and the Saints 4th. The bottom 4 would also have had the same composition but a different order, with the Hawks 5th, Carlton 6th, the Swans 7th, and Fremantle 8th. (If we instead used the win production function fitted to 1980 to 2010, Carlton and Sydney swap places.)

Finally, let's apply the win production functions to the 2009 final home-and-away ladder. 

Here the fit is extraordinarily good. The average MAD for win percentage is under 4% and that for games won is a measly 0.8 games across a 22-game season. The teams in 1st and 2nd, St Kilda and Geelong, are the only teams that won more than 1 game more than they should (plus the Lions if we use the win production function fitted to 1980 to 2010), and West Coast, Sydney and Melbourne are the only teams to win more than 1 game fewer than they should.

The deserved final eight is almost unchanged from the actual final eight, with the only change being the Lions and Carlton swapping 6th and 7th.

In summary, the win production functions seem to do a good job of predicting the wins that are likely to be generated from a team's relative scoring shot and conversion performances.