With four rounds of the home-and-away season remaining, four teams currently outside the 8 are mathematically still capable of making the Finals. Time then to put some probabilities against those possibilities.
In previous years' simulations I've taken the short-cut of simulating game outcomes by coming up with an expected game margin and then generating random Normals with a mean equal to that expected margin and a standard deviation of about 37 points. That allows me to readily simulate the winner of each game and the margin but requires some additional simplifying assumptions to be made to split teams that finish equal on competition points. This year, percentages will almost certainly feature in the determination of final ladder positions, so I've implemented a slightly more complex and realistic methodology in which I simulate the scores of the teams participating in each game as bivariate Normals.
The first step in that approach was to fit a bivariate Normal to historical score data, for which purposes I've used the sem function from R's lavaan package applied to all games from seasons 2000 to the end of Round 17 in 2012. I used team MARS Ratings as the only input. (I'd love to use TAB Bookmaker probabilities, but if I knew these in advance - well let's just say I wouldn't be writing this blog ...)
These equations emerged:
- Home Team Score = 193.665 + 0.341 x Home Team MARS Rating - 0.436 x Away Team MARS Rating
- Away Team Score = 153.281 - 0.393 x Home Team MARS Rating + 0.330 x Away Team MARS Rating
The estimated variances were, for the Home Team Score, 659.173, and for the Away Team Score, 659.914; and the estimated covariance was -65.644. (These imply that the team score standard deviations are about 25.7 points per game, and the correlation between team scores is about -0.1).
Using these equations and teams' current MARS Ratings allows me to come up with an expected score for each team in each of the remaining 36 games of the home-and-away season, and to simulate actual scores.
Here are those expected scores for each game, along with the simulated probabilities of the various possible results (across 1,000 simulations).
(So uncharitable is the simulation process that it doesn't think GWS could beat St Kilda in Round 22 even if the game were played 1,000 times. Even if that's statistically accurate, I can't see the TAB offering 1,000/1 odds on GWS anytime soon.)
Feeding these expected scores and the variance-covariance information - which I've assumed to be the same for every game - into a bivariate Normal generator in Excel yields the following simulated ladder position data:
Sydney are estimated as better than even money chances of running out minor premiers, with Adelaide as about 3/1 shots, Hawthorn as 4/1 chances, and Collingwood as about 19/1 chances.
Adelaide, Collingwood, Hawthorn and Sydney are the only teams with a (simulated) guaranteed Finals berth, though Geelong and the Eagles are also virtual certainties, both with probabilities exceeding 90%. The Roos and Fremantle are the only other teams with better than even prospects of playing in September, with the Dons rated as slightly worse than even-money chances. Other teams carrying non-zero hopes are St Kilda (21%), Carlton (7%), and Richmond (0.8%).
Adelaide, Hawthorn and Sydney are also virtually guaranteed top 4 finishes, with Collingwood the overwhelming favourite to grab the other spot. Geelong and West Coast are the only other teams with simulated odds shorter than 10/1 to wind up with the double-chance. Essendon, Fremantle or the Roos could also finish 4th, but only under extraordinary circumstances.
The battle for the spoon is a race in two with GWS rated about 3 times as likely as Gold Coast to finish in last place.