Simulating the Finalists for 2015 After Round 14

It seems I'm projecting the likely Finalists earlier and earlier each season, but the competition is at such an interesting juncture that I thought this week might be the right time to start this year.

The technical basis on which I'll be simulating the remainder of the season will be the same this year as it was last year although, in light of the solid margin predicting performance of C_Marg so far, I've decided to use its predictions for margins in the remaining games rather than use the model I created last year, which instead took MARS Ratings as inputs.

That means this week I'll be using the predicted margins in the table at right as my simulation inputs. These predictions use the same methodology as C_Marg to come up with game margins, that methodology drawing on the current difference in the teams' ChiPS Ratings, an allowance for HGA and any Interstate Advantage, and a component reflecting the teams' recent form (as reflected by changes in ChiPS Ratings in the last two games). 

C_Marg is, as we've seen already this season, more willing to make extreme margin predictions, a characteristic that we can see reflected here in, for example, the predictions for the Crows v Lions game in Round 21 and, more dramatically, the Hawks v Lions game in Round 22.

As shown, the margin predictions can be converted to victory probabilities (here by assuming that margins are Normally distributed with a mean equal to the predicted margin and a standard deviation of 36 points). Those probability estimates can then, more conveniently, be arranged as a matrix from which we can readily calculate the expected number of wins for every team across their remaining nine games.

Hawthorn, for example, is expected to win about seven of the nine, while Carlton is expected to win only a bit over two of the nine.

SIMULATIONS

The predicted margin in each game is used in each simulation to produce an actual game score. Briefly, the steps for each game in a repetition are as follows:

  1. Generate a simulated Total Score for the game by assuming that the Total Score is a (discretised) Normally distributed random variable with mean 183.4 points and standard deviation 31.7 points.
  2. Generate a simulated Margin by assuming, as noted above, that the Margin is a (discretised) Normally distributed random variable with mean equal to the predicted margin and standard deviation of 36 points.
  3. Calculate the Home Team Score as the simulated Total Score plus the simulated Margin divided by 2 (rounded up if necessary)
  4. Calculate the Away Team Score as the simulated Total Score less the Home Team Score calculated in Step 3

This process is repeated for every remaining game of the season and the results combined with team performances to date to arrive at a simulated final ladder. The entire process is repeated for all games 100,000 times.

The results, shown firstly as manhattan charts (or 'dinosaurs walking past high fences' charts as I think they might more accurately be called this week), are these:

The contrasting variability in the final ladder positions of the different teams is striking. Adelaide, the Cats, the Roos and the Dogs, for example, have a relatively wide range of almost equally-likely finishes, while teams like the Lions, Dockers, Hawks and Eagles have a much narrower palette.

Viewed instead as a heat-map, those results look like this:

From this we can determine that the most likely finishes for each team are:

  • Fremantle 1st
  • West Coast 2nd
  • Hawthorn 3rd
  • Sydney 4th
  • Collingwood and Richmond 5th
  • Western Bulldogs 7th
  • Adelaide, Geelong and the Kangaroos 9th
  • GWS and Port Adelaide 12th
  • St Kilda 13th
  • Melbourne 14th
  • Essendon 15th
  • Gold Coast 16th
  • Carlton 17th
  • Brisbane Lions 18th

We can also estimate each team's likelihood of finishing 1st, finishing in the Top 4, making or missing the Finals, and collecting the Spoon. Those estimates appear in the table below and their congruence with current TAB prices assessed in the right-hand side of that table, with incongruent assessments shown as shaded prices indicating that the price represents a positive wagering expectation. (Note that, in keeping with previous seasons, I would not rate bets with an edge below 5% as representing genuine assessed "value", which leaves only the three prices shown in green as having worthwhile edge according to the simulations. Note also that the TAB is not fielding a Spoon market at this stage.)

Fairly obviously, the two teams about which the simulations - and, by implication, ChiPS Ratings - differ most significantly from the TAB's assessments are the Eagles and the Dogs. The simulations have the Eagles finishing Top 4 over 25% of the time, which implies that their current $4.50 price carries a +22% edge. They also have the Eagles making the Finals in 24 of 25 replicates, which implies a +10% edge in their $1.15 price for a spot in the 8.

The simulations also have the Dogs making the 8 in 58% of replicates, which implies a +29% edge in their $2.25 price for achieving that result.

Finally, we can use the simulations to review the likely composition of the Top 2, Top 4 and Top 8 teams. 

Fremantle and West Coast finish 1st and 2nd in the largest proportion of replicates (almost 1 in 5), though Fremantle and Hawthorn fill these spots in that order almost as often.

Fremantle, Hawthorn, West Coast and Sydney in some order represent all 10 of the 10 most-common Top 4's, the ordering most common of all being Fremantle, Hawthorn, West Coast and Sydney.

It's a little early to be assessing the most common Top 8's, with even the current notionally "most likely" occurring in less than 1 replicate in 1,000. So, consider the following table a list of some of the more probable finishes only. We should look for stronger candidates to emerge in coming weeks.