2019 : Simulating the Finals Series After Round 14

So far this season the simulations have ended at the conclusion of the home and away season. Today we'll extend the simulations to include the entirety of the Finals series.

In simulating games in the Finals, we'll use a similar methodology to that we've been using to simulate home and away games (which you can review here), with the exception that, for the purposes of determining what standard deviation to use when we perturb the underlying team ratings, we’ll assume that all Finals will be played on that one arbitrary day in September, namely, the 7th.

To be honest, that decision is mostly about minimising the changes I need to make to the existing script for simulating Finals, and the practical implication is that we’re implicitly assuming we’re only a little more uncertain about teams’ MoSHBODS Ratings for all the Finals than we are about them for the last week of the home and away season. There’s no doubt I need to review how I’ve modelled the relationship between time and Rating uncertainty this year - my sense is that I’ve incorporated too much variability for games more than about 8 weeks away - and this review will include what to do about modelling uncertainty for Finals when those Finals are also quite distant. If nothing else, the life of the predictive modeller is one of doubt and incremental adjustment.

Anyway, to apply this methodology, we need to determine venues for Finals contests, for which purpose I've assumed that, as last year:

  • All Victorian teams play their home Finals at the MCG

  • GWS play home Finals at Sydney Showground

  • Sydney play home Finals at the SCG

  • The Brisbane Lions play home Finals at the Gabba

  • Gold Coast play home Finals (sic) at Carrara

  • Adelaide and Port Adelaide play home finals at the Adelaide Oval

  • Fremantle and West Coast play home finals at Perth Stadium

For the Grand Final, a quick logic check is performed and, provided the year is before 2058, it is assumed to be played at the MCG. (I thought that joke was pretty good last year, and I haven’t changed my mind in the interim).

Applying the methodology to the 50,000 final ladders produced by our original simulations yields the following results.

The height of each bar provides an estimate of the probability that the team will exit at a specified week of the Finals. Geelong, for example, are about a 45% chance of exiting in a Preliminary Final. The internal colouring of each bar reflects the final home-and-away ladder position of the team when it went out in a particular week in a particular simulation replicate.

So, for example, in those replicates where Geelong wins the Flag, well over half of them come in simulated seasons where they were projected to finish 1st in the home and away season (ie the red portion of the rightmost bar represents over one half of the height of that bar).

In the top section of the chart we use a common y-axis for all teams, which makes it harder to discern some of the details for teams with smaller Finals chances. The bottom section of the chart addresses this by representing the same data but with different y-axes for each team.

WEEK OF ELIMINATION IN FINALS

In this next chart we look at teams' chances for various Finals finishes, ignoring their home and away ladder positions (ie we focus solely on the heights of the bars in the previous chart). The numbers shown inside a cell are the percentage of simulation replicates (multiplied by 100) in which the specified team went out in the specified week of the Finals.

We see here that, if we define the season in terms of the six events listed above plus "Miss the Finals", the most-likely finishes for each team are estimated to be:

  • Lose in a Preliminary Final: Geelong, Collingwood and GWS

  • Lose in a Semi Final: West Coast

  • Lose in an Elimination Final: Adelaide, Brisbane Lions, Port Adelaide, Fremantle, and Richmond

  • Miss the Finals: all other teams

In terms of the Flag, we have Geelong just slightly more likely to win it than GWS or Collingwood, with West Coast and Adelaide roughly on the next line of betting.

GRAND FINAL PAIRINGS 

In this final chart we look at all of the Grand Final pairings that occurred in at least one of the simulation replicates. The numbers shown inside a cell are the percentage of simulation replicates (multiplied by 100) in which the team named in the row defeated the team named in the column in the Grand Final.

We see that the most common Grand Final has GWS beating Geelong, though that’s about equally as likely as the converse. These each occurred in just over 5% of replicates.

Because the Finals are still quite some time away, as implied earlier, we’re injecting quite a lot of variability into underlying team Ratings. As such, we move all contests nearer to 50:50 propositions, which is why many of the values for Team A defeating Team B are near to those for Team B defeating Team A in the table above. This will change as the Finals get closer and the level of uncertainty diminishes.

(Note that zeroes in the chart represent pairings that did occur at least once but in less than 0.05% of replicates.)