2019 : Simulating the Final Ladder After Round 13

The results of the latest 50,000 simulations appear below.

(If you’re curious about the methodology used to create them, you can start here.)


After last week’s results there are now 10 teams with (roughly) a 1 in 2 or better shot at the Finals, and five teams with (roughly) a 1 in 3 or better shot at a Top 4 finish.

The movements in terms of Expected Wins this week were fairly small, with no team registering an increase or decrease larger than 0.6 of win.

In terms of estimated probability of making the Finals, the changes were generally small, too, with the largest being:


  • Port Adelaide (down 12% points)

  • Hawthorn (down 9% points)


  • Fremantle (up 13% points)

  • Essendon (up 10% points)

GWS saw a 10% point increase in its Top 4 prospects, and Port Adelaide a 9% point decline.

Geelong remain almost 80% chances for the minor premiership, Collingwood about 1 in 9 chances, and GWS about 1 in 12 chances.


The detailed view of each team’s estimated probability of finishing in each of the 18 possible ladder positions appears below. Blank cells represent ladder finishes that did not occur even once in the 50,000 simulations, while cells showing a value of 0 represent estimated probabilities below 0.05%.

The heatmap suggests a loose hierarchy of teams is beginning to form:

  • For 1st: Geelong

  • For 2nd & 3rd: Collingwood & GWS

  • For 4th to 10th: Adelaide, West Coast, Fremantle, Brisbane Lions, Richmond, Port Adelaide, and Essendon

  • For 11th to 15th: St Kilda, Hawthorn, North Melbourne, Western Bulldogs, Sydney

  • For 16th: Melbourne

  • For 17th & 18th: Carlton & Gold Coast


This week’s simulations suggest that the results in Round 13 had relatively little effect on the likelihood of any particular team playing Finals given 11, 11.5 or 12 wins.

Similarly, there was little effect on the likelihood of any particular team finishing Top 4 given 13, 13.5 or 14 wins.


This week, for the first time this season, we’ll take a quick look at what the simulations are suggesting are the most-likely combinations of teams finishing in key positions.

For Top 2s, we have the situation as shown at right, which see a Geelong-Collingwood 1-2 finish as, narrowly, the most likely, it occurring in about 1 simulation in 3.

Just behind it is the Geelong-GWS finish, which arose in 30% of the simulations. After that, the estimated probabilities fall off fairly rapidly, dropping as low as 1% for the tenth-most common pairing of Geelong and the Brisbane Lions.

Across all 10 of the combinations shown here, seven different teams appear at least once: Geelong, Collingwood, GWS, Adelaide, West Coast, Fremantle, and the Brisbane Lions. Combined, the 10 combinations represent 93% of all of the simulations.

Looking next at Top 4s, we find a much great diversity of potential, with even the most-likely combination appearing in only just over 1 simulation in 25, that being for a Geelong-Collingwood-GWS-Adelaide finish.

In the next-most likely combination, GWS and Collingwood trade 2nd and 3rd to give a Geelong-GWS-Collingwood-Adelaide ordering.

Across all 10 of the combinations shown here, six different teams appear at least once: Geelong, Collingwood, GWS, Adelaide, West Coast, and Fremantle. Combined, the 10 combinations represent only 28% of all of the simulations.

Whilst, clearly, it would be possible to look at the most-frequent Top 8s as well, there’s not a lot to be gained by doing that, since even the most-frequent combination amongst them occurred in only 15 of the 50,000 simulations (ie 0.03%). There’s just not enough power in a sample even as large as 50,000 to differentiate amongst options with probabilities that small.

(For curiosity’s sake, I also looked at the complete orderings of all 18 teams in the simulations. Across the 50,000 simulations, there were 49,998 unique orderings, with just two orderings occurring twice each.)


Next we’ll analyse how likely it is that key positions on the final home-and-away ladder will be determined by percentage because the teams in those positions finish tied on competition points.

This week, 4th and 5th are separated by percentage in about 44% of the simulations (down 2% points), and 8th and 9th are separated by percentage in about 53% of the simulations (down 1% point). As well, 8th and 10th are separated by percentage in about 25% of the simulations, and 8th from 11th in about 10%.

There’s also about a 1 in 12 possibility than 7th and 10th finish equal on wins. Wouldn’t that make for a heckuva finish.


One way of measuring how much uncertainty there is in the competition is to use the Gini measure of concentration commonly used in economics for measuring income inequality to quantify the spread of each team's estimated final ladder positions across the 50,000 simulation replicates, and to quantify the concentration in the probabilities across all the teams vying for any given ladder position.

This week saw the uncertainty about the final ordering of the teams decrease relatively substantially, with the Gini coefficients rising to around 0.58 to 0.59, a season high.

At the team level, we saw reductions in uncertainty for every team, with the largest relative reductions for Fremantle, Hawthorn, North Melbourne and Adelaide.

Essendon, Port Adelaide, Richmond and the Brisbane Lions now have the most uncertainty, while Geelong, Gold Coast, Carlton, and Collingwood still have the most certainty about their eventual ladder finishes.

Next, if we adopt a ladder position viewpoint, we see that 1st and 18th remain the positions with the narrowest range of likely occupants, with about 90% chances that either of two teams will occupy them at season’s end, whilst positions 7th through 12th still have the widest range of possible tenants.

Every ladder position, however, became at least a little more certain about who will occupy it come the end of the home-and-away season.


Here are the updated assessments of the 30 most-important games between now and the end of the home-and-away season. (See this blog for details about how these are calculated.)

This week, the Top 11 and all but two of the Top 25 are common to last week’s list.

In terms of the temporal distribution of these games, we now have:

  • Rounds 14 to 17: 9 games from 33 (27%)

  • Rounds 18 to 20: 10 games from 27 (37%)

  • Rounds 21 to 23: 11 games from 27 (41%)

We see only small changes in the level of involvement in games assessed as being important for each team this week:

  • 9 games involving Richmond (up 2)

  • 8 games involving Port Adelaide (no change)

  • 7 games involving Brisbane Lions (no change) or Essendon (up 2)

  • 5 games involving Fremantle (no change)

  • 3 games involving West Coast (down 1), North Melbourne (no change), or St Kilda (no change)

  • 2 games involving Adelaide (no change), Collingwood (up 2), GWS (up 1), Sydney (up 1), or Western Bulldogs (no change)

  • 1 game involving Carlton (no change), Geelong (down 1), Gold Coast (up 1), Hawthorn (down 2), Melbourne (down 1)

We can again see the commonsense of this list when we compare it with the simulated probabilities for teams finishing in 8th or 9th, which are:

  • 23%: Richmond

  • 22%: Brisbane Lions, Port Adelaide, and Essendon

  • 20%: Fremantle

  • 16%: West Coast

  • 15%: St Kilda and Hawthorn

  • 14%: Adelaide

  • 10%: North Melbourne

No other team has a higher than 7% estimated probability of finishing in 8th or 9th combined.