The latest simulations of the home and away season appear below.
(If you’re curious about the methodology used to create them, you can start here.)
I could not have picked a worse year to adopt a more conservative simulation methodology. Given the apparently compressed nature of the span of true team abilities this season, I’ve no idea just how absurd a range of 8 to 14.4 for Expected Wins really is.
Regardless, here’s what we have:
The week’s biggest losers in terms of Expected Wins, according to the simulations, were West Coast (down 1.6 Expected Wins), Western Bulldogs (down 1.4), and GWS (down 1.2), while the biggest gainers were Port Adelaide (up 1.5), Fremantle (up 1.3), Carlton (up 1.2), and Collingwood (up 1.0).
Perhaps more importantly, looking at the simulations through the lens of making the Finals, we have:
West Coast (down 28% points)
Western Bulldogs (down 21% points)
GWS (down 12% points)
Melbourne (down 11% points)
Port Adelaide (up 28% points)
Fremantle (up 21% points)
Collingwood (up 15% points)
St Kilda (up 15% points)
Essendon (up 13% points)
TEAM AND POSITION CONCENTRATION
One way of measuring how much uncertainty there is in the competition is to use the Gini measure of concentration commonly used in economics for measuring income inequality to quantify the spread of each team's estimated final ladder positions across the 50,000 simulation replicates, and to quantify the concentration in the probabilities across all the teams vying for any given ladder position.
In the context of a team, a Gini coefficient of 1 implies that the team has only one feasible final ladder position, while a Gini coefficient of 0 implies that all 18 ladder positions are equally likely. Since no team has, as yet, mathematically locked in any ladder position, and none are capable of finishing in every one of the possible ladder positions with equal probability, no team has a Gini coefficient of exactly 0 or 1.
But, as we can see in the table at right, Geelong and Carlton are the teams whose Gini coefficients are highest, reflecting the fact that they have a relatively narrow set of likely final ladder positions (Geelong is now estimated to be a 33% chance for 1st and Carlton a 23% chance for 18th).
Conversely, Fremantle, Adelaide and the Brisbane Lions are the teams whose Gini coefficients are smallest, signalling that they have the broadest range of possible ladder finishes. Fremantle are estimated to have a 7% or greater chances for every ladder position from 6th to 13th, Adelaide likewise from 7th to 14th, and the Brisbane Lions likewise from 5th to 12th.
We'll continue to calculate each team's Gini coefficient over the course of the remainder of the season to track the speed at which potential ladder finishes narrow or widen.
In the context of a ladder position, a Gini coefficient of 1 implies that only one team could feasibly finish in that position, while a GIni coefficient of 0 implies that all teams are equally likely to finish in that position. Again, because no ladder position has been secured by any team and because no position is capable of seeing every team occupy it with equal probability, no Gini coefficient is exactly 0 or 1.
Looking at the ladder position Gini coefficients, which also appear in the table, we see that 18th and 1st are the positions with the narrowest range of likely occupants at season's end (as we just saw, Geelong and Carlton have already started measuring up curtains for those positions), whilst positions 6th through 12th have the widest range of potential occupants.
For each of the ladder positions 6th through 12th there are at least four - and as many as seven - teams with 8% or greater chances of occupying it come the end of Round 23, so, here too, the Gini coefficient values seem to make practical sense.
In summary, it's the bottom six and top five ladder positions that are most determined, and the positions from 6th to 12th that are least determined.
Here are the updated assessments of the 30 most-important games between now and the end of the home-and-away season. (See last week’s blog for details about how these are calculated.)
Thirteen of the 30 shown here were also estimated as being in the top 30 last week, so there’s a reasonable level of stability in the list. That said, there isn’t yet a huge difference in Weighted Average Importance values as we get further down the list. As you can see here, the weighted average effect associated with the game 30th on the list here is about 1.1%. The game in 100th place has a weighted average importance of 0.9%.
The main message: don’t read too much into these importance ratings just yet. They’re likely to favour games late in the season for a while, because those games are the ones we’re most uncertain about and they therefore carry the greatest potential to materially effect the finals estimates for both teams in the contest with about equal likelihood.