Simulating the Finalists for 2015 After Round 20

The ChiPS System, on whose opinions I've been basing this year's Finalist simulations, thought Fremantle would go down to the Eagles last weekend, and also that Hawthorn would win. Both of those predictions proved accurate and, as a result, ChiPS has significantly reassessed the Minor Premiership chances of those three teams, as well as the Top 4, Top 8 and Spoon chances of all the teams.

Fremantle is now rated a 60% chance by ChiPS for the Minor Premiership, an assessment only slightly below that of the TAB, which at $1.30 rates them as between about 70 and 75% chances, depending on the assumption you make about the overround embedded in that price.

That's just over a 16% point reduction in Freo's chances, the beneficiary of that reduction being, mostly, the Hawks, whose probability increased by just over 13%. That assessment makes the Hawks value at their current $4.75 price for the Minor Premiership. West Coast also saw its probability for the top spot increase, but by only about 3% points.

In the race for a Top 4 spot, only two teams enhanced their chances last weekend, the Hawks who moved to near-certainties, and the Dogs who moved from 15% to 33% chances. Richmond (down 12%) and Sydney (down 5%) were the weekend's big losers, though Richmond does appear to represent value at $8 for a Top 4 finish.

Adelaide made the largest move in the Final 8 market, their victory seeing their chances lift by over 22% points (much to my relief after my recent Adelaide Advertiser coverage). They currently look value at the TAB's price of $1.28. The Dogs and the Roos also made non-trivial improvements to their Top 8 chances, the Dogs now becoming near-certainties and the Roos almost 88% chances.

Geelong (down 21%), GWS (down 10%), and Collingwood (down 6%), saw the largest falls in their Top 8 chances, though the Giants at $11 seem slightly generously priced.

In the only other market tracked here, the Spoon market, there was a big shake-up, with the Lions' victory over Carlton seeing those two now assume rough equal-favouritism for an 18th-placed finish.

ChiPS' input matrix for the 100,000 simulation replicates appears below.

The usual Dinosaur Chart and Heat Map follow and depict the ever-diminishing range of ladder finishes that each team might reasonably entertain. Adelaide, Geelong, GWS, the Kangaroos, Richmond and the Dogs (and maybe Sydney and Collingwood) are the only teams with genuine prospects for four or more different positions, all other teams being limited now to just two or three.

TOP 2s, 4s AND 8s

In the latest round of simulations, the most common 1-2 finish saw Fremantle take out the Minor Premiership and Hawthorn finish as Runners Up. That result appeared in about 36% of replicates. Next most common was a Fremantle / West Coast finish, which appeared in just under a quarter of all replicates, slightly more often than a Hawthorn / Fremantle finish, which appeared in about 22% of all replicates.

The Dogs and Swans made very unexpected appearances in 2nd place in a handful of replicates, the most unlikely of which involved a Fremantle / Sydney 1-2 finish and which cropped up in only 2 of the 100,000 replicates. 

Both of the two most-common Top 4s saw Fremantle finish as Minor Premiers and the Swans grabbing 4th, the 2nd and 3rd places being taken by the Hawks and the Eagles in one order or the other. These finishes both appeared in about 1 replicate in 8.

The third-most common Top 4, which occurred only slightly less often, saw the Dogs take 4th behind Freo, Hawthorn and West Coast, in that order. 

A number of the less-likely Top 4s had Richmond taking 4th spot, though none of these combinations appeared in more than about 1 replicate in 20.

Turning, lastly to Top 8s, we find that there is still no ordering with a probability exceeding 2%, and six different orderings with probabilities exceeding 1%. Fremantle finishes top in six of the 10 most-common orderings, Hawthorn 2nd in six, West Coast 3rd in six, Sydney 4th in seven, the Bulldogs 5th in five, Richmond 6th in seven, the Roos 7th in eight, and Adelaide 8th in eight. 


Again this week we look, firstly, at inter-team dependencies by inspecting the simulation replicates to see how often Team A makes the Finals depending on whether Team B makes or misses the Finals.

One curiosity of this week's analysis is that the Crows' chances of making the Final 8 is actually enhanced by the Pies' making the Final 8 because, to do so, the Pies would need to defeat both Geelong and Richmond. In net terms, that turns out to be beneficial to the Crows, though it's so very unlikely that it really is mostly of academic interest.

In reality, Adelaide's chances are far more bound up with Geelong's, the Crows' prospects falling to 45% should the Cats sneak in.

Geelong's fate is about equally tied to Adelaide's, the Kangaroos', Richmond's and the Dogs', the Cats' probability climbing should any of them miss out. Realistically though, the Cats might only hope to take the Crows' or the Roos' spot in the Final 8.

GWS's hopes are most linked to the Roos', and the Roos are most at risk from Geelong and GWS amongst the Finals aspirants with non-trivial chances of making and missing the Finals.


Lastly, we'll look at the extent to which each of the 27 remaining games might effect the composition of the Final 8, using the methodology I first described in this post from a few weeks back and then refined by adding a Weighted Impact Index in this post from a few weeks later.

In the comments on last week's Finals simulations blog Nick raised a possible anomaly in the results, which got me to thinking about the sampling variability of the Raw and Weighted Impact Indexes. They're based on 10,000 simulations - my script not yet optimised sufficiently to allow me to run many more than this in a sensible timeframe - so for games where the home team is, say a 10% chance, the estimate of a team's chances of making the Final 8 contingent on the home team winning, is likely to be based on about 1,000 simulations, the standard error for which could be as high as 1.5%. As the home team's probability moves nearer 50% this standard error will decrease, but could still be as much as half that amount. In the context of assessing the impacts of single games on individual teams, that's a non-trivial standard error.

Accordingly, this week, rather than ranking each of the remaining 27 games, I've instead included a broader assessment of each game's impact, ratings them as either Low, Medium, High or Very High. Whilst sampling error might bounce around a single game's ranking, it's less likely to change its impact rating on this 4-point scale.

So, here are the ratings:

(Note that I've removed any reference to the Raw Impact Index now. For reasons that I touched on in earlier blogs, it's an inferior measure, the moreso now when I think about the effects of sampling error.)

Four games then have been assessed as having Very High impact. These are the games where flipping the result from a home win to a home loss has the greatest aggregate absolute impact on each of the Finals aspirants' chances, adjusting the raw absolute impacts to account for the relative likelihood of a home win versus a home loss.

This week we've only one game assessed as having Very High impact, the Saints v Cats game on Saturday. We've also two games assessed as having High impact (GWS v Sydney, and the Kangaroos v Fremantle) and two as having Medium impact.

Next week has two games rated as Very High impact and one as High impact, while the final round has one Very High and two High impact games. Every other game in Rounds 22 and 23, however, is rated Low impact so, on balance, I think it's fair to say that Round 21 is likely to have the greatest overall influence on the composition of the Final 8.