2015 - Team Ratings After Round 14

The two losing favourites were severely punished by ChiPS over the weekend, not so much because they lost, but because they did so by such huge margins.

Essendon, who lost by 110 points, fared worst, dropping a remarkable 12.7 Rating Points (RPs) and falling six places to 14th as a result. The Saints, who were the beneficiaries of those 12.7 RPs, climbed four places into 12th on the back of them.

The Roos, who lost by only half as much as the Dons, still surrendered 8.7 RPs, this loss costing them one position on the ladder and slipping them to 7th. 

Port Adelaide were the only other team to move by multiple spots this week on ChiPS, they climbing three places into 8th after snatching 2.5 RPs from Sydney despite losing by 10 points.

On MARS, the moves were slightly smaller and less dramatic, though still significant by its standards. Essendon shed 7.6 RPs and fell one place into 13th, while St Kilda acquired those 7.6 RPs and climbed two places into 14th. The Roos, meantime, handed over 6.1 RPs to the Suns but retained 10th spot, their contribution enough to lift the Suns one place into 16th.

Only three other teams moved on MARS Rankings, GWS climbing a spot in 12th, Melbourne falling two spots into 17th, and Carlton dropping a single spot down into 15th.

That leaves only a single team Ranked by ChiPS and MARS more than two places differently: the Roos, which ChiPS Ranks 7th and MARS Ranks 10th. As such, the rank correlation between the two Systems remains high and now stands at +0.967.

The correlation between ChiPS' and MARS' Team Ratings also remains high at +0.9550, which means that over 90% of the variability in these two Systems' Ratings is shared.

Colley, Massey and ODM

Of all the Ranking Systems used here on MoS, MARS re-Ranked fewest teams this week at only six, while ChiPS and ODM re-Ranked eight, Colley re-Ranked nine, and Massey re-Ranked 10 (though more on that a bit later).

Colley shifted only one team by more than two spots, dropping the Roos by three places into 11th after their loss to the lowly Suns. The same was true of both Massey and ODM, and for the same team, they both dropping the Roos four places from 8th to 12th.

In aggregate, ChiPS, Colley, Massey and ODM all moved their Team Rankings nearer the competition ladder ordering this week, as reflected in the respective rank correlation figures. Colley (+0.983) remains most aligned with the ladder, while ChiPS (+0.905) sat least aligned though still frequently nodding.

Amongst the five Systems, the Rankings of Massey and ODM remain most alike, they differing only in their Rankings for the Hawks and the Eagles, and for the Cats and Port. ChiPS' and Colley's Rankings, however, are most different (+0.889), though MARS' and Colley's are almost as different (+0.897).

ODM Offensive and Defensive Rankings

I discovered an interesting and slightly disturbing feature of ODM Component Ratings this week: they're sensitive to the number of games that a team has played relative to other teams. The sensitivity doesn't appear to extend to overall ODM Ratings, however.

The sensitivity cropped up when I initially decided to treat the Adelaide v Geelong game as if it didn't happen (which, as you know, it didn't), and noticed that the ODM Offensive Rankings of both teams dropped and the Defensive Rankings of both climbed, all to an extent that didn't seem reasonable (as many as 7 spots in the case of the Cats' Offensive Ranking). Including the game as a notional 88-all draw instead, restored sanity to the Rankings, so this was the solution I eventually adopted. But, since I use the same result data set to calculate Massey, Colley and ODM Ratings, this decision turned out to have the side effect of altering two of the Massey Rankings, flipping the Cats and Port. I figure that this was a small price to pay for the convenience of using a single data set for these Systems, the obvious alternative being a clumsy set of runs and re-runs of my scripts to include and exclude the non-existent game in order to extract the different Ratings and Rankings for each of the Systems.

(There's no effect on ChiPS or MARS, by the way, as I calculate these separately.)

Anyway, the conclusion is that users of the ODM System need be mindful of this potential pitfall and should adjust their approach accordingly. I can't find any discussion of the different games per team issue on the internet but it might be a well-known characteristic that I've missed, or it could be instead that my implementation of the ODM algorithm induces this behaviour. I think the latter's unlikely, though correspondence is welcomed on this topic.

After making the required adjustment to the data, ODM ended the week with the same Top 5 Defensive Teams as last week, this list comprising:

  1. Fremantle
  2. Sydney
  3. West Coast
  4. Hawthorn
  5. Richmond

There was some movement in ODM's Top 5 Offensive Teams, however, with a reversal of the teams in positions 3 and 4, though this was possible only after Geelong's 5th spot had been preserved by the sanity-restoring data adjustment:

  1. Hawthorn (no change)
  2. West Coast (no change)
  3. Collingwood (up from 4th)
  4. Kangaroos (down from 3rd)
  5. Geelong (no change)

Amongst the teams now sitting in Finals positions on the ladder, seven have a Defensive Ranking closer to their ladder position than their Offensive Ranking, while only one (West Coast) has an Offensive Ranking nearer its ladder position than its Defensive Ranking.


Most of the Systems Ranked six of this weeks' winners higher than they Ranked their opponents, the exception being the Offensive Component of the ODM System, which Ranked seven teams higher.

That leaves ChiPS and MARS tied in the lead, the ODM Offensive Component eight tips behind, ODM itself nine tips behind, Massey 10 tips behind, the ODM Defensive Component 11 tips behind, and Colley 15 tips behind.