2018 Strength of Schedule: A Post-Season Review

Back in October of 2017 we looked at the then-upcoming season of the AFL and estimated the strength of schedule for all 18 teams based on the MoSHBODS Ratings and Venue Performance Values (VPVs) that prevailed at the time.

In this blog post we'll use the same methodology but replace the static, start-of-season MoSHBODS data with the dynamic Ratings and VPVs that each team's opponents carried into the respective game to assess who, in hindsight, had easier or more difficult schedules than we assessed initially.

The summary results of that analysis appear in the table below, with the new estimated schedule strengths recorded in the block of data headed "What We Think Now", and the schedule strengths we reported in that earlier blog shown in the block of data headed "What We Thought Pre-Season".

What we find is that:

  • If we include venue effects, in retrospect St Kilda had the toughest home and away schedule, and one that was significantly tougher than we estimated pre-season (about 2.5 points per game tougher).
  • Excluding venue effects it was Sydney who had the toughest home and away schedule, and one that was even more significantly tougher than we estimated pre-season (about 3.5 points per game tougher).
  • Other teams that had harder schedules than we estimated pre-season were the Western Bulldogs, Adelaide, and the Brisbane Lions.
  • Teams that had significantly easier schedules than we estimated pre-season were West Coast, GWS, Hawthorn, and Melbourne.

Overall, there was a considerable difference between how we ranked the teams' schedules pre-season and how we'd rank them now knowing actual pre-game ratings and VPVs. The rank correlation coefficient for the Total Effective SOS data is only +0.51, and for Aggregate Opponent Ability Only just +0.21.

We can decompose the differences between the "actual" (knowing the true pre-game ratings) and "expected" (using the pre-season ratings) estimates of schedule strength by looking at the team-by-team results for each opponent.

(Note that, in this analysis, we'll look only at opponent ability and ignore VPVs. The differences we're investigating then are those shown in the rightmost column in the table above.)

Here we can see that, for example, the source of the very large difference between Sydney's actual and expected strength of schedule is largely attributable to the unexpected strength of West Coast, North Melbourne, Hawthorn,  and GWS, all of whom Sydney played twice, and also Melbourne, Richmond, Collingwood and Fremantle, all of whom they played once.

St Kilda's unexpectedly strong schedule is due largely to the same teams, although St Kilda met Richmond and Melbourne twice, which served to increase their difference relative to Sydney's, but also met North Melbourne and Collingwood when their ratings were not as high as when those teams met Sydney, which served to decrease their difference relative to Sydney's.

The Western Bulldogs and Fremantle also had relatively large differences, but both of theirs were lowered by virtue of playing an unexpectedly weak Carlton twice.

Moving to look at the teams with the smallest differences, we see that GWS, Hawthorn and Melbourne all benefited from facing Sydney, Adelaide, St Kilda, Western Bulldogs and Carlton at least once and at times when those teams' ratings were significantly lower than our pre-season estimates (with the exception of GWS' clash in Round 1 with the Western Bulldogs when there'd been no time for a gap to emerge between the Dogs' actual and expected rating).

As a final observation, it's interesting to note that, of the teams with the eight smallest differences between actual and expected schedule strength, six made the finals.