An Analysis of Strength of Schedule for the Men's 2021 AFL Season
/It feels slightly surreal that it’s only been a little over 12 months since last we did it, but it’s time to, once again, review the schedule for the men’s footy season ahead.
To do this, we’ll use the same methodology as we used last year (with one small adjustment described later), details of which appear in this post from 2015 and this one from the year before. We’ll again use the latest MoSHBODS Team Rating System to provide the required estimates of relative team ability and venue effects.
The 2021 AFL Fixture, as usual, has all 18 teams playing 22 of a possible 34 games, each missing 6 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. There is, again, a bye week for every team, these having been accommodated in 2021 by playing only 6 games in Rounds 12 through 14.
THE RULE OF THIRDS
In determining the 108 games to be excluded from the schedule, the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 198 home-and-away games, using the ladder positions of 2020 after the Finals series as the measure of that ability.
This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 42 of the 72 (or about 58%) of the possible pairings are included in the schedule. That's the same number as we had in the 2018, 2019 and (original) 2020 Fixtures.
By contrast, 46 of the 60 (or about 77%) of the possible pairings between the Top 6 teams are included, while 44 of the 72 (or about 72%) of the possible pairings between the Top 6 and the Middle 6 teams are included. Those are also the same proportions as we had in the 2019 and (original) 2020 Fixtures.
There are also 42 of a possible 60 pairings pitting teams from the Middle 6 against one another, 44 of a possible 60 pairings pitting teams from the Bottom 6 against one another, and 46 of a possible 72 pairings pitting a team from the Middle 6 against a team from the Bottom 6.
In total, 132 of the 198 contests (or about 67%) involve teams from the same one-third based on final official ladder positions last season. That’s down 4 games on what was fixtured for the 2019 and 2020 seasons.
MoSHBODS’ VIEWS ON STRENGTH OF SCHEDULE
Next we’ll use MoSHBODS' opinions about team strengths and venue effects to provide some answers to the following questions about the 2021 schedule:
How difficult is the schedule that each team faces, taking into account the teams faced and the venues at which they are faced?
How much has the use of the 'weighted rule' in truncating the draw helped or hindered a team's chances relative to a complete 34-round competition?
The first thing we need to estimate a team's schedule strength is a measure of their opponents' underlying abilities. For this purpose we'll use MoSHBODS’ 2021 pre-Round 1 Team Ratings, which are set by taking 65% of their final 2020 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below.
This year, most team rankings on MoSHBODS are similar to the ordering based on ladder finish, with no team ranked more than three places differently by the two methods.
In the context of the AFL's competition "thirds", only four teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:
Collingwood: Top 6 based on Ladder / Middle 6 based on MoSHBODS
Western Bulldogs: Middle 6 based on Ladder / Top 6 based on MoSHBODS
Fremantle: Middle 6 based on Ladder / Bottom 6 based on MoSHBODS
Hawthorn: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS
Note, however, the small difference in Combined Rating between Hawthorn in 12th and Fremantle in 13th (0.02 points), which means that the Hawks could easily have finished in the Bottom 6, and Fremantle in the Middle 6, based on Ratings but for a goal or two here and there.
The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:
Top 6: Average +10.5 Points / Range 9.3 Points (2020: +7.7 / 7.8)
Middle 6: Average -0.1 Points / Range 11.1 Points (2020: +1.4 / 9.3)
Bottom 6: Average -10.4 Points / Range 12.0 Points (2020: -9.1 / 22.3)
We can see that the Top 6 teams are, on average, stronger than those from last year, and the Middle and Bottom 6 teams weaker.
Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore about 21 points (+10.5 less -10.4). That’s about 4 points more than last season. Also, the spread of Ratings is - again, but less so than was the case last season - greater in the Bottom 6 than in either the Top 6 or the Middle 6. So it's relatively more important exactly who you play from the Bottom 6 than who you play from the other 6s.
VENUE PERFORMANCE VALUES
MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue relative to their own and their opponents’ underlying ability. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.
The current Venue Performance Values are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2021 home-and-away season. For details on how these have been calculated, refer to this blog
(Interestingly, Gold Coast play at 11 different venues in 2021, and GWS and St Kilda play at 10. Collingwood plays at only 6)
Venue Performance values are, like Ratings, measured in Points, and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +5.3 Points better team than their underlying +17.8 Points Rating when playing at Kardinia.
In previous Strength of Schedule blogs, I’ve included only the VPV of a team’s opponents in the Strength calculations, but this year I’ve rethought that practice and decided to include the VPV of the team itself as well, because I think this better encapsulates the full venue effect.
To provide a concrete example of why I think this is the case, imagine moving a Geelong v Essendon game from Kardinia Park to Docklands assuming that the VPV numbers in the table above apply. Under the old methodology, that would have no effect on Geelong’s estimated Strength of Schedule, because Essendon’s VPV at both venues is about the same at around minus 0.7 to 0.8 points, and we would ignore Geelong’s VPV at those venues. But, Geelong is estimated to be about a 6-point better side at Kardinia Park compared to Docklands - a fact which, arguably, seems worthy of inclusion in the Strength calculation. The fixture with that game at Kardinia Park is surely an easier fixture than the one with that same game at Docklands.
The main drawback that I can see from this approach is that it tends to increase the estimated schedule strength for teams that have relatively low VPVs at all venues (for example, Essendon), and decreases the estimated schedule strength for teams that have relatively high VPVs at all venues. If a team seems to enjoy no home ground advantage anywhere, is it reasonable to therefore assess them as having a more difficult schedule and, conversely, if a team seems to play relatively equally well at all venues, is it reasonable to therefore assess them as having a less difficult schedule? Ultimately, this is probably a question of personal preference but, for this year at least, I’m answering “yes” to both those questions.
One way of avoiding this issue is, of course, to solely consider the underlying abilities of the teams faced and ignore venues altogether. In the table that follows, I’ll provide the data to allow you to do exactly that.
Anyway, because of the manner in which they are calculated, the Venue Performance Values incorporate the effects, if any, of interstate travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, Sydney, Essendon and Fremantle are all about 8-point worse teams, and most other interstate teams are about 5- to 7-points worse, with Geelong the major exception. (Gold Coast are about a 2 point worse team at the Gabba, but you can’t really attribute that to the travel.)
STRENGTH OF SCHEDULE
After performing the necessary calculations for all 22 games for every team, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.
(See the STRENGTH OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)
In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from venue effects. We would generally expect the Aggregate Net Venue Performance figure to be negative for a team in this part of the table, since their opponents are likely to have negative VPVs and they themselves are likely to have positive VPVs. That is, indeed, the case here.
If you take a closer look at the VPV table above you'll see that, in general, the VPVs of non-Victorian teams at Victorian venues are less negative than the VPVs of Victorian teams at non-Victorian venues. That might be partly attributable to the fact that the non-Victorian teams play more often at the relatively small number of Victorian venues (and so are more familiar with them) whilst, in contrast, the Victorian teams play less often at any given interstate ground.
The result of this is that the Aggregate Net Venue Performance values in the table above tend to be highly negative for non-Victorian teams when playing at home, but not unusually positive for those same teams when playing away. The overall impact of this is that non-Victorian teams are generally assessed as having easier schedules, if venue effects are included.
But, let’s return to the Strength of Schedule table and look a bit closer.
Based on each team’s home fixtures, the teams with the five most difficult schedules (including net venue effects) are:
Melbourne (+12.7)
Collingwood (-6.8)
North Melbourne (-21.2)
Essendon (-36.4)
Carlton (-38.3)
If we ignore venue effects, such is the underlying strength of the teams they face at home, Collingwood rank 1st, Brisbane Lions 2nd, Melbourne 3rd, Port Adelaide 4th, and North Melbourne 5th. On this metric, Essendon would slide into 17th (partly because they have a negative VPV at Docklands and the MCG, which makes their fixture seem more difficult when venue effects are included), Carlton to 15th (partly because they have a negative VPV at Docklands), and Richmond to 11th (partly because they also have a negative VPV at Docklands)
Those with the easiest home schedules (including net venue effects) are:
West Coast (-153.5)
Sydney (-121.4)
Adelaide (-119.2)
Fremantle (-101.7)
GWS (-96.1)
Were we to ignore venue effects, West Coast and Sydney would remain in the bottom five in terms of home schedule difficulty, while Adelaide, Fremantle and GWS would move into 10th, 8th, and 7th respectively, such is the benefit for these three teams of playing at their home grounds. Western Bulldogs, Essendon and Carlton would move into the bottom five.
The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and venue effects. Here we would expect the Aggregate Net Venue Performance figures to be positive for a team, since their opponents are likely to have positive VPVs at their home grounds. That is, indeed, the case for every team, and to a remarkably similar aggregate extent.
Based on their away fixtures, the teams with the five most difficult schedules (including net venue effects) are:
St Kilda (+109.8)
West Coast (+92.7)
Gold Coast (+86.1)
Essendon (+86.0)
Fremantle (+81.5)
Ignoring venue effects would leave West Coast, Gold Coast, and Fremantle in the Top 5, but drop Essendon to 6th, and St Kilda to 8th. Moving into the Top 5 would be GWS and Port Adelaide.
Those with the easiest away schedules (including net venue effects) are:
Hawthorn (+19.1)
Collingwood (+42.6)
Melbourne (+50.4)
Geelong (+52.0)
Brisbane Lions (+55.6)
Ignoring venue effects would see Hawthorn and Geelong remaining in the Top 5, joined by Adelaide, Richmond, and Carlton. Collingwood, Melbourne, and Brisbane Lions, whilst no longer in the Top 5, would still be in the preferred half of the 18 teams.
Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have (including net venue effects):
Tough Schedules: St Kilda, Melbourne, Essendon, North Melbourne
Slightly Harder Schedules: Collingwood, Carlton, Richmond, Gold Coast
Average Schedule: Western Bulldogs
Slightly Easier Schedules: Port Adelaide, Fremantle, Brisbane Lions, GWS
Easy Schedules: Geelong, Sydney, Hawthorn, West Coast, Adelaide
Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out:
Essendon and North Melbourne have more difficult schedules than might be expected for a team in the Bottom 6
Brisbane Lions, Geelong and Port Adelaide have easier schedule than might be expected for Top 6 teams
West Coast has a slightly easier, and Melbourne a slightly harder schedule than might be expected for Middle 6 teams
To investigate the issue that some of these disparities might be attributed mainly to net venue effects, I have, as mentioned, included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).
Looked at that through this lens we see that:
Essendon’s and Western Bulldog’s fixtures appear much easier
Melbourne’s, Carlton’s, and Port Adelaide’s fixtures appear somewhat easier
Brisbane Lions’ and Adelaide’s fixtures appear much harder
GWS’, and West Coast’s fixtures appear somewhat harder
Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 127 points across the season, which is about 6 points per game.
A 6-point advantage turns a game with an otherwise 50% victory probability into one with about a 57% probability, which converts to about 1.5 extra expected wins across a 22-game season. If, instead, we assume a 25% (or 75%) average probability without the advantage, then the 6-point advantage is worth about 1.2 extra expected wins a season.
STRENGTH OF MISSING SCHEDULE
We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 22 of a possible 34 games.
The table below summarises the missing games in the 2021 Fixture, denoting with H's those games missed that would have been home games for a team, and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2020 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.
Richmond, for example, fail to play two of the other Top 6 teams twice during the season, missing out on Port Adelaide at home, and Collingwood away. Geelong misses both Port Adelaide and Collingwood at home. Collingwood and Port Adelaide fare best amongst the Top 6 teams, both playing only 7 of a possible 10 games against other teams from the Top 6.
Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.
The column headed ‘Total’ shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. The more negative it is, the weaker in aggregate are the teams not played twice; the more positive it is, the stronger in aggregate are the teams not played twice.
On this measure, St Kilda’s schedule was furthest away (in a detrimental sense) from what it would have enjoyed in an all-plays-all home-and-away fixture, Richmond’s was second-furthest, and Brisbane Lions’s third-furthest (though Geelong’s was almost equally as distant). Conversely, Essendon’s schedule was furthest away in a beneficial sense, Sydney’s second-furthest, and Adelaide’s third-furthest (though North Melbourne’s was almost as far).
As we'd expect, the magnitude of the number in the Total column for a team is broadly related to that team’s final ladder position, reflecting the AFL’s desire to have stronger teams play fewer games against weaker opponents and more games against similarly stronger opponents, and to have weaker teams play fewer games against stronger opponents and more games against similarly weaker opponents.
By adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the earlier Strength of Schedule Actual table.
CONCLUSION
If we weight the two components of schedule strength - say average quality of the opposition faced at 75%, and net venue effects at 25% - we might summarise the teams' relative Strength of Schedules as follows (with teams’ final 2020 ladder positions shown in brackets)
Tough Schedules: St Kilda (5th)
Harder Schedules: North Melbourne (17th), Melbourne (9th), Collingwood (6th)
Slightly Harder Schedules: Carlton (11th), Richmond (1st), Gold Coast (14th)
Roughly Average Schedules: Essendon (13th), Brisbane Lions (4th)
Slightly Easier Schedules: Fremantle (12th), GWS (10th), Port Adelaide (3rd), Geelong (2nd)
Easier Schedules: Western Bulldogs (8th), Sydney (16th), Hawthorn (15th), West Coast (7th), Adelaide (18th)
Relative to the AFL’s intentions, you could make a case based on this listing that:
North Melbourne and Gold Coast (and maybe St Kilda) were hard done by
Geelong and Port Adelaide (and maybe Brisbane Lions, West Coast and Western Bulldogs) did better than they might have expected
As I said last year, Strength of Schedule is quite a challenging metric to define and measure, so you’ll get a better sense of it and be more able to make your own assessment by reading a variety of approaches.