The 2017 AFL Draw: Difficulty and Distortion Dissected

I've seen it written that the best blog posts are self-contained. But as this is the third year in a row where I've used essentially the same methodology for analysing the AFL draw for the upcoming season, I'm not going to repeat the methodological details here. Instead, I'll politely refer you to this post from last year, and, probably more relevantly, this one from the year before if you're curious about that kind of thing. Call me lazy - but at least this year you're getting the blog post in October rather than in November or December.

So, let's get into it.

The 2017 AFL Draw, released, as is now custom, in late October, once again has all 18 teams playing 22 of a possible 34 games, each missing 6 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. The bye rounds have been handled a little differently this year, with Round 9 having 8 rather than 9 games, Round 11 and Round 13 having 6 games, and Round 12 having 7 games, but in all other superficial respects 2017 looks a lot like 2016.

In determining the 108 games to be excluded the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 198 home-and-away games, using the ladder positions of 2016 after the Finals series as the measure of that ability.

So, this year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 42 of the 72 (or about 58%) possible pairings are included in the schedule. By contrast, 46 of the 60 (or about 77%) of the possible pairings between the Top 6 teams are included.

More broadly, as you can see from the table, teams within each third play more games against one another than they do against teams from the two other thirds.

Excluding games, however it's done, almost inevitably imbalances a draw in that the combined strength of the opponents faced by any one team across the entire home-and-away season will differ from that of every other team. At face value, the AFL's methodology for trimming the draw seems likely to exacerbate that imbalance - and deliberately so - especially for teams in the top and bottom thirds who will face quite different mixes of team abilities.

In reality, the actual effect of the AFL's schedule truncation on the variability of team schedule strength depends on the degree to which last year's final ladder positions reflect the true underlying abilities of the teams, the spread of ability within each "third" of the competition, and the relative magnitude of venue effects in enhancing or depressing these abilities. Those are the things, of course, that the MoSSBODS Team Rating System is designed to estimate.

This year we'll use MoSSBODS' opinions to to answer the following questions about the schedule:

  1. How difficult is the schedule that each team faces, taking into account the teams faced and the venues at which they are faced?
  2. How much has the use of the 'weighted rule' in truncating the draw helped or hindered a team's chances relative to a complete 34-round competition?


The first thing we need to estimate a team's schedule strength is a measure of their opponents' underlying abilities. For this purpose we'll use MoSSBODS 2017 Team Ratings, which are set by taking 70% of the final 2016 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below. 

This year sees a few teams ranked more than a couple of spots differently by MoSSBODS compared to their official final ladder position. Adelaide, for example, will start the season as the top-rated MoSSBODS team despite finishing 6th on the final ladder, while the Western Bulldogs find themselves reigning Premiers, but ranked only 4th on MoSSBODS.

In the context of the AFL's competition "thirds", however, no team would be placed in a different third were MoSSBODS to be used rather than the final ladder in defining the boundaries.

The average Combined Rating of teams from each of the thirds is as follows:

  • Top 6: +4.35 Scoring Shots
  • Middle 6: +0.14 Scoring Shots
  • Bottom 6: -4.49 Scoring Shots

So, ignoring Venue Effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is almost 9 Scoring Shots (SS), or a little over 5 goals. That's about 2 SS, or a bit over a goal, more than the difference was between the Top and Bottom thirds last season.

MoSSBODS also provides estimates of how much better or worse teams, on average, play at each venue. These estimates are known as Venue Performance values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for a given team.

The current Venue Performance values are summarised in the table below for all of the venues being used sometime during 2017. Note that teams need to have played a minimum number of games at a venue before their Venue Performance value is altered from zero (shown as dashes in the table below to improve readability).

Venue Performance values are, like Ratings, measured in Scoring Shots, and are added to a team's underlying MoSSBODS Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong, on average, is a +0.31 SS better team than their underlying +4.08 SS Rating when playing at Docklands 

One final adjustment is made to the estimated strength of an opponent, this one to reflect the relative impact of any significant travel on the two teams. If a team needs to travel interstate (or overseas) to play a game then a -3 SS adjustment is made to its underlying rating. So, for example, while Adelaide has a +2.32 SS Venue Performance value at Docklands, once the Travel Penalty is taken into account they are actually assessed as a -0.68 SS poorer team when playing at this venue.

After performing this calculation for all 22 games for every team, we arrive at the Strength of Schedule calculations below, within which larger positive values represent more difficult schedules.

The Travel Penalties in the Strength of Schedule calculations work, as you'd expect, to produce net negative Strength of Schedule scores for each team's Home games taken as a whole, and net positive Strength of Schedule scores for the Away games. Taken together, the two Aggregate Nett Travel Penalty columns provide a measure of how kind or cruel the schedule is to a team in terms of overall travel.

On this measure, the Gold Coast, Adelaide, Port Adelaide and Melbourne fare best, their aggregates all coming in at a 9 SS reduction in the overall Strength of Schedule. Carlton, alone, fare worst, their aggregate a 6 SS increase in overall Strength of Schedule. For every other team, the aggregates lie between -3 and +3 SS, so I think it's fair to say that the draw balances this aspect of a national competition fairly well.

In total, Hawthorn is assessed as having the most difficult draw, GWS the second-most difficult, and Fremantle the third-most. Interestingly, GWS and Hawthorn were in the Top 3 last season as well. One contributor to the elevated schedule strengths for both Hawthorn and GWS is that they both play 8 of a possible 10 fixtures against other teams from their Top 6 - a burden they share with Sydney and Geelong.

Adelaide are assessed as having the easiest draw, Gold Coast the second-easiest (they had the easiest last year), followed by Port Adelaide and Richmond. Adelaide's riches include the nett travel benefit mentioned earlier, as well as the fact that they play only 7 of a possible 10 fixtures against teams within their third.

Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, three teams stand out: Fremantle and, to a lesser extent, Essendon, because of the relative difficulty of their draws given they come from the bottom third, and Adelaide because of the relative ease of its draw given they come from the top third. 

The difference between the hardest and easiest schedules this year amounts to about 23.5 Scoring Shots across the season, which is a tick over 1 Scoring Shot or 3.7 points per game assuming a 53% Conversion rate. That's roughly 0.5 Scoring Shots per game smaller than the difference was assessed at last season. An average advantage of 3.7 points per game is equivalent to about 0.7 to 0.9 extra wins across a 22-game season, depending on the average ability of the teams faced.

If we exclude the teams with the two easiest and hardest schedules, however, the difference shrinks to just 0.6 SS or a little over 2 points per game. That represents less than half a win across the season.


Even a 34 round, all-plays-all draw wouldn't produce identical Strength of Schedule estimates for every team. There are a couple of reasons for this:

  • No team would play itself and so, relative to all other teams, would be "missing" the impact of a home and away fixture against itself
  • Teams differ in their relative abilities at some away venues. For example, Collingwood has a positive Venue Performance value for games played at Kardinia, while Essendon has a negative Venue Performance value for that ground. As such, there's an irreducible difference between the schedule strengths of Collingwood and Essendon from this source alone.

As such, if we were to estimate what I'll call the Strength of Missing Schedule, what we'll get won't simply be the difference between what we've already calculated and some common, overall all-plays-all Strength of Schedule figure. Instead, we'll get an estimate of the extent to which the deliberate imbalance in the schedule has benefited or harmed each team relative to what its own best possible Strength of Schedule could be given the inherently distorting differences listed above.

So, let's do that, assuming that all of the unplayed games would have been played on a team's most commonly used home ground. That means Carlton, Collingwood, Hawthorn, Melbourne and Richmond are assumed to play all of their missing home games at the MCG; Essendon, the Kangaroos, St Kilda and the Western Bulldogs at Docklands; Fremantle and West Coast at Subiaco; Adelaide and Port Adelaide at Adelaide Oval; Gold Coast at Carrara; Sydney at the SCG; GWS at the Sydney Showground; Geelong at Kardinia Park; and the Brisbane Lions at the Gabba.

The table below summarises the missing games, denoting with H's those games missed that would have been home games for a team. and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2016 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.

The Western Bulldogs, for example, fail to play three of the other Top 6 teams twice during the season, missing out on Geelong, Hawthorn and Adelaide at home. That makes them the only team in the Top 6 to have more than two of their home games against their peers excised from the schedule. Geelong, Hawthorn and Adelaide, by contrast, miss none of their home games against peers. Adelaide also miss three away games against their peers for which they would otherwise incur the Travel Penalty.

Overall though, Geelong suffers most from the schedule truncation, especially in relation to what would otherwise be its full suite of home fixtures. It misses out on home games against 4 of the Bottom 6 teams, including both Essendon and the Brisbane Lions.

GWS fares next-worst, though in its case more for the away games it doesn't get to play, which include 5 of the 9 lowest MoSSBODS-rated teams.

Least disadvantaged are the Brisbane Lions, mostly on account of the away games they miss, which include three against teams with Top 5 MoSSBODS Ratings. The Gold Coast are next-least disadvantaged, they too mostly for missing away games, which include four against teams with Top 8 MoSSBODS Ratings.

The differences in the Strengths of Missing Schedules are much greater than those in the Strength of Schedules. They span a range of almost 38 SS, which equates to a little over a goal a game and 1.2 to 1.5 wins per season.

What's very apparent in this table is the alignment between the teams' final ladder positions from 2016 and the extent to which they have been penalised by the truncation of the draw. The teams with the 5 easiest missing schedules all come from the top third of the ladder, while those with the four most difficult missing schedules come from the bottom third.

Which is of course, exactly as the AFL would have intended.