Quantifying Imbalances in the AFL Draw Across Recent History

More and more often now, I'm being offered interesting suggestions for analyses by followers on Twitter, and today's blog is another example of this.

Recently, one such follower was wondering about the revealed difficulty of Hawthorn's schedule this year (ie how good were the teams Hawthorn faced at the time they met them, not as assessed at the start of the season) and another was musing about the extent to which any draw imbalances tended to even out - or at least reduce - over a sequence of seasons. I thought it might be interesting to bring these two notions together and look at revealed schedule strengths across multiple seasons.

This analysis will allow us to investigate how "even" the schedule has been for different teams across those seasons, and which teams have endured the least and most challenging schedules during that period.

Firstly though, we need a working definition of draw imbalance. 

A draw or schedule might be said to be imbalanced over some period if the average strength of the opponents faced during that period, at the venues where they were played, varies across teams.

Given that definition, imbalance in the AFL scheduling is a nearly inevitable practical consequence of the decision not to employ an all-plays-all home-and-away regular season. While it might be theoretically possible to, in some seasons, find a schedule in which teams play only 22 of a possible 34 games and yet still all face identically challenging sets of opponents, in practice this is almost certainly an impossibility.

And, in any case, it's overtly not the intent of the AFL to construct balanced schedules for teams within a season - the draw is biased towards including games played between teams of roughly equal demonstrated abilities from the previous season (eg 14th vs 15th), and biased against probable mismatches (eg 1st vs 18th). If you're going to discard over one-third of all possible matchups (108 of 306), the AFL would contend, better to eliminate games that are less likely to result in narrow victory margins. 

If we can't expect to see parity within a single season, we might though expect to see something closer to it across multiple seasons, certainly for teams whose abilities have waxed and waned across the full range during the period. So, let's investigate that.


Any notion of balance requires a quantification of team abilities and for this purpose I'll be employing the MoSSBODS Team Rating System, which estimates teams' offensive and defensive abilities on the basis of the Scoring Shots they create and concede, and the quality of the attacks and defences against which they do so.

We'll look at the 2000 to 2015 Home-and-Away seasons and ask: what was the average Combined Offensive and Defensive Ratings of all the opponents faced by a given team at the time those teams were played?

But schedule strength is not just about which teams were played, but also about where they were played, because teams' performances are indisputably affected by venue.

Now MoSSBODS, as well as estimating teams' underlying abilities, also estimates the enhancement or inhibition of those abilities that occurs when they play at different venues. It makes this assessment not just for each team at its home ground or grounds, but for every team at every ground where it has played at least 30 games (before then the assumed effect is zero).. These measures are called Venue Performance Values in MoSSBODS.

Also, MoSSBODS adds 3 Scoring Shots to the Rating of a team when it hosts a team from another State. This is known as the Travel Penalty in MoSSBODS and is added to the Venue Performance Values for the competing teams to arrive at a Net Venue Effect.

We will incorporate these aspects of the scheduling into our assessment of schedule strength by calculating the average Net Venue Effect (from the opponent's viewpoint) for all games played between any pair of teams. Put another way, the only element of the MoSSBODS Rating System we'll be excluding from our assessment of the schedule strength of a particular team will be MoSSBODS' assessment of that team's own underlying strength when playing at a neutral venue.

So, we have as our measure:

Average Venue-Adjusted Opponent Strength = Average Opponent Combined Rating + Average Net Venue Effect

The units for this measure, as they are for all elements of MoSSBODS, are Scoring Shots (SS).


The table at right provides the summary details for every team for the 2000 to 2015 period.

It's sorted based on each team's Average Venue-Adjusted Opponent Strength, which reveals that the Western Bulldogs, based on this measure, have faced the most difficult revealed home-and-away schedule across the 16 seasons. On average, venue-adjusted, their opponents' have enjoyed an 0.8 Scoring Shot (SS) advantage per game. This compares with the all-team average of 0 SS.

Now in some sense, that advantage is not entirely of the Dogs' opponents' doing, since the Average Net Venue Effect is an amalgam of:

  • The Dogs' own heightened (or otherwise) abilities when playing at their home ground
  • The Dogs' abilities when playing at their opponents' home grounds
  • Their opponents' abilities when playing at the Dogs' home ground

The Dogs, like the four teams immediately below them on this list, suffer from the fact that they share a home ground with other Victorian teams. This means that, when facing those teams, they'll not enjoy the net benefit that a team such as Geelong, for example, will enjoy because they play at a venue where their opponents have less experience and will therefore probably have negative Venue Performance Values.

At the bottom of this list are the teams with the easiest schedules. It's interesting to note that, the Cats aside (who are known to be formidable at Kardinia Park but who also play a lot of games at Docklands and the MCG and play slightly better than expected at these venues too), the teams there are non-Victorian teams.

They're not there because of the 3 Scoring Shot Travel Penalty - they suffer that about as often as they enjoy it - but because they do relatively better, Travel Penalty aside, when playing away than do their opponents when travelling to these teams' home grounds. There seems to be a clear advantage in having a home ground that is exclusively, or almost exclusively, your own.

Another interesting aspect of this table is the correlation between the teams' average strengths and the unadjusted average strength of the opponents they face. As alluded to earlier, the AFL tries to skew the draw to ensure that strong teams meet other strong teams more often than they meet weak ones, and that weak teams tend to face other weak teams. This being the case, you'd expect a correlation between team and opposition strength, and that is indeed what we see if we correlate the data in the third and fourth columns of the table: the overall correlation is +0.42. That's not huge, but it's clearly non-zero.

That correlation disappears - indeed, reverses - once we incorporate Net Venue Effects, however, which is something I'd argue it is more difficult to hold the current AFL schedulers accountable for. This aspect of the imbalance was locked in, perhaps inadvertently, when the decision to move to shared home grounds for Victorian teams was taken.

We can dive a little deeper into some of the imbalances by looking at the statistics for every possible team pairing, which is what we do in the table that follows (click on it for a larger version).

Each cell in this chart contains three numbers: the average opponent strength and average Net Venue Effect across all games played between the teams named in the row and column, and, in brackets, the number of times they've played. A cell's colour is based on the sum of the average opponent strength and Net Venue Effect and is more red the larger is that sum (meaning the opponent was "tougher"), and more green the smaller is that sum (meaning the opponent was "easier").

According to MoSSBODS, the Dogs' challenging draw has been especially attributable to their encounters with Geelong, who've been Rated +2.92, on average, prior to their games, and who has enjoyed a +2.43 average Net Venue Effect advantage over the Dogs - a combination of the Cats' relative strength at Docklands and Kardinia, and the Dogs' lesser strength at both of those venues. The Dogs have also faced significant venue-adjusted opponents in Sydney, Collingwood and Adelaide.

In general, you can get a sense for how lucky or unlucky a team has been when facing a particular opponent by running your eye down the first set of numbers in any particular Opponent team column. For example, if we look at West Coast we can see that Port Adelaide have played them when, on average, their Combined Rating has been -0.56 SS, while Richmond has played them (albeit 3 times fewer) when, on average, their Combined Rating has been +1.29 SS. This is another thing that the AFL schedulers can't do much about, especially when teams perform much better or worse than their previous season's performances would have foreshadowed.

Similarly, you can get a sense of the relative abilities of a team at and away from home against different opponents by scanning across a row. If we review the Dogs' row, for example, we can see that they suffer a Net Venue Effect deficit against 15 of the 17 teams, which suggests that they don't travel well, don't perform at home especially well relative to most opponents, or a combination of both.

I'll finish the analysis of this table by pointing out the major imbalances for each team (excepting GWS and Gold Coast) in terms of pure matchup counts across the 16 seasons:

  • Western Bulldogs: faced Melbourne 27 times / Carlton only 18 times
  • Carlton: Collingwood 32, Essendon 31 / West Coast and Western Bulldogs 18 
  • Richmond: Essendon 29 / Adelaide, Geelong, West Coast and Brisbane Lions 20
  • North Melbourne: Geelong and the Brisbane Lions 26 / Essendon 19 
  • Melbourne: Western Bulldogs 27 / Essendon 18
  • Fremantle: West Coast 32 / Sydney and Western Bulldogs 19
  • Hawthorn: Geelong 26 / St Kilda 18
  • Essendon: Collingwood 32, Carlton 31 / Adelaide, Geelong, North Melbourne 19, Melbourne 18 
  • Collingwood: Essendon and Carlton 32  / Port Adelaide 19
  • Sydney: Brisbane Lions 27 / Fremantle 19
  • St Kilda: Fremantle, Richmond, Carlton 25 / Hawthorn 18 
  • Port Adelaide: Adelaide 32 / Collingwood 19
  • Brisbane Lions: Sydney 27 / St Kilda, Collingwood, Fremantle, Melbourne and Richmond 20
  • West Coast: Fremantle 32 / Carlton 18
  • Geelong: Sydney, Hawthorn and North Melbourne 26 / Essendon 19
  • Adelaide: Port Adelaide 32 / Essendon and Carlton 19

There are some quite substantial differences in that list, driven in many cases, of course, by the desire for "local derbies" or to perpetuate (or instigate) "traditional rivalries".


Another of my Twitter followers has suggested that he believes a natural "epoch" in the AFL lasts for about eight years so, for the final piece of analysis, I've replicated the previous table but using data only from the home-and-away seasons of 2008 through 2015.

The ordering isn't all that different excepting that, most notably:

  • North Melbourne now have the 10th-hardest draw, not the 4th-hardest
  • Hawthorn now have the 11th-hardest draw, not the 7th-hardest
  • Essendon now have the 3rd-hardest draw, not the 8th-hardest
  • Collingwood now have the 5th-hardest draw, not the 10th-hardest
  • Sydney now have the 6th-hardest draw, not the 11th-hardest

No other team's schedule difficulty ranking changes by more than 3 places.


For me, the key conclusion from all this analysis is that, across the last 16 (or even 8) seasons, teams have endured schedules of varying average strength. Not all of this difference can be attributed to a team's generally above- or below-average ability across this period and the AFL's desire to match them with teams of similar talents.

Other factors that contribute to the variability in schedule strength are:

  • teams' relative abilities at each other's home grounds, and the tendency of shared home ground arrangements to reduce the average "home ground advantage"
  • variability in team abilities across and within seasons
  • the AFL's desire to include certain fixtures in the schedule every year, regardless of the relative abilities of the teams involved

Some of the differences between teams' schedule strengths are clearly material. The gap between the teams at the top and bottom of the list, the Dogs and the Crows, for example, is almost 2 SS per game, or a little over a goal. That's enough to account for a sizeable difference in expected winning rates. But, as I said earlier, that's not entirely due to pure scheduling - some of it comes down to how well or poorly the Dogs and the Crows perform at and away from home.

Such imbalances will never go away, I'd suggest, but identifying and quantifying them is an important component of assessing any team's performance.

Of course all of this analysis is founded on the assumption that the MoSSBODS System does a reasonable job of estimating team strengths and venue effects. If anyone reading this has suggestions for other ways that schedule imbalance might be measured (or even defined), I'd be very keen to hear about them.