The 2016 AFL Draw, released in late October, once again sees teams playing only 22 of the 34 games required for an all-plays-all, home-and-away competition. In determining which 12 games - 6 at home and 6 away - a given team will miss, the League has in the interests of what it calls "on-field equity" applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents, using the final ladder positions of 2015 as the measure of that ability.
So, for example, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 41 of the 72 (or about 57%) possible pairings are included in the schedule. By contrast, 48 of the 62 (or about 77%) possible pairings between Top 6 teams are included.
This year, as last, we'll be analysing the draw in two ways to answer the following questions:
- How hard is the schedule that each team faces?
- How much has the 'weighted rule' helped or hindered a team's chances relative to a complete 34-round competition?
The first assessment is sometimes referred to as a Strength of Schedule assessment, and a number of quality blogs have already made their own such assessments (for example, Troy Wheatley at The Wooden Finger Depot, and @arwon and @capitalcitycody at Hurling People Now) as have, of course, the mainstream blogs and papers (for example, ABC Sport, The Age and FOX Sports).
Some of these sources have also incorporated or made reference to the second assessment, with another quality blog, The FootyMaths Institute, making it a central pillar of its analysis.
All of which is to point out that there's already been a great deal of quality analysis of and commentary about the 2016 AFL Draw, so I write this blog acknowledging that body of work and not expecting to stun anyone with a startlingly overlooked insight. (By the way, if you're aware of any other analyses I should also link to, let me know and I'll include them.)
STRENGTH OF SCHEDULE
The first input into any Strength of Schedule calculation is a quantitative assessment of each opponent's ability, for which purpose I'll be using the final 2015 MoSSBODS Team Ratings, adjusted as they will be for the start of a new season by taking 70% of their end-of-year values.
Under these Ratings, which you can see in the table at right, most teams are ranked similarly to their official final ladder positions, the notable exception being Fremantle, which MoSSBODS Ranks 9th despite their official 3rd-place finish.
Fremantle aside, the only team with a MoSSBODS Ranking more than two places different from its ladder finish is Port Adelaide, which MoSSBODS has 6th, and the ladder only 9th.
The actual Ratings themselves represent a team's ability in terms of Scoring Shots above or below average (for a game played on a neutral venue) so West Coast, for example, is Rated about 6.2 Scoring Shots above an average team, and about 3.6 Scoring Shots better than Sydney.
MoSSBODS, as well as providing an assessment of teams' underlying abilities, also estimates how much better or worse teams play at particular venues. These estimates are known as Venue Performance adjustments, the current values for which are summarised in the table below for all of the venues being used sometime during 2016. Note that teams need to have played a minimum number of games at a venue before their Venue Performance adjustment is altered from zero (shown as dashes in the table below to improve readability).
These Venue Performance adjustments are added to an opponent's underlying MoSSBODS Rating in the Strength of Schedule calculation. So, for example, if Fremantle faced Adelaide at Subiaco, Adelaide's +2.17 Rating would be reduced by 1.91 to account for the fact that Adelaide have, historically, not performed as well at Subiaco as their underlying ability at the time would have suggested.
One final adjustment is made to the estimated strength of an opponent, this one to reflect the relative impact of any significant travel on the two teams. If both or neither team needs to travel interstate (or overseas) then no further adjustment is made, but if one travels while the other does not, then a -3 Scoring Shot adjustment is made.
After performing this calculation for all 22 games for every team, we arrive at the Strength of Schedule calculations below, within which larger positive values represent more difficult schedules.
The travel penalties in the Strength of Schedule calculations work, as you'd expect, to produce net negative Strength of Schedule scores for each team's Home games taken as a whole, and net positive Strength of Schedule scores for the Away games.
In total, the Roos are assessed as having the most difficult draw, GWS the second-most difficult, and Hawthorn the third-most. Geelong have the easiest draw, followed by Port Adelaide and the Gold Coast. The difference between the hardest and easiest schedules amounts to about 34 Scoring Shots across the season, which is about 1.5 Scoring Shots or 5.5 points per game assuming a 53% Conversion rate.
STRENGTH OF MISSING SCHEDULE
So, we see that the Roos, Giants and Hawks face the toughest 22-game schedules of all the teams, and the Cats, Power and Suns the easiest. We might wonder the extent to which these and other teams' Strength of Schedule estimates were determined by the average ability of a team's potential opponents when playing at home or away, and the extent to which the imbalanced draw played a role.
One way of quantifying the effect of the imbalance is to calculate what I'll call a Strength of Missing Schedule estimate, which is calculated in an identical fashion to the earlier Strength of Schedule estimate, assuming that all of the unplayed games would have been played on a team's most commonly used home ground. That means Carlton, Collingwood, Hawthorn, Melbourne and Richmond are assumed to play all of their missing home games at the MCG; Essendon, the Kangaroos, St Kilda and the Western Bulldogs at Docklands; Fremantle and West Coast at Subiaco; Adelaide and Port Adelaide at Adelaide Oval; Gold Coast at Carrara; Sydney at the SCG; GWS at the Sydney Showground; Geelong at Kardinia Park; and the Brisbane Lions at the Gabba.
The table below summarises the missing games, denoting with H's those games missed that would have been home games for a team. and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2015 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.
Hawthorn, for example, fail to play only two of the other Top 6 teams twice during the season, missing out only on Fremantle and Adelaide away. They also miss out on four home games against teams from the Middle 6, a fate shared only by West Coast and, curiously, Gold Coast, although the remainder of the Suns' missing schedule more than makes up for this, as we'll see.
These distortions in the Hawks' schedule leave them as the team most disadvantaged by the imbalanced schedule, a little ahead of Adelaide, the only team in the Top 6 that plays each team in the Bottom 6 only once, and the Kangaroos, the only team to miss out on home games against the 3 of the bottom 4 sides.
Least disadvantaged are Gold Coast, who play home games against all 5 other teams in the Bottom 6 and play neither of the top 2 teams twice, Carlton who play none of the top 4 teams twice and only 2 of the top 13 teams twice, and St Kilda who play only 1 of the top 7 teams twice and none of the top 3 twice.
It's worth noting in passing that Essendon are the only team that plays none of the Top 6 teams twice. Still, they have only the fifth-easiest schedule.
At the start of this blog I linked to a range of other sources containing their own Strength of Schedule (SOS) and Strength of Missing Schedule (SOMS) estimates. In the tables that follow I've compared these estimates.
Here, firstly is the SOS comparison, where HPN refers to the Hurling People Now blog, which provides two estimates of SOS; WFD refers to the Wooden Finger Depot website; and MoS refers to this website.
The colour-coding of the numbers hints at a broad agreement of opinion amongst we number-crunchers, the level of which is quantified in the correlations that appear below the table. These are all in the +0.64 to +0.92 range and are, bar a handful, all around +0.75 or higher. That's convergent validity in action.
Curious then, is ABC Sport's assessment of the Blues as "Losers" in the scheduling, and FOX Sport's characterisation of the Dees in the same manner. FOX's tagging of the Tigers as "Winners" is also at odds with the other assessments.
Lastly, here are three SMOS estimates, these coming from this blog (MoS), the FootyMaths Institute (FMI), and derived as a difference between two of the estimates on the Hurling People Now (HPN) site.
The level of agreement here is particularly high, all three pairwise correlations coming in at +0.94.
One particularly striking feature of this chart is how the colour-coding generally reflects the teams' ladder finishes from last year (which is the basis on which the rows are sorted), with higher-placed teams suffering more from the truncated and imbalanced scheduling.
The standout exceptions are Geelong and Port Adelaide, whose SMOS estimates are more in keeping with a team that finished several ladder positions lower than they actually did. If you cast your eye back to the SOS estimates you'll see that this is also the case for that data.
When you recall that both teams were genuine Finals hopes until the last few weeks of the 2015 home-and-away season, that makes these apparently favourable draws seem even more fortuitous.