An Analysis of Strength of Schedule for the Men's 2026 AFL Season
/The men’s AFL fixture for 2026 was released this last* week and, as is tradition, we’ll analyse it to see which teams we think have done better or worse than others.
In particular, we’ll look for answers to two questions:
Which teams fared best and which worst in terms of the overall difficultly of the teams they face in their schedule (the Strength of Schedule analysis) before and after adjusting for venue effects? Our measure here will be the MoSHBODS Combined Ratings of a teams’ opponents, which we’ll adjust for venue effects for the supplementary analysis
Which teams fared best and which worst in terms of the matchups they missed out on given that only 23 games out of a possible 34 all-plays-all fixture are played (the Impact of Missing Schedule analysis)? Our measure here will be how much more or less likely would each team be to finish in the Top 10 ladder positions were the missing parts of the fixture actually played.
(* the final set of simulations took days to run - I won’t get into why)
THE FIXTURE
The 2026 AFL Fixture has all 18 teams playing 23 of a possible 34 games, each missing 6 of the home and 5 of the away, or 5 of the home and 6 of the away clashes that an all-plays-all full schedule would entail (except for Hawthorn and Western Bulldogs who meet twice in the coming season with Hawthorn the home team in both).
There are two bye weeks for every team, these having been accommodated in 2026 by playing only five games in Round 0, eight games in Rounds 4 and 13, and seven games in Rounds 2, 3, 12, and 14 through 16. That means teams will have played the same number of games next season only from the end of Round 16 onwards.
THE RULE OF THIRDS
In determining the 99 games to be excluded from the schedule, the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 207 home-and-away games, using the ladder positions from 2025 after the Finals series as the measure of that ability.
Those ladder positions are used to split the teams into three groups of six, with teams playing more games with teams in their own third than with teams from the two other thirds.
Previously, teams were fixtured to twice play against no more than three of the five other teams in their third, but this year this was increased to four, which has lifted the number of home and away games played between teams from the same third from last year’s 138 to 142 of a possible 180 (about 79%).
This year, again, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 44 of the 72 (or about 61%) of the possible pairings are included in the schedule.
By contrast, 48 of the 60 (or about 80%) of the possible pairings between the Top 6 teams are included, while 46 of the 72 (or about 64%) of the possible pairings between the Top 6 and the Middle 6 teams are included (note that this includes the two games between Hawthorn and Western Bulldogs that are both designated as Hawthorn home games).
There are also 46 of a possible 60 pairings (77%) pitting teams from the Middle 6 against one another (up 2 on last year), 48 of a possible 60 pairings (80%) pitting teams from the Bottom 6 against one another (the same as last year), and 46 of a possible 72 pairings (64%) pitting a team from the Middle 6 against a team from the Bottom 6 (also the same as last year).
HOW TO measurE THE strength of aN OPPONENT
Measuring the strength of the schedule that a team faces across the home and away season requires a measure of the relative ability (or ‘strength’) of each of their opponents. Some analyses that you’ll see will use a team’s competition points from the previous home and away season as the measure of their ability, while others will instead use the team’s percentage (points for divided by points against).
To simplify the analysis and recognising the imbalanced nature of the draw, some will only consider these measures for the teams faced twice in the season, implicitly assuming that the combined strength of playing each opponent once is the same for all teams. That clearly isn’t true for two reasons:
An opponent playing at home is usually harder to defeat (ie stronger) than when playing away
By definition a team doesn’t play itself and so the collective ability of all opponents is different for every team
As a way of dealing with that first point, some analyses look to adjust for where games are played, for example increasing the estimated strength associated with those games where a team is forced to play interstate or away.
We will, instead and as usual, measure each team’s relative ability based on its MoSHBODS Combined Ratings, and we will adjust for venuing based on the Venue Performance Values for each team at each venue, which is a measure of how much above or below expectation a team tends to perform at a given venue once you adjust only for the ratings of the two teams. They are generally positive for a team’s home grounds and negative for away grounds.
Analyses will use the same methodology as we used last year, details of which appear in this post from 2015 and this one from the year before. The Venue Performance Values will be those calculated as at the start of the 2026 season, and the Team Ratings will be those that will apply for the first game of the 2026 season (which is 65% of what they were at the end of the 2025 season).
CALCULATING STRENGTH OF SCHEDULE
The metric that we’ll use to estimate Strength of Schedule is the combined strength of the teams met across the 24 rounds of the home and away season, adjusted for venue effects for both teams.
The estimated difficulty of a single game will be:
Excluding Venue Effects: Opponent’s Combined Team Rating
Including Venue Effects: Opponent’s Combined Team Rating + Opponent’s VPV for the game venue - Own VPV for the game venue
The second value will generally be higher than the first (ie the game will be assessed as more difficult) if the game is played at an opponent’s home ground instead of ours because VPVs for teams’ home grounds tend to be positive and those for away grounds negative, especially if the venue is interstate.
Higher values of either estimate implies that it is a more difficult game in the fixture.
The first thing we need for this metric is a measure of each team’s underlying abilities, and these appear in the table below.
Recall that Ratings are now measured in historical team score standard deviations, which at present maps 1 SD to about 25 points.
So, for example, the Western Bulldogs’ Offensive Rating of +0.6 means that they would be expected to score about 15 points more than an average team when playing an average team at a neutral venue.
This year there are again some teams that are ordered very differently based on their MoSHBODS Ratings compared to their ladder finish, with the biggest differences being for:
Western Buldogs: 1st on MoSHBODS and 9th on the Ladder
Melbourne: 10th on MoSHBODS and 14th on the Ladder
(I mention in passing that the correlation between the teams’ Combined Ratings and their percentages from the 2025 Home and Away season is about +0.94, which lends some credibility to using that as a measure of relative team ability).
In the context of the AFL's competition "thirds", only four teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:
Western Bulldogs: Middle 6 based on Ladder / Top 6 based on MoSHBODS
Adelaide: Top 6 based on Ladder / Middle 6 based on MoSHBODS
Melbourne: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS
St Kilda: Middle 6 based on Ladder / Bottom 6 based on MoSHBODS
The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:
Top 6: Average +0.48 SDs / Range 0.38 SDs
Middle 6: Average +0.12 SDs / Range 0.93 SDs
Bottom 6: Average -0.6 SDs / Range 1.15 SDs
We can see that the Top 6 teams from the final 2025 ladder are, on average, slightly stronger than those from the Middle 6, and that those from the Middle 6 are, on average, substantially stronger than those from the Bottom 6.
Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore about 1.08 SDs or about 27 points.
With relatively large spreads in the ratings across the Middle and Bottom thirds - the equivalent of about 4 goals in the Middle 6, and almost 5 goals in the Bottom 6 - it's quite important which of the teams from these thirds a team plays.
VENUE PERFORMANCE VALUES
MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue relative to their own and their opponents’ underlying ability. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.
The Venue Performance Values, calculated as they would be on day 1 of the 2026 season, are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2026 home-and-away season. For details on how these have been calculated, refer to this blog.
(Interestingly, this year it’s the turn of Gold Coast to play at 12 different venues, and GWS, Brisbane Lions, and North Melbourne at 11 different venues. In contrast, Port Adelaide, Carlton, Collingwood, and West Coast play at only eight different venues.)
Venue Performance values are, like Ratings, measured in Standard Deviations (SDs), and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +0.42 SDs better team than their underlying +0.67 SDs Rating when playing at Kardinia.
Estimate number 2 for calculating game strength includes a team’s own VPV as well as that of its opponent. I do that because I think, on reflection, this better encapsulates the full venue effect, although prior to 2021, only opponents’ VPVs were included.
To reiterate the rationale for this by way of a concrete example, imagine moving a Brisbane Lions v Western Bulldogs game from the Gabba to Carrara assuming that the VPV numbers in the table above apply. Under the old methodology where a team’s own VPV was ignored, that change of venue would have virtually no effect on Brisbane’s estimated Strength of Schedule, because Western Bulldogs’ VPV at both venues is about the same at around minus 0.23 to 0.27 SDs, and we would ignore Brisbane’s VPV at those venues. But, Brisbane is estimated to be almost a 10-point better side at the Gabba compared to Carrara - a fact which seems worthy of inclusion in the Strength calculation. The fixture with that game at the Gabba is surely an easier one for Brisbane than the one with that same game at Carrara.
The main drawback that I can see from this approach is that it tends to increase the estimated schedule strength for teams that have relatively low VPVs at all venues (for example, North Melbourne), and decreases the estimated schedule strength for teams that have relatively high VPVs at all venues. If a team seems to enjoy no home ground advantage anywhere, is it reasonable to therefore assess them as having a more difficult schedule and, conversely, if a team seems to play relatively equally well at all venues, is it reasonable to therefore assess them as having a less difficult schedule? Ultimately, this is probably a question of personal preference but, again for this year, I’m answering “yes” to both those questions.
One way of avoiding this issue is, of course, to solely consider the underlying abilities of the teams faced and ignore venues altogether. In the table that follows I’ll provide the data to allow you to do exactly that.
Anyway, because of the manner in which they are calculated, the Venue Performance Values incorporate the effects, if any, of interstate (technically, ‘out of region’) travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, all interstate teams (except Geelong) are about 0.25 to 0.3 SDs worse, or about 6.25 to 7.5 points. Gold Coast are about a three-point worse team at the Gabba, but you can’t really attribute that to the travel.
Generally speaking, the interstate travel component of VPVs is fairly uniform across teams because:
only the last 8.5 years of data is included in VPV calculations
a team needs to have played 65 games at a venue in that 8.5 year window before the regularisation towards the default of -0.26 SDs for out of region contests is completely discontinued and the calculation becomes based solely on the team’s actual performance relative to expectation
The breakdown of home and away games for each team in terms of ‘in region’ or ‘out of region’ is shown in the chart below (click on it to access a larger version).
A few things stand out in this table:
GWS play only 9 games in region again in 2026: their 8 home games at Sydney Showground and their away clash with Sydney at the SCG
Gold Coast play only 10 games in region in 2026: their 9 home games at Carrara and their away clash with Brisbane Lions at the Gabba
Only 4 of Carlton’s games are out of region for their opponents, and only 6 of Collingwood’s, Geelong’s, Richmond’s, and Western Bulldogs’
In contrast, 13 of Frementle’s and Gold Coast’s games are out of region for their opponents
2026 STRENGTH OF SCHEDULE
After performing the necessary calculations for all 23 games for every team, taking into account who each team plays as well as where, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.
(See the IMPACT OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)
Note that the numbers shown in these tables are all aggregates and so will include either 11 or 12 games in the home and the away data, depending on the particular team’s fixture.
In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from venue effects. We would generally expect the Aggregate Net Venue Performance figure to be negative for a team in this part of the table, since their opponents are likely to have negative VPVs and they themselves are likely to have positive VPVs. That is, indeed, the case here, with Carlton the outliers because of their small and negative VPVs at the grounds where they play their home games.
Based solely on each team’s home fixtures, the teams with the five most difficult schedules including net venue effects are:
Carlton (+1.63, 11 games)
Essendon (+0.16, 12 games)
Melbourne (-0.65, 11 games)
St Kilda (-0.98, 11 games)
Hawthorn (-1.44, 12 games)
If we ignore venue effects, such is the underlying strength of the teams they face at home, Brisbane Lions rank 1st (from 13th!), Hawthorn 2nd, Carlton 3rd, Essendon 4th, and Gold Coast 5th. On this metric, Melbourne would slide into 9th, and St Kilda to 8th.
Those with the easiest home schedules including net venue effects are:
Geelong (-6.52, 12 games)
Western Bulldogs (-5.68, 11 games)
GWS (-5.51, 12 games)
Port Adelaide (-5.34, 12 games)
Adelaide (-5.00, 12 games)
Here, the disproportionate impacts of interstate travel on teams’ home schedules are apparent. Brisbane Lions, for example, face interstate teams in every home game except when they play Gold Coast, Port Adelaide and Adelaide likewise except when they play each other, and West Coast likewise except when they play Fremantle. We also see the significant impact of recognising Geelong’s relatively large estimated VPV for Kardinia.
Were we to ignore venue effects, Western Bulldogs, Geelong, and GW Sydney would remain in the bottom five in terms of home schedule difficulty, with Port Adelaide moving to 13th, and Adelaide to 10th.
The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and venue effects. Here we would expect the Aggregate Net Venue Performance figures to be positive for a team, since their opponents are likely to have positive VPVs at their home grounds. That is, indeed, the case for all teams, though least of all for Hawthorn.
Based on their away fixtures, the teams with the five most difficult schedules including net venue effects are:
Western Bulldogs (5.46, 12 games)
Gold Coast (4.68, 12 games)
Geelong (4.30, 11 games)
Adelaide (4.12, 11 games)
Essendon (3.80, 11 games)
Ignoring venue effects would see Adelaide exiting the top five (to 8th) and Essendon (to 7th), their places taken by Collingwood (3rd) and Richmond (5th).
Those with the easiest away schedules including net venue effects are:
Hawthorn (+0.63)
Brisbane Lions (+1.20)
St Kilda (+1.86)
Adelaide (+4.12)
Essendon (+3.80)
Ignoring venue effects would see only Essendon exiting the top five (to 7th) and Adelaide (to 8th), their places taken by Collingwood (3rd), and Richmond (5th).
Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have if we include net venue effects:
Tough Schedules: Carlton and Essendon
Above Average Schedules: Melbourne, Collingwood, and Gold Coast
Average Schedule: St Kilda, Sydney, North Melbourne, Richmond, Fremantle, Western Bulldogs, Hawthorn, and Adelaide
Below Average Schedules: West Coast, Geelong and GWS,
Easy Schedules: Port Adelaide and Brisbane Lions
Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out:
Essendon, and Melbourne have more difficult schedules than might be expected for teams in the Bottom 6
GWS has an easier schedules than might be expected for Middle 6 teams
Hawthorn, Adelaide, Geelong, and Brisbane Lions have easier schedules than might be expected for Top 6 teams
Only seven of the 18 teams appear in the same third when ranked using the final ladder and (inverse) ranked using this Strength of Schedule metric
To investigate the issue that some of these disparities might be attributed mainly to net venue effects, I have included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).
Looked at that through this lens we see that:
Melbourne’s, St Kilda’s, North Melbourne’s, Richmond’s, and GWS’s fixtures appear somewhat easier
Brisbane Lions’, Geelong’s, West Coast’s, Hawthorn’s, Fremantle’s, and Gold Coast’s fixtures appear somewhat harder
Nine of the 18 teams appear in the same third when ranked using the final ladder and (inverse) ranked using this Strength of Schedule metric, which ignores venue effects.
It’s interesting to note that excluding venue effects and including only the figures for games involving teams played twice, rank orders the teams based on MoSHBODS Strength of Schedule very similarly to what you get if you measure the strength of teams based on their percentage in the home and away phase of the 2025 season, just as Max Laughton has done for foxsports.
No team is ranked more than five places differently by the two methods (Fremantle is 4th for Max and 9th for MoSHBODS) and 14 teams have rankings that differ by no more than a single spot.
Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 8.4 SDs or about 210 points across the season, which is just over 9 points per game.
A 9-point advantage turns a game with an otherwise 50% victory probability into one with about a 61% probability, which converts to about 2.5 extra expected wins across a 23-game season.
DETAILED GAME-BY-GAME NET VPVs AND STRENGTH OF SCHEDULE
The table below (click on it to access a larger version) provides a full breakdown of the Strength of Schedule calculation for every fixtured game. The top row of figures records the Combined Rating of the relevant team, and the numbers in the body of the table the Net VPVs for each contest.
So, for example, when Brisbane Lions play Geelong at home at the Gabba, the Opponent component of the SoS calculation for the Lions is the +0.67 Rating of the Cats, and the Venue component is -0.52, which is Geelong’s VPV at the Gabba of -0.13 less the Lions’ VPV of 0.38. So, the contribution of this game to the Strength of Schedule for the Lions is +0.67-0.52, which is +0.15, signalling that the Cats would start favourites by about 4 points.
If we perform this same calculation for each of the teams the Lions play at home, and then add the results, we get -4.98, which you’ll see is the same number for the Lions that’s in the earlier table.
When, instead, Brisbane Lions play Geelong away at Kardinia Park, the Opponent component of the SoS calculation for the Lions is still the +0.67 Rating of the Cats, but the Venue component is now +0.66, which is Geelong’s VPV at Kardinia of +0.42 less the Lions’ VPV there of -0.24. So, the contribution of this game to the Strength of Schedule for the Lions is +0.67+0.64, which is +1.31, signalling a tough game for the Lions.
If we now perform this same calculation for each of the teams the Lions play away, and then add those results, we get +1.2, which you’ll also see is the same number for the Lions that’s in the earlier table.
(NB The two cells highlighted in green are those for Hawthorn at home and Western Bulldogs away, both of which include the Team Rating of the other twice to account for the fact that they meet twice with Hawthorn as the home team)
IMPACT OF MISSING SCHEDULE
We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 23 of a possible 34 games.
The table below summarises the missing games in the 2026 Fixture, denoting with H's those games missed that would have been home games for the team named at left of the row, and as A's those that would have been away games for the team named at left of the row.
Note that I've ordered the teams on the basis of their final 2025 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.
Here are a few interesting points that you can glean from this table:
The matchups between teams in the Top 6 are surprisingly poorly spread:
Hawthorn and Adelaide play only two of the other Top 6 teams twice
Brisbane Lions and Gold Coast play three of the other Top 6 teams twice
Geelong and Collingwood play four of the other Top 6 teams twice
Four of the six teams in the Bottom Third fail to play five of the Top 6 teams twice during the season, the exceptions being Melbourne and Essendon who miss only four of the Top 6.
North Melbourne play only 2 of the Top 12 teams twice, and also play four of the five other Bottom 6 teams twice.
To measure the impact of the missing schedule we will estimate the difference in the simulated probability of making the Top 10 and the Top 8 between that obtained using the actual 24 round fixture and that obtained using a 35 round fixture in which all teams meet all other teams home and away (except the Dogs and Hawks). What we’re measuring here is the difference between an idealised all-plays-all season and a 24 round season modelled on the actual fixture.
We assume that all missing games are played at the home team’s most common home ground in the actual season, which this year are as per the table at left.
For this measure, a larger difference in the 35- vs 24-round probabilities for favourable ladder positions is interpreted as a more disadvantageous schedule.
The table at right provides information about how each team fared in the 2,500 all-plays-all simulations versus the 10,000 simulations based on the actual fixture. With samples of this size, we should treat any difference under about 2% points as possibly being explained by random variation.
We see that Gold Coast, Fremantle, Carlton, Collingwood, Western Bulldogs, and Geelong form the Top 7 teams on both metrics, but in different orders.
Conversely, GWS, Richmond, North Melbourne, and Port Adelaide form the Bottom 4 teams on both metrics.
SUMMARY
The table below summarises the team rankings on a number of the metrics.
It’s interesting to note that the team ordering based on aggregate schedule strength ignoring venue effects is quite similar to the team ordering based on the impact of the missing schedule.
CONCLUSION
A final team classification appears in the table below. It primarily uses estimated Schedule Strength including venue effects (left to right) and estimated Schedule Strength excluding venue effects (bottom to top) with annotations in brackets to describe the Missing Schedule impacts.
(NB An earlier version of this chart had Adelaide in the same cell as Western Bulldogs and Sydney)
Forced to make a choice, I would prefer the left-to-right ordering, but the alternative bottom-to-top ordering is there for anyone who feels that venue effects should be excluded.
