As the 2016 AFL season proper looms and the window for more leisurely analyses slowly closes, today we'll wander across the expanse of footy history this time using MoSSBODS Team Ratings to decide which of the 1,442 teams that have played VFL/AFL football have been the big improvers, and which the big decliners (well ... you find an antonym then).
Specifically, we'll be comparing teams' end-of-season Offensive, Defensive and Combined Ratings across consecutive seasons and identifying those teams whose Ratings moved furthest,in either direction. For the purposes of today's analysis, a team will only be considered to be the "same" team if it played under the same name in two consecutive seasons (excepting that North Melbourne will be considered North Melbourne even when playing as the Kangaroos).
So, as well a losing from consideration all the teams from the first season, we also lose teams when they appear in their own first seasons. In total, we wind up with 1,413 teams that have a "this season" and a "last season".
LARGEST RATING DECLINES
We'll start with a downbeat analysis and look at the 10 teams whose Offensive Rating declined most across consecutive seasons.
Top of that list is the 1990 Geelong side, whose Offensive Rating fell from an exceptional +8.6 in 1989 when the Cats finished Runners Up, to a just-above-average +0.8 in 1990 when they finished 10th of 14 teams, generating Scoring Shots at a rate about 1.5 per game higher than the all-team average.
Next is the 1994 Adelaide side, whose declining Offensive powers saw it slip from a respectable 5th of 15 teams in 1993, to 11th in 1994. So poor was their 1994 Offense that they managed to generate more Scoring Shots than only the teams finishing 12th through 14th on the ladder, and not the Swans, who finished last.
Third is the 2001 Essendon side, who, on the face of it, are entitled to feel a little aggrieved by MoSSBODS' assessment. After all, they did finish as Minor Premiers and Runners Up in 2001, the year of their alleged decline. But, their raw scoring numbers do suggest a fairly steep reduction in Offensive ability, at least in the manner that MoSSBODS measures it. The 2000 team registered 741 Scoring Shots (which is what MoSSBODS uses for its Ratings, not actual Score) in 22 home-and-away games, while the 2001 version managed only 653, which represents about a 12% decline. That changed them from a team that generated 6 more Scoring Shots per game than the average team to one that generated just over 3 more Scoring Shots per game than the average. As well, in the three Finals of 2000 they recorded 118 Scoring Shots, but in the three Finals of 2001 just 71, which is about a 40% decline.
On the scoreboard, the Dons overcame their more modest ability in generating Scoring Shots by converting at the extremely high level of 58% during the home-and-away season.
Three other teams on the list also appear despite making Finals appearances in the years of their measured decline: the 1997 North Melbourne, 1998 St Kilda, and 1988 Melbourne teams.
The last of those is perhaps the most interesting one, the 1988 Melbourne team losing in the Grand Final to finish the season with a -2.3 Offensive Rating. The raw scoring data again bears out this assessment, the Demon's 573 Scoring Shots in the home-and-away season being over 30 Scoring Shots lower than the all-team average for that year and only the 10th-highest of the 14 teams.
Next, let's look at large declines in Defensive ability.
Here it's the Hawks of 1944 heading up the list, a feat rendered all the more impressive by the fact that it was achieved in an 18-game home-and-away season.
The 1943 Hawks team, while not fortress-like in defence, were competent enough, allowing only 377 Scoring Shots across 15 home-and-away games, which was just fractionally below the all-team average for that season of 25.3 Scoring Shots per game and good enough to see them finish 5th. In contrast, the 1944 Hawks team leaked 32.4 Scoring Shots per game against an all-team average of just 25.5 Scoring Shots per game. As a result, they finished second-last, ahead only of a Cats team that conceded a massive 620 Scoring Shots.
And speaking of defensively challenged Cats teams, the 1915 Geelong side sits second on our list. Their 1914 predecessors were actually quite competent at restricting opposition Scoring Shots, allowing just 289 of them in 18 home-and-away season games (16.1 per game compared to an all-team average of 19.4 per game) and reaching as far as the Semi-Final where they bowed out to South Melbourne in a low-scoring, inaccurate contest 5.14 to 5.7. In the slightly-shorter 1915 season, the Cats allowed 392 Scoring Shots in 16 games, this 24.5 average over 5 Scoring Shots per game higher than the all-team 19.3 average. These Cats finished last.
The 1987 Fitzroy team, third on the list, present as a similar case, their predecessors conceding just 588 Scoring Shots in 22 games at an average of 26.7 per game, a little below the all-team average of 28.1 Scoring Shots per game. Their season ended with a loss to the Hawks in the Preliminary Final. Come the next season, Fitzroy allowed over 100 more Scoring Shots, lifting its per game average to 31.3, while the all-team average rose only slightly.
Of the remaining teams on the list, only one was a finalist: the 1996 Carlton team. As Premiers in the previous season they'd allowed only 481 Scoring Shots in 22 games, an average of just 21.9 Scoring Shots per game, more than 4 fewer than the all-team 26.0 average. In 1996, the Blues allowed almost an additional 3 Scoring Shots per game (24.7) while the all-team average fell marginally. They bowed out in a heavy Semi-Final loss to the Brisbane Bears where the Scoring Shot tally was 40-23 in favour of the Bears.
For the final analysis of precipitous declines we'll look at Combined Ratings, which are simply the sum of Offensive and Defensive Ratings.
The top three teams on this list are teams we've seen before from the list of 10 largest Defensive declines, while five of the bottom six on this list appear on the list of 10 largest Offensive declines. Empirically, teams' Offensive and Defensive Ratings do tend to positively co-vary - though far more strongly in some years than in others - so it's not entirely surprising that the list of teams with the 10 largest Combined declines should contain such a large proportion of teams from the lists of declines in the constituent parts of the Combined Rating.
Two teams, though, are new to the mix, the 2004 Hawthorn team from higher on the list. The Hawks' predecessors were not an especially strong team, finishing the year 9th on the ladder and with a +1.4 Combined Rating having registered about as many Scoring Shots as they conceded (531 versus 529, which is about 24.1 per game) and missed a spot in the Finals by 4 competition points, although their relatively poor percentage of 100.6 meant that, practically, a win and a draw would have been needed.
The 2004 Hawks registered almost 100 Scoring Shots fewer (438) and allowed over 130 Scoring Shots more (660) than the 2003 side, which saw the 2004 edition finish second-last on the ladder and with a Combined Rating of -9.2, the second-worst of any team in this list.
Our other new entrant is the 1987 Collingwood team, which registered over 130 fewer Scoring Shots than the 1986 team (498 versus 631 in 22 games) and conceded exactly 100 more (685 versus 585). Their performances in 1987 knocked over 10 Scoring Shots off their Combined Rating, taking it from a mildly above-average +1.9 to a lowly -8.1. On the competition ladder than fell from 6th in 1986 to 3rd-last in 1987.
LARGEST RATING INCREASES
Okay, time to move from the Dark Side into the Light. Let's look at teams that were most-improved, firstly in terms of their Offensive Rating.
You might expect that this list would be almost entirely Premiers and Runners Up, but while that's true of half the list and two more are other flavours of Finalist, the team at the top of the list finished only 8th of 12 teams in its season. What's going on there, then?
That team, the 1982 Melbourne side, registered 31.7 Scoring Shots per game, which was about 0.5 per game higher than the all-team average. That's good, especially so for a team finishing in the lower third of its peers, and was enough to lift its Offensive Rating to +1.9 Scoring Shots. What makes this result so commendable though, is the improvement relative to the previous season in which the 1981 version recorded more than 200 fewer Scoring Shots (just 484). That means the 1982 side cranked up Scoring Shot production by over 44% per game.
Second on the list is the 2007 premiers, the Cats, who lifted their Offensive Rating from -1.3 to +6.7 Scoring Shots after registering an additional 160 Scoring Shots in the 2007 home-and-away season compared to 2006 (692 versus 532), an increase of almost 8 Scoring Shots per game.
Third is the 1942 South Melbourne team, who played only 15 home-and-away games in that season (curiously, five teams that year played only 14 home-and-away games - go figure) but still managed to generate 26 more Scoring Shots than they did in the previous season when they played 3 more games. So, while the all-team average was dipping from 27.3 to 26.8 Scoring Shots per game, South Melbourne's was climbing from 24.6 to 31.2 Scoring Shots per game.
The two other non-Finalist teams on the list are the 1944 North Melbourne team whose Offence went from being diabolical in 1943 (19.9 Scoring Shots per game against a 25.3 all-team average) to creditable in 1944 (26.9 Scoring Shots per game against a 25.9 all-team average), and the 1992 Carlton team who started from a slightly higher base in 1991 (24.2 Scoring Shots per game against a 28.5 all-team average) but lifted its production by a little less to 29.2 Scoring Shots per game.
Next - which I'm sure you've already guessed - is the list of greatest Defensive Rating increases, top of which by some considerable margin is the 1999 Brisbane Lions team. They lost the Preliminary Final to the Kangaroos after a home-and-away season in which they conceded just 21.2 Scoring Shots per game, almost 5 fewer than the all-team average. By contrast, in 1998 they conceded almost 200 more Scoring Shots, or over 9 per game more.
In second is the 2011 West Coast side, another losing Preliminary Finalist, they moving from a 2010 home-and-away season where they finished last and conceded 635 Scoring Shots in 22 games to a 2011 season where they conceded just 450 Scoring Shots in the same number of games.
That lifted the Eagles' Defensive Rating by 8.4 Scoring Shots, just a fraction higher than the increase recorded by the 1956 St Kilda team. Finishing second last, they seem an odd inclusion in this list, but the 1955 Saints conceded 562 Scoring Shots in 18 games (31.2 per game) while the 1956 version allowed only 375 Scoring Shots in the same number of games (20.8 per game). What prevented the 1956 Saints from finishing higher on the competition ladder was their inability to generate Scoring Shots. They managed to eke out just 345 across the entire season.
The Hawthorn 1929 and Melbourne 1907 teams share similar stories - of lifting highly negative Defensive Ratings at the end of one season back to figures nearer 0 at the end of the next. Those negative Ratings were achieved in both cases by leaking Scoring Shots at a rate more than 7 Scoring Shots per game higher than the all-team average in their respective seasons.
Post-improvement, the Hawks nonetheless finished 3rd-last, and the Dees second-last, both having significantly reduced the number of Scoring Shots conceded per game compared to the previous season, but both still also conceding them at a rate about 1.5 per game above the all-team average.
Last in the list is the 1997 Premiers, Adelaide, who appear in the list because they lifted their Defensive Rating from -4.4 to +2.3. That's clearly noteworthy, but what's curious about them is the fact that a Premier can finish a season with just a +3.3 Combined Rating.
Looking at the Crows' scoring statistics for 1997 we find that they only generated about 100 more Scoring Shots than they conceded in the home-and-away part of the season (and finished only 4th on the competition ladder) and that they registered only 14 more Scoring Shots than their opponents in the four Finals they played, all of which they won. So, undeniably a better-than-average team, but by no means exceptional.
Finally, let's review the list of what might reasonably be called the greatest all-around improvers in VFL/AFL history, those who've lifted their Combined Rating by most.
The first six teams on the list come from one of the previous lists, four from the Defensive improvers list and two from the Offensive improvers list. This large level of overlap provides yet more evidence of the positively co-varying nature of Team Ratings - getting better Offensively tends to come with getting better Defensively; likewise for getting worse.
Of the four remaining teams, all but one have transformed a highly negative Combined Rating into a slight to moderately positive one.
The exception is the team that started with the most negative Combined Rating, the 1912 Saints whose 1911 predecessors finished second-last after generating fewer and conceding more Scoring Shots than any other team in the competition. The 1912 St Kilda team was superior to the 1911 team Defensively (though still below-average), but far superior Offensively, generating over 50% more Scoring Shots per game than the team of the previous year.
SOME FINAL THOUGHTS
It seems self-evident that teams from eras in which fewer games were played must find it harder to make these lists of extremes because the number of opportunities they have to shift Ratings is more limited than those for teams playing in longer seasons. And, indeed, of the 45 unique teams appearing in any of the earlier lists, only 7 are teams that played prior to World War II and 28 are teams that played in 1980 or later.
The chart below records the average absolute change in Ratings for teams from every season and reveals that, while it's true that average absolute Rating changes have been generally higher since about 1980, they've not been dramatically so. It's important to recognise too that any such increase might not be solely down to the lengthening of the seasons allowing always-present season-to-season variability in team abilities to reveal itself, but might also be attributable, at least partly, to the levelling function of the draft.
In any case, the chart below also provides some context for the size of the Rating changes shown for teams in each of the top 10s above. Many of those changes are three to four times the size of a "typical" change shown in the chart.