Building Your Own Team Rating System

Just before the 2nd Round of 2008 I created MAFL's Team Ratings System, MARS, never suspecting that I'd still be writing about it 5 years later. At the time, I described MARS in the newsletter for that week in a document still available from the Newsletters 2005-2008 section section of this website (it's linked under the "MAFL The Early Years" menu item in the navigation bar on the right of screen). Since then, MARS, as much to my surprise as I think to anyone's, has been a key input to the Line Funds that have operated in each of the ensuing years.
Read More

How Good Are Hawthorn, How Poor GWS?

Without the benefit of emotional and chronological distance it's often difficult to rate the historical merit of recent sporting performances. MAFL's MARS Ratings, whilst by no means the definitive measure of a team's worth, provides one, objective basis on which to assess the teams that ran around in 2013.
Read More

Game Statistics and the Dream Team

Today, a new voice comes to MAFL. It's Andrew's. He's the guy that uncovered the treasure-trove of game statistics on the AFL website and he's kindly contributed a blog taking you through some of the analyses he's performed on that data. Let me be the first to say "welcome mate". I have lurked on the sidelines of MAFL and Matter of Stats for a couple of years and enjoyed many conversations with Tony about his blogs. I found myself infected with curiosity and so, with gratitude to Tony, here's my newbie blog post.
Read More

Current Teams' All-Time MARS Rankings

I've looked previously at the best and worst AFL teams of all time and, whilst none of the current crop of teams is vying for either of those honours as at the end of Round 11 in the 2013 season, two (GWS and Melbourne) are in the 30 lowest-rated teams ever and one (Hawthorn) is in the 50 highest-rated teams ever.
Read More

Which Teams Fare Better as Favourites?

In this blog, the next is a series in which I've been exploring the all-time MARS Ratings I created for every game from the start of 1897 to the end of the 2012 season, I'll be looking at how well each team has performed depending on the relative strength of its opponent, as measured by their MARS Rating. So, for example, we'll consider how well Collingwood tends to do when playing a team it is assessed as being much stronger than, a little stronger than, about as capable as, and so on.
Read More

The Greatest Upset Victories in VFL/AFL History (1897-2012)

The Suns' victory over the Saints in the first round of the 2013 season was heralded as an "upset win" for the Suns and one of the greatest in their short history. Undoubtedly their win was unexpected, but even the bookmakers rated the Suns as only $3.75 chances, so it was hardly a Halley's comet-like occurrence.
Read More

Team MARS Ratings Performance By Decade and Overall: 1897 to 2012

In the previous blog on the topic of all-time MARS Ratings I explained the process I used to derive team Ratings across history and then identified those teams that had achieved the highest (Essendon) and lowest (Fitzroy) MARS Ratings ever. We know then which teams have burned brightest - and which flickered dimmest - across VFL/AFL history. In this blog I want to explore more extended bursts of talent or apparent lack of it.
Read More

Setting an Initial Rating for GWS

Last season I set Gold Coast's initial MARS Rating to the all-team average of 1,000 and they reeled off 70 point or greater losses in each of their first three outings, making a mockery of that Rating. Keen to avoid repeating the mistake with GWS this year, I've been mulling over my analytic options.
Read More

MARS Ratings : How Important Are Teams' Initial Ratings?

It's been a few years since I chose the key parameters for the MARS Ratings System, which I selected on the basis that they maximised the predictive accuracy of the resulting System. One of those parameters - the percentage carryover of team Ratings from one season to the next - determines each team's initial MARS Rating for the season.
Read More

A Competition of Two Halves

In the previous blog I suggested that, based on winning percentages when facing finalists, the top 8 teams (well, actually the top 7) were of a different class to the other teams in the competition.

Current MARS Ratings provide further evidence for this schism. To put the size of the difference in an historical perspective, I thought it might be instructive to review the MARS Ratings of teams at a similar point in the season for each of the years 1999 to 2010.

(This also provides me an opportunity to showcase one of the capabilities - strip-charts - of a sparklines tool that can be downloaded for free and used with Excel.)

2010 - Spread of MARS Ratings by Year.png

In the chart, each row relates the MARS Ratings that the 16 teams had as at the end of Round 22 in a particular season. Every strip in the chart corresponds to the Rating of a single team, and the relative position of that strip is based on the team's Rating - the further to the right the strip is, the higher the Rating.

The red strip in each row corresponds to a Rating of 1,000, which is always the average team Rating.

While the strips provide a visual guide to the spread of MARS Ratings for a particular season, the data in the columns at right offer another, more quantitative view. The first column is the average Rating of the 8 highest-rated teams, the middle column the average Rating of the 8 lowest-rated teams, and the right column is the difference between the two averages. Larger values in this right column indicate bigger differences in the MARS Ratings of teams rated highest compared to those rated lowest.

(I should note that the 8 highest-rated teams will not always be the 8 finalists, but the differences in the composition of these two sets of eight team don't appear to be material enough to prevent us from talking about them as if they were interchangeable.)

What we see immediately is that the difference in the average Rating of the top and bottom teams this year is the greatest that it's been during the period I've covered. Furthermore, the difference has come about because this year's top 8 has the highest-ever average Rating and this year's bottom 8 has the lowest-ever average Rating.

The season that produced the smallest difference in average Ratings was 1999, which was the year in which 3 teams finished just one game out of the eight and another finished just two games out. That season also produced the all-time lowest rated top 8 and highest rated bottom 8.

While we're on MARS Ratings and adopting an historical perspective (and creating sparklines), here's another chart, this one mapping the ladder and MARS performances of the 16 teams as at the end of the home-and-away seasons of 1999 to 2010.

2010 - MARS and Ladder History - 1999-2010.png

One feature of this chart that's immediately obvious is the strong relationship between the trajectory of each team's MARS Rating history and its ladder fortunes, which is as it should be if the MARS Ratings mean anything at all.

Other aspects that I find interesting are the long-term decline of the Dons, the emergence of Collingwood, Geelong and St Kilda, and the precipitous rise and fall of the Eagles.

I'll finish this blog with one last chart, this one showing the MARS Ratings of the teams finishing in each of the 16 ladder positions across seasons 1999 to 2010.

2010 - MARS Ratings Spread by Ladder Position.png

As you'd expect - and as we saw in the previous chart on a team-by-team basis - lower ladder positions are generally associated with lower MARS Ratings.

But the "weather" (ie the results for any single year) is different from the "climate" (ie the overall correlation pattern). Put another way, for some teams in some years, ladder position and MARS Rating are measuring something different. Whether either, or neither, is measuring what it purports to -relative team quality - is a judgement I'll leave in the reader's hands.

MARS Ratings of the Finalists

We've had a cracking finals series so far and there's the prospect of even better to come. Two matches that stand out from what we've already witnessed are the Lions v Carlton and Collingwood v Adelaide games. A quick look at the Round 22 MARS ratings of these teams tells us just how evenly matched they were.

MARS_Finalists_F2.png

Glancing down to the bottom of the 2009 column tells us a bit more about the quality of this year's finalists.

As a group, their average rating is 1,020.8, which is the 3rd highest average rating since season 2000, behind only the averages for 2001 and 2003, and weighed down by the sub-1000 rating of the eighth-placed Dons.

At the top of the 8, the quality really stands out. The top 4 teams have the highest average rating for any season since 2000, and the top 5 teams are all rated 1,025 or higher, a characteristic also unique to 2009.

Someone from among that upper eschelon had to go out in the first 2 weeks and, as we now know, it was Adelaide, making them the highest MARS rated team to finish fifth at the end of the season.

MARS_Finalists_F2_2.png

(Adelaide aren't as unlucky as the Carlton side of 2001, however, who finished 6th with a MARS Rating of 1,037.9)