The latest Team Dashboard appears below but, before that, I’m going to present a table that looks at the ranking of each team across all of the metrics on that main Team Dashboard.
It’s very early days, so we shouldn’t draw too many conclusions from what we see here, but we can take the opportunity to review what it will show each week.
Teams in the table are ordered based on their current competition ladder ordering. You can see at a glance on which of the dashboard metrics each team is doing relatively well or poorly in comparison to its ladder position.
For example, Geelong, who sit 1st on the ladder, are a surprising 10th on Q1 performances, while Carlton, who lie fourth-last on the ladder, are an equally-surprising 5th on Q3 performances.
The columns here map in a fairly obvious way to columns on the main Team Dashboard excepting, perhaps, the last column, which uses the number of expected wins for each team based on the MoS Win Production Function.
At the foot of the table are the rank correlations between the teams' ladder positions and their ranking on the Dashboard metric in question.
a correlation of +1 implies a perfect agreement between two sets of metric rankings
a correlation of -1 implies a perfect agreement in that the team 1st on one metric is last on the other, and the team 2nd on one is second-last on the other, and so on.
a correlation of 0 implies that there is no relationship at all between two sets of metric rankings (so knowing a team's ranking on one metric would give you no information by which to infer its ranking on the other)
We can see then, for example, that there is a high level of agreement between teams' ladder positions and their ranking on Scoring Shots Recorded and Conceded, since the relevant correlation coefficients are around +0.8. Conversely, there is only a weak relationship between the teams' ladder positions and their ranking on Q1 performances, since the correlation coefficient there is only +0.16.