PlayerRatings And Team Quality

Published on March 09, 2015 by Martin Eastwood

Introduction

Last week I introduced PlayerRatings, a mathematical model I’ve been working on over the past few months to quantify the ability of individual footballers.

Aggregable

One of the nice characteristics of this approach is that player ratings can be aggregated together to create team ratings that correlate strongly with their team’s performance. For example, Figure One below show the relationship between the average rating of a team’s players and how many points the team achieved in the English Premier League for the past eight seasons.

Pelican

The r2 value for this correlation is 0.73, essentially meaning that 73% of the variance associated with how many points a team achieves is captured by the average ratings of its players. To put this number into context, Total Shot Ratio (TSR), which is widely used among the analytics community to assesses a team's performance, has an r2 value of around 0.68 when correlated with points.

This means that something as simple as taking a team’s average PlayerRating potentially provides us with more information than its Total Shot Ratio does. There is also scope to potentially improve this further as there is no doubt a more elegant solution than just taking the average, e.g factoring in substitute appearances, injuries, opposition quality etc.**

Baselines

Since PlayerRatings correlate well with points we can use this data to set approximate baselines for what quality of squad is needed to win the league, qualify for Europe, avoid relegation etc (Figure Two).

Pelican

For example, over the past few seasons the Premier League champions have had an average PlayerRating of around 141 +/- 11, teams reaching the Champions League through positions two, three or four have averaged a PlayerRating of 133 +/- 11, teams in positions five to seventeen have averaged a PlayerRating of 111 +/- 7 and teams relegated have averaged a PlayerRating of 106 +/- 4.

So, as it stands this season, the only teams whose PlayerRatings look good enough to win the league are Chelsea and Manchester City (no surprises there then). At the bottom of the table though Hull, Sunderland, Aston Villa, Queens Park Rangers, Leicester, Crystal Palace and West Bromwich Albion all have team ratings that fall within the typical range of relegation candidates.

This is reassuring as it matches the clear lack of parity we see in the English Premier League - there are much fewer teams capable of winning the league than there are at risk of relegation, which matches previous observations I've made. I'm really interested in the distributions of these different groups so I'm planning at looking at this in more detail over the coming weeks.

Conclusions

As well as allowing individual players to be quantified, PlayerRatings can also be aggregated together to create team ratings that correlate more strongly with points achieved over the course of a season than TSR. These team ratings could be then be used to assess squad quality or evaluate whether potential signings are worthwhile. For example, will that expensive new striker on a five-year contract actually increase your squad's average rating and push it closer to the level required to challenge for Europe?

Addendum

**To be a bit more detailed, it’s actually the average of all the team’s players who started a league fixture that season weighted by the number of minutes they played in total.