My Mid-Season Analytic "Pro Forma" Review

Grandpabuzz

Registered User
Oct 13, 2003
910
0
Dallas, Texas
As some of you know, I’ve put out some posts the past few years providing a list of ratings for players in the league as well as mock drafts based on a similar methodology. Some examples of my posts include:

Mock Draft

Player Ratings

I’ve continued to refine the process over the years, but what I do is essentially gathering data for each player based on the statistics I think are important only on an even strength basis-numbers that directly affect the outcome of the game (what should be the most important for any player). I then put the numbers through a formula that spits out a value attributed to the player’s performance. It is similar to PER for those familiar with basketball and different than Corsi, as again I focus more on actual contribution to the game’s end result rather than shots. A more detailed explanation can be found here:

More Information

Since defensemen play a different game versus forwards and versus goaltenders, the number that is spit out from the original formula is then adjusted so there can be a similar comparison rating among every player. This stage is not yet perfect, but still provides a good idea of what the ultimate player ratings are. For example, this season the average forward rating is 54.2, the average defensemen rating is 56.7 and the average goalie rating is 56.0 based on even strength minutes played.

I’ve collected statistics for each player the past seven seasons. And for the most part, players tend to score fairly consistently each year. As an example, here are Crosby’s ratings:
2006 - 2007: 71.2
2007 - 2008: 76.8
2008 - 2009: 72.5
2009 - 2010: 77.9
2010 - 2011: 90.6
2011 - 2012: 79.9
2012 - 2013: 80.5

Because Crosby has been fairly consistent over these years, he’s expected rating or “pro forma†rating is a time-weighted average of 77.1. There are some players who, because of injury or old age, see sharp changes in their score over time, and so I’ve instead had to use a different formula based on most recent performance to get their ultimate rating. An example would be Dany Heatley

2006 - 2007: 73.4
2007 - 2008: 71.4
2008 - 2009: 65.2
2009 - 2010: 62.2
2010 - 2011: 53.7
2011 - 2012: 56.0
2012 - 2013: 55.3

As you can see, a huge drop-off starting in 2010, and as a result his pro forma rating is now 55.0. Generally players that score above 60 are "top" players, players that score between 50 and 60 are "average" or "solid" as you get closer to 60, and players below 50 are "poor." But those are just rough guidelines.

For those interested, here are my current best active forwards/defensemen/goalies since the 2006-2007 season based on this method. Note that the score shown does not include results from this season.

35b28f_f075469ec8594dc0aab6a4838d125543.png_850


So based on this type of analysis, here are the top players and their respective ratings mid-way through this season versus their expected rating coming in the season. Players must have played an adequate number of minutes to be considered.

35b28f_f45d30fb55c14dd9a84ee8c04e08c623.png_1024



One thing to note is that prospects are an interesting case, because the way that rookies are initially rated is at 50. They are then “bumped†up to rating based on their performance in junior leagues (which is another set of formulas based on regression analysis of current players). No prospect can score above 60 coming into the league. This is obviously not going to be accurate (Hertl's current rating is 71), but from the data I’ve seen, the vast majority of prospects tend to enter the league between a 50-60 rating.

Now, what I’ve also been able to do is take these ratings and consolidate the data with all the players on each team and each player’s total ice time each game to come up with a value for the entire team. Using the values I can then rank each team against one another as shown on the left side of the picture below. While not perfect, the current ranking very closely resembles the actual standings.

The right side of the rankings predicts how the season will end for each team based on how the team has performed so far, player’s expected performance, and other factors such as average ice time, injuries, transactions, etc.

35b28f_47a40c9553b849cf8762dde04c219b9b.png_1024


Using the same methodology, if I were to go back and predict the standings for this year assuming the “expected rating†but assuming that each team plays each of its players the same amount of games/minutes, the standings would look like the table below. From here you can see that teams like Edmonton, Buffalo and New York are severely underperforming, while teams like Colorado, Tampa and Anaheim are over performing.

35b28f_e727287cf07f446cba6cfa11417892e4.png_1024


Again, this is just an analysis based on data I find important and in no means is meant to be more than a supplement to other advanced metrics out there. Let me know if you have any questions or what to see other players current/historical/etc.
 

MarkGio

Registered User
Nov 6, 2010
12,533
11
This is like how EA sports ranks players and teams?

What statistics do you find important?
 

Grandpabuzz

Registered User
Oct 13, 2003
910
0
Dallas, Texas
This is like how EA sports ranks players and teams?

What statistics do you find important?

A bit more complicated than that, as my ratings are derived from actual stats. The output is shown like EA, just because it's easier to read.

I use goals, certain assists, a defensive responsibility factor, time on ice and team performance (all even strength). Its kind of like a plus-minus stat, but eliminates a lot of the limitations.
 

hatterson

Registered User
Apr 12, 2010
35,393
12,736
North Tonawanda, NY
A bit more complicated than that, as my ratings are derived from actual stats. The output is shown like EA, just because it's easier to read.

I use goals, certain assists, a defensive responsibility factor, time on ice and team performance (all even strength). Its kind of like a plus-minus stat, but eliminates a lot of the limitations.

How do you determine the defensive responsibility factor or which assists to use?
 

Grandpabuzz

Registered User
Oct 13, 2003
910
0
Dallas, Texas
How do you determine the defensive responsibility factor or which assists to use?

Assists are basically first assists. Second assists do sometimes directly impact a goal that is scored, but a lot of the time it wouldn't be meaningful.

Defensive responsibility is subtracting an amount if a player was on the ice when a goal is scored. The amount is adjusted based on total team defensive performance compared to the rest of the league. The worse a team's defense the less impact (magnitude is not as large as you may think) that goal scored against has on the player. The concept is a bit circular, but the idea is to illustrate that a player's defensive performance may be part of a bigger team issue.
 

hatterson

Registered User
Apr 12, 2010
35,393
12,736
North Tonawanda, NY
Assists are basically first assists. Second assists do sometimes directly impact a goal that is scored, but a lot of the time it wouldn't be meaningful.

Does that change at all for defensemen? I've seen some work that's hinted that secondary assists are more indicative of play from a dman versus a forward, although nothing conclusive that I recall.

Defensive responsibility is subtracting an amount if a player was on the ice when a goal is scored. The amount is adjusted based on total team defensive performance compared to the rest of the league. The worse a team's defense the less impact (magnitude is not as large as you may think) that goal scored against has on the player. The concept is a bit circular, but the idea is to illustrate that a player's defensive performance may be part of a bigger team issue.

Is there any adjustment for matchups and deployments or ice time?

For example, if I compare McClement and Kadri from last year for the Leafs. McClement was on ice for 22 GA and Kadri for 26. Based on ice time their GA/60 is basically equal. However, McClement had dramatically harder minutes (both from a deployment standpoint and a matchup standpoint) and, as anyone who watches the games will tell you, is a dramatically superior defensive player.
 

Grandpabuzz

Registered User
Oct 13, 2003
910
0
Dallas, Texas
Does that change at all for defensemen? I've seen some work that's hinted that secondary assists are more indicative of play from a dman versus a forward, although nothing conclusive that I recall.


No - There are goals where the secondary asset is important (i.e. an outlet pass to a forward who passes to another forward who scores), but my thought was that the last pass is what needed to happen to allow the other player to score the goal. There is too much variability and options involved with the outlet pass and secondary assist. However, it can be said that these secondary assists help create offensive pressure which means that there is a less likely chance that the player is scored against. So in a way that secondary assist is indirectly captured in many cases which a better "defensive awareness" score.

Is there any adjustment for matchups and deployments or ice time?

For example, if I compare McClement and Kadri from last year for the Leafs. McClement was on ice for 22 GA and Kadri for 26. Based on ice time their GA/60 is basically equal. However, McClement had dramatically harder minutes (both from a deployment standpoint and a matchup standpoint) and, as anyone who watches the games will tell you, is a dramatically superior defensive player.

Ice time-yes. All the stats I use are based on ice time played.

For matchups - this is kind of tricky. I don't do direct adjustments (i.e. McClement got scored against Crosby versus Kadri got scored against Craig Adams and thus McClement should be penalized less), as I would have to make many assumptions about how to consider each player as a scorer, and I wanted the rating to be as closely related to what actually happens on the ice without arbitrary input from me. Not to mention this would be a circular approach for each player.

So what I did is considered that top forwards are given the most ice time so that they can score. As you go down the lines, the time that the forwards are given is shortened per game and their responsibility is still to score but also to not be scored against. This is were you see the third line go against the first line. As a result, the less time on ice that a forward plays, the less impact a goal scored against has on his rating. I know this is not perfect as you have players like Bergeron who play a lot of minutes and defend against top tier scorers and you have fourth liners who won't be defending against Crosby and Malkin. However, I think in general the logic still applies. Moreover, the magnitude of the adjustment is not high and is why you still see fourth liners as "worst performers" in my list above.

Please note that I use the reverse formula for defensemen. So the more ice time a defender plays, the less impact a goal against will have, as the logic is that since the best forwards are playing the most minutes, the best defenders will also play the most minutes.

One thing to keep in my is that offensive results are weighted proportionally higher for the final rating. This all goes back to my entire theory of the project that a 1-0 score will win you a game. A 0-0 score will not.
 

nfld77

Registered User
Aug 13, 2007
1,666
427
Newfoundland
Excellent job you're doing sir. I absolutely love hockey stats and fell Im practically addicted to them. I was wondering as a Bruins fan where does defenceman Torey Krug and forward Reilly Smith fit in your Performance Ratings. Those 2 players are a big reason Bruins are playing so well even with all the injuries. Congrats on an awesome job anf please keep it coming and if I can ever give you a hand{sort of an assistant} please pm me...Again, love stats and would love to get involved..
 

Grandpabuzz

Registered User
Oct 13, 2003
910
0
Dallas, Texas
Excellent job you're doing sir. I absolutely love hockey stats and fell Im practically addicted to them. I was wondering as a Bruins fan where does defenceman Torey Krug and forward Reilly Smith fit in your Performance Ratings. Those 2 players are a big reason Bruins are playing so well even with all the injuries. Congrats on an awesome job anf please keep it coming and if I can ever give you a hand{sort of an assistant} please pm me...Again, love stats and would love to get involved..

Thanks - this is more of a hobby that I do for fun, but if it takes off somehow, I'll keep you in mind.

As for the Bruins, Reilly Smith has been the best performing forward this season. He has a season rating of 66.8 versus expectations of 52.0 (8th biggest positive surprise this season among forwards). As a reference, here are some of the other Bruin forward scores this season:
Reilly Smith: 66.8
Milan Lucic: 64.8
Jarome Iginla: 63.8
David Krejci: 58.1
Brad Marchand: 56.5
Except for Smith, all this players are within 10% of their expected score

Krug is a bit more interesting. He has great offensive stats, but his defensive performance has really been lacking, especially the past month. As a result, his season score is only 51.2 (versus expected score of 52.8). He reminds me a lot like Cody Franson and their scores are pretty similar.

On the other hand, the rest of the Bruins defense core have been playing great. Except for Krug and Warsofsky, all the defensemen are rated above 63 so far this season - even McQuaid and Bartkowski who had expected ratings ~50 coming into this season.
 

Ad

Upcoming events

Ad

Ad