Speculation: LA Kings News, Rumors, Roster Thread part VII

Status
Not open for further replies.

crassbonanza

Fire Luc
Sep 28, 2017
3,264
3,137


I have developed such a distaste for expected goals. Beyond the metrics inability to account for actual on ice situations, people believe it is an account of luck. The Avs, Cats and Canes are not just super lucky. Conversely the Habs, Hawks and Flyers have not been unlucky. Fun fact with expected goals, From 2014-2017 the Kings led the entire league in xGF% by a fairly substantial margin, but only made the playoffs once and Lombardi/Sutter were fired.
 

Peter James Bond II

Registered User
Mar 5, 2015
3,657
5,441
Is there a metric for 'expected failure' ? Maatta would be #1 and Brown would be in the top 10. It's is true that Brown mixes in some 'unexpected success' moments. He actually does. But the 'expected failure's' are 8X those.
 
  • Like
Reactions: Piston

Statto

Registered User
Sponsor
May 9, 2014
4,988
6,805
Just look at shooting %.
Yep. Like most stats it doesn’t work on its own so XG above and below need looking at with s% and probably XG. What XG does tell you (to some degree) is how well the systems are working and being executed but even then other stats plus the eye test complete the picture. No stat on its own tells you anything much.
 

Raccoon Jesus

Todd McLellan is an inside agent
Oct 30, 2008
62,018
62,182
I.E.
I have developed such a distaste for expected goals. Beyond the metrics inability to account for actual on ice situations, people believe it is an account of luck. The Avs, Cats and Canes are not just super lucky. Conversely the Habs, Hawks and Flyers have not been unlucky. Fun fact with expected goals, From 2014-2017 the Kings led the entire league in xGF% by a fairly substantial margin, but only made the playoffs once and Lombardi/Sutter were fired.

Just like any other advanced stat it's people's over-belief in it as a 'tell-all' that makes it annoying.

I mean in our case it confirms an absolute ton--there's a huge dissonance between our control of the puck 5v5 and our finishing with the puck 5v5.
 

Chazz Reinhold

Registered User
Sep 6, 2005
9,023
2,683
The Stanley Cup
I have developed such a distaste for expected goals. Beyond the metrics inability to account for actual on ice situations, people believe it is an account of luck. The Avs, Cats and Canes are not just super lucky. Conversely the Habs, Hawks and Flyers have not been unlucky. Fun fact with expected goals, From 2014-2017 the Kings led the entire league in xGF% by a fairly substantial margin, but only made the playoffs once and Lombardi/Sutter were fired.

I'm not sure it's fair to say the expected goals models don't account for actual on-ice situations. Each model is built a little differently but one similarity for all is that they're not simply based on shot location stripped of any of context of what's occurring on the ice. Most models will account for what happened prior to the goal (e.g., rebound, rush chance, cross-ice puck movement, tip, power play, etc.). The private models that we're not privy to (except for occasional releases of stats) tend to account for more pre-shot occurrences, so you could certainly argue those are better (which is probably why at least 30 teams are using Sport Logiq's data to some extent or another). The public models aren't totally without value, however, in quantifying the quality of shots a team is generating or giving up in a game and over the course of a season.

In my opinion (I'm not a statistician so this is my layperson interpretation), a different way to describe expected goals would be a detailed expression of scoring chances for and against. Expected goals models have basically catalogued all shots and goals (including pre-shot occurrences on the ice) over thousands and thousands of games (i.e., from both skilled shooters and terrible shooters--yes, you, Austin Wagner) from the past decade or so and note the percentage of times in the course of history that shot has gone in (e.g., a point shot tipped in the slot from the inner right hash mark immediately after a rush chance has historically gone in 10% of the time, so it is worth .1 expected goals).

To your point about luck versus skill, I won't dispute that the skill of the shooters on a team will have an impact on whether that team is over-performing or under-performing their expected goal totals. In that sense they can be useful because over the course of the season teams that are over-performing their expected goal totals clearly have better finishing than teams that are under-performing their totals (e.g., the Kings). What I see when I look at the expected goal totals to date there is that the Kings are doing a pretty good job generating the types of shots that historically go in the net but that they suck at finishing. Some people might say, "well, duh, we already knew that," but I think it's useful to have a quantifying data point to highlight that (shooting percentage alone lacks all of the context of what's actually happening on the ice).

I can't pretend to summarize what the models do to attempt to account for/create proxies for actual on-ice situations so I'll just provide links with snippets to a few for those who are interested.

Moneypuck
Variables In Shot Prediction Model:

1.) Shot Distance From Net
2.) Time Since Last Game Event
3.) Shot Type (Slap, Wrist, Backhand, etc)
4.) Speed From Previous Event
5.) Shot Angle
6.) East-West Location on Ice of Last Event Before the Shot
7.) If Rebound, difference in shot angle divided by time since last shot
8.) Last Event That Happened Before the Shot (Faceoff, Hit, etc)
9.) Other team's # of skaters on ice
10.) East-West Location on Ice of Shot
11.) Man Advantage Situation
12.) Time since current Powerplay started
13.) Distance From Previous Event
14.) North-South Location on Ice of Shot
15.) Shooting on Empty Net
MoneyPuck.com -About and How it Works

Hockeyviz
My animating assumption is that all of the skaters (between six and eleven, in total, usually) are working together both to generate shots for their team and to suppress the generation of shots by the other team. In principle, I consider all of the skaters equally able to affect both processes, in the long run, even if a given skater (for either team) might be only minimally involved with a given shot. All of play in all three zones leading up to and including the decision by a given player to shoot the puck, I understand to be the product of the (combined) ability of all of the skaters, and I call the goal likelihood of shots generated, at the moment the decision to shoot is made, the "expected goal probability", or xG for short. Then, in addition to the xG of the pattern to which a given shot conforms, the shooter themself can, in principle, affect the goal likelihood of the shot, by shooting the puck skilfully, or perhaps (as I might) by woefully butchering whatever chances they and their teammates (and opponents) have conspired to generate. This ability I call "finishing impact" or "shooting talent". Finally, a goaltender can, in principle, affect the goal likelihood of a shot after it is taken, by sneakily interposing their body between the puck and the goal, or, (again as I might do) contriving to fail to do so.

This three-fold division&em;all of the skaters collectively produce the shot, the shooter shoots, and the goalie endeavours to save&em;is the animating idea behind the model I describe here. Even at this very basic level these choices have important consequences. For instance, a player who is skilled at receiving passes in (relatively) less-dangerous shooting positions and then skating untrammelled with the puck to more-dangerous locations will score more often for that reason, and that skill will appear in my accounting in their impact on xG (which will increase, since their team is taking more dangerous shots) and not in their finishing impact (which will presumably decrease, since they are shooting from locations where goals are easier to come by). Similarly, including goaltender effects only on shots already taken prevents us from making any estimate of goaltenders' impact on xG, conceded or generated, from, say, their tendency to handle the puck.

Throughout this article, when I say "shot" I will mean "unblocked shot", that is, goals, saves, and misses (including shots that hit the post or the crossbar). All shots in all strength situations that are taken against a team with a goaltender on the ice are considered.

...

I use a design matrix X" role="presentation">X
for which every row encodes a shot with the following columns:
  • Two indicators for the shooter; (in 2019-2020 there were 879 shooters)
    • One for tip/deflections
    • Another for all other shot types
  • An indicator for the goaltender; (in 2019-2020 there were 86)
  • A set of geometric terms and shot types, described below;
  • An indicator for "rush shots", that is, in-zone shots for which the previous recorded play-by-play event is in a different zone and no more than four seconds prior;
  • An indicator for "rebound shots", that is, shots for which the previous recorded play-by-play event is another shot taken by the same team no more than three seconds prior;
  • An indicator for teams which are leading and another for teams which are trailing; to be interpreted as representing change in configurations surrounding shots compared to when teams are tied;
  • Four indicators for different skater strength situations:
    • SH for short-handed shots, that is, ones taken by a team with fewer skaters;
    • PPv3 for shots taken by a team with more than three skaters against a team with exactly three skaters;
    • PP for all other power-play shots, that is, ones taken by a team with more skaters;
    • 3v3 for play where both teams have exactly three skaters (mostly in overtime)
    All shots are assigned exactly one of the above indicators, which should all be understood as the change compared to a similar shot at even-strength, that is, all 4v4 and 5v5 shots gathered together.
  • An interaction term for shots which are slapshots and also on the power-play of any kind, this term is meant to proxy for one-timers.
Model Description: Expected Goals Fabric

Patrick Bacon (model is used for jfresh's visuals)
My research has led me to different conclusions on the predictive power of expected goals, but before I get into that, I want to address my issue with this line of thinking. A part of me wishes that we had stuck to calling expected goal models “Shot Quality” models instead, because I think that the term “Expected Goals” implies that these models are solely predictive in nature, which isn’t necessarily the case. Even if expected goal shares were completely useless for predicting future goals at the team levels, expected goals would still be extremely useful for describing past events and telling us which teams relied heavily on goaltending and shooting prowess, or were weighed down by poor shooting and goaltending, and even which shots the goaltender deserved most of the blame for, so I disagree with the premise that hockey fans should stop using expected goals at the team level if they are not as predictive as Corsi.

...

I accounted for the following variables in my model:
  • Shot distance and shot angle. (The two most important variables.)
  • Shot type.
  • The type of event which occurred most recently, the location and distance of this event, how recently it occurred, which team the perpetrator was, and the speed at which distance changed since this event. (The inclusion of the last variable was inspired by Peter Tanner of Moneypuck.)
  • Whether the shooting team is at home.
  • Contextual variables such as the score, period, and seconds played in the game at the time the shot was taken.
  • Whether the shooter is shooting on their off-wing. (For example, a right-handed shooter shooting the puck from the left circle is shooting from the off-wing, and a left-handed shooter shooting from the same location is not.)
A New Expected Goal Model That is Better Than Corsi at Predicting Future Goals and Wins Above Replacement 1.1 and Expected Goals 1.1: Model Updates and Validation

Clear Sight Analytics (not a public model, but certain results are released publicly occasionally)
CSA’s proprietary methodology systematically catalogs every shot sequence resulting in a shot on goal for every game played in the NHL, using 34 individual standardized points of data, including:
  • Passer
  • Passer location
  • Shooter
  • Shooter location
  • Offensive situation (i.e., man advantage/even strength, odd man rush/settled offense and face offs)
  • Screens
  • Deflections
  • Broken plays

And of course RESULTS including:
  • Rebounds
  • Whistles
  • Goals
This proprietary methodology allows CSA to accurately categorize each and every shot sequence resulting in a shot on goal by type, creating the definitive measure of a scoring chance—the actual probability of scoring. CSA has analyzed more than 250,000 shot sequences resulting in a shot on goal—more than 8 million individual points of data, creating a new generation of team and player performance metrics that will change how you see the game.
https://www.csahockey.com/what-we-do
 
Last edited:

crassbonanza

Fire Luc
Sep 28, 2017
3,264
3,137
I'm not sure it's fair to say the expected goals models don't account for actual on-ice situations. Each model is built a little differently but one similarity for all is that they're not simply based on shot location stripped of any of context of what's occurring on the ice. Most models will account for what happened prior to the goal (e.g., rebound, rush chance, cross-ice puck movement, tip, power play, etc.). The private models that we're not privy to (except for occasional releases of stats) tend to account for more pre-shot occurrences, so you could certainly argue those are better (which is probably why at least 30 teams are using Sport Logiq's data to some extent or another). The public models aren't totally without value, however, in quantifying the quality of shots a team is generating or giving up in a game and over the course of a season.

I think one of my biggest complaints with these models being held as infallible is that the positioning of all skaters can play such a big role in scoring success and they really can't account for that(and to be fair they won't be able to until we have player tracking). I noticed that your clear sight link mentioned screens, so that would have been one of my examples. That has been one of my biggest complaints for years about these stats. I'm not sure how they are accounting for screens, but it is nice to see that added to the discussion.

To your point about luck versus skill, I won't dispute that the skill of the shooters on a team will have an impact on whether that team is over-performing or under-performing their expected goal totals. In that sense they can be useful because over the course of the season teams that are over-performing their expected goal totals clearly have better finishing than teams that are under-performing their totals (e.g., the Kings). What I see when I look at the expected goal totals to date there is that the Kings are doing a pretty good job generating the types of shots that historically go in the net but that they suck at finishing. Some people might say, "well, duh, we already knew that," but I think it's useful to have a quantifying data point to highlight that (shooting percentage alone lacks all of the context of what's actually happening on the ice).

The thing about expected goals is that it really doesn't seem to deviate all that much from shots for. Let's take the 3 year stretch from 2018-2021. The top 3 teams in xGF% are Vegas, Carolina and Montreal, while the top 3 teams in SF% are Vegas, Carolina and Montreal. Last year the top 3 teams in SF% were Boston, Colorado and Florida, while the top 3 teams in xGF% were Colorado, Toronto and Boston(with Florida in 4th). So, it just seems to me to be a glorified shots for that doesn't account for skill.
 
  • Like
Reactions: Chazz Reinhold

Raccoon Jesus

Todd McLellan is an inside agent
Oct 30, 2008
62,018
62,182
I.E.
The thing about expected goals is that it really doesn't seem to deviate all that much from shots for. Let's take the 3 year stretch from 2018-2021. The top 3 teams in xGF% are Vegas, Carolina and Montreal, while the top 3 teams in SF% are Vegas, Carolina and Montreal. Last year the top 3 teams in SF% were Boston, Colorado and Florida, while the top 3 teams in xGF% were Colorado, Toronto and Boston(with Florida in 4th). So, it just seems to me to be a glorified shots for that doesn't account for skill.

Why would it? How do you score?

I agree that it doesn't account for skill and I'm not sure there's a model that really can--there will always be outliers. Ovy can have a bad shooting % season. Philip Danault can go on a heater. How would you differentiate between 'skill' and 'luck'? Edit: a lot of the more recent models use 3-year windows to try to eliminate some of that deviance but even so there are ebbs and flows to EVERYONE's careers.

There's a ton these models leave up to interpretation and I think they should be that way. The mistake isn't the stat, it's the application.

I don't see anything wrong with this stat. It tells me the Kings are expected to score more based on the chances they can generate, not that they SHOULD score more. It tells me they're leaving a LOT of offensive opportunities on the table and, in the past, someone like DL used that to, say, add Gaborik--you could exchange some of your 5% CF for a little less possession but a little more punch. In our case, it could be something as simple as *cough* adding Vilardi to a possession line, or finding a way to get more offense out of the Dmen.
 
Jun 30, 2006
5,536
2,219
Is there a metric for 'expected failure' ? Maatta would be #1 and Brown would be in the top 10. It's is true that Brown mixes in some 'unexpected success' moments. He actually does. But the 'expected failure's' are 8X those.
Brown is on his last legs, I think he retires and becomes a Kings spokesperson for youth/charity etc.
 

crassbonanza

Fire Luc
Sep 28, 2017
3,264
3,137
Why would it? How do you score?

Haha, yeah I was more pointing out that despite all the work that has gone into developing these, you can get nearly the same amount of information by watching the shot tracker on a broadcast.

There's a ton these models leave up to interpretation and I think they should be that way. The mistake isn't the stat, it's the application.

I definitely agree that there is nothing wrong with the stat, it's another piece of information to discuss. I just don't believe it is infallible and charts like the one JFresh posted can lead people to think there is luck at play rather than skill divides. It's kind of like the PDO discussion when that stat was more popular, at the end of the day it really just shows you that some teams are good and some teams are bad.
 

Raccoon Jesus

Todd McLellan is an inside agent
Oct 30, 2008
62,018
62,182
I.E.
Haha, yeah I was more pointing out that despite all the work that has gone into developing these, you can get nearly the same amount of information by watching the shot tracker on a broadcast.

I definitely agree that there is nothing wrong with the stat, it's another piece of information to discuss. I just don't believe it is infallible and charts like the one JFresh posted can lead people to think there is luck at play rather than skill divides. It's kind of like the PDO discussion when that stat was more popular, at the end of the day it really just shows you that some teams are good and some teams are bad.



Yeah I remember that discussion even when CF% was just starting to gain some steam...we were like it's nothing profound. Darryl Sutter has been around for a century and the system was just 'outshoot the opposition and you'll win more often than not.' Obviously more recent stuff tries to parse that with shot location and other nuance but at the end of the day.

This forum is actually better than most at dealing with this stuff, believe it or not.

I can't even count how many times I see someone drop a JFresh card with no commentary and walk off like it was a mic drop.
 

unicornpig

Registered User
Dec 8, 2017
3,649
5,320
I( really don't get why Tkachev is in the minors, he outplayed every Kings winger
and i have a hard time to beliebve that people like Grundstrom, Arvidson or even Iafallo are an upgrade over Tkachev
cause the headcoach is a dinosaur.
 
  • Like
Reactions: dick341

Raccoon Jesus

Todd McLellan is an inside agent
Oct 30, 2008
62,018
62,182
I.E.
I just wonder how long till he heads home to Russia and till Vilardi/Frk get traded at this point.
 

KingsFan7824

Registered User
Dec 4, 2003
19,376
7,463
Visit site
Frk has to be like the backup QB of the Kings. He's 28, been through 3 organizations, none of them good at the time, has 120 NHL games to 303 AHL ones, but somehow the Kings are missing out by not playing him.

Frk getting traded? How often is he on waivers and nobody takes him? Traded to where?

Tkachev, same thing. He's 26, and was a wild card anyway. Sure, he's scoring in the AHL, but so is Tynan. Not many are clamoring for TJ Tynan.

They do have talent, and they're simply different from the same old same old that the Kings love to put on the ice, but maybe they're not NHL players. Well how do you know if you don't play them? You need a spot for them to play them. The bigger 1 way contracts are going to play. The Kings, due to circumstance, are playing like a handful of young players every game, while trying to compete for a playoff spot. It's not going to be Brown sitting if Tkachev or Frk are in the lineup.
 
  • Like
Reactions: funky

Raccoon Jesus

Todd McLellan is an inside agent
Oct 30, 2008
62,018
62,182
I.E.
it was simply a musing-aloud of what's going to happen to the now-redundant parts that the Kings have devalued, not a cry for them to get into the lineup.
 

KINGS17

Smartest in the Room
Apr 6, 2006
32,381
11,264
cause the headcoach is a dinosaur.

Hmmm, maybe.

3181437-2099247642-36e9f.jpg
230px-Todd_McLellan.jpg
 

kingsfan

President of the Todd McLellan fan club by default
Mar 18, 2002
13,384
1,032
Manitoba, Canada
I( really don't get why Tkachev is in the minors, he outplayed every Kings winger
and i have a hard time to beliebve that people like Grundstrom, Arvidson or even Iafallo are an upgrade over Tkachev

You have a hard time believing a 26-year-old with four NHL games under his belt (and arguably only one good game of those four) isn't at least equal to, if not better than, guys like Arvidsson or Iafallo?

 
Status
Not open for further replies.

Ad

Upcoming events

Ad

Ad