Take a step back and listen to yourselves. If that were indeed true then it shouldn't make a difference if I use 1 year or 5 years or 9 years as long as they are the same years for each.
No, that's not accurate. If you're talking about an amalgam of different seasons then the math changes. If you assume that a player is ahead by 10% over 9 particular seasons in raw stats should necessarily be 10% ahead in adjusted stats, that means you assume that the scoring in each particular season is the exact same value, which is in fact what adjusted scoring is intended to account for.
A player could build big "leads" by having his best years in high-scoring times, while the other player has his best years in low-scoring times. If they have an equal numbers of best years versus each other, of the same average betterness, then the first player will remain ahead in raw total because it's easier to have big seasons in a high scoring environment. Adjusted stats attempt to account for this.
To break this down to a very simple example:
Player A scores 60 goals in a 8.00 GPG environment. Player B scores 50 goals that same season. Player A is 20% better.
The next season, there's a shift in the scoring dynamics and the league now features 4.00 GPG. Player B scores 30 goals, and player A scores 25. Player B is 20% better.
By raw totals, player A has 85 total goals, and player B has 80. Player A seems to be 6% better, despite the fact that each player had one season of being 20% better in direct competition, and the players in total scored the same percentage of league goals. If each player had one season of being the same amount ahead of the other, surely in total they should be considered equal.
Adjusting to 6.00 GPG would give 45 (A) and 37.5 (B) in the first season, and 37.5 (A) and 45 (B) in the second. Totals 82.5 each, which in this hypothetical case seems the more fair result.
Seriously...time to give yer heads a shake.
And this sort of crap doesn't help your case. If you have points make them.