Re: Mental Lapses
posted at 5/12/2011 2:44 PM EDT
In Response to Re: Mental Lapses
[QUOTE]In Response to Re: Mental Lapses : I think Bill's point requires much more exemplification. The example he does give shows a runner not playing the percentages but playing against the percentages: risking two outs to gain one base. If an out is worth a base, the man is risking two bases to gain one. That's a lousy percentage play, and ones like it will kill a team over the course of a season.
Posted by expitch[/QUOTE]
Here's my exemplification
(i absolutely love that word, which is why I italicized it
Let's take this situation. Man on 1st base, 1 out. A short blooper is hit between the right fielder and 2nd baseman. If the runner runs immediately, he can make it to 3B (in reality, there's more variables here). If he waits for the ball to drop (or until he thinks he knows) the ball will drop, he'll only make it to 2B. Makes sense so far, right?
We'll assume this situation is the early innings of a game, so scoring one run is not of the essence (a late-game situation would be completely different).
An "expected runs matrix" tells you on average, how many runs were scored (on average) in different situations (baserunners and number of outs). This doesn't take lineup position into account (nor does it take a host of other variables into account), but it proves my general point. Here are the run expectancies from 2005 (most updated one i could find):
1 out, runners on 1st and 2nd: .9143
1 out, runners on 1st and 3rd: 1.183
2 out, runner on 1st: .237
3 out, no runners on: 0 (this one i just threw in for fun, obviously)
Now, if the runner goes and the ball drops, there will be 1 out, runners on 1st and 3rd. If the ball is caught, there are 3 outs. Assuming it's 50/50 whether or not the ball drops, the "run expectancy" if the runner runs is:
1.183 (number of "runs scored" if ball drops)* .50 (chance ball drops) + 0 (number of "runs scored" if runner is doubled off)*.5 (chance ball is caught)=
.5915 runs scored
If the runner stay and the ball drops, there will be 1 out, runners on 1st and 2nd. If the ball is caught, there will be 2 outs, runner on 1st. If the runner stays, the "run expectancy" is (using the same formula as above):
(.9143*.5)+(.237*.5)= .57565 runs per inning.
There is certainly much more to the argument. The moment at which the runner decides to "run" decides the "percentage" chance he would make it to the base. Obviously all the "run expectancy" values change with the exact game situation (pitcher, lineup spot, etc.). I'm not even trying to argue that if there's a 50% chance the ball drops in this situation, that the runner should run. I think it's really pretty inconclusive and game-situation dependent. Furthermore, having a runner try to "compute" all these percentages would be extremely difficult.
All I'm trying to exemplify here is that there are some situations in which getting doubled off is actually an ok "percentages play" and not a poor decision. It's really the same thing as going for an extra base on a single, since only one out is truly being risked (since if the ball is caught, that's already one out anyways). The same goes for getting picked off.
My guess is, the reason those "rules" were created is because managers didn't trust to their baserunners to make the right split-second decisions, so they simply told the runners to take the low-risk option. It's the same reason football teams always punt on 4th down when in their own territory (or often times even when in the opponent's territory).