It occured to me a few months ago that relief pitchers, everything else being equal, should have lower ERAs than starting pitchers. This is because your ERA should be lower if you enter an inning with 1 or 2 outs (instead of starting the inning with no outs). Now, in this day and age a lot of teams have a set-up man who will start the 8th inning, and the closer will start the 9th. But, still, there are definitely many times when relief pitchers will come into a game with 1 or 2 outs and men on base (or not). Remember, as far as their own ERA goes, however, it doesn’t matter if there are men on base or not. If he comes in with 3 men on base and 2 outs and gives up a grand slam, he’s only charged with 1 run.
To start with, I’ll do a relatively simple experiment that demonstrates that entering with 1 or 2 outs will lower your ERA. For this demonstration I’ll simplify things. Suppose a pitcher only walks or strikes out batters; there are no hits or stolen bases. Simply put, the other team scores if there are 4 or more walks in an inning. Obviously a game is not this simple, but it’ll be fine for our demonstration purposes.
Suppose that a pitcher walks a better every 1 / x times. Another way to put it is that for each batter, the walk probability is B = 1/x and the strikeout probability is K = 1 – B. The question is, what’s the expected number of runs scored in each inning? We’ll call this E. Then:
E = 0 * probability(0 runs) + 1 * probability(1 run) + 2 * probability(2 runs) + …
E = 1 * probability(4 walks) + 2 * probability(5 walks) + …
E = 1 * (15B^4K^3) + 2 * (21B^5K^3) + … + r * ((r + 5) choose (r + 3)) * B^(r+3) * K^3 + …
E = ((K^3) / 2) * (sum for all r (from 1 to infinity) of (r * (r + 5) * (r + 4) * B^(r+3)))
Rather than trying some fancy math to simplify this, I used my Euclid calculator to calculate E for various values of B.
If B = 1/2, then E = approximately 0.9375 (for an ERA of 8.44).
If B = 2/5, then E = approximately 0.374 (for an ERA of 3.36).
Suppose that instead of entering with 0 outs, the pitcher enters with 2 outs. Then the expected number of runs (F) in that 1/3 of an inning is:
F = 0 + 1 * (B^4K) + 2 * (B^5K) + … + r * b^r * K + …
F = K*(sum for all r (from 1 to infinity) of r * B^(r+3))
If B = 1/2, then F = approximately 0.125 (for an ERA of 3.375).
If B = 2/5, then F = approximately 0.043 (for an ERA of 1.152).
So, clearly, there’s a huge difference between entering with 0 outs and entering with 2 outs in this simplified example.
Of course, in a real game, the difference will probably not be as dramatic. For example, a home run is a run no matter if there are 0 outs or 2 outs. So if you did the same experiment I just did, but only allowing a pitcher to have strikeouts or home runs, then the ERA should be the same regardless of the number of outs.
I’m not sure how much difference there is in actual practice. I am sure it is significant, but I don’t know how much. A wild guess would be that if the average ERA is 4.50 when entering with 0 outs, then it might be 3.50 when entering with 1 out and 2.50 when entering with 0 outs. But that’s just a wild baseless guess. If someone with more patience than me wanted to figure this out, they could simply scour the box scores of MLB games and crunch the data. You’d probably have to go through a whole season to get statistically relevant data. This could also be done with some sort of simulation program to simulate a real baseball game. That sounds like fun. Maybe I’ll do that sometime.
Note: I’ve only checked the above equations a couple times, so it’s entirely possible I may have made some errors, so feel free to point them out if you see any mistakes.