Thinking about offense in April

I’m working on a USA Today Sports Weekly piece, and there sure is a lot of interesting stuff that happens in April.

We know that relievers have tossed an increasingly large fraction of innings over the years. In the 1970s, relievers tossed about 20% of all innings. In recent years, the percentage has been closer to 24%. That may not seem like a lot, but with a little over 43,000 innings last year, that extra 4% represents more than 5000 outs being recorded by relievers instead of starters.

It’s interesting to break down relief pitching even a little further, looking at what happens in all MLB games through the end of April vs the remainder of the regular season after April. Here is the percentage of innings pitched by relievers in those two conditions:

reliefpitching

Both lines trending up over the years indicate what we already knew: relievers are throwing more and more innings, and it’s true for both April and the rest of the months. Incidentally, that spike for April in 1995 is due to the delayed start of the season from the 1995 strike–both a small sample size of games, and pitchers who had very little spring training to get ready.

Relievers have tended to be more effective than starters, especially as bullpen specialization (9th-inning closer, 8th-inning setup guy, and LOOGY) has increased. Because relievers roles are better defined, they can warm up more efficiently and not worry about leaving anything in the tank. This leads to harder pitches, more strikeouts, and less offense.

Take a look at the ratio of the two lines in the plot above. This quantifies how much more pitching relievers do in April vs the rest of the season. For example, in a given year, if they threw 30% of April innings and 20% for the rest of the season, that would register as 150% for that year, since they threw 1.5 times as many of the April innings (on a percentage basis).

reliefpitching2

Throughout the 1970s, relievers pitched about 1.45 times as many innings than they did during the rest of the season (again, on a percentage basis). In the 1980s, it was about 1.40. In the 1990s, when offense spiked, so did the relievers’ innings (as starters were getting knocked out like crazy) but over the 2000s the value dipped as low as 1.30, a value it has hovered around in recent innings.

This makes sense for a couple of very different reasons:

  • First of all, there are only so many innings. It’s hard to imagine starters averaging less than 6 IP per start, league-wide. So whereas they once averaged 8 IP per start, that slid down to 7, then down to 6.5, and now is approaching 6. But it just can’t fall much lower. So, we’ll likely never reach the point where relievers are tossing 40% or more of innings over any significant period of time.
  • Secondly, imagine the patterns of relief pitching over the years. When complete games were still common (think 1950s through 1970s on these graphs), pitchers at the start of the season were a little less likely to toss a complete game because they weren’t yet stretched out. Therefore, in years gone by, relievers were more likely to appear in April than later in the season. hence the larger percentage difference during those eras. Once we got into the 1980s, complete games started to become less common overall, and relievers started appearing more consistently during the season, April and otherwise alike. Now that the complete game is for all intents and purposes the dodo, relievers appear all the time, season-long, and the difference between April and the rest of the season is diminished.

Do you realize what this means? It means the notion that “pitchers are ahead of hitters early in the season” is wrong. The more likely explanation is that “relievers were used more frequently in April in years gone by, and were more effective, hence lower April offense, but now that there is much less difference in relief pitcher usage, there is little difference in April offense and that of the other months.”

0 0 votes
Article Rating
Subscribe
Notify of
guest

11 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Doug
Editor
8 years ago

Pitchers’ (starters and relievers) have posted better ERA in April than for the whole season in 53 of 86 seasons (62%) since 1930. So, that would certainly support conventional wisdom that pitchers are ahead of hitters in the early going.

OTOH, for the most recent past with the larger share of innings by relief pitchers, it’s a different story, with better ERA in April than the whole season in only 10 of 22 seasons since 1994. And, in 7 of those 10 seasons, April ERA was better than the season by less than 0.1 runs.

brp
brp
8 years ago
Reply to  Doug

I had read a long time ago that offense is generally lower in April and Sept/Oct anyway. Something to do with the ball not travelling or carrying as well when the weather is colder.

Doesn’t seem to be much of a factor here, though.

Doug
Editor
8 years ago

Best active pitchers in April, relative to rest of season, with minimum of 100 career IP in March/April. Rk Player Split G ERA ERAtot Diff W L W-L% IP IPtot 1 Ricky Romero April/March 19 2.63 4.09 -1.46 9 4 .692 130.0 794.0 2 Jaime Garcia April/March 20 2.17 3.38 -1.21 10 3 .769 124.2 535.0 3 Doug Fister April/March 20 2.43 3.57 -1.14 9 5 .643 126.0 860.2 4 Zack Greinke April/March 43 2.26 3.25 -0.99 22 7 .759 263.0 1771.2 5 Andrew Cashner April/March 31 2.51 3.47 -0.96 5 9 .357 107.2 540.0 6 Yu Darvish April/March 16… Read more »

Ken S.
Ken S.
8 years ago

“So whereas they once averaged 8 IP per start, that slid down to 7, then down to 6.5, and now is approaching 6.”

The average IP/start in 1913, the last season Retrosheet has such splits for, was 7.18. Are there estimates somewhere for the years earlier than this? It would be interesting to know when it was last 8 innings.

Doug
Editor
8 years ago
Reply to  Ken S.

According to B-R’s Play Index, we’ve already crossed below the 6 IP per start threshold and haven’t been above 7 IP since the 1940s.
Average IP Per Start 1930-2015

Dr. Doom
Dr. Doom
8 years ago
Reply to  Andy

I don’t know; in the last 70 years, we’ve lost 1 inning per start. When you hear people talk about it, they certainly make it SEEM like we’ve lost 2-3 innings in the last 40 years. I find it fascinating that it’s as little as it is. Also, there’s been very little change since the mid-90s, so things may be stabilizing. I wonder when a manager is going to come along and change bullpen usage again. I imagine it’s only a matter of time until someone decides that a roster might be more valuable with 15 position players and 10… Read more »

Paul E
Paul E
8 years ago
Reply to  Andy

Andy,
Check out the increase in complete games from 1970 to 1971 (22 to 28%). Obviously, 1970 evidenced a great offensive explosion but, kind of odd that the DH (1973–) didn’t increase that percentage of CG’s at all

Dr. Doom
Dr. Doom
8 years ago
Reply to  Paul E

I guess I don’t find it that surprising. In a non-DH league, there’s an incentive for a pitcher to “complete” an inning, especially if his spot is up in the lineup next half-inning, so you don’t end up wasting a reliever to get just one out or watching one hit, which is even worse. In a DH league, there’s no such incentive, so I would guess that there’s more incentive for mid-inning pitching changes. It probably all washes out in the end. No research on that – it’s just a guess.

Doug
Editor
8 years ago
Reply to  Dr. Doom

This is one reason why the difference in average start length isn’t greater.
Percent of Start of 1 IP or Less 1913-2015
Don’t know whether we have a more uniform caliber of starting pitcher in recent years or just more patient managers, but those really short starts in years past served to balance out all those complete games that we so seldom see today.