“Baseball altered the rules [in 1969] in response to the decline in offense, lowering the mound, and if Major League Baseball wants something other than general parity and games with fewer runs, it will probably have to revisit this — perhaps lowering the mound again, or changing the composition of the ball.”
Kepner details the numbers, and possible causes:
“[The] batting average, .251, is the lowest since 1972, the year before the creation of the designated hitter. … The 4.13-runs-per-game average is a full run lower than in 2000 … and the slugging percentage, .390, is at its lowest point since 1992. … Teams  encourage talented pitchers to throw harder for shorter stretches in the bullpen rather than pace themselves as starters.”
How you view today’s scoring depends largely on what seems normal to you. So, full disclosure: I grew up in the 1970s. My first vivid memory for a stretch of MLB games is the 1972 ALCS, which averaged two runs per team in a full-distance thriller, with eight of the 20 total runs scored in extra innings. And even though Oakland broke my Tigers-fan heart, I watched the A’s nip the Reds in the ’72 World Series, scoring 2.3 runs per game, with three runs or less in all four wins.
Still, I’m no big fan of routine 2-1 games. My sense of normal was formed more by the rest of that decade, starting with the AL’s introduction of the designated hitter, and by the ’80s. For the brand of baseball on display, my favorite year was 1980. (Check out the batting and pitching leaders.)
Anyway … In the first 20 years of the DH — 1973-92, or the “pre-PED era” — the combined scoring average was 4.26 runs per game. Seven of those seasons averaged less than 4.15 R/G, and just one reached 4.50 (homer-happy ’87).
1978: Jim Rice, Dave Parker, and 4.1 runs per game
Was there a cry for more offense after 1978 averaged just 4.10 R/G? I recall it as an exciting and entertaining year: There were three close pennant races. Two worthy batting champs, in Rod Carew (.333, on the heels of his .388) and MVP Dave Parker (.334-30-117). Jim Rice topped 400 total bases, the first since Bad Henry was young. Ten guys cracked 30 homers; George Foster reached 40 HRs and 120 RBI for the second straight year. Twenty-one swiped 30 bags, with four over 50; Ron LeFlore stole 68, and scored 126 runs. Ron Guidry’s 1.78 ERA stood out, as the next best were 2.27 and 2.36. J.R. Richard fanned 303, but no one else topped 260. Starters completed just under one-fourth of their games; only Knucksie Niekro reached 300 innings.
There was no sense of pitchers ascendant. No one that I know of called for changes in the game after 1978 — probably because 4.10 runs per game was about average to that point of the DH era.
What are we really talking about?
I think the subtext of today’s gripes is not runs, per se, but the dearth of sequential offense: low batting averages, and ever-soaring strikeout rates. Some even mourn the drop in total home runs since the PED era — even though the overall HR percentage is still higher than all but seven seasons before 1994, and homers as a percentage of batted balls are higher than any pre-PED season save ’87.
But if we do push for more scoring in general, let’s take care to aim for the kind of baseball we like to watch. If strikeouts are part of the problem, how would juicing the ball or trimming the mound discourage the “grip’n’rip” batting style?
This modern approach to batting is broadly blamed for the high K rate, and that’s surely a big factor. But it’s also quite clear that today’s pitchers, facing those hitters, are more effective in shorter stints — and teams are exploiting that fact more than ever. This year, relief outings of one inning or less account for 21% of all innings pitched. In 1989, that figure was 8%. This year, just past midseason, 145 pitchers already have 20 or more such appearances, and have averaged (in those games) 9.2 strikeouts per nine innings — well above the overall relief average of 8.5 SO/9, and fully 25% above the starters’ rate of 7.4 SO/9.
Note the results for a pitcher’s first batter faced in a game, with relievers comprising three-fourths of the total this year:
- First batter: 22% Ks … .243 BA … .683 OPS
- All others: 20% Ks … .252 BA … .708 OPS
What’s more, the percentage of “first batters faced” has risen from 7% of all batters in 1988, to 10% in 2013.
How did we get here?
The average number of relievers per team game began rising sharply in the early ’90s, commonly blamed on the increase in scoring. But although scoring began its steady decline in 2008, there’s been no drop in relief outings since then. And it’s easy to see why:
- 1988 — 1.75 relievers … scoring avg. 4.14 R/G … relief ERA 8% less than starters
- 1993 — 2.27 relievers … scoring avg. 4.60 R/G … relief ERA 5% less than starters
- 1998 — 2.46 relievers … scoring avg. 4.79 R/G … relief ERA 9% less than starters
- 2003 — 2.67 relievers … scoring avg. 4.73 R/G … relief ERA 9% less than starters
- 2008 — 2.92 relievers … scoring avg. 4.65 R/G … relief ERA 8% less than starters
- 2013 — 2.93 relievers … scoring avg. 4.17 R/G … relief ERA 12% less than starters
Scoring is now where it was in 1988. Relievers’ share of total innings has grown only from 29% to 33%. But there are 67% more relief appearances now, as their average duration has shrunk from 6.4 batters to 4.4. As a result, their edge in effectiveness is greater than ever.
So, what should be done?
Perhaps there’s a way to discourage so many pitching changes. One idea for the AL (although the union would fight it) is to link the DH to the starting pitcher: Once the starter exits, the reliever must take a spot in the batting order. He wouldn’t have to take the DH’s spot, as rules already permit the DH to move to a fielding position, with the pitcher batting in place of the exiting fielder. This would be some deterrent to pitching changes generally, but especially to mid-inning changes, as skippers would have to make multiple lineup decisions without knowing what game situation they’d face in their next at-bats. It also might help swing the roster balance back towards position players, as deeper benches would be needed to keep the relievers from having to bat.
A modest idea for the NL, to discourage mid-inning pitcher changes, would be to ban the “double-switch” solely in those situations: A pitcher brought in mid-inning would have to bat in the spot occupied by the current pitcher. This might not have as big an impact as the DH innovation, but it would target situations in which a fresh pitcher has the greatest drag on scoring, whether by gaining a platoon edge (see Doug’s recent study), or simply by being at top strength.
Other rules could be tried to restrict mid-inning changes. So far this year, 28% of all relief stints have lasted two outs or less, and 16% lasted two batters or less. Both have doubled in frequency since 1989. What if a reliever had to stay for at least three outs or three batters, with exemptions for injury and perhaps one free exemption per game? As a bonus, such limits would reduce one of the dullest aspects of watching a major league game.
If the goal is to get scoring levels closer to historical norms, the best approach might not be to tinker with the balls, bats or mounds, but to encourage a return to how the game was played before 1990. I’m no grandpa grouching “get off my lawn!” I just think that curbing short relief stints would lift the number of balls hit onto the lawn. It might pressure starters into a little more pacing, to go a little deeper into games. Whether the score winds up 4-2 or 8-5, our game should be more about batted balls, fielding and running, and less of this endless parade of fresh arms blowing 95-mph smoke past a couple of batters.
Now, here’s where you point out what I’ve overlooked.