1917-18: Strikeouts on the wane … but why?

Since 1890, these are the largest annual changes in strikeout percentage:

  • 1893, -38%
  • 1901, +32%
  • 1903, +21%
  • 1946, +20%
  • 1918, -17%

And these are the largest annual declines in strikeout percentage since 1890:

  • 1893, -38%
  • 1918, -17%
  • 1899, -10%
  • 1917, -9%

The three largest changes in SO% correspond with major rule changes. In 1893, the pitching distance was increased to 60′ 6″, and strikeouts plummeted. In 1901, the foul-strike rule was adopted by the National League, and the American League followed suit in 1903, with each move bringing a large rise in strikeouts.

The 1946 spike likely has much to do with the postwar return of the regular players — notably Bob Feller, whose 348 Ks in 1946 were one shy of the modern record and accounted for 22% of the raw SO increase over 1945 — and of regular ball-making materials. It still represents an 8% rise compared to 1940, which in turn was the highest rate since 1916. Hal Newhouser, perhaps the most notable holdover from wartime, averaged 5.4 SO/9 in 1944 and 6.1 in ’45, then surged to 8.5 SO/9 in 1946.

But what caused the large drop in SO% in 1917-18?


By 1903 both leagues had adopted the foul-strike rule, and from then through 1916, the MLB strikeout percentage ranged from 9.5% to 10.8%, with no annual change greater than 7%. But in 1917 the rate dipped by 9% — the 4th-biggest drop in modern times — to reach its lowest mark since 1902. A further plunge of 17% in 1918 — the biggest drop outside of the sea change brought by the modern pitching distance — lowered the rate to 7.7%, the lowest since either league counted fouls as strikes. That was 25% below the 1916 rate, and 23% below the 1903-16 average.

Here’s a chart, for you visual types. I’m told that 1917 is purple and 1918 is red:

MLB SO Pct 1904-33


The effect was seen broadly:

  • The AL rate fell by 22% from 1916-18, the NL rate by 27%.
  • Out of 97 batters with 200+ PAs in both 1916 and 1918, 70% saw their SO rate fall at least 20% over the two years, while just 3% had a rise of 20% or more. A dip of at least 10% was 11 times as common as a similar increase (80% vs. 7%).
  • Out of 61 pitchers with 50+ IP in both 1916 and 1918, 72% saw their SO rate fall at least 20% over the two years, while just 5% had a rise of 20% or more. A dip of at least 10% was 10 times as common as a similar increase (84% vs. 8%).
  • In 1916, the top 10 marks in SO/9 ranged from 5.9 to 4.7; the 1918 range was 5.2 to 3.6. And note the progression of the 1916 SO/9 leaders; every one declined in 1917 and again in 1918:
Rk 1916 1917 1918
1 Larry Cheney 5.9 4.4 3.7
2 Walter Johnson 5.6 5.2 4.5
3 Allen Russell 5.5 4.7 3.4
4 Lefty Williams* 5.5 3.3 2.6
5 Harry Harper* 5.4 5.0 2.9
6 Tom Hughes 5.4 4.9 n/a
7 Elmer Myers 5.2 3.9 1.6
8 Bullet Joe Bush 4.9 4.7 4.1
9 Claude Hendrix 4.8 3.4 3.3
10 Fred Anderson 4.7 3.8 3.1
11 Dutch Leonard* 4.7 4.4 3.4
12 Al Mamaux 4.7 2.3 n/a
13 Rube Marquard* 4.7 4.5 3.4
14 Babe Ruth* 4.7 3.5 2.2
“n/a” = less than 70 IP
Table provided by Baseball-Reference.com.

But what was the cause? I’ve been searching all day, but I haven’t found it yet.

The size and breadth of the SO drop suggests some change in basic conditions. If the ban on doctored baseballs and the practice of keeping a fresher ball in play could be dated to 1917-18, that alone might explain the decline in strikeouts. But every source I can find dates those changes to the period of 1919-21, and especially to the death of Ray Chapman in August 1920.

The 1918 season was shortened by a month due to World War I, but I can’t see how the lack of September baseball by itself would affect strikeout rates. And while some players did lose time to WWI service or to defense-related jobs, the numbers were nothing like what was to come in the next war.

During 1917-18, there was no change in the official strike zone definition, nor any other significant rule changes to the game on the field.

I can’t rule out changes in visibility at existing stadiums, but no team changed parks from 1916 to ’18.

It’s interesting that the drop in strikeouts was not matched by a significant rise in batting average. The 1916 BA was .2476, rising to .2538 in 1918, a gain of just 2.5%. By comparison, the 1893 drop of 38% in SO% was accompanied by a 14% rise in BA, and the 1899 drop of 10% in SO% came with a 4% rise in BA. It’s also curious that home-run percentage fell by 25% from 1916 to 1918, but slugging average was virtually unchanged.

Whatever the causes, the 1918 strikeout rate became the “new normal.” The MLB SO% stayed between 6.9% and 8.2% from 1918 through 1933.

If you have any ideas on the cause of the 1917-18 drop in strikeouts, let us hear them!

50 thoughts on “1917-18: Strikeouts on the wane … but why?

  1. 1
    Ed says:

    Don’t know but the decline was much larger in the NL (3.6 to 2.8; about 22%) than in the AL (3.3 to 2.9; about 12%).

    • 11
      John Autin says:

      Ed — The AL split the decline more evenly across the two seasons. But from 1916 to 1918, the AL strikeout rate fell by 22%. You’ve listed just the 1917-18 rates.

      • 13
        Ed says:

        Yeah sorry John. I originally just skimmed the post and missed the fact that you were talking about a 2 year decline. Anyway if you look at my #6, the decline from 1917-1918 was clearly a systematic decline. I haven’t looked at 1916-1917 but I would guess that it would show the same thing. Puzzling to say the least. I’ve done some google searches and don’t see any mentions of this phenomenon. Now do I see it mentioned in Bill James’ Historical Abstract.

  2. 2
    Insert Name Here says:

    Apparently, there is something called “the Babe Ruth theory” for why the Deadball Era ended (http://en.wikipedia.org/wiki/Dead-ball_era#The_end_of_the_dead-ball_era), which claims that Babe Ruth’s success inspired other players to change their hitting methods, ending the Deadball Era through the increased power numbers brought along by Ruthian hitting. However, the Babe did not become a star as a batter until 1918, so this wouldn’t have affected 1917 strikeout rates, although it could have helped the low rates to become the norm.

    • 9
      Hartvig says:

      Yeah, but that doesn’t exactly explain why strikeouts went down however. In fact, I know I’ve read quotes from about as far back as I can remember reading about baseball (the early 60’s) of people lamenting high strike out rates and some about Ruth himself- who did manage to lead the league in strikeouts 5 times and finish 2nd another 7- especially in debates about wether Ruth or Ty Cobb was the greatest ever.

      I have long believed that one of the biggest changes in baseball over time has been a gradual increase in the overall talent level, especially at the bottom end of the spectrum. There is simply no way in the current era- regardless of what you may say about Jeff Mathis- that a player the caliber of Bill Bergen essentially holds a full time position for over a decade. And as awful as Bergen and his career OPS+ of 21 were, the only thing that made him unique was how long he was able to get away with it, not how bad he was. The further back you go the more ridiculously overmatched players who had no business playing MLB you will find and the more playing time they will get. It’s my belief that over time improvements in scouting and the development of the farm systems have significantly reduced the number and playing time of these way-out-of-their-league players.

      But even if what I believe about player quality is true, that does nothing to explain why the sudden drop in strikeouts in such a short period of time unless there’s some way to figure out who the worst players in the league were in 1915 & 16 and who replaced them in 1917 & 18. And even if it turns out that that were to be the explanation- and I kind of doubt that it will- it still would do nothing to explain why all these improved players appeared so suddenly either since I’m aware of no big changes in scouting or the farm systems around that time.

      I do have to admit that the fact that strike out rates were so much higher for most of the deadball era than they were in the 1920’s and even into the 30’s came as a complete surprise to me.

      • 22
        Insert Name Here says:

        Re: player quality and worst players:

        From 1915 to 1916, there was an increase in quality of players due to the folding of the Federal League, which had most of the worst players. The number of players with -1.0 WAR or less decreased by over 70% (down from 27 to 8), although the number of players with negative WAR only decreased by less than 40%. The eight players with -1.0 WAR or less in 1916 were Eddie Mulligan, Lee King, Doug Baird, Zip Collins, Doc Johnston, Jack Tobin (yes, the same Jack Tobin who was a star RF in the early 1920s), High Pockets Kelly (yes, the HOFer), and Fritz Mollwitz (sadly, no, Fritz was not his given first name). Here’s how they fared over the next two years:

        – Mulligan and King, the two worst players of 1916 by WAR, did not play in 1917 nor in 1918 (Mulligan’s 1916 SO%: 14.5%! King’s, however, was 9.6%)
        – Kelly and Collins played very few games in 1917, and none in 1918 (Collins’ 1916 SO%: 14.2 %! Kelly’s 1916 SO%: 28.6%!!)
        – Tobin did not play in 1917, but was a full-time OF in 1918, showing improvement (122 games, .277/.349/.338, 110 OPS+, 1.9 WAR, 4.7 SO% DOWN from 7.7 SO% in 1916)
        – Johnston did not play in 1917, but returned full-time for the second half of the 1918 season, with poor results (74 games, .227/.301/.286, 70 OPS+, -0.8 WAR, 6.1 SO% DOWN from 9.4 SO% in 1916)
        – Baird played full-time in both 1917 and 1918, and drastically improved (his totals for 1917-18: 229 games, .252/.312/.356, 105 OPS+, 2.9 WAR/season, 4.5 WAR/162 games, 12.2 SO% UP from 10.7 SO% in 1916)
        – Only Mollwitz continued to struggle in obscurity during both seasons (his totals for 1917-18: 155 games, .266/.303/.322, 88 OPS+, 0.2 WAR/season, 0.9 WAR/162 games, 5.0 SO% DOWN from 6.5 SO% in 1916)

        But 1917 brings its own class of players with -1.0 WAR or less: Lee Magee, Chuck Wortman, Jake Pilter, a long-past-his-prime Frank Schulte, Pickles Dillhoefer, Chick Shorten, and Dave Shean. Here’s my little analyses for each of them:

        – Pitler and Dillhoefer were rookies, and Shean had not played in the big leagues sine 1912. Despite their terrible seasons, they had below-average SO%’s of 5.5% (Pitler), 4% (Dillhoefer), and 8.1% (Shean)
        – The other four (Magee, Wortman, Schulte, and Shorten) had played about as many games in 1916 as they did in 1917. Here are there 1916 SO%’s vs their 1917 SO%’s:

        – Magee: 5.3% in 1916, went UP to 7.3% in 1917
        – Wortman: 8.6% in 1916, went UP to 10.5% in 1917
        – Schulte: 12% in 1916, went UP to 12.6% in 1917
        – Shorten: 6.5% in 1916, went DOWN 50 5.3% in 1917

        I thought we had something there, but after seeing those numbers for Magee, Wortman, and Schulte, I’m confused again.

        Well, I hope this helps somebody.

        • 23
          William J. says:

          The strike rates in the Federal League were similar to the NL and AL. Also, the league had folded in 1915, so the impact would have been noticeable in 1916. I do, however, think MLB’s attempt to win back fans may have played a role (see comment below).

          • 31
            Insert Name Here says:

            Alright, throw out the first two sentences of that paragraph, but don’t throw the baby out with the bathwater; the rest of my comment still seems relevant to me.

            However, some sort of league effort(s) to win back fans could have been involved…

        • 24
          Insert Name Here says:

          Correction: Shorten’s SO% went down *to 5.3%, not “went down 50 5.3%”

        • 46
          Hartvig says:

          Insert Name Here-

          I’m still as confused as I was before as well but that doesn’t mean I don’t appreciate the fine work you did. And ANY mention of Pickles Dillhoefer is a sure way to make certain that today will be just a little bit better than it might otherwise have been.

          This is almost as fun a distraction as the Hall of Greats are- my only regret is that they’re not being done sitting around a big table over a few beers.

          Well done again, JA

          • 50
            Insert Name Here says:

            Agreed… “Pickles Dillhoefer” is easily one of my all-time favorite baseball nicknames. “Fritz Mollwitz” (actual first name Frederick, hence “Fritz”) is pretty good too!

    • 16

      I was thinking something similar, INH. The graph John presents above destroys the narrative that hitters took pride in making contact and never striking out until Ruth started swinging for the fences and accepting the strikeouts with the home runs, thereby changing most hitters’ approaches.

  3. 3
    Mike L says:

    John A, the US entered WWI in April of 1917. In the beginning, we sent a smaller force “over there” but after the Selected Service act later that year, the armed forces grew to a total of close to 5 million. If you look at the rosters for 1918, there are a significant number of players who played only partial seasons. Here’s a link to WWI players

    • 17
      John Autin says:

      Mike L, the WWI link is appreciated. But I still maintain that there was a quantum difference between the two world wars in the number of top MLB players who missed significant time.

      When you start going through the players listed on that page you provided, you see that a LOT of them missed no significant time. From 1917-19, Ty Cobb missed just 32 games; Speaker missed 17 games. Harry Heilmann missed the last 42 games of 1918. This is nothing like Ted Williams, Joe DiMaggio, Bob Feller and countless others missing 3 full years during WWII.

      My point is not at all to compare the valor and service of the two generations — merely to compare the MLB games missed.

  4. 4
    Mike L says:

    John A (or someone talented), can you graph or otherwise track Ks vs HR’s for 1916, 1917, and 1918?

    • 19
      John Autin says:

      Mike L re: Ks vs. HRs, here are the MLB percentages for those 3 years:

      1916 — SO 10.29%, HR 0.414%
      1917 — SO 9.32%, HR 0.361%
      1918 — SO 7.73%, HR 0.309%

      Change in rates:
      1917 vs. 1916 — SO down by 9.4%, HR down by 12.8%
      1918 vs. 1917 — SO down by 17.1%, HR down by 14.4%
      1918 vs. 1916 — SO down by 24.9%, HR down by 25.4%

      But this positive correlation between SO and HR was shattered in the next few years:
      – From 1918 to 1920, the HR% more than doubled, while the SO% barely changed (down less than 1%).
      – From 1920 to 1922, the HR% jumped another 65%, while the SO% fell by 6.1%.

  5. 5
    e pluribus munu says:

    John, I have no ready suggestions. Your figures took me entirely by surprise – not the 1917-18 ones, but the basic fact that strike outs were so much lower in the early lively ball era than at the height of the dead-ball era (I just went to B-R to check them). I think I’ve always passively believed, as a complement to the theory mentioned by Insert@2, that just as Ruth and the lively ball caused a quantum leap in HRs from 1920, the same was true of K’s (since Ruth’s numbers alone seem to suggest this). Although I knew that the SO numbers for Mathewson, Johnson, Alex, etc. were high and that league leader numbers in the years prior to Feller pretty low, I had never really noted that my intuitions for the ’20s-’30s embraced a contradiction. I appreciate the corrective and I’ll enjoy thinking about reasons for the WWI era drop until someone else comes up with a plausible answer.

    • 26
      MikeD says:

      I wonder if it is a case of batters adapting but pitchers not? Clean balls, better manufactured balls, banning of spit balls, etc. may have removed an advantage pitchers had built in for years. Couple that with a change if approach of hitters and suddenly hitters had progressed but the pitchers were still clinging to their old style. Yeah, I’m making this up, but just throwing it out for thought.

      Related, but moving forward to the 1960s, Bill James theorized that the decrease in hitting leading to the year of the pitcher in 1968 started after 1961 when MLB increased the strikezone to levels never before seen. The impact wasn’t immediate, but over time pitchers began to take greater control of the more generous strikezone, and umpires began to continue to widen the strikezone. In essense, pitchers and umpires were unknowingly working together to decrease hitting. Some believe a reverse form of this took effect throughout the 1990s, with the strikezone continuing to shrink.

      Of course, none of this would explain the rapid decrease that John noted in 1917-18 beyond whatever caused it was permanent. A slight blip up in 1919, and then back down to ever lower percentages through 1925.

  6. 6
    Ed says:

    Just looking at the decline from 1917-1918 this is what I found:

    1) Strikeouts were stable throughout 1917 and then dropped off immediately in 1918.

    2) The dropoff effected virtually every team. There were a few exceptions (Indians, Red Sox and Browns) but those teams tended to have low K rates to begin with so perhaps they couldn’t get much lower.

    3) Every team played in the same ballpark both years and the decline occurred in virtually every park.

    4) Most of the umpires were the same in 1917 and 1918 and almost all of them show a pattern of decline.

    I’d say the results from 1917-1918 suggest some sort of systematic change throughout baseball but I have no idea what it could be.

    • 12
      John Autin says:

      “1) Strikeouts were stable throughout 1917 and then dropped off immediately in 1918.”

      Ed, I’m not quite sure what you were going for there, but the numbers show something different. As noted in the post, the SO% dropped 9% in 1917 (compared to ’16) and then another 17% in 1918 (compared to ’17). It’s like, one big step one year, and then another two big steps the next year.

      • 14
        Ed says:

        Sorry I wasn’t clear there. If you look at the monthly splits for 1917, you don’t see anything unusual. Strikeouts per 9 innings were fairly stable throughout the year. So the drop that occurred in 1918 was an immediate drop. It wasn’t something that was building up from the prior year.

  7. 7
    Mike L says:

    I’m going to throw one more wild-card out there. Rubber shortages during WWI.

  8. 8
    Brent says:

    Equipment related, somehow, due to the War? Something about the baseballs that made it harder to throw hard, or break as much? That is all I can come up with.

    I guess twilight comes later in the day in July and August than September (obviously). So maybe not playing games in September made the general lighting better? I am really guessing here.

  9. 10
    birtelcom says:

    Splendid post, John — prompting a new look at baseball of the era. In 1911, 7.16 plate appearances per game ended in a walk or strikeout instead of a batted ball. By 1916, before the precipitous drops in Ks that you identify, that number was already down to 6.66 (perhaps Satan had taken control of the game?). In 1918, it was 5.75, then 5.74 in 1919 and 5.70 in 1920. Although I don’t have much non-statistical evidence for this, I wonder whether a developing understanding that doctored pitches (spit balls, emery balls, etc.) were bad for the game, bad for pitchers’ arms, bad for control, bad for fielders, etc. was already spreading around the league and reducing the use of such pitches even before they actually began to be banned in 1919-1920. There certainly were critiques of doctored-pitch pitching circulating in these years.

    • 15
      John Autin says:

      Thanks, birtelcom.

      Adding to your speculation that doctored pitches may have been already declining by 1917-18 (before the formal ban), I *think* I’ve read, somewhere, that the practice of keeping a fresher, cleaner ball in the game was already building up steam *before* the Chapman tragedy.

      But I just can’t find a citation for that. I’m pretty sure the matter is discussed in Mike Sowell’s excellent “The Pitch that Killed,” but I’ve misplaced my copy, and the selected pages that are available online don’t touch on that matter.

      • 21
        Richard Chester says:

        I remember reading somewhere that the practice of replacing balls frequently was begun at the start of the 1920 season. I’ll see if I can track down where I saw it.

      • 30
        no statistician but says:

        A question and a couple of comments.

        Were the dirty-ball pitchers actually noted for strikeouts? Does anyone know?

        After looking at the league leaders in the Teens and Twenties, I got the impression that power pitching died out as an approach, except for Dazzy Vance, who put up numbers in the Twenties that would fit in with those in the Teens.

        Obviously a shift was taking place in batting at the same time, as you cite at #27 below. The new star players such as Hornsby, Sisler, Ruth, and Heilmann, who came into the league in the mid-teens, may have had some kind of an impact on the batting side that influenced others. Just an idea.

  10. 20
    Ed says:

    I just skimmed through the 1918 and 1919 Reach Guides. I didn’t see anything mentioned regarding rule changes, playing condition changes, etc. The guide does say that in 1918, 55% if American League players and 64% of National League players were enlisted in the service. So it does seem like there was a fairly large “war effect” at least for that season. Though I’m not sure why that would lead to a decline in strikeout rates. And as John’s graph shows, the lower rates were maintained even after WWI.

    If anyone is interested in perusing the guides, here’s the link.


    Near the bottom left is a link to download them as a PDF or an EPUB.

  11. 25
    William J. says:

    This was an interesting question, so I did a little digging.

    The details are in the link below, but the gist is the leagues may have gerrymandered the strike zone a little to help boost offense.


    • 28
      MikeD says:

      William, interesting as always.

      My gut is that the powers that be did “something” to alter the game. Rapid changes in hitting or pitching usually can be attributed to a significant change. An announced rule change, pushing back the pitcher’s mound, introduction of new baseballs. Yet not all changes may be made public, or recorded in the league minutes. This may be one of those cases where the leagues decided to make a change with the goal of increasing interest, but the world was never told.

      That leads to another interesting question involving Babe Ruth. It’s generally believed his go-for-broke HR swing changed the game as he dragged MLB into the HR age. He was out homering entire teams. It’s interesting that the first year he led the league in HRs was 1918. It was a small number (just 11), but he was also still a pitcher, hitting part time. 1918, of course, is the year John noted for the big drop strikeouts. Something changed and Ruth, a pitcher who had an ability to hit, may have grasped it quicker than other players since he performed on both the defensive (pitcher) and offensive (hitter) sides. Perhaps it wasn’t Ruth who dragged MLB into the HR age, but rather he was the first success story build on the unannounced changes MLB had made, leading to a drop in strikeouts and the beginning rise of HRs. It would take a few years for the rest of baseball to do what Ruth was already doing.

    • 29
      John Autin says:

      William J — Thanks for the link and the cite, and kudos on your own well-done post.

      Tener’s campaign is interesting on a number of levels. For one, his theory that batter patience and pitcher reticence tends to decrease scoring is not one that I’ve ever seen supported, and is certainly at odds with the modern consensus. (But I do agree with him that more “free-swinging batsmen … mean more fun for the fan.”)

      But I think the notion that Tener’s campaign may have affected the action in 1917-18 is problematic. For one thing, while strikeouts were plummeting, walks remained constant, accounting for 7.6% of PAs in both 1916 and 1918 (and 7.5% in 1917). Also, scoring rose just slightly, from 3.56 R/G in 1916 to 3.59 and then 3.63 in 1918. That’s just 2% over the two years.

      • 33
        Doug says:


        I suspect Tener was thinking more of the mindset of the hitter. If you’re always playing the waiting game, it can be tough to be aggressive when you finally get a good pitch to hit. Or, if you force yourself to be aggressive, you end up chasing sub-optimal pitches. But, if there are more hittable pitches being thrown, the batter’s mindset will change from being cautious to being aggressive and he he will thus be a freer swinger and take better cuts.

        Interesting that the perceived remedy (and the effect) then was the same as in 1969 – if the pitchers are too dominant, make it tougher for them by shrinking the target (officially or unofficially) and changing the comfort zone (e.g. clean ball in 1920, lower mound in 1969).

      • 35
        William J. says:

        Thanks and ditto.

        I wondered about the relatively stagnant BB rate, but then realized walks have always been more stable than strikeouts. Whereas BB/9 has varied from 2.35 to 4.09 since 1901, K/9 has ranged from 2.72 to 7.56. Also, the correlation coefficient for the two rates since 1901 is pretty low, so what impacts one won’t necessarily change the other.

        As for the slight increase in scoring, we also know there is little correlation between run scoring and strikeouts, so Tener’s ideas probably shifted some strikeouts to balls out of the zone being put into play earlier in the count, which won’t have much of an impact on offense. I wish we had a record of total pitches thrown in the two seasons, but that data is not available.

  12. 27
    John Autin says:

    I do think there’s a bit of misconception about the rise in offense at the start of the “live-ball” era. The effect of HRs tends to be overassessed, while the impact of increased batting average tends to undercounted.

    Taking 1918 as the base year, the last year before Ruth’s first HR record, scoring rose sharply for 3 straight years — from 3.63 R/G to 3.88, then 4.36, then 4.85 in 1921 — and then leveled off for several years.

    So from 1918 to 1921:
    – Runs per game rose by 34% (3.63 to 4.85).
    – BA rose by 15% (.2538 to .2910).
    – Hits per game rose by 20% (8.41 to 10.08).
    – Total bases per hit rose by 8% (1.281 to 1.384).
    – HR per game rose by more than 200% (0.12 to 0.38).

    If you just look at the HR percentage, it seems like an explosion. But at 0.38 HR/G, home runs were still relatively rare, as reflected in the modest 8% rise in TB/H. Over 71% of all team-games in 1921 had no HRs, and just over half of all games had no HRs by either side.

    1929 brought the next great surge in HRs, reaching a new record high of 0.55 HR/G. And that level held pretty steady until WWII; the average for 1929-41 was 0.55 HR/G. That’s 45% more HRs than in 1921 — yet scoring in this period averaged 4.90 R/G, just 1% more than in 1921. The cause of runs was being distributed more towards HRs, but the number of runs barely changed.

    Comparing 1921 to the average of 1929-41, we find that:
    – Hits per game fell a little (10.08 to 9.68);
    – Doubles per game went up a bit (1.62 to 1.74); and
    – Triples per game fell a good bit (0.55 to 0.40).
    In a sense, 3Bs were traded for HRs: The sum of the two is pretty steady from 1921-41, and by the end of the period the numbers have been reversed — 3Bs down from 0.55 in 1921 to 0.35 in 1941, HRs up from 0.38 in 1921 to 0.53 in 1941.

    Clearly, there are lot of different things going on in the first 20 years of the “live-ball” era. Home runs are an important part of the story, but by no means all of it.

  13. 32
    John Autin says:

    I’d like to put to bed the notion that WWI cost a significant number of top players a significant number of games:

    I looked at 1916’s top 65 position players by Wins Above Replacement — about 4 per team — and counted their games played in 1917-18. I did the same for a non-wartime control group (1920 to 1921-22), and then for a WWII group (1942 and 1943-44).

    For convenience, I measured games played as a percentage of the MLB median number of team-games (including ties) in each season.

    First World War

    1916’s top 65 position players played an average of 75% of available games in 1917-18, with a median of 83% of available games. Looking at 1918 alone, that same group of 65 top players averaged 68% of available games, but with a median of 84%.

    Those figures are barely different from the non-wartime control group. 1920’s top 65 position players in WAR averaged 74% of the schedule for 1921-22 combined, with a median of 87%. For 1922 alone, they averaged 71% of available games, with a median of 89%.

    Putting the WWI and control numbers side by side:
    1916 to 1917-18 — Average 75%, median 83%.
    1920 to 1921-22 — Average 74%, median 87%.

    1916 to 1918 — Average 68%, median 84%.
    1920 to 1922 — Average 74%, median 87%.

    Second World War

    1942’s top 65 position players in WAR averaged 50% of available games in 1943-44, with a median of 49%. And for 1944 alone, forget it — they averaged 39% of available games, with a median of 0% (i.e., more than half did not play at all in 1944).

    Putting the WWI and WWII numbers side by side:
    1916 to 1917-18 — Average 75%, median 83%.
    1942 to 1943-44 — Average 50%, median 49%.

    1916 to 1918 — Average 68%, median 84%.
    1942 to 1944 — Average 39%, median 0%.

    I didn’t run the pitchers, but it seems pretty clear that the impact of WWII on MLB was vastly greater than that of WWI.

    Yes, some players did lose time to WWI, and the war did take a month off the schedule in 1918. But for the purposes of this strikeout study, the shortened schedule is basically irrelevant. And on a league-wide basis, the games lost by top players, as a percentage of team games actually played, was pretty small.

    • 34
      Brooklyn Mick says:

      John, I disagree that the shortened schedule is irrelevant, and, quite possibly, the shortened season might be the anomaly you’re looking for.

      In 1917 there were 2494 games played compared to 1918, when they played only 2032. That’s a pretty significant difference. To say that the strikeout ratio would have held steady throughout a full season is supposition at best.

      If we’re going to hold fast to the ratios irregardless of games played, then why not credit Matt Williams with 62 homers in the strike shortened 1994 season?

      • 36
        William J. says:

        The shortened schedule was probably irrelevant because monthly strikeout rates in the past did not vary much, so there’s no reason to think the lost September would have had an impact.

      • 37
        John Autin says:

        Mick, taking your idea to heart, here’s my best attempt to adjust for the fact that the 1918 schedule ended in early September:

        Based on the monthly split data for the 2 years before and after 1918, the combined strikeout rate in September/October was 5.3% higher than the composite for the other months.

        In 1918, 80% of a standard schedule had been completed by the end of August. The strikeout rate through August was 7.82%.

        Suppose we populate the hypothetical last month of 1918 with PAs equal to 1/4 of the April-August total, and estimate the last month’s SO% at 5.3% higher than the April-August rate.

        The resulting hypothetical totals would increase the 1918 SO% from the actual 7.73% up to a hypothetical 7.91%. Not a big difference.

        The change for 1918 compared to 1917 would then fall from the actual 17.1% to a hypothetical 15.1%. The two-year change for 1918 compared to 1916 would fall from the actual 24.9% to a hypothetical 23.1%. The effect is minimal.

        I just can’t see how any reasonable projection for the canceled final month of 1918 would significantly change the picture of a sharp drop in strikeouts in 1918 and for the two-year period of 1917-18.

        • 38
          John Autin says:

          Further to my #37:

          In order to shrink the 1918 drop in strikeouts (compared to 1917) down to a still-significant 10%, we would have to project the hypothetical last-month SO% at 10.66% — or 36% above the combined rate for April through August.

          It’s just not reasonable to imagine that the canceled month played a significant role in 1918’s curtailed strikeout rate.

      • 39
        Brooklyn Mick says:

        Yeah, you’re right John and William. Even in the 21 September games that were played, the SO% was only 2.6. I hadn’t thought to check the monthly splits. I retract my statement.

      • 40
        John Autin says:

        Mick — Not to pile on (especially after your retraction), but it’s worth noting that the analogy to Matt Williams ’94 doesn’t work. There’s a quantum difference in statistical significance between projecting one player from 115 games up to 162 games (a bump of 41%), and projecting 9 players from 2032 games to 2494 games (a bump of 23%). A player can easily get hot or cold for a month or two, but that virtually never happens with the major leagues as a whole.

    • 42
      Mike L says:

      John A, does it matter (for strikeouts at least) whether there were more top players taken out in WWII than WWI? Wouldn’t you just assume that reducing of major league quality players in either era would increase strikeouts?

      • 44
        John Autin says:

        Mike L — If we assume that the reduction in talent level was roughly equal between hitters and pitchers, then why would we assume any particular direction of change in SO%?

        And for what it’s worth, stats during the Second World War show the opposite of what you suggest. The SO% declined a little bit over the duration. The SO rate was 9.16% in 1941, 8.87% in 1942. By 1945 it was down to 8.49%. Then in 1946 it shot up to 10.16% — the biggest one-year increase outside of the foul-strike rule.

        If those years suggest anything, it would be that a decline in talent level reduces strikeouts.

        • 45
          Mike L says:

          Never leave a good argument (especially when you are probably wrong). I would have expected more high quality players to have served in WWII that WWI because the need was greater, and because of societal pressure was greater. WWI was largely perceived as Europe’s War. My dad, who served in the Philippines, told me that there were two men in his company who had real physical disabilities and were still drafted. Also, they drafted up to 39, and it was considered suspect be considered 4F. President’s sons served (and one died). As to the 1946 numbers, I wonder if an alternate explanation couldn’t be found; players had essentially been out of the game for three or more years, some still had their names but not the same degree of talent, and maybe 1946 was somewhat akin to Spring Training, where the pitchers start out ahead of the hitters.
          But, in all fairness, I can’t really find fault with any of your arguments, so I am going to return to my WWI rubber shortage and change to the baseball suggestion I made above.

          • 49
            Hank G. says:

            “so I am going to return to my WWI rubber shortage and change to the baseball suggestion I made above.”

            But how would a temporary shortage lead to a permanent change in the strikeout rate? I think the fact that the change coincided with WWI is a red herring. Whatever the change in conditions was, it had to be ongoing.

  14. 41
    John Autin says:

    Meanwhile (he said, in a shameless attempt to reach 50 comments), what about that 20% strikeout spike in 1946?

    • 43
      birtelcom says:

      Well you certainly need 50 comments to validate the interest of your post — and I’ll say that again 8 more times if I have to.

      • 48
        Hartvig says:

        I’m afraid that I have to take a stand against introducing PEC’s- Performance Enhancing Comments- into the “game”.

        I’ll stick with my usual gibberish, thank you.

  15. 47
    John Autin says:

    Another look at the top strikeout pitchers of 1916 and what they did in 1917-18:

    There were 31 pitchers who notched at least 100 SO in 1916. The group averages (with 1918 projected to a full season):

    1916 — 136 SO in 279 IP, 4.38 SO/9.
    1917 — 102 SO in 239 IP, 3.84 SO/9.
    1918 — 67 SO in 188 IP, 3.22 SO/9.

    So while the SO rates were declining, it might also be significant that this high-SO group had 33% fewer innings in 1918 compared to 1916, even after prorating their 1918 numbers to a full schedule.

    But we don’t know how much of that is a war effect and how much is ordinary wear and tear on pitchers. Virtually any group of pitchers that is selected as tops in some counting stat will show a decline in both workload and in that counting stat two years later.

    So I made a control group of the 1920 top 31 in pitcher strikeouts, and followed them through 1921-22.

    1920 — 104 SO in 272 IP, 3.46 SO/9.
    1921 — 70 SO in 202 IP, 3.14 SO/9.
    1922 — 65 SO in 190 IP, 3.07 SO/9.

    The control group pitched 30% fewer innings two years down the road, which is similar to the WWI group (33% fewer IP).

    So I’m not going to pursue the notion that the hypothetical loss of a handful of strikeout artists to WWI service might have skewed the 1918 SO rate.

    P.S. Having run this little study, I now wish I’d used a different control year, since both Eddie Cicotte and Lefty Williams were among the top SO pitchers of 1920 and then were banned. But I’m not going to run it again — I might accidentally pick a year when two top pitchers blew out their arms.

Leave a Reply

Your email address will not be published. Required fields are marked *