Have pitchers become more fragile?

Are today’s pitchers all pansies? There are a lot of people that seem to think so. As the number of innings a starter is expected to pitch has gone down over the past few decades, many critics have piled on to what seems like some serious coddling.

Baseball has seen a steady fall in complete games, innings per start, and even games started throughout its history, and each generation has been able to call the next fragile and weak. Just take a look at this graph of the league-leading total in innings pitched since the first professional baseball season in 1871:

image

We can easily define some rough eras in terms of pitcher usage:

  • 1871-1892, when pitchers were regularly throwing over 600 innings a year.
  • 1893-1900, a period of steep decline, bottoming out in 1900 when Joe McGinnity led the National League with “just” 343 innings pitched.
  • 1901-1917, which saw a few 400-inning pitchers and league leaders regularly approaching that number.
  • 1918-1956, when the league leader in innings averaged a little over 300 innings pitched.
  • 1957-1961, a five-year period in which no one threw even 300 innings.
  • 1962-1979, when 300-plus innings seasons came back into fashion.
  • 1980-2007, a period of slow but steady decline in innings pitched.

There are four decline phases here, two of which are easily explicable. In the late 19th century, the demands on pitchers increased as overhand pitching was legalized and the rules were changed to make the batter-pitcher matchup more of a battle. Innings pitched by starters fell after the deadball era because of an increase in run scoring, which meant pitchers had to exert themselves more and face more batters per inning.

The decline in innings pitched by the league leader in the 1950s is admittedly weird, but it might just be due to a changing of the guard. Robin Roberts, who led the league innings pitched from 1951-1955, was entering his decline phase, while new stars like Don Drysdale, Juan Marichal, and Sandy Koufax were still a few years away.

But the decline since 1981 has no such simple explanation, at least not yet. The question I’d like to answer today is whether that decline is due to a change in playing conditions or managerial decisions, like in the late 19th century or after the deadball era, or if it is simply the result of a less durable generation of pitchers, like in the late ’50s.

So how do we answer that question? I think we can do it in much the same way that we did league adjustments. Basically, a pitcher’s durability doesn’t change from year-to-year (well, it does, but we’ll control for that). If Sandy Koufax pitches 350 innings one year and 300 the next, all else being equal, we have to assume that innings were 17% easier to accumulate in the previous year.

So if we look at every pitcher’s changes in innings pitched from year-to-year, we can figure out just how much of the decline in innings is due to more fragile pitchers and how much of it is due to more difficult conditions.

There are some caveats. First of all, we only want to be looking at starters because relief innings are whole other story. So I’ve restricted my sample to pitchers who started in all of their appearances in a given city.

A Hardball Times Update
Goodbye for now.

Secondly, we want to make sure that pitchers are used equally in both years that we look at. If a September call-up pitches well and is inserted into the starting rotation the next season, that doesn’t mean that it’s become easier to throw more innings, and likewise if a pitcher gets injured, that doesn’t mean it has become harder. So what I’ve done is only looked at consecutive seasons in which a pitcher had about the same numbers of starts in both years, within a margin of error of +/- 2.

With that said and done, we are ready to compare durability throughout the decades. However, because of all the restrictions, sample sizes are only really meaningful after 1970, so we’ll only be able to look at the past 35 years or so. As well, strike years can screw things up, so I’ve had to omit 1980-81 and 1993-95. With that said, let’s look at how pitchers’ innings have decreased over the years:

image

What this graph shows is that a pitcher in 1970 would be expected to throw 82% as many innings today. In other words, it does appear that this decline in innings pitched has been at least partly due to a change in pitching conditions, and not due to lesser pitchers.

If we look at the graph closely, we find that the whole change occurred in the ’80s; since 1990, the graph has been almost perfectly flat. And yet, innings pitched by the league leader have continued to decline since then, though admittedly only by a bit. We may need to wait a few more years to see just how much that number will fall.

So now here’s the pertinent question: Why are pitchers throwing fewer innings now than they would have two decades ago? I have a few different theories:

  • This decline in innings pitched coincides pretty well with a startling inflation in baseball salaries. Perhaps as teams have invested more and more in pitchers, managers have been pressured to push them less, as injuries become costlier.
  • As run scoring has increased, perhaps innings have become longer, increasing the number of pitches thrown per innings and decreasing total innings pitched per start.
  • Or perhaps, just maybe, as hitters have improved, it has become tougher to be a pitcher. Each pitch requires more exertion and a pitcher’s best stuff can’t be saved for the opposing team’s best slugger because even the shortstop can hit the ball out of the park.

We can test these theories, and at some point I will, but I’ll leave that for another article. For now, we can rest easily knowing that today’s pitchers aren’t pansies—today’s game is just very different from the baseball your grandfather grew up with.


Comments are closed.