Do Hitters Wear Down Over a Season?

As great as Albert Pujols has been, age plays a part in a hitter's break down. (via Erik Drost)

As great as Albert Pujols has been, age plays a part in a hitter’s break down. (via Erik Drost)

While watching and listening to games this past postseason, I heard quite a few comments on players wearing down as the season dragged on. The narratives were all over the place. Are the older players able to pace themselves over the season because of experience? Do the younger players have youth on their side and continue producing? Is there a difference? When I went searching for an answer, the results surprised me.

To see how hitters perform as the season wears on, I compared their production in the first half (1H) to the second half (2H). One issue with doing these comparisons is the decline in the offense environment that started after the 2005 ban on performance-enhancing drugs. To help adjust for this drop in offense, I used wRC+, which is adjusted for the league and season run environment. This is not so important with the 1H to 2H comps, but I also compared one year’s second half to the next season’s first half. There, it is more valuable.

First, hitter change in performance from first half to second half.  (There’s a note on my methodology at the end of this article.)

The data are not perfectly smooth, but some trends can be noted. Hitters can expect to see some improvement from the season’s first half to second half in their age 21 to 25 seasons. From age 26 to 34, some decline occurs, with even more from age 35 on. I weighted each of the changes grouped into three age brackets and here are the results:

Age Range: Change in wRC+

  • 21 to 25: +2.1
  • 26 to 34: -2.1
  • 35 to 40: -5.1

We need to understand that each of these values should probably be more positive depending on survivor bias. Hitters who are over-performing in the first half are more likely to get more plate appearances in the second half than an identically talented player who struggled to start the season. The over-performing player will on average regress to his true talent level in the season’s second half, which will give the appearance that he is wearing down.

The above numbers do pass the smell test. Over the same time frame, all hitters have a 102 wRC+ in the season’s first half and a 97 wRC+ in the second half. Overall league offense sees a decline from the season’s first half to the second. The first thought to explain the change is that the average temperature is warmer in the season’s first half. Higher temperatures  should mean more offense because for every 10 degree rise in temperature, a baseball will travel an additional four feet. But that’s not the case. Using Retrosheet data from 2005 to 2013 (2014 data are not available yet), the average temperature in the first half is 70.0 degrees; it jumps to 76.6 in the second half. On average, hitters should be performing better at higher temperatures, but they aren’t.

Other than those in the 25-and-under grouping, most hitters see a small drop in ability over a season, but how do they produce in the first half of the next season? I ran the same process for the last half of Season One to the first half of Season Two. Here are the results:

Age Range: Change in wRC+

  • 21 to 25: -4.8
  • 26 to 34: -4.7
  • 35 to 40: -6.4

Talk about some negative numbers! The higher values from 26 to 40 are sort of easy to understand. If a player is slowing down, losing eyesight and/or loses muscle mass, he would expect to see more decline over a nine-month span than in the three months from the first to the second half of the same year.

The young player decline puzzles me. The decline amount is just a bit more than the amount for the 26-to-34 year-old group. These players improved as the season went on, but declined from the end of one season to the start of the next half season. The colder weather could be some of the effect, but not all. I don’t know the answer for sure, but here are some theories.

  • Teams have more time over the offseason to find out what works against certain new young hitters and can make changes going into the next season.
  • Younger hitters don’t take the offseason as seriously as older hitters. They don’t come into the season ready to play and struggle early on. Veterans, who struggled in the past, are going to put in more effort.
  • Survivor bias is even more of an issue because teams thought the player they saw in the second half would be the same next year. They don’t have any options available when the player regresses to a lower talent level to start off the next season.

The sweet spot for lack of decline from youthful innocence on what it takes to be a big leaguer and the body letting a player down occurs around age 25. This drop can be seen in the overall wRC+ aging curve from the same time frame.

A couple of notes. First, aging curves used to have more of an up and then down, with a later peak, but just recently I noticed a change post PEDs, which major league teams, like the Pirates, have also noticed.

Second, the aging curve was created by the delta method by weighting plate appearances and innings pitched using their harmonic means, explained here. With this method, there’s a small survivor bias, which was summarized by Mitchel Lichtman in these internet pages back in 2009:

A Hardball Times Update
Goodbye for now.

“… survivor bias, an inherent defect in the delta method, which is that the pool of players who see the light of day at the end of a season (and live to play another day the following year) tend to have gotten lucky in Year 1 and will see a “false” drop in Year 2 even if their true talent were to remain the same. This survivor bias will tend to push down the overall peak age and magnify the decrease in performance (or mitigate the increase) at all age intervals.”

Age Range: Average decline in wRC+

  • 21 to 25: +0.8
  • 26 to 34: -4.2
  • 35 to 40: -9.7

While the year-to-year numbers don’t add up because of survivor bias, the trend is generally the same. We have an early improve/constant talent period to age 25. Hitters then begin to slowly decline from age 26 to age 34. After age 34, the decline rate doubles.

Now, getting back to the original question: How do hitters perform as the season progresses? Generally, hitters 25 and younger improve during the season, while older hitters get worse. The reasons could be injuries or just wearing down. The big surprise in aging happens between seasons, when players — who shouldn’t be experiencing any wear and tear on their bodies — get worse. Batters of all ages see their performance decline significantly from the end of one season and the start of the next. The reason is not obvious, but it could be a combination of weather, conditioning, regression or teams having more time to prepare. All players eventually wear down; it is surprising to see this aging happen during the offseason instead of during the season.

Jeff, one of the authors of the fantasy baseball guide,The Process, writes for RotoGraphs, The Hardball Times, Rotowire, Baseball America, and BaseballHQ. He has been nominated for two SABR Analytics Research Award for Contemporary Analysis and won it in 2013 in tandem with Bill Petti. He has won four FSWA Awards including on for his Mining the News series. He's won Tout Wars three times, LABR once, and got his first NFBC Main Event win in 2021. Follow him on Twitter @jeffwzimmerman.
Newest Most Voted
Inline Feedbacks
View all comments
7 years ago

Pitcher velocity is lowest in April. Maybe it’s not the batters wearing down, but the pitchers warming up.

Mr Punch
7 years ago

I’m not sure the off-season decline is all that surprising. It’s six months, and not all players work out consistently; plus a fair number have surgery. The drop for young players is interesting, though. Might it be worthwhile to focus on this group, and to (for example) separate out rookies? There is, after all, a phenomenon known as the “sophomore slump.”

7 years ago

The weird result with the season 1 half to season 2 half makes me wonder if there’s some artifact about the technique–like strong survivor bias–that affects things. wRC+ is normalized within seasons, right? So maybe if you were to analyze the raw numbers between season halves that weird drop in hit among younger players would go away?

Really interesting piece.

7 years ago

If hitters decline within a season and between seasons, where is offense injected into the system? Is it all rookie hitters? Or reinvigorated hitters returning after a recovery?

Also would there be any value in removing April and September from the analysis? April’s usually strange due to weather and rusty players and September’s usually strange due to call-ups and strange motivations (getting a look at young players vs resting veterans vs clawing for a playoff spot). May-through-August seem to be more similar to each other.

7 years ago
Reply to  Jianadaren

I think that the survivorship bias gives the illusion that there is offense “leaking” from the pool of MLB players, since too much playing time ends up going to guys whose true talent is less than their performance in the previous time period.

There may be a similar bias at work for the guys who’ve underperformed their true talent level, and therefore only get plate appearances when an MLB team needs to dig into their farm. For example, Steve Pearce got very little playing time in 2013, which would reduce the weight that his improvement into 2014 carries in the study. These kinds of players represent an injection of offensive talent into the MLB player pool, but it’s possible that this impact gets understated by the weights used within the delta method.

7 years ago

Does this account for decreases in average offense that could be occurring during the year? For example, if umpires are changing how they call the bottom of the strike zone in the course of a season, wouldn’t that affect your 1st half to 2nd half results?

7 years ago
Reply to  Jeff Zimmerman

wRC+ is adjusted on a seasonal basis, not within the year. I’m talking about the offensive environment declining within a single year–going from a more hitter-friendly to more pitcher-friendly environment as the year goes on, due to umpires or something else.

7 years ago

Since wRC+ always averages out to 100 over a season, and that only ~20% of qualified hitters are age 21-25 who do worse in the first half, and that young hitters are below average (wRC+ in the low 90s), and the average rookie is even worse, your results don’t really seem possible. You can’t have an almost-leaguewide year-over-year decline in wRC+ without an influx of new talent, but the young players suck and the first-half rookies suck too, and the full season of rookies really suck (from the 40-man refugees getting cups of coffee I’d guess). Plus you’re moving from the 2h, where they’re bad, to the next season’s 1h, where they’re better, and still claiming a near-leaguewide decline year to year. There’s just no way this is actually happening in season-normalized wRC+ terms. There’s no possible talent influx to offset that.

7 years ago
Reply to  Jeff Zimmerman

But it can’t be just that. You’re reporting that ALL surviving players (in aggregate) drop 4-6 points from 2h year 1 to 1h year 2 That’s only possible if players in the next season’s 1h who aren’t in sample (rookies, possibly guys returning from huge injuriesyears abroad/season in minors) completely offset that drop (and then some since 1h wrc+ is 102), but 1h rookies aren’t even 100 themselves. It doesn’t add up.

7 years ago

I raised a similar question to Tom C on BBTF. Keep in mind that it’s not just rookies entering the system but older players leaving it. Still there is something counterintuitive about the results and just looking1H vs 2H when you likely have more granular time periods available also seems like it might be obscuring something.

Here is what I wrote over there:

There’s something strange to me about using wRC+ for this comparison, since it compares players to the league average. You really are not measuring whether hitters got better in an absolute sense but rather which age bands got better relative to the other ones. So it is a bit confusing to see a result that says that players of all ages got worse relative to league average. Does that mean that the new players coming into the league at the beginning of the season are so much better than the old guys who didn’t return, and everyone else looks worse relative to league average as a result? And basically the young guys improve in-season and then regress during the off-season, roughly maintaining a constant-to-slightly-improving talent level, while everyone else is getting worse in-season *and* during the off-season?

Other Dave
7 years ago
Reply to  Dave

I think this might be an important thing, the older players leaving. What is not captured here (I think) is that a bunch of (older) players are not coming back, and are replaced by rookies. The rookies might not be good, but they are (apparently) a lot better than the guys that disappear. Which makes some sense, if you are bad enough to not play again, the guy that replaced you is probably significantly better.

It doesn’t explain why everyone gets worse all the time at the individual level, but it may help explain it at a league level.

7 years ago
Reply to  Other Dave

I think you’ve nailed the issue on the illusion of offense “leaking” from the study. If the rookies and the Steve Pearces of the world are in the 95-105 wRC+ range when they debut, they boost the league level of offense vs. the guys who lose their jobs because of lousy hitting. And the guys losing their jobs drag the aging curve down by either (a) failing spectacularly over a small number of PAs or (b) producing clearly unacceptable offense (75-85 wRC+) over a long enough stretch that the team views the decline as real.

The more I think about it, the more Jeff’s results do make sense.

7 years ago

I think at least some of the data is mis-stated. The linked queries supporting the statement that the overall 1st half wRC+ is 102 and 2nd half wRC+ is 97, are for 2014 only. These are fangraphs links; there the definition of 1st half changes each year (all start break?), and there doesn’t seem to be a way to get ten year totals in one query. when I ran a query for each year from 2005 to 2014, and averaged these results, it looks like the 10 year averages are about 100 wRC+ in both 1st and 2nd half. I didn’t try to check other numbers

7 years ago
Reply to  Joearthur

Also, to the extent the age specific numbers are right, they represent a combination in unknown proportion of in-season aging effects and “wearing down”. These may be entirely aging effects, as I think Dave suggests…

Calvin Liu
7 years ago

Interesting idea – but it ignores at least 2 very important factors:
1) the book on each hitter. The younger the player, the less likely there is a book. As the season wears on, and for the next season, the book becomes better. For older players, the book also matters as there is continuous adjustment between batters and pitchers. They’re not robots out there.
2) Growth of the hitter. See above
Other comments on survivor bias are also quite relevant.

Cyril Morong
7 years ago

Interesting post. Do pitchers wear down? Should teams give more rest to some players? Should they call up some younger players in the 2nd half?

I looked at how league OPS in June, July and August compare to the OPS for the entire season by decade going back to 1914. It seems like the warm weather effect might be getting a bit weaker over time, but no real clear pattern

Rich Moser
7 years ago

I’ll take this thread into a different direction. To me, the biggest story of this season (2014) was the “decline” (downward expansion) of the strike zone as the season approached the playoffs. If this turns out to be some sort of annual phenomenon that couldn’t be measured until now, then it would be a major factor hurting the batters in the 2nd half.