by Rany Jazayerli
March 3, 2004
http://www.baseballprospectus.com/article.php?articleid=2633
So to understand the methods we use to analyze pitcher usage, it's important
to appreciate that while every team in baseball today employs essentially the
same usage pattern--starting pitchers work in a five-man rotation, with four
or five days of rest between starts, and never relieving in between--that
usage pattern is far from the norm historically.
As recently as 30 years ago, starters were expected to start every fourth
day, with only three days of rest between starts. This does not appear to
have had a detrimental effect on the pitchers of that era; in fact, over half
of the 300-game winners of the live-ball era were in the prime of their
careers in the early 1970s.
There is no definitive proof that pitching in any kind of rotation is a
necessary ingredient for successful pitching staffs. Through the 1950s,
starting pitchers would routinely get six or seven days off to pitch against
a team they matched up favorably against, then return to the mound on just
two days' rest for their next start.
There is no evidence that starting pitchers who relieve on their days off
between starts suffer adversely for doing so. Starting pitchers routinely
made 10 or 15 relief appearances a season for the better part of half a
century.
So if starting pitchers have been used in many different ways over the years,
and there's no hard evidence that any one usage pattern was more likely to
keep pitchers healthy, how do we determine whether a pitcher is being used in
a manner that's likely to get him hurt?
One thing we have learned is that for starting pitchers, how many days off
they get between starts does not seem to correlate with injury risk. This
series of articles carefully looks at the track record of pitchers working in
a four-man rotation vs. pitchers in a five-man rotation, and finds that
pitchers who worked in a four-man rotation stayed just as healthy as pitchers
working every fifth day. It also showed that a pitcher working on three days
of rest is no less effective than when he works on four days of rest, and in
fact that he might have better command on less rest.
What seems to matter isn't how often a starter pitches, but how much he
pitches when he does take the mound. About five years ago, we unveiled a
system known as Pitcher Abuse Points (PAP for short) that attempted to
measure just how much is too much. The system is based on the following
principles:
While pitching is an inherently unnatural motion, throwing a pitch does not
necessarily do permanent damage to a pitcher's arm. It's only when fatigue
sets in (and a pitcher's mechanics start to waver) that continued pitching
can result in irreversible injury.
There is a certain number of pitches that a pitcher can throw before that
fatigue sets in.
Once a pitcher is fatigued, each additional pitch causes more damage, and
results in more additional fatigue, than the pitch before.
The original version of PAP operated under the assumption that fatigue set in
at 100 pitches, and after 100 pitches a starter was awarded Abuse Points for
each additional pitch. The number of points he received per pitch slowly
increased as he threw more pitches.
Two years later, Keith Woolner performed the definitive study that examined
the relationship between high pitch counts and injury risk. First, Woolner
looked at whether there was a relationship between high pitch counts and
decreased effectiveness over the pitchers next few starts. What he found was
that, while the relationship was there, the formula for PAP needed to be
changed--that until that point, the system did not penalize pitchers enough
for really high pitch counts (120 and up) compared to a 105 or 110-pitch
outing.
Then using the new, refined formula for PAP, Woolner showed that there was,
indeed, a link between high PAP scores and future injury risk.
The way PAP scores are calculated is quite simple. Simply take the number of
pitches thrown in any given start, and subtract by 100. (If the pitcher threw
fewer than 100 pitches, he automatically receives zero PAP for that outing.)
Then the resultant number is cubed to arrive at the PAP score for that start:
100 pitches - 100 = 0^3 = 0 PAP
105 pitches - 100 = 5^3 = 125 PAP
115 pitches - 100 = 15^3 = 3375 PAP
130 pitches - 100 = 30^3 = 27000 PAP
As you can see, by this method a 130-pitch outing is eight times more
damaging than a 115-pitch start, and 216 times worse than throwing 105
pitches.
There's one other factor that needs to be considered when evaluating whether
a starting pitcher is throwing too many pitches. As first explored by Craig
Wright in his landmark book, The Diamond Appraised, starting pitchers under
the age of 25 appear to be particularly sensitive to how many innings they
are allowed to throw. Some of the most talented young pitchers of the last
forty years--from Gary Nolan and Don Gullett in the 1970s, to Dwight Gooden
in the 1980s, to the Mets' "young-guns" trio of Bill Pulsipher, Paul Wilson,
and Jason Isringhausen in the 1990s--went on to suffer career-threatening
injuries that prevented them from ever reaching their full potential.
Conversely, two of the most durable starters of our generation, Nolan Ryan
and Randy Johnson, weren't even full-time starters in the majors until they
turned 25.
Indeed, when 21-year-old Kerry Wood blew out his elbow, the spring after one
of the most exhilarating seasons ever by a rookie pitcher, it proved to be
the spark needed to convince major-league organizations that lowering the
pitch counts of their starting pitchers might prevent a significant number of
injuries--and save them millions of dollars in the process.
So to recap, here's everything we know about the usage of starting pitchers:
There is no evidence that the current system of employing a five-man rotation
is any better at accomplishing what it was created for--keeping pitchers
healthy--than the four-man rotation. It appears that most pitchers simply
don't need more than three days of rest between starts.
In the era of the four-man rotation, teams were able to get six or seven more
starts, and 50-75 more innings, out of their best starters than teams do
today.
Starting pitchers have, historically speaking, thrived without use of a fixed
rotation at all.
Starting pitchers have, historically speaking, been used as relievers between
starts without adverse consequences.
What seems to put starters at risk of injury is throwing too many pitches per
start.
Roughly speaking, "too many pitches" seems to translate to "over 100".
Once a pitcher hits his fatigue point, his risk of injury goes up very
quickly with each additional pitch.
Pitchers under the age of 25 are exquisitely sensitive to overuse.
The ideal usage pattern--something I think we'll see some teams try to
emulate over the next decade--would probably look something like this:
A reversion back to a four-man rotation, giving a regular starter 40 to 41
starts over the course of a season.
More careful observation of pitch counts, with most pitchers probably
averaging about 90-95 pitches a start, and rarely going over 110 in any given
outing. Older, more established pitchers might average closer to 100 pitches
a start, with a soft limit of 120 pitches in an outing.
Judicious use of a starting pitcher on his standard throw day between starts
could net another seven or eight appearances and 10-15 innings over the
course of a season.
Individual starting pitchers throw fewer innings, and therefore have less
overall value, today than at any point in baseball history. But it doesn't
have to be that way. By following these guidelines, teams should be able to
safely get 280-290 innings out of their #1 starter, instead of the 220-230
innings we see today.