For about as long as baseball has been played, players, coaches, and fans have pondered about how to properly evaluate players. As times change, the methods of evaluating players have evolved.
Since pitchers tend to come under the most scrutiny of any position and are generally considered to be the most critical asset to a team’s success, the Earned Run Average (ERA) statistic was created.
So, what is ERA in baseball?
Earned Run Average, or ERA, is a statistic that is used to evaluate the number of earned runs that a pitcher, on average, allows per nine innings pitched. This is done by taking the pitcher’s total earned runs allowed and dividing by the number of innings pitched, and multiplying it by nine.
Now, a pitcher’s ERA doesn’t tell the entire story, as different sets of circumstances can lead to wild swings in what is considered a “normal” ERA and likewise what is considered good or bad. We’ll go further in-depth to explain exactly what an ERA is and how it’s interpreted.
So let’s address the question at hand, shall we?
What Is a Pitcher’s Earned Run Average?
In baseball, the job description for a pitcher is rather simple: to prevent the other team from scoring. Naturally, the most commonly accepted stat for evaluating pitchers involves how effective the pitcher is at preventing runs, which ERA reflects.
A pitcher’s ERA reflects the average number of runs that a pitcher, based solely on his pitching, would allow per nine innings of pitching.
ERA is meant to take into account the performance of both the pitcher and the defense behind him so that he isn’t punished for poor defense. As a result, there are two different types of runs that a pitcher can be charged with: earned and unearned runs.
To better understand the significance and calculation of ERA, we should probably explain what the difference is between the two.
What Are Earned and Unearned Runs in Baseball?
In the simplest terms, an earned run is a run that is deemed to be allowed solely due to the pitcher’s pitching ability, while an unearned run is a run that would not or likely would not have scored without the benefit of an error or passed ball.
Earned runs are by far the most common, accounting for over 92% of the 23,467 runs scored in the 2019 Major League Baseball (MLB) season. These runs are almost exclusively scored as the result of a combination of hits, walks, and hit batters, as well as well-timed outs.
On the other hand, unearned runs are much rarer, accounting for only 1,783 MLB runs in 2019. In broader terms, there were roughly three unearned runs scored over the span of every four MLB games in the 2019 season.
These runs are most commonly the result of a fielder committing an error.
These also include any errors committed by the pitcher, as while they are the pitcher’s fault, they aren’t indicative of his pitching ability, specifically.
There are rules differentiating earned and unearned runs. If a batter reaches base as a direct result of an error and later scores, his run will be unearned regardless.
However, there is also a gray area with errors that advance a runner, but do not directly result in that runner reaching base.
In these cases, the official scorer is left to judge whether the run(s) would have scored anyways had the rest of the inning played out the same way without the error.
For example, let’s say a runner singles with one out and nobody on base, but the outfielder mishandles the ball, allowing the runner to advance to third on the play.
If the next batter were to hit a home run, then both runs are earned, as they would’ve scored anyway on the homer.
On the other hand, if the next batter instead hits a sacrifice fly that scores the run (or if he scores on a ground out, a wild pitch, or even a single) and the inning subsequently ends with no further baserunners, then the run becomes unearned if the official scorer believes that the run would not have scored without the error.
Because errors can extend innings, runs that are scored with two outs after an error is committed, are likely counted as unearned runs.
Additionally, if an error takes place amidst an inning with multiple runs, it is also possible for some runs in an inning to be unearned, while others are earned.
So with that out of the way, let’s get back to the main focus, ERA, and look at how you calculate a pitcher’s ERA.
How Do You Calculate ERA?
ERA can be a little tricky to calculate at first because there are three numbers to take into consideration, but once you get the hang of calculating it a few times it shouldn’t be too difficult.
A pitcher’s ERA is calculated by dividing the total number of earned runs allowed by the total number of innings pitched, then multiplying the total result by nine.
The resulting formula will look like this: ERA = (earned runs/innings) x 9
As an example, we’ll use a pitcher who has 50 innings pitched in a season and allowed 25 total runs, but only 20 earned runs. Because five runs were unearned, we discard that total and just use 20 runs and 50 innings pitched.
So, 20 divided by 50 is 0.4, and once you multiple that by nine, the result is 3.6. ERA is almost always denoted to two decimal places, so the pitcher would be listed as having a 3.60 ERA.
Now that we know how to calculate ERA, let’s look at how to apply that knowledge.
What Is a Good ERA in Baseball?
There is no set-in-stone standard for what a “good” earned run average in baseball is, as numbers fluctuate due to several factors, including elevation, quality of hitting/pitching in the league, ballpark dimensions, and other things.
That said, as an unofficial benchmark, in the 21st century, an ERA below 4.00 is considered good, and an ERA below 3.00 is great, and below 2.00 is exceptional. An ERA above 5.00 is generally considered poor.
In 2019, for example, the league–wide ERA in MLB was 4.49. In that case, the 3.60 ERA of our pitcher in the theoretical example had a solid season, as his ERA was almost 20 percent lower than the league average.
On the other hand, the lowest league-wide ERA in baseball since the so-called “Modern Era” (started in 1901) was in 1908, when Major League Baseball as a whole recorded a 2.37 ERA. If our theoretical pitcher would’ve recorded his 3.60 ERA in 1908, it would’ve been 52% higher than the league average.
In the past 20 seasons, the annual ERA leaders for all of MLB have had ERAs ranging from 1.66 to 2.77, with the average ERA for the major league leader in that span being 2.37.
For context, the league ERA in that span is 4.25, meaning that the league leader in ERA is on average about 44% lower than the league average.
This illustrates how important a strong ERA is because as stated earlier, the primary goal of a pitcher is to prevent the other team from scoring.
And when it comes to earning a low earned run average, National League pitchers have an advantage over their American League counterparts. This is because the National League doesn’t use the DH, meaning that pitchers in this league get to throw to other pitchers, who are almost always poor hitters.
When Did ERA Originate in Baseball?
Earned run average is a statistic that came about early on in baseball’s history, with the idea that pitchers needed to be assessed differently than the early method of just wins and losses
Early baseball writer and statistician Henry Chadwick is credited with inventing the Earned Run Average statistic in the mid-to-late 1800s, though the exact year is unknown.
Chadwick believed that wins and losses were not true indicators of a pitcher’s effectiveness, so he sought another statistic meant to capture how effective a pitcher was at preventing runs from scoring.
As it turns out, Chadwick was generations ahead of his time, as wins and losses by a pitcher continued to be highly valued for many decades despite that statistic being more indicative of a team’s collective performance.
As relief pitching became more common in the early years of the 20th century, ERA usage became more common as pitchers began appearing in games without earning wins and losses.
Earned run average became an official statistic of Major League Baseball in 1912, though ERA figures from previous years have been retroactively tabulated.
Now that you know what ERA is and how to interpret it, you should have a better idea if the pitcher you see on the mound is one you can trust or one you should worry about.
Odds and Ends
- The lowest ERA in a season (minimum 1 IP per team game) in MLB’s Modern Era (since 1901) is a 0.96 mark posted by Dutch Leonard of the Boston Red Sox in 1914. The lowest earned by average of the “Live-ball era” (since 1920) is the 1.12 ERA turned in by Hall of Famer Bob Gibson of the St. Louis Cardinals. In the 21st century, the lowest mark is Zack Greinke’s 1.66 earned run average for the 2015 Los Angeles Dodgers.
- The lowest career ERA of all-time (minimum 1,000 innings pitched) is by Hall of Famer Ed Walsh, who recorded a 1.82 earned run average between 1904 and 1917. The lowest for a pitcher who pitched exclusively in the live-ball era (post-1920) is 2.21, accomplished by Mariano Rivera between 1995 and 2013. Clayton Kershaw has the lowest career earned run average of any active pitcher at 2.44.
- The highest ERA in a season by a qualified pitcher is a 7.71 mark posted by Les Sweetland of the 1930 Philadelphia Phillies. The Phillies also posted a 6.70 team earned run average that year, the worst by any team in Major League Baseball history.
Related Articles
- What Is a Shutout in Baseball? Meaning and Historical Stats
- What Is a Quality Start in Baseball? A Stat that Measures…
- What Is a Full Count in Baseball?
- What Is MVR in Baseball? A Detailed Guide to the Rule
- What Is a Bullpen in Baseball?
- What Is the Strike Zone in Baseball? A Thorough Explanation
- What Does It Mean to Tip Your Pitches in Baseball?
- What Is a Balk in Baseball?
- What Is WHIP in Baseball?