The Invincible Pettitte and Lee

October 18, 2010

If you hate me or my writing, my anti-traditionalism or my affinity for newfangled statistics, you should be happy to know that at eight o’clock tonight I will be one of the most miserable people in New York. Because the majority of my grad school classes run from 7-9pm, I will be missing the first hour of Games 3 and 4 in the ALCS. I have two strategies to combat this horrific coincidence, neither of which will make anything better at all. First, I will look around the classroom for other students whose faces are just as contorted in displeasure as mine will be. Surely I will find someone with whom to exchange a despairing wince (I won’t, because my school is something like 98% female. Maybe that makes me a pig, but I’m pretty sure there are no serious baseball fans in my classes). When that fails, I will pull out my BlackBerry and hit “refresh” one thousand times until class ends, when I will scamper to the nearest television for what will probably be disappointing news.

Yes, “disappointing.” The Yankees face Cliff Lee tonight, a man who has transformed himself into one of the elite pitchers in baseball. He will probably pitch very well. But he is not invincible, as most of the newspapers and radio shows would have you believe. Lee pitched terribly in August and has occasionally pitched poorly in other months. Because pitching is, you know, difficult and unpredictable. If Roy Halladay can give up four runs to the Giants after no-hitting the Reds a week earlier, that means Lee can be beaten too, even if we are being told he is unbeatable.

More perplexing than the “Lee is unbeatable” narrative is its “Andy Pettitte is the clutchiest thing to ever walk the earth” counterpart. Lee has called Pettitte “the best postseason pitcher of all-time“, an errant notion that I can condone because Lee a professional baseball player and not a professional analyst. But it’s not just Lee who thinks this. Much of the local sports media – likely on account of Pettitte’s 19 playoff wins and his modest, likable personality – is echoing the idea that October is under his dominion.

Lost in this mythologizing is the fact that Pettitte pitches almost exactly the same in the playoffs as he does in the regular season. His postseason ERA is one one-hundredth of a run lower than his regular season ERA. He has allowed hits at exactly the same rate. His playoff home run rate is higher than his regular season’s. He strikes out fewer batters in October but walks fewer too, making his K/BB ratio almost exactly the same as in the regular season. Look at the numbers for yourself. Pettitte doesn’t step it up in October. It’s a myth.

Tonight, one of these myths will probably be crushed. I’m rooting for Lee’s, but even as a Yankee fan, the debunking of Pettitte’s playoff invincibility would be an acceptable silver lining in the long term.



The Relationship Between Fielding Effectiveness And Balls In Play

January 29, 2010

What you see before you is more than a funny looking picture. It is a symbol of my unrelenting devotion to truth, the physical embodiment of some of the data to be revealed in this post, and evidence that I am a Grade A dork. But let’s not focus on that last one. This thing is called a “boxplot,” and I know that because my good friend and occasional commenter told me so after creating it for me. This friend graduated from Vanderbilt with me in 2008, and has gone on to earn two master’s degrees – one in economics, and one in mathematics. He is also the person I turn to when I have a hypothesis, a ton of sports-related data in a spreadsheet, and not a blasted clue about what to do with it. So, R. Thomas, this boxplot’s for you.

Back to sports. If you are a baseball fan and you watch baseball games, I’m willing to wager my considerable life savings that you’ve heard an analyst talk about the importance of fielders “staying on their toes.” This usually happens when a pitcher is working quickly and accumulating outs fly and ground balls. The analyst will talk about how keeping the fielders involved in plays increases the quality of the defense behind the pitcher, because it prevents fielders from getting distracted, dozing off, or stiffening up due to lack of movement. This saying is, at its core, a variation of the old “practice makes perfect” dictum. If a pitcher pitches in such a way that his defenders stay involved and get into a fielding rhythm, their defense improves. That’s what they say.

Of course, just a small amount of critical thinking reveals some serious holes in this logic. Supposing pitchers do have some control over how and where their pitches are hit (a huge and false supposition), wouldn’t leaning so heavily on their defense yield tired fielders? If pitchers do have this sort of control, shouldn’t they just remove any possibility of an error and try and strike batters out? Wouldn’t the increased number of balls in play create more chances for hits and fielding errors? There are a ton of problems with the idea that fielders field better when under constant fire, but that hasn’t stopped it from emerging almost daily during baseball season.

Well, this is me doing my small part to refute this erroneous claim. The first thing I did was acquire the number of balls in play allowed by the pitching staffs of every major league team since 1990. Then I looked up each team’s defensive efficiency, which is the rate at which balls put into play are converted into outs. Lastly, and with the necessary help of my aforementioned friend, I examined the relationship between the two via the correlation function in Excel. The results shed light on the extent to which the two variables (balls in play and defensive efficiency) are related. The closer the number is to +1.0, the more positive the relationship between the two is (if one goes up, the other goes up). The closer the number is to -1.0, the weaker the relationship between the two is (if one goes up, the other goes down). I’m sure this is fascinating, so I hate to tear you away from the riveting inner-workings of statistical functions, but here are the results:

  • 2009: -0.42
  • 2008: -0.59
  • 2007: -0.32
  • 2006: -0.52
  • 2005: -0.56
  • 2004: -0.02
  • 2003: -0.15
  • 2002: 0.11
  • 2001: 0.13
  • 2000: -0.03
  • 1999: -0.11
  • 1998: -0.53
  • 1997: -0.50
  • 1996: -0.36
  • 1995: -0.50
  • 1994: -0.29
  • 1993: -0.53
  • 1992: 0.08
  • 1991: -0.49
  • 1990: -0.20

As you can see, only three (1992, 2001, 2002) of the past 20 seasons reveal some sort of positive relationship between the number of balls in play and defensive efficiency. And in all three cases, the positive correlation is quite weak. On the other hand, the 17 other seasons that I examined reveal a negative relationship between the two variables. With a few exceptions (1999, 2003, 2004), the negative relationship is pretty pronounced. As a result, these numbers suggest fairly strongly that, at least in the last 20 years, defenses labor when more balls are put into play by hitters. This makes a great deal of sense given the very obvious and intuitive problems with the whole “keeping fielders on their toes” idea.

I’m sure this information is completely unsurprising to baseball teams and the (usually) very bright individuals who run them. By no means is this meant to be some sort of breakthrough that organizations can use to construct better teams. Instead, this post is targeted at those who believe (or have got into the habit of saying) that fielders perform better when they are put to regular work. I’ve heard broadcasters say that fielders are more comfortable when they remain active in the game, and I have no doubt that some players truly feel that way. But in the last 20 years, increased fielding activity has in no way boosted teams’ defenses. In fact, the opposite appears to be true.

I guess the key to good defense is just having plain old good fielders.

Modern Pitchers Are No Less Aggressive Than Their Historical Peers

August 12, 2009

During today’s Blue Jays-Yankees game, color commentator Paul O’Neill made an off-hand assertion that piqued my interest. I forget what exactly led to this comment, but O’Neill said:

“In today’s game, pitchers are afraid to throw the ball over the plate.”

His comment is not meant to be taken literally; he’s not saying that modern pitchers tremble at the thought of throwing a strike. In baseball-ese, however, O’Neill’s observation roughly translates to “modern pitchers throw fewer strikes and are less aggressive in attacking hitters than they used to be.” Although it was just a throwaway line, two reasons drove me into wondering if this was actually true. The first is my heightened sensitivity to any statement – sports-related or not – that implies or asserts that things now are worse than things “back then.” While it’s true in some cases, I believe that such statements result from some combination of ignorance, envy, and insecurity. The second reason for my curiosity is the frequency with which I’ve heard this assertion over the years. O’Neill is merely the most recent in a long line of baseball analysts to have said this.

This was a difficult subject to research. FanGraphs’ Zone% statistic is, essentially, exactly what I was looking for. This statistic simply charts the percentage of a pitcher’s offerings that end up (or would have ended up) in the strike zone. Unfortunately, an unwieldy interface (or user incompetence) combined with only 35 years of data to make this research option unfeasible. So, I decided to track Major League Baseball’s BB/9 statistic since 1901. Obviously, the lower the BB/9, the better the era’s pitchers’ control was. The results were fairly surprising, at least to me:

  • 2000s: 3.37
  • 1990s: 3.45
  • 1980s: 3.23
  • 1970s: 3.31
  • 1960s: 3.14
  • 1950s: 3.59
  • 1940s: 3.59
  • 1930s: 3.28
  • 1920s: 3.04
  • 1910s: 2.95
  • 1900s: 2.53

Predictably, pitchers issued very few walks in the first quarter of the 20th century. What surprised me, however, was the apparent lack of control exhibited in from 1940-1959. Many members of the sports media have levied O’Neill’s criticism against modern pitchers while pointing to this era as a time when pitchers were aggressive and threw strikes. Apparently that’s not true. Ultimately, these numbers reveal that – with the exception of the early 20th century – modern pitchers are no less aggressive or capable than their historical peers.

I Repeat: A “Five-To-Six Inning Pitcher” Is Not A Bad Thing

May 29, 2009

Consider this post my informal proposal to retire the phrase “he’s a five-to-six inning pitcher.” This phrase – used with some regularity in baseball circles – always has a respectfully negative connotation to it. It’s intended to say tactfully “he’s not very good, but he’ll take his lumps and get you through nearly two-thirds of the game.” Most recently, ESPN’s Buster Olney used it to describe the Phillies’ Jamie Moyer:

With Moyer essentially a five-to-six inning pitcher these days, the last thing that the Phillies need is to acquire another starter who would consistently leave 9 to 15 outs on the table for the bullpen. 

Olney and every other baseball writer continually neglects the fact that the average starting pitcher in the major leagues is “a five-to-six inning pitcher.” Look at the average length of a pitcher’s start since 2000:

  • 2009: 5.80 IP
  • 2008: 5.80 IP
  • 2007: 5.79 IP
  • 2006: 5.82 IP
  • 2005: 5.99 IP
  • 2004: 5.85 IP
  • 2003: 5.86 IP
  • 2002: 5.85 IP
  • 2001: 5.91 IP
  • 2000: 5.91 IP

As you can see, a phrase that is meant to criticize politely actually describes an average performance. Furthermore, there are many, many teams in Major League Baseball that would love to have someone who is “essentially a five-to-six inning pitcher.” There’s good value in average starting pitching, believe it or not. Since average starting pitching is somewhere between five and six innings per start, I propose that we banish the critical usage of “five-to-six inning pitcher.” Such criticism would be valid in, say, 1954; pitchers threw 463 complete games that year. But in the modern game, this qualifier adds nothing.

Inexplicably, Home Runs Remain An Underrated Means Of Scoring

May 26, 2009

One of the more puzzling sentiments that has made its way into mainstream baseball analysis is the idea that home runs kill rallies. You don’t hear it in every game, or even most games, but when the opportunity presents itself, you can count on a broadcaster unleashing this bit of misinformation. For example, if a team loads the bases with no outs, and the batter hits a grand slam, it is likely someone will say “I’d rather have had a single to keep the line moving than a rally-killing homer.”

I hope the fallaciousness of this thinking is fairly self-evident. A home run is, by definition, the single best result a hitter can achieve during his at-bat. At the very least, it guarantees one run for his team. It often guarantees more. But it’s a guarantee, and that’s the most important point to remember and the very point that people forget when they proclaim certain home runs “rally-killers.” As a fan, it’s easy to understand the feelings behind such a statement. The bases are loaded, no one is out, and there’s all the promise in the world of an endless inning with lots and lots of scoring. When a player hits a home run and clears the bases, it just feels like the start of the inning all over again. Sure, multiple runs have scored, but now there’s no one on base. So, I understand the visceral reaction leading to the idea of rally-killing home runs. It’s important to understand, however, that the home run itself is the very rally that people fear has been killed.

I bring this up because Tampa Bay Rays pitcher Jason Isringhausen has introduced an apparent descendant of this misguided maxim. After helping blow a 10-0 lead over the Cleveland Indians, the Rays’ reliever offered this bit of thinking:

“The walks are unacceptable,” Isringhausen. “I’d rather give up home runs than walk guys.”

Isringhausen’s preferences are his own choice, but if he’s intent on pitching effectively, then his choice is wrong. It’s wrong for the same reason that home runs as “rally-killers” is wrong. If hitting a home run is the best thing a hitter can do, it’s also the worst thing a pitcher can allow. A walk is bad, yes, but allowing a home run means that the opponent has instantly scored one run. That’s much, much worse than allowing a baserunner.

While wrong, Isringhausen’s statement is understandable. As a fan, it’s agonizing to watch your pitcher walk batter after batter. It’s a slow, painful death that wreaks havoc upon the nerves and grants an amplified feeling of powerlessness. Seeing your pitcher allow a home run, on the other hand, provides certainty. It’s the devil you know. Once the ball leaves the park, you know exactly what the score is going to be, and you can start to get on with your life. Walks don’t afford that luxury. So, once again, I comprehend the feelings behind a statement like Isringhausen’s. That doesn’t make him any less wrong.

Dispelling The Duke Myth

January 8, 2009

I have a secret that I would like to share with you. Once upon a time, when I was young and foolish, I was a fan of Duke basketball.

I am still ashamed of this. It is not because, like so many people, I now equate Duke with innate evil. I have no quarrels with Coach Mike Krzyzewski (you’re not going to believe me, but I spelled that right on my first try) and his ego. I have not joined the ranks of those who hate out of envy, who snarl out of insecurity, mock out of fear. I simply left for Vanderbilt University, and the Commodores became my team. You see, I had no real college team growing up in New York. I enjoyed seeing St. John’s do well, but the late 1990s/early 2000s Red Storm did not capture the public imagination like earlier editions. There were no Walter Berrys, Boo Harveys, Mark Jacksons or Chris Mullins. Instead, there was Ron Artest, Zendon Hamilton, Erick Barkley and Bootsy Thornton. It was a perfectly fine group, but it failed to capture the city’s attention. I rooted dutifully for them, but without passion. This brought me to Duke.

During my time as a Duke fan, I was subjected to the taunts and jeers of the non-Duke world. This was fair. Friends and family wondered how I could lay claim to this team. I couldn’t. Duke was in North Carolina and I was from Manhattan, which made me a by-the-book bandwagoner. I understand that more now than I did then. There was, however, one popular barb that never sat well with me. Invariably, after defending my misguided loyalty for long enough, my opponent would dismissively say “well, Duke players never make it in the NBA anyway.” This sentiment has been popular in the last ten years of my life. I have heard it from friends, family, fans, and analysts, even after becoming a Vanderbilt fan. It was the common last resort against an unwavering Duke fan: “Duke players never make it in the NBA anyway.”

At the time, I was pretty sure this statement was unfounded. After doing some research, my suspicions have been confirmed. First, I somewhat arbitrarily chose the 20 best college basketball programs of the past decade or so. Twenty because 15 was not enough, and 25 was too many. A decade because I am positive I have heard this myth for the last ten years, and less sure about the preceding years. Then I recorded the current and active NBA players that came from these schools. Finally, I took down each player’s key career statistics. These numbers will show that the Duke Myth is just that – a myth. Read the rest of this entry »