LeBron James. Kobe Bryant. Who's better? If you're an analytics-minded NBA fan trying to make a case, there are a number of all-in-one statistics you can choose to bolster your argument: Win Shares per 48, Player Efficiency Rating (PER), Real Plus-Minus (RPM).
None of these metrics are perfect. Still, they do a fairly good job of quantifying a player's relative value and productivity—that is, on the offensive side of the floor. On defense, by contrast, basketball's smartest analytical minds have yet to come up with a useful, ubiquitous way to measure on-court aptitude.
Given that individual defensive performance is a pretty important factor in the outcome of games, this is a serious shortcoming.
You can forgive casual fans for mostly basing their defensive conclusions on subjective assessments gleaned from watching games. In other words, the eye test. Yet, as we've learned from offensive analytics, our eyes can deceive us, telling us that Andrew Wiggins is approaching elite status on the offensive end, when he's actually hovering near league average so far this season.
The potential for eye test defensive misevaluation is even greater. After all, when most of us watch the sport, we're watching the ball. We're barely paying attention to how defenders rotate and revolve around it, or how they close out, or how they lock and trail around off-ball screens.
So when your friend argues that Bryant actually deserved all nine of his selections to the NBA's All-Defensive First Team (later in his career, he certainly did not), at least they have an excuse. Interestingly enough, the people paid to look past the eye test—those aforementioned analytical minds—also have trouble quantifying defense.
Talk to basketball analysts like ESPN Insider's Kevin Pelton, and you'll find that a mysterious, all-in-one defensive metric is something of a jigsaw puzzle whose pieces have yet to match up—and may never come together. And that's despite recent innovations in regression analysis and the use of SportVU cameras in every arena to track player movement.
When I recently asked Pelton about the possibility of a holy grail defensive stat, he was pretty close to definitive with his answer.
"Probably not," he said.
Why is it so hard to quantify defense? For starters, counting stats don't tell us much. When people talk about defense, they'll often mention blocks, or steals, or—if they're looking a little closer—defensive rebounding percentage. Stuff that's palpable, because we can measure it. LeBron James' block of Andre Iguodala in Game 7 of the NBA Finals resonated not simply because it was an awesome defensive play but because it was an awesome defensive play that every basketball fan saw.
But those same obvious defensive stats can be misleading. Steph Curry led the league in steals last season, averaging 2.1 per game. Would anyone call him the best perimeter defender in the league? Curry might not even be the sixth- or seventh-best defender on his own dang team. Even when we consider advanced defensive counting stats, like defensive rebounding and steal percentage, there's no getting around the fact that a box score will never adequately tell us a player's actual defensive impact—which means all-in-one defensive metrics that solely rely on counting stats, like Basketball-Reference's Defensive Box Plus-Minus, are of little use.
ESPN's Defensive Real Plus-Minus (DRPM) goes deeper, attempting to ascribe a basic plus-minus number to a player, while also taking into account the teammates and opponents who share the court with him. It relies very little on box-score data, but it also illustrates just how difficult it is to come up with an all-in-one defensive rating. Pelton told VICE Sports that DRPM is "the democracy of defensive metrics: the worst except for all the others."
In other words, it's the best rating we currently have, but also has some serious limitations.
According to Dr. Stephen Ilardi, a clinical psychologist and neuroscientist who along with the Jerry Engelmann helped create ESPN's Real Plus-Minus rating, the primary variable being tracked in the network's system is "what's happening to the bottom line: the score. It's what's happening to offensive and defensive efficiency for every single possession a player is on the court—controlling, mathematically, statistically, for the impact of teammates, which vary from lineup to lineup and by opponent. Also, adjusting for things like home court and so forth."
Since the ultimate end result of good defense is preventing points, DRPM looks primarily at the same things: Did a team score? Did a player score? Who was on the court when a team scored?
DRPM can produce valuable insights: Danny Green, a long-limbed San Antonio wing who has never made an All-Defensive team in his career, currently has the top DRPM among shooting guards. The Spurs give up fewer points when Green is on the court this season, and watching him hassle opposing guards with those long arms, most hardcore Spurs fans know why.
But DRPM can also miss the mark. Avery Bradley, an All-Defensive First Team selection last season, is ranked No. 91 (out of 95!) among shooting guards in DRPM. Some of this is because Boston's splashy summer free-agent signee, Al Horford, missed time early this season due to injury; without Horford's imposing presence around the rim, Bradley doesn't have as much carte blanche to hound opposing ball handlers, which negates one of his defensive calling cards. That doesn't mean Bradley has regressed that much as a defender since last season, just that his particular skillset is overlooked by DRPM.
Another issue with ESPN's metric? Publicly available DRPM data only covers the current season, even though Ilardi told VICE Sports that the measure gets "dramatically more accurate the more data we have."
(ESPN also has a "predictive RPM" that uses three preceding seasons of data to augment its accuracy, but it's currently not publicly available.)
According to both Ilardi and Pelton, the next logical step toward creating an all-in-one defensive metric is incorporating SportVU data with DRPM. For the past three years, all 29 NBA arenas have housed cameras that track every movement by every player over the course of games, measuring everything from average player speed and distance run to catch-and-shoot opportunities.
SportVU's player tracking already is being used by NBA.com to give fans some limited defensive measurements, like a rim-protection stat that lists opposing players' field-goal percentage at the rim while a particular defender is within both five feet of the basket and five feet of the player attempting a shot.
One of the researchers to first translate raw SportVU numbers for basketball fans, Harvard PhD candidate Andrew Miller, told VICE Sports that he concurs with Ilardi and Pelton.
(Miller was involved in an important defensive analytics paper unveiled at the 2014 MIT Sloan Analytics Conference. You might've read about it when analyst Kirk Goldsberry, now with the San Antonio Spurs, detailed their findings in a piece of his own on defensive shot charts.)
"I think incorporating information you get from not just box score events will make that DRPM more indicative of the true value of the player," Miller said. "So right now I think it's a good approximation of the right qualities, but I think the approximation could get better."
In the future, we'll see more defensive data thanks to SportVU cameras. "We essentially know … which defender is guarding which offensive player," said Charlie Rohlf, the product manager for STATS, the company behind the NBA's player-tracking cameras. "We have determined that algorithmically for the entire game."
This has allowed researchers to derive more granular, subtle defensive data than what's currently available to the public, like the aforementioned Defensive Shot Charts that Goldsberry wrote about but also "ball screens, post-ups, drives, isolations, whatever," according to Rohlf. The cameras track every movement on both sides of the ball, and Rohlf says that they "update who is guarding who every 25th of a second," which is plenty good enough to keep pace with all of the defensive switching and backline rotations that occur in the modern game.
Discussing defensive abilities that SportVU might be able to quantify for fans in the not-too-distant future, Rohlf mentioned tracking player drives. Only forget the usual idea of recording whether an offensive player scored or not. No, SportVU will tell us what defenders specifically force driving offensive players to do.
"Does one particular drive defender force more kick-out passes versus finishes at the rim, and how does that affect point per possession for an opponent overall?" Rohlf said.
SportVU data can tell us whether it's better for a defender to force a kick-out, or challenge a shot at the rim, then overlay that datapoint—one the public isn't exposed to yet—with DRPM. As Ilardi said, more data makes DRPM more accurate. SportVU is a whole helluva lot more data.
Right now, there's nothing showing fans a player forced an opponent to pass rather than attempt a layup. There was no block, so how would we know the defender prevented a possible layup? In the future, player-tracking data may be able to show us just that, helping us see what isn't apparent to the naked eye.
"Incorporating that information and coming up with a way to have SportVU measure [defensive] responsibility for each player is still quite a ways away," Miller said. But Pelton said that "eventually," SportVU data "will be able to track when a player has made the right rotation, and how that has improved."
That said, better (and more) data isn't the only thing holding us back from an all-in-one defensive stat. Even SportVU cameras miss important components of defense.
"We are recording what did happen," Rohlf said. "We are not recording what should have happened. We cannot get into the coach's head to know what was on the scouting report and what in-game adjustments they had just made and what they just talked about in the timeout. We don't know what should have happened, or who should have guarded who."
That's an important distinction, one that will always escape the casual fan, even with SportVU data becoming more readily available.
"You're never going to capture everything," Pelton said. To illustrate, he offered the example of a defender rotating over and thwarting a shot attempt. "If I'm in the right spot at the right time, then there's not even stats recorded because there's no shot being taken if I'm deterring the shot," he said. "Because I've helped over to nail and taken away that opportunity."
Teams know what their players should be doing, but that information is proprietary. Moreover, raw SportVU data isn't publicly available except for certain exceptions, like Miller and his peers working on defensive shot charts for an academic paper.
As such, teams may have a better chance of coming up with an all-in-one defensive metric in private than analysts do in public.
"It's up to the team analysts and the coaches and the general managers to make sense of [SportVU] data to fit their schemes and their needs and what they're looking for," Rohlf said. "To fill in that extra piece of the puzzle, which is highly classified amongst teams themselves."
To understand how all of this may come together—tracking data, analytics, deep knowledge of scheme and strategy—consider Al Jefferson's time in Charlotte. Pelton said that Jefferson was "a different defender in Charlotte than he had been in the rest of his career. That's another layer of complexity on the defensive end."
Big Al posted his two highest Defensive Win Share totals (an estimate of the number of wins added by a player due to his defense) and his lowest defensive ratings (points allowed per 100 possessions) in his seasons with the Hornets under coach Steve Clifford. Was he really a better defender in Charlotte, or was something else going on?
In Clifford's scheme, Jefferson dropped back on high pick-and-rolls while one of the weakside defender's "tagged" the roll man. That's how Clifford protected the slow-footed Jefferson, who used to get eaten alive on the same pick-and-roll actions while he was in Boston and Utah. (The same thing is happening to Jefferson this year in Indiana.)
Without careful film study or access to coaching game plans, there's simply a lot of stuff we don't know about defense on a team-to-team and player-to-player basis, especially because it's "five guys working on a string," as Pelton says, with everyone's movement playing off everyone else. To wit: while Clifford had Jefferson tagging the roll man, he had fellow 2013-14 Charlotte big man Josh McRoberts jumping out on ball handlers when he was guarding the screener.
"We, as outsiders, are not fully tuned in to the defensive scheme," Miller said. "We can speculate as to what the different roles of the players should have been and then based on those assumptions we can divvy up [influence] however we see fit.
"But really, we're missing a key bit of information: the defensive scheme and what the coach wants his team to execute."
If we didn't know that Jefferson was always supposed to drop back on the pick-and-roll, we might wrongly assume that if his man wasn't bumped and rolled for an uncontested layup it would be his fault—even though it was partially the fault of one of the weakside defenders.
"Without that information, we're going to be wrong," Miller said. "Or we're going to be incomplete, no matter what."
For now, DRPM is the best all-in-one defensive measure we have, and its rankings will improve as the season goes on and ESPN has more on/off data to use. As researchers like Miller render raw SportVU data into defensive metrics available to the public, they'll be assimilated into DRPM data, and we'll come closer and closer to showing the truth of a defender's impact in one clean number.
Still, for the foreseeable future, if you really want to get a feel for how effective a player is on defense, there's an old-fashioned way to go about it. Watch the freakin' games.
Want to read more stories like this from VICE Sports? Subscribe to our daily newsletter.