clock menu more-arrow no yes mobile

Filed under:

Analytics: Assessing the San Diego Chargers Defense and John Pagano

Using the same methodology used for assessing the offense, we look at the San Diego Chargers defense and the performance of Defensive Coordinator John Pagano.

If you buy something from an SB Nation link, Vox Media may earn a commission. See our ethics statement.

How bad are we again?
How bad are we again?
Stephen Dunn

We all know the San Diego Chargers' defense last year was awful. This article breaks down the awfulness by player and position but also provides some insight into Pagano's performance as defensive coordinator. That insight gives more ammunition (if any was needed) to those wishing to line up Pagano to the metaphorical firing squad. Some seem to want to do it literally, but this is a game, remember.

As a reminder, in my prior article I introduced some methods of transforming ProFootballFocus (PFF) data into metrics, while based on PFF grades, allow better comparison across positions, players and teams. The core methodological steps are as follows:

  • Normalizing player PFF grades so instead of getting a score PFF points score, they would get one saying how many standard deviations above or below the mean that score was.
  • Solver and linear regressions were used to create estimates of the importance of each position to the overall team defense (and offense). The metric for team defense is Football Outsiders DVOA metric.
Team Level Data
As described above, this shows how well the PFF metrics do at explaining team level performance. Going forward, the PFF-derived data will be called PFFM (PFF metrics). So, below shows an XY scatter showing DVOA (x) vs. PFFM (y) for both 2012 and 2013. The Chargers are highlighted:

The tightness of fit between DVOA and PFFM is looser than with the offense. Also, relative to other teams, PFFM for SD is actually a bit worse than expected (they are below the linear trend line).

The chart below shows the two metrics both converted to standard deviations against all the other teams. Looking at this way, the metrics for 2013 look pretty similar.
So, as we all know, the San Diego defense was pretty bad. Many other metrics would probably say the same thing. Again, the team level data is not all that interesting. One more step is necessary before showing the player level detail. Below shows the results of the position weights after running my regressions. These are the weights that do the best job of explaining how important the PFFM team position scores were at explaining total team defense (measured in DVOA).

As a side note, the Solver values are rounded to the closest round-ish percentages but not for the regression values, for no good reason. The data shown earlier is based on the regression values. To read the above chart, it is saying that the PFFM scores for a team's DE play explain 31.7% of the team's aggregate team defense score (in DVOA). If these weights were true and independent, that would be a profound finding. We don't really know how reliable these weights are and probably subject of ongoing work but hopefully interesting and helpful, including the cavaets. OK, so what does this say about player contribution to total team PFFM? Aside from Weddle, a lot of painful scores.

The data is sorted by the player PFFM and is intended to show the flow from the player score to the impact on the aggregate team score. We had a lot of bad scores at CB and, according to the weights shown above, CB's are very important for team defense.

Now that we have all the player scores linked to the team score, we can show a flow chart from last year, tracking the impact of the players that left, the FA, draftees and the impact of changes in the level of play of the existing roster. That flow chart is show below:

So this chart is intended to flow as follows:
  • START PY: Last year's score was -0.38 (our weighted players scores were roughly .38 standard deviations below the mean). Not very good.
  • LOST: this the value of the players that left. Blue bar connotes a positive score, so the players that left had bad scores, so their absence is counted as a gain to the team.
  • FA: the burgundy bar connotes a negative score. The players added did not play well (according to PFF). This is heavily the "contribution" of Mr. Cox.
  • DRAFT: we did not have a lot draftees play on defense, basically T'eo. He was not very good but his play did have a huge impact on team metrics.
  • ROSTER: this is change in the scores of the existing roster. So the same players on the roster played much worse this year than last year.

So, we were bad, got rid of a bunch of bad guys, replaced them with guys who played almost as badly and the roster played materially worse. At this point we could drill down on the players, but comparing this data against other teams might provide some insight on how the front office and defensive coordinator did at assembling and utilizing defensive talent. So how did the Chargers do against other teams?


The teams are sorted by total of the two scores: New Players (FA + Draftees) and Roster change. San Diego is next to last (highlighted in the red box). Only CHI did worse. So looking at some of the top and bottom scoring teams, so interesting we see somewhat interesting results.

  • BUF: Mike Pettine was named new coordinator for Buffalo, did a good job and parlayed that into the largely unwanted head coaching position in Cleveland.
  • NO: Rob Ryan took over in New Orleans and the unit responded very well, big positive gains from the existing roster.
  • CHI: Hired a new defensive coordinator (Mel Tucker) and their defense fell apart, driven mostly by vastly inferior play by existing roster.
So Pagano is far away from good coordinators and close to (seemingly) very bad ones. However, perhaps Pagano should not be blamed for the bad play of the FA. To try to look at this, I tracked the scores of the FA by each team and added up their scores to look for which teams saw improved scores for their FA. Again, Pagano looks bad.


I did look at the correlation between roster changes and FA change, and it was weak (.083). I was hoping it would be strong, suggesting that there is something in common to teams where the FA play worse and the roster plays worse. However, it is still pretty damning for Pagano the the roster regressed as did the players brought it, namely Cox. I hope Pagano proves his detractors laughably wrong, but the data suggests that would be a miraculous turnaround. Hopefully, at that time, we do not hire the presumably recently fired Mel Tucker as his replacement.