The Super Bowl is the most viewed TV event of the year. Four hours with nearly 80 percent of America’s TV viewers attentively watching not just the football game, but the ads too. That’s the ideal, anyway.
Sunday’s Super Bowl, however, had a turn just before the Patriots–down eight points–began their final drive against the Eagles. Viewers bailed, totaling a nearly three percent drop in viewership. If you watched the game, you know things didn’t look good for the Pats at that point. The drop off probably consisted of some mix of distraught Pats fans who had lost faith in their team and casual viewers who didn’t really care what happened but assumed the Eagles had it. Regardless of who’s responsible for it, I’ve dubbed the drop off the “faithless rate.”
The table below shows the faithless rates of three non-Boston/Philadelphia markets during Super Bowl LII based on Sorenson Media’s second-by-second viewer data. The rate shown is the difference between the viewership percentage at the peak of the Super Bowl (just as Tom Brady fumbled in the 4th quarter) and the viewership percentage just before the Patriots’ final drive of the game.
This faithless rate is an expected metric looking back at the flow of the game. But it’s also an interesting sports TV phenomenon because it’s entirely unpredictable from one sporting event to the next. It all depends on the progression of the game. Then there’s the question, is three percent viewership enough to worry over? It depends on the event. For the Super Bowl, three percent represents millions of eyes. In the case of a regular season college ball game, it’s much less significant. As a collective TV industry, I think we can all agree that the more faithful the fans the better.
For the sake of viewership, here’s to a neck-to-neck Super Bowl LIII.