Friday, July 1, 2011

True Strength: Viewing Levels 102-2 - How Non-TV Events Affect Viewing

Last time, we looked at how big "TV Events" tend to drive up viewing by a sort of somewhat reliable percentage of their rating. This time, we're going to take a look at the other thing that impacts TV viewing levels: "Non-TV Events." These are external factors out in the (gasp!) "real world" that cause fewer people to watch TV. (I suppose some external factors like nationwide bad weather would actually drive up viewing, but I'm not sophisticated enough to be able to tackle that one just yet.)

These are particularly tough to tackle because of the "self-fulfilling prophecy" I mentioned a couple posts back: fewer people watch, so the networks make less and less of an effort to put big-rated programming in the low-viewing periods, so even fewer people watch. It's a vicious circle. It makes things a little more complicated than it is for the "TV Events" we just looked at. Viewing goes up for the Super Bowl mostly because... people are watching the Super Bowl. However, there's not one huge thing we can point to on the weekends.

What is a "Non-TV Event"?

So within the regular season, there are two easy sources of viewing declines: weekends and holidays. We looked at weekends a bit in the first post, where I showed that across the entire 2010-11 season, Friday and Saturday nights had about 18% less average adults 18-49 viewing than Sunday through Thursday (29.9 vs. 36.5). But how much of that is the "self-fulfilling prophecy"?

To try to parse out the two things going on without hopelessly overcomplicating things, we'll break up viewing levels into the "Broadcast Persons Using TV (PUT)" and the "Cable PUT." First a couple definitions: "Broadcast PUT" is the combination of all known broadcast ratings (in other words, the sum of all the individual show ratings we get when broadcast finals come out each day) and "Cable PUT" is everything else on TV (meaning I take the viewing levels (PUT) and subtract the "Broadcast PUT").*

*- It's not really "cable," because it includes cable plus all the broadcast numbers we don't know (including, most importantly, Univision and 10:00 local programming on Fox/CW). But that's quicker than saying "Cable Plus All The Broadcast Numbers We Don't Know PUT."

B'Cast PUT Cable PUT PUT
Su-Th 10.28 26.27 36.54
Fr-Sa 5.03 24.88 29.91
Chg -5.25 -1.39 -6.64
%Chg -51.1% -5.3% -18.2%

So of those 6.64 points of viewing that are lost on the weekends, the vast majority of those points (5.25) "come from" known broadcast ratings. Cable is a little weaker on the weekends, but not very much. But it's not simply a matter of just saying "the real inherent disadvantage is 5% because cable 'tries' about as hard on Friday/Saturday as during the week." Cable is probably losing more than 5% (maybe a lot more), then recouping some of that from the people who still want to watch TV but not the bad broadcast options. As we explored last time, when there are big changes in viewing, most people don't come out of (or vanish into) "thin air." Most of them are "taken" from (or go to) other TV viewing.

As I've been hinting at, within most viewing declines, there are really two "events" in play:

1) The "weekend viewing disadvantage": Because it's the weekend, people are inherently less likely to watch TV. This would be the case even if everyone collectively "tried" harder with stuff like American Idol or Grey's Anatomy. In theory, it should affect broadcast and cable equally.

2) The "broadcast surrender": this is separate from any tendency not to watch because it's the weekend. It's just the decline within known broadcast ratings solely because they're not trying as hard.

How much of each is at play in a "negative event"? To try to get some sense of that, I decided to look back at the flip side: the "positive events" that we looked at last time. Those don't have any inherent viewing change; it's just a "broadcast surrender," except it's the reverse.

So here are a sampling of the big positive PUT-changers we looked at last time:

B'cast PUT Cable PUT PUT
Super Bowl +32.8 -16.3 +16.4
AFC Champ +15.5 -7.4 +8.0
Oscar +7.4 -4.8 +2.6
Idol Wed +3.5 -1.7 +1.8
Grammys +6.9 -5.1 +1.8
Idol Th +1.8 -0.7 +1.1
AVERAGE +12.1 -6.3 +5.8

As we said last time when we were just looking at the event itself and the change in PUT, it varies based on the type of event. The award shows, in particular, take much bigger chunks out of "cable" than out of "thin air" and skew these numbers a bit. But it seems like on average, it is pretty close to breaking down as follows: for every two-point spike in the "Broadcast PUT," about one of that is taken directly from the typical cable numbers and about one of that is taken from "Thin Air" (expressed as an increase in overall PUT).

So let's reverse that and say that a "broadcast surrender" of 2.0 points means that 1.0 stops watching TV and 1.0 moves over to cable. That means we have some sense of how to gauge both events.

Step One) "Weekend viewing disadvantage": knock down the Broadcast PUT, the Cable PUT, and the overall PUT down by some constant percentage.

Step Two) "Broadcast surrender": After doing that, subtract from the Broadcast PUT to its actual weekend levels, then send half of the difference back into cable and half of it back into thin air.

At the end of those two steps, we should have a Broadcast PUT, a Cable PUT and an Overall PUT that are basically equal to their real weekend levels. By programming both of those steps into my spreadsheet, I can get a guesstimate for what the "constant" is for step one.

B'Cast PUT Cable PUT PUT
10.28 26.27 36.54
Step One: -12.8% 8.96 22.91 31.87
Step Two:
-3.93 +1.97 -1.97
5.03 24.87 29.90

So the "magic number" we were seeking is about 12.8%. In other words, the "adjusted" viewing decline on the weekends (stripping out how much worse the weekend broadcast programming is) is about 12.8%.

Now, this is probably a vastly more complicated issue than I'm making it, and there's a good chance I'm making some faulty assumptions. One that springs to mind is that much of cable also "tries less hard" on the weekends. But I think the totality of cable is much less variable than broadcast, where the original hits are much bigger (and cover more of the schedule) relative to the syndicated/movie stuff. So we'll go with this for now. Perhaps there will come a day when we can break it down even better (and not have to worry so much about margin of error!)


Now that we've come up with a system, I'm just going to run through a few other times where viewing is noticeably lower. Probably the most interesting is Thanksgiving Eve, a holiday that has produced many a series low in recent years, yet many networks still seem willing to attack it. On Thanksgiving Eve in 2010, but ABC and Fox were effectively in all regular originals, CBS had a Survivor clip show and NBC had some holiday and Biggest Loser specials. Not a full-strength programming evening on every network, but aside from sweeps and early fall, few evenings are anyway. So on a somewhat "good-faith" evening, the broadcast and cable drops are much closer to being in line.

B'cast PUT Cable PUT PUT
Wed Reg 9.02 25.45 34.47
T'G Eve 7.68 23.19 30.87
Change -14.9% -8.9% -10.4%

Running it through our two-step system we get:

B'Cast PUT Cable PUT PUT
Wed Reg
9.02 25.45 34.47
Step One: -9.8% 8.13 22.96 31.09
Step Two:
-0.46 +0.23 -0.23
T'G Eve
7.68 23.19 30.86

As you can see, this evening was much closer to "regular" midweek programming than anything on the weekends, so the "broadcast surrender" was much smaller and the "true viewing decline" was much closer to the actual viewing decline.

St. Patrick's Day is the only other example I can think of that's remotely decent, but I don't think even it's quite as good. There were lots of originals on broadcast, but with many of the really big shows out of the picture; ABC's original night had a special Private Practice rather than Grey's Anatomy at 9:00, CBS' original night was NCAA basketball rather than the much higher-rated typical Thursday lineup and NBC's 2.5 original hours suffered not just from the viewing drop but from a lack-of-The-Office halo effect. The broadcast dropoff from the "norm" is much greater here, but it's worth noting individual "regular" shows like American Idol and Wipeout were down some (but not a ton) from surrounding episodes.

B'cast PUT Cable PUT PUT
Thu Reg 11.16 24.55 35.71
St. Paddy's 7.81 23.39 31.21
Change -30.0% -4.7% -12.6%

B'Cast PUT Cable PUT PUT
Thu Reg
11.16 24.55 35.71
Step One: -9.4% 10.11 22.24 32.35
Step Two:
-2.30 +1.15 -1.15
St. Paddy's
7.81 23.39 31.20

It's tough to find one where the networks really aren't trying because most of the big holidays fell on weekends where the networks weren't trying anyway. I'll go with Christmas Eve, which fell on a Friday but was still much less viewed than even the usual Friday.

B'Cast PUT Cable PUT PUT
Fri Reg 5.19 24.53 29.72
X-Mas Eve 4.11 22.22 26.32
Change -20.9% -9.4% -11.4%

B'Cast PUT Cable PUT PUT
Fri Reg
5.19 24.53 29.72
Step One: -10.5% 4.65 21.95 26.60
Step Two:
-0.54 +0.27 -0.27
X-Mas Eve
4.11 22.22 26.33

As with Thanksgiving Eve, the "real" drop and the "adjusted" drop are somewhat close together because there's not a huge gap in the amount of effort. The nets don't try that hard on Fridays to begin with, so them not trying that hard on Christmas Eve is not that big a deal.

Sidenote: Most of the holidays above take right around a 10% "adjusted" viewing drop. But with the exception of big "go out" holidays like St. Patrick's, viewing drops are usually bigger on the "Eves" than on the actual holidays themselves. Makes sense because of travel and the programming choices, I guess. Thanksgiving Day sees only a 7% drop in viewing from the typical Thursday (NFL Network football probably helps that). Christmas Day is virtually indistinguishable from the average Saturday (again, the lineup of big NBA games probably helps). New Year's Eve on a Friday took a drop similar to that of Christmas Eve, but then New Year's Day (with lots of bowl games) was actually very strong for a Saturday.

In my continued effort to keep these posts coming, I'll cut it off there. These first three posts have been all about the fairly rough viewing level approximations, but next time we'll get out of this margin-of-error muck and (gasp!) look at some actual ratings to try to see how some of this viewing level theory actually compares with ratings fluctuations in practice.

No comments:

Post a Comment

© 2009-2022. All Rights Reserved.