Tuesday, July 5, 2011

True Strength: Viewing Levels 103 - Viewing Across the Season


(Sorry! I said we'd get to actual ratings next time, but the thing I devised for the last post led to something else. I am pretty sure we'll get to actual ratings next time.)

Until I started doing my daily ratings spreadsheets, I really had no good sense of anything about viewing levels. But what I assumed as far as the entire year was that they'd be high for the hype of the fall, maybe drop a bit as shows settle from big premieres, then increase again as the weather gets awful, then decrease again in the late spring as the weather heats up. That certainly matches up with ratings even for relatively "stable" shows, which frequently do well in the fall and winter, then decline in the spring.

Well, based on what we can use on an apples-to-apples basis (before the March 28 methodology change), that's... sort of close to the truth. It's kind of annoying Nielsen made this change in the middle of a season, but they did... so here's the line graph of weekly averages:

 

So basically what happens is it starts low, steadily increases (except for a couple weeks in late December) basically up until Super Bowl week (the peak), then proceeds to fall off a cliff basically until the methodology change, especially in the last two weeks (after Daylight Saving Time).
    Now, I've averaged all those weeks together and then run them through the same two-step process I used for the last post, in an attempt to strip out the heavy influence of what the broadcast nets are programming. This is my effort at a "tendency to view" metric, all programming being equal.


    Relatively similar (but smoother) trajectory, but it starts lower, actually peaks a little bit earlier (three weeks before the Super Bowl), then again declines considerably heading into the methodology change and would most likely continue to decline.

    What does all of this mean? Well, the most interesting observation for me is that the broadcast networks premiere most of their stuff at a pretty low-viewed time of year. I guess I always felt like the viewers all magically come back for premiere week, but they really don't. The "tendency to view" is significantly below the full season average (by "significantly below" I mean more than 3% below) for the first three weeks of the season and remains at least a little below average for three more weeks after that. Then there's basically about a 3.5-month period (starting with the beginning of November) that's above average, with the only real exceptions being the second half of December, and then starting in mid-February there's again a below-average tendency to view, and it's way below by the time the methodology changes. It makes me wonder if the midseason shows that premiere in January are actually kinda better off than the ones that premiere in September.

    As we compare viewing level trends with rating trends in the next few posts, this will be one particular phenomenon that won't really match up with viewing levels, especially in the early fall. Many shows, not even talking just about new shows, get their best ratings in the fall when the viewing is pretty low. Those big fall ratings seem to come largely at the expense of cable. For example, the first two weeks of the season have the two lowest Cable PUT levels of the season, but broadcast has its two strongest weeks outside of Super Bowl week. Viewing levels probably do matter, and that's why the ratings decline in the spring, but there's some second phenomenon (maybe we'll call it "early season hype") that boosts fall ratings. We'll see how it lines up starting next time.

    No comments:

    Post a Comment

    © SpottedRatings.com 2009-2022. All Rights Reserved.