In the last couple of “Tuesdays with Coleman” blogs, I’ve revisited a Radio & Records column I wrote in 1999. The column introduced four scenarios I ran into when speaking with radio station managers about research projects. As I mentioned previously in this series, despite the dramatic changes in the radio industry over the last 18 years, my colleagues and I still encounter these scenarios, which prevent managers from getting the full benefit of conducting research on their stations.
Today, we’ll discuss Scenario 3 – Confusing Tactical and Strategic Research. In this scenario, a Rock station hires us to conduct a strategic study—at Coleman Insights, we call this a Plan Developer study—after years of relying on rudimentary music testing from another research provider. After those tests caused the program director to “yank” the station back and forth between newer and older and between softer and harder sounds, the general manager—who was frustrated with the station’s mediocre ratings performance—prevailed upon ownership to fund strategic research.
Among other findings, the study revealed that the best opportunity for the station was to focus its music mix on mainstream Classic Rock titles primarily from the 70s and 80s. The program director, however, would have none of this, claiming that he couldn’t get a lot of that material “to test.” He said that his testing revealed many of his station’s listeners liked Alternative titles from the 90s and Flashback songs from the 80s and bristled at our suggestion that these sounds should be eliminated from the library.
The PD was right—the people participating in his tests gave 90s Alternative and 80s Flashback titles high marks. However, he was wrong that his station should be playing them.
First, the design of his music testing sample was off-target from where the real opportunity existed—focused on mainstream Classic Rock from the 70s and 80s. Our Plan Developer revealed that the station should focus its music testing on men in their late 30s and 40s; the PD was testing his music with a younger sample that was balanced between the genders. Second, the test list he prepared for each test was not focused on the sounds the Plan Developer suggested should be at the core of the station’s strategy. Third, the Plan Developer revealed that the audience expected Classic Rock from the station and that if it could simply grow its images for Classic Rock, its performance should improve significantly. This was eye-opening for the general manager. The station had not done a strategic study in years and its annual music tests did not include any measure of audience expectations, such as the Fit measurement Coleman Insights includes in its FACT360 Strategic Music Tests.
Long story short: Management instructed the PD to follow the Plan Developer music recommendations, including bringing its future music tests in line with the strategy. The PD rebelled, eventually left on his own accord and a new PD came on board and implemented “The Plan.” She looks like a genius, as the station now consistently finishes in the top five in a very competitive market.
While all the research you deploy should have elements of strategy in it, when it is predominately tactical in nature (i.e., rudimentary library tests, simple callout, listener surveys, etc.), the results should never be allowed to take your station off course. Stations that practice having their tactical research—especially their music testing—flow from their strategic research usually win.