Tag Archives: Strategic Research

Three Thoughts on Strategic Research from a Taxi Ride

During a ride in a New York City taxi recently, I noticed the credit card machine displayed “No Surge Pricing. Curb: The #1 Taxi App.”

Here are three things that came to mind.

#1:  I’d never heard of Curb.

So first things first, I looked up where Curb is available and it’s in seven cities: New York, Chicago, Philadelphia, Washington DC, Miami, and Ft. Lauderdale. It’s new-ish, but not new, having launched in 2016.

I visit these cities for business and leisure, so this means one of two things. Either the brand a) didn’t make an impact on me or b) I haven’t taken a taxi in a really long time. Truth be told, it’s probably a little of both.

Curb taxi app

Photo credit: Shutterstock/Janet Julie Vanatko

#2: “No Surge Pricing” is an interesting choice for a positioning statement.

My Lyft trip from Newark to Manhattan was $66. The ride back would have been $95. A taxi was $122. Yes, Uber and Lyft have surge pricing, but in my experience, it’s nearly always less expensive than a cab.

When you search for terms like, “Common complaints from Uber riders” and “Common Lyft complaints”, you get a long list of items including:

  • Failure to pick up customers
  • Drivers being late
  • Incorrectly charged
  • Distracted driver
  • Messy car
  • Getting lost
  • No surge pricing

My 20-year-old kid won’t take an Uber or Lyft because they are fearful that the drivers aren’t screened properly, and they’re worried about their safety. I was surprised to see this one not pop up more often in my search.

But, this is an important lesson in the value of research. It’s certainly possible that Curb’s research indicated frustration over surge pricing in the cities they serve is a big problem. To their credit, that message is on the machine in the cab, on the app, and on the website. Pick a message and be consistent. But do research to make sure it’s the right message.

#3: Taxis need an industry marketing campaign.

Many have said the same thing about the radio and podcasting industries. If you want consumers to think about using your product, you better remind them why they should.

It’s been my perception for many years now that Uber and Lyft are faster, cheaper, and easier than a taxi. Period. And while I don’t live in a city with Curb, I do visit them and take Uber and Lyft often.

Changing my perception isn’t going to be easy, because the taxi industry allowed the rideshare industry to run roughshod over it with better tech and more drivers.

A deep research study to understand consumer perceptions of taxis, Uber, and Lyft would be valuable. But beyond that, I wonder if images have passed beyond subtle “no surge pricing” being effective enough.

I’m seeing a commercial with…oh let’s see…a messy car showing up late with a driver that keeps texting while driving and getting lost and ends with the customer getting a push notification from the credit card company that they’ve been overcharged.

That is, of course, if those negative images showed up as significant vulnerabilities in research.

Understanding your brand’s awareness levels and images (along with that of the competition) is at the core of what makes strategic perceptual research so effective.

Once you know the results, the execution of the strategic plan is where the real fun begins.

Why Does Tactical Research Need Strategic Research?

Tuesdays With Coleman

In the last couple of “Tuesdays with Coleman” blogs, I’ve revisited a Radio & Records column I wrote in 1999. The column introduced four scenarios I ran into when speaking with radio station managers about research projects. As I mentioned previously in this series, despite the dramatic changes in the radio industry over the last 18 years, my colleagues and I still encounter these scenarios, which prevent managers from getting the full benefit of conducting research on their stations.

Today, we’ll discuss Scenario 3 – Confusing Tactical and Strategic Research. In this scenario, a Rock station hires us to conduct a strategic study—at Coleman Insights, we call this a Plan Developer study—after years of relying on rudimentary music testing from another research provider. After those tests caused the program director to “yank” the station back and forth between newer and older and between softer and harder sounds, the general manager—who was frustrated with the station’s mediocre ratings performance—prevailed upon ownership to fund strategic research.

Plan Developer

Among other findings, the study revealed that the best opportunity for the station was to focus its music mix on mainstream Classic Rock titles primarily from the 70s and 80s. The program director, however, would have none of this, claiming that he couldn’t get a lot of that material “to test.” He said that his testing revealed many of his station’s listeners liked Alternative titles from the 90s and Flashback songs from the 80s and bristled at our suggestion that these sounds should be eliminated from the library.

The PD was right—the people participating in his tests gave 90s Alternative and 80s Flashback titles high marks. However, he was wrong that his station should be playing them.

First, the design of his music testing sample was off-target from where the real opportunity existed—focused on mainstream Classic Rock from the 70s and 80s. Our Plan Developer revealed that the station should focus its music testing on men in their late 30s and 40s; the PD was testing his music with a younger sample that was balanced between the genders. Second, the test list he prepared for each test was not focused on the sounds the Plan Developer suggested should be at the core of the station’s strategy. Third, the Plan Developer revealed that the audience expected Classic Rock from the station and that if it could simply grow its images for Classic Rock, its performance should improve significantly. This was eye-opening for the general manager. The station had not done a strategic study in years and its annual music tests did not include any measure of audience expectations, such as the Fit measurement Coleman Insights includes in its FACT360 Strategic Music Tests.

Long story short: Management instructed the PD to follow the Plan Developer music recommendations, including bringing its future music tests in line with the strategy. The PD rebelled, eventually left on his own accord and a new PD came on board and implemented “The Plan.” She looks like a genius, as the station now consistently finishes in the top five in a very competitive market.

While all the research you deploy should have elements of strategy in it, when it is predominately tactical in nature (i.e., rudimentary library tests, simple callout, listener surveys, etc.), the results should never be allowed to take your station off course. Stations that practice having their tactical research—especially their music testing—flow from their strategic research usually win.