Forget What You Think You Know
After working with so many companies on content and content strategy over the last 10 years, patterns emerge. This has given me some confidence. Some structure. And many content strategy best practices.
Every once in a while, though, best practices slap back.
Last year, I started working with a measurement-obsessed SaaS client on their brand story and content strategy. It was fantastic. Over the course of several months, we discussed and built content strategy, created a plan, and implemented changes quickly. Best of all, they shared the results with me every week.
That’s when the slap happened. My forward momentum came to a halt. Confidence wavered. Nothing, or practically nothing, was working the way we intended.
The Obvious Suspects
When I was first introduced to the brand, I could immediately see that we had a whole lot of opportunity for content optimization:
- The brand story was foggy.
- The product offerings, cost, features, and benefits were unclear.
- The call to action was not immediately apparent.
- It felt there wasn’t much reason to believe in them.
- And worse, people were complaining that they didn’t get what they expected.
Content Strategy Begins! A Fresh Start
So we began to unravel and untangle, audit and analyze. We took a closer look at the audience and their needs, complaints and expectations. We followed the common paths to purchase. We outlined the business messaging priorities.
The main goal? To improve product clarity and transparency and improve their brand reputation and loyalty … while increasing conversions.
In alignment with the content strategy, we started to apply web best practices one by one.
Battles Lost, Takeaways a Mystery
Each week, I heard the most surprising things. Content optimizations that clearly should have worked, and have worked in the past for other clients, did not. Clarity and transparency were decreasing conversions.
Other changes that, logically speaking, shouldn’t have made a difference, gave them a bump.
The Hypothesis Cyclone of Death
As the results came in, the mystery grew. In the company meetings, theories started cycloning as to why changes had the effect that they did, and what they could do to fix it. Brand values and ethics were at stake.
This was bad. Even with one controlled change per week (A-B testing and significant traffic volume during peak season are ideal, but we were not operating that way), there are so many variables. World events, the economy, local weather, competitive influence … anything can give you a false negative. Or, if the negative result is in fact accurate, there are so many reasons it could be so.
Theories are good for creative, but not great for content strategy.
I needed to do something quick.
Back Up, Back Up!
Instead of focusing on the negative results and improving on them, we instead took a step back and looked at the history of the site, the positive results so far, and considered why these things were working.
A whole new story quickly emerged.
We had been operating under the assumption of a balanced audience of shopping types. We figured some wanted to dive right into the service, some wanted an overview, and some wanted all the details.
While it’s true that the audience is comprised of different shopper types, it was not true that it was balanced. MOST of the incoming audience wanted to dive right in. And we were creating and optimizing obstacles.
Content Strategy Best Practices Are a Good Start
Best practices are awesome. I love to learn and build from different experiences. But it’s important not to get too stuck on them. Things that worked in the past may not work at all for your audience or situation in the future.
For this SaaS client, once we began following the lead of happy clients and positive outcomes, we were able to expand and optimize more effectively. Do we still run into mysteries? Absolutely. But the measurable results are steadily ticking upward and the brand is not compromising its reputation to do so.