Good questions for customer interviews
July 16, 2020
To create content that performs, it’s critical for writers to understand how customers perceive your product, what problems they use it to solve, and what problems, more generally, they have.
(In shorter words: writers need empathy.)
A great way to do this is by talking to customers. In the vast majority of companies we’ve worked with, content producers, and often even marketers more generally, don’t do this. Common objections include:
- Not having time
- Not wanting to bother customers
- Not knowing what to ask
The first two are easy to deal with. The time you take in building empathy with customers is paid off by the time you’ll save creating the right content, and really understanding where your customers are coming from. And as for not bothering customers – they would love to talk to you about your product, as long as they feel like you’re listening. Relationships with your customers, like trust, are a positive-sum game.
So what if you don’t know what to ask? Here are a few good questions to start with.
Tell me about yourself
This is a quick one, but you want to make a connection with the person you’re interviewing. What’s their role, how long have they been in it? How do they describe the company they work at? Useful context for understanding where they’re coming from.
Before you start the interview, you should also take a minute to tell them who you are and your role at your company. And some general tips: show your engagement with the interviewee, and take it slow since often the most interesting information comes after a moment of reflection.
What do you like about your work? What do you dislike?
Start listening here for things they want to do more of – sources of value for them in their work – and pain points. If you’re interviewing an SDR who uses your product and they tell you that the most unpleasant part of their day is filling out reports, you know that the ability of your product to automate reporting is useful to them.
Tell us how you heard about our product
Did they hear about it from a friend, see it on social, or read some interesting article that led them to it? In any case, this is critical data for your content distribution strategy, and you’ll want to share it with your demand gen team, too. (Or maybe your customer can’t remember how they heard about it; that’s useful information, too.)
For extra points with your marketing ops team, see if your customer’s memory of where they heard about it is the same as what’s reflected in your lead gen analytics.
What are you using it to do?
This is the first question that will really help you home in on what you should be writing about. Not only does this answer tell you what problems your software solves, but more importantly, it tells you how your customer thinks about the problem your software solves.
You might get a really simple answer: “We use your software to manage our inventory.” But that gives you a chance to dig a little bit deeper. “Tell me more about that process”, “How did you do that before you started using our software?” or similar questions will reveal details about the customer’s goals and experiences that you can use to write more useful content for them.
What’s your favorite part of our product?
Share this answer with your product team. But also, the language here is useful for your landing pages and calls to action.
“What is your favorite part of using our CRM?” “Well, I love how flexible it is. You can basically do anything with this software.”
This customer (and others) say that they love the flexibility of your software. This leads to:
- A series of how-to articles on all the interesting things you can do
- A survey of top things that your CRM is used to do
- The use of the “flexibility” theme in your ads, product page copy, and landing pages
If you had a magic wand, what would you change?
Share this answer with your product team, too. From a content perspective, the opportunity here is to know what to avoid. For example, is your search painful to use? Maybe take “powerful search” out of your landing page bullets.
If you were to describe our product to a colleague, what would you say?
This is a key question for your organic search strategy. How do your users talk about your app? What do they think it is? You can target this phrase in search. But also, if you’re hearing answers that are not what your product does, that gives you an opportunity to write content that helps users better understand what they should be using it for.
“I usually describe it as a tool that makes it easier to schedule appointments.”
Of course that means you should make sure you’re targeting how to use your product to make appointment-setting easier. But it also means that you can write some other posts about how your product maximizes customer satisfaction, or facilitates customer engagement via instant videoconferencing.
Any other comments?
Letting your customer talk will give you some useful insights. They may also have a problem that you can easily solve for them – a misunderstanding about how a feature works, for example, or lack of awareness about something that’s available to them.
Investing in customer interviews will generate a huge payback for your content marketing efforts. Even one interview every week or two will help you build empathy for your customer. Becoming a skilled user interviewer takes time, and there are lots of places to learn more; Nielsen Norman Group is a go-to for us.
Building trust with your site visitors
July 15, 2020
Marketers spend a lot of time figuring out how to get attention. We pay for it on Google and LinkedIn and Instagram, we think about whether our search results will end up at the top of someone’s screen when they are trying to solve a problem, and we send lots and lots of email with cute titles. We use lots of fun gimmicks – sometimes ones that have nothing to do with the brand or what you offer – just for the attention and the chance to continue the conversation.
It’s competitive, and the competition is exhausting. Attention is zero-sum. There’s a fixed amount, and to get attention for yourself, you need to take it from someone else.
Sometimes looking for attention works, and it’s certainly an important thing to know how to do.
But there’s a big part of the story that marketers often miss – building trust. Trust is needed for any transaction to take place, and the world’s biggest brands are often those that are the most trusted. Trust is:
Positive-sum. When you build trust with a customer, it’s easier to build more – versus attention, which you get a limited amount of. And when I build trust with a customer, it doesn’t mean there’s less trust for you. In fact, we can both benefit at the same time.
A way of reducing friction. When you build trust, the right customers go from wondering, “how can I avoid buying from this person?” to “I bet this person can solve my problem, and I want to work with them to make that happen.”
Key to customer satisfaction, especially for complex products like software. The path to getting value out of them is long and circuitous, and it only begins with the sale.
So how can we focus on building trust?
Make promises, then keep them
The major way to build trust is to make promises, and then keep them. This manifests itself in a lot of ways, including consistency, value, and authenticity. Some examples:
Explain clearly what your product is about on the front page. When companies make really vague pronouncements on the front page, it’s a missed opportunity to make a promise about what you deliver, and how. And it often confuses visitors or turns them off.
Here’s an interesting example. Is the heading clear? Should their subhead be the heading? If you had never heard of this company, would this make you want to buy?
Make it easy to access useful information. This can mean a lot of different things, from having a clear and consistent navigation bar, to having clear topics in your content library that are navigable and correspond with your visitors’ problems.
The content library from Nielsen Norman Group, a UX consultancy.
Be consistent with your brand. This can mean simple stuff like making sure your design is up to date and helps your user navigate, rather than getting in their way. More generally, it means making sure that all your communications and all of your brand personality works together (though that’s a topic for another post.)
Price honestly and fairly. Is it easy to cancel? If a user isn’t getting value from your service, can you charge them less or automatically switch them to a lower tier?
We love this “maintenance plan” for a service we recently canceled (left). It’s not available until you try to cancel – which is super-easy, by the way – but it gives the user an option other than “we’re going to delete all your data.” By way of comparison, does knowing how hard it is to cancel the New York Times (right) make you want to sign up?
Make outbound touches useful and relevant to your prospect. Personalization works in outreach. Why? Because it creates trust that someone’s reaching out to you for a reason, and has done their research.
Is this useful personalization? How could it be improved?
What would happen if you viewed your goal as creating trust rather than getting attention? Slower growth maybe, at first. But ultimately – much more durable, valuable relationships with your customers and prospects.
A/B testing for startups
July 14, 2020
Generally, we tell clients that they need 1,000 conversions for a reliable A/B test, for each variation they’re going to test. (There are actual calculators you can use, too, but this is a rule of thumb.)
That means if you have, for example:
- A landing page with 50,000 visitors every month, and a 5% conversion rate (= 2,500 conversions)
- Or an ad with 250,000 impressions per month, and a 1% click-through rate (= 2,500 conversions)
You’re going to be able to run 1 reliable test each month. Of course, there’s a lot more to the story, and we recommend using a calculator like this one to know for sure. One other big factor is the size of the uplift; if you have larger differences between the test group (say they have a 10% conversion rate) and the control group (say they have a 1% conversion rate), the difference is also easier to detect. There are other parameters you can play with as well.
But the overall point is that it takes a lot of traffic. As you grow larger, you can run more tests! A million views of your homepage every month, with a 5% conversion rate, means 50 tests a month – and you can really get into things like buttons, form fields, copy, and more.
But what if you don’t have that much traffic? Is an A/B test still worthwhile, and how can you make it count?
Maximizing the usefulness of A/B testing
Given how few reliable A/B tests most marketers can run at a time, we suggest a few important practices to make sure they’re effective.
1) Test big. Testing slightly different landing page copy, or button size, or font color, is interesting! But ultimately, these tests often yield smaller improvements that take a long time to show up. Worse, by the time you’ve completed the test, or shortly thereafter, you’re embarking on a redesign or a new campaign that means you have to throw out your test and start again.
Instead, test an entirely different landing page design across all of your landing pages simultaneously. Try a completely different value prop on your homepage. Hide or show pricing in your nav bar. Hide or show live chat. Try to make big changes, see what happens, and use the results as evidence not just for marginal improvements in performance, but for significant changes in how you talk about, position, or promote your product.
2) Test all the way through. Your ads are a great place to test – super-easy to try different languages, instant learning about what resonates, and usually, something like click-through rate is a faster test than form conversions.
But in addition to testing click-through rates, you probably have a goal of converting your visitor. So you need to test conversions as well to see if your ad copy is simply drawing in lower-intent visitors more efficiently, or if it’s truly doing a better job at positioning you to prospects who would be interested. (You don’t have to A/B test your landing page in addition to your ad, though testing an ad in combination with a landing page might give you a more powerful signal.)
(If you’re an ecommerce business, this is a lot simpler, of course – and effective e-commerce tests do generally track all the way through to revenue. This point is directed mostly at B2B companies with a more complex sales cycle.)
3) Get the fundamentals in place before you test. A/B testing is useful, but talking directly to customers – and perhaps even showing them a landing page and soliciting their feedback – might be worth prioritizing. (And that approach will definitely give you more useful feedback.) There may be other fundamentals you need to work on first, too. How’s your design? Is your page showing up in search? Does it have a clear value proposition?
4) If you are going to A/B test, do it as a program, instead of as a one-off. Bake it into your process to always test your email subject lines, for example, and then choose the winner as the final send. By doing this, you’ll get better at testing, you’ll learn more, and your ultimate results will be a lot better.
5) Don’t hack your own test. Choose a timeline or an endpoint for the test – let’s say 1,000 actions – and then stop the test there. And don’t stop the test until you reach that point. Ending tests prematurely when a desirable outcome has been reached, even if that outcome is mathematically significant, is a major reason why marketers get false results from their A/B testing program.
6) Track your test. We don’t just mean keeping track of the results of the test, though of course that’s important! We also mean – what did you learn from each test? Why did you run it? What did you expect to see (your hypothesis), and what actually happened? This can add another layer of learning, since you don’t just learn from the test, you see how it compared with your thought process before you ran the test.
What kinds of A/B tests are useful?
In general, A/B tests should focus where learning will be most beneficial – and that isn’t necessary where you have the most conversions.
- For example, if you have a page that lets users sign up for a demo, test 2 different versions of the page, with different value propositions, perhaps a description of what happens during the demo, social proof, and so on.
- Consider A/B testing different page templates, not just individual blog posts or landing pages.
- Make A/B testing part of an ongoing program, particularly for marketing emails, email outreach, and, if you have enough conversions, for paid advertising.
A/B testing is a powerful method for improving performance, but if you have less data, there are techniques you can use to really make your A/B tests count. In addition to ensuring a statistically valid test, make your tests bigger – more significant changes, a more thorough view of the entire sales funnel, and more consistent testing as part of the work you do every day.