If you wish you had a better understanding of your customers’ needs, I can relate. Just over a year ago, my co-founder and I realized we knew very little about our customers and how they engaged with our self-tracking app, Exist.
At the time, we’d been running our startup Hello Code for about two years. We knew that gathering feedback was a smart move—how else could we know who was using Exist, and why, and how we could make it better? But after spending hours on video calls with some of our users, we weren’t any wiser about what we should do. Every customer seemed to have a different idea of what made our product useful, and they had very little in common.
We figured we had nothing to lose if we tried a user survey. Setting it up as a simple, free Google Form, we sent a link to the survey to all of our users, and hoped that enough people would fill it in to get useful data.
Unlike the one-on-one calls, the survey was so successful that it’s now become an annual ritual. We’ve acquired deep insights into the needs and behavior of our clients, and I haven’t had to jump on a video call since.
To help you leverage the annual survey as a client communication tool, I thought it would be useful to explore some of the broad lessons we’ve learned about gathering user feedback, and the tactics that have helped us get the most out of the results.
The Annual Format is Great for Identifying Business Priorities
In between our annual surveys, we gather feedback in a variety of ways. We have an exit survey for users who suspend or delete their accounts. We get feedback emails from users when they’re using our app. And we have a public roadmap on Trello, where users can suggest and vote on new features, and tell us why they’re supporting various suggestions.
While all of this feedback is invaluable, it’s very different from having hundreds of users providing feedback at the same time. Our annual user survey aggregates lots of responses to the exact same questions, so we can easily evaluate trends in the choices our users make.
Apart from giving us a lot of feedback at once, this process makes it a lot easier to see what our priorities should be. Though a handful of users might have taken the time to ask about a particular feature or integration they’d like us to build, many people who complete our surveys haven’t spoken to us at all over the past year, so we suddenly realize a much bigger proportion of our user base wants something we’ve only heard a few requests for.
I would add one caveat to this, though: large user aggregates will make it more obvious what your priorities should be. This can be really frustrating early on, when you don’t have many users, but having a strong vision for your product and understanding the need you’re fulfilling will keep you going until you have enough users for their aggregated feedback to be more helpful.
Free-Form Comments Reveal Helpful Surprises
We have two types of questions in our surveys: multiple choice questions, where the user can generally choose just one answer, and open-ended questions, where the user can type a message.
The multiple choice questions allow us to test our hunches, and to find out more about particular aspects of the product and our users’ behaviour. Occasionally the answers surprise us, as our hypotheses are proven wrong, or else we’re surprised by the second most popular answer, even if we guessed the winner correctly.
But we find even more surprises in the open-ended questions.
When users can type whatever they like, they often tell us things we wouldn’t expect. For instance, they might tell us about why our product is useful for them or how they use it. Often they’ll share a use case with us that we hadn’t thought of, and even though it might not be the most popular use case, it’s valuable to know how different types of people can get value out of your product.
Our users will also use these open-ended comments to bring up features they want us to add, or different ways they’d like to use our product that we haven’t asked about in the multiple-choice questions. It’s a chance for users to share their ideas, and it’s just as useful as the multiple-choice questions that help us prioritize future feature development.
When you write up a survey, don’t assume that it’s too much effort for a user to send you a feedback email or share their thoughts in an open form to tell you what they think of your product. If your customer is already taking the time to complete your survey, chances are good that they won’t find it hard to quickly type out a thought or idea they’ve been hanging onto. You might be surprised at how much feedback you get in free-form text boxes, and how useful it can be.
Wording and Structure Affect How Users Respond
We spent a lot of time wording our questions carefully last year, and even more this time around. We’ve learned that the way we word a question affects how our users will answer it, so we try to be as clear as possible about what we’re asking.
For instance, our product Exist has a mood tracking feature that lets users rate their day from 1–5, and add an optional note. This year we asked about their usage of this feature in two parts:
- How often do you rate your mood?
- How often do you add a note?
You can see from the responses that the number of users who add a note every day is almost half the number that rate their mood every day:
Although we think of the note as just one part of the mood tracking feature, we realized it was important to ask these separately after speaking to a user earlier in the year who mentioned that he rarely adds a note to his mood rating. Our users think about our product differently than we do, and it’s important to reflect that when gathering feedback, rather than assuming we’re all on the same page.
We guessed early on that we needed to be careful wording our questions in order to get the most useful answers, but something we didn’t think of initially was what those questions were telling our users. You don’t think so much about a question being a way to give someone information. It’s a solicitation of information, after all.
But our questions did just that. They gave our users information about us. They told our users how we thought about our product, what plans we were considering for its future, and what we thought its strengths and weaknesses were. We didn’t realize this until our users mentioned it in their feedback with comments like, “You seem to be thinking your product is best for X, but I’m only interested in Y. I hope you don’t move away from doing Y, because that’s what I’m paying for.”
Needles to say, this was something we needed to know.
The questions you choose to ask, how you ask them, and what order you put them in says a lot about your thought process. Trying your survey out on a friend who knows your product first can give you an idea of what kind of impression you’re giving survey-takers.
And if you’re new to writing survey questions, it’s helpful to remember that users can’t answer questions you don’t ask. If you want to know something but you’re not sure how to write a question that extracts the information you need, start with a small, test batch of users. Be open about what you want to find out, and listen to how they explain their feedback to you. Using the same language as your users do will make your questions clear, and you’ll get useful feedback in return.
Customers Are Hungry For Feedback
We use our blog and email list to publicly respond to our surveys every year. We generally jump on a few small changes or features immediately—low-hanging fruit that have proven to be popular requests—and get those done while survey responses continue to trickle in. This gives us small wins, things we can action to improve the product quickly, as well as some progress to show when the survey is over and we write up a blog post that details the results.
In our blog post we share our future plans based on the most popular requests, our own vision for the product, and some of our most interesting findings.
For instance, this year we learned that a majority of our users have one of our mobile apps installed, and most of them use our iOS app. This is important for us to know, because my co-founder Josh does our web and Android development, and works on Exist full-time, but as our iOS developer I’m a lot slower to progress, since I’m still working a day job. Insights like this help us figure out when we need to shuffle things around to make sure we’re moving as fast as possible on what our users want most.
Sharing our plans and what we’ve learned from the survey lets our users know we listened to them and used their feedback to make decisions. We generally make plans for around a year ahead, which coincides nicely with the next survey we send out.
Having a deeper understanding of your customers can help you prioritize the growth of your business, increase client retention and repeat business, and improve quality and profitability across the board.
If you want to experiment with a user survey of your own, keep these insights in mind. While an annual survey may not prove effective for every business type, trial and error helped us make surveys an efficient method for acquiring immediate, useful feedback from our audience, on the burning questions we had about how to improve our product.
Just avoid sending out user surveys too often—or your customers might get sick of filling them in.