Listen to the podcast here:
How To Gain Powerful Insights From Customer Testing
I want to talk about how you gain powerful insights from customer testing. Bella Scena has completed a six-week alpha test process. I want to share some of the lessons learned and the process that we used as we went through that test and more about why we did that in the first place. It’s common to hear that someone’s launched a new product. You see product announcements all the time. The game becomes, “How do I get people? Do you see any people use my product and give me feedback?” It’s this, “Use this thing.” It’s like we have a one ad placed that says, “A software company desperately seeking feedback.” My inner practical engineer says, “It feels pretty gross and nebulous to do that.” Feedback is such an ugly thing without context. What are you supposed to do with that feedback? I’d been thinking about this whole process of gaining feedback and how did I use that for testing because we’ve been using human-centered design for this full methodology. I thought about how I could apply that to the feedback process as well.
It stuck for quite a while and then Aaron Keller sent me this interesting article written by Rahul Vohra from Superhuman. Thank you, Aaron, for sparking the thought behind the methodology I used for the alpha test. This whole article walked through their methodology they were using to measure a product-market fit. I was inspired by this whole concept of measuring your testing because that’s what he was getting at. How do you actually measure that you are on the right track? How do you know what to add? How do you know what to do as you iterate? I found that to be a fascinating concept. The underlying premise is how do you make sure you’re creating a product that customers will crave? They will love, they will not let you take it away from them. That’s what I wanted to go after. Out of that whole article sprang this idea for how I could run my alpha test. I ran a closed structured six-week alpha test to prepare my product for the market and I use the five W’s: Who, What, Where, When and Why are we doing this?
Who Am I Trying To Reach?
How did I do this and what went into each step? How could you do this as well if you were interested in doing something similar? The first thing I asked myself is who was I actually trying to reach with this alpha test? That made me think a lot about what was going to be my target market? Who was I planning to market to initially as part of my go-to-market strategy? I knew that I wanted to target solopreneurs and small businesses for my initial product as I continued to work on iterating, refining and doubling down on the product. I made sure that as I looked for people to participate in the test, that I had a representative percentage of solopreneurs, small companies versus large companies.
I had a little over 40 people that participated total in my test. Forty people were a good number because people are busy. Not everybody has time to complete every test. You want to make sure you’re going to get a representative sample of feedback. I have 30% of my testers were solopreneurs, 55% were from small companies and 15% were from large companies. I knew that this was going to give me a representation of the types of problems to make sure I had problem-solution fit for what I was solving for my product. The other thing that’s important to me is I’m going after women solopreneurs and women-owned small businesses. It’s absolutely a target market for me. I have a real heart for having women involved in the software development process, especially after having been a woman in tech for many years.You can't be offended when you’re running customer testing. You’re out in the wild. Click To Tweet
I’m extremely proud of having had so many women involved in the design of the product. At the two most crucial stages of this product, I heavily had women included. When we did our reviews and we were deciding what features to have, over 50% of the people that participated in my reviews were women. As a percentage of testers, 45% of my testers were female, 55% were male. I’m extremely proud of that because I would dare you to go talk to other companies about what percentage of their pool are women. For me, this was hugely important and is something I focused on very much so from the alpha test.
The next thing that you had to think about is how are you going to find these people? I wasn’t just paying money and going in user testing. I needed something more intensive as part of that. About 25% of my group had participated in previous tests. They’ve been with me this whole time, probably around ten or eleven of them. The rest of them came from others I’ve met along the way. Subsequent when we had done some of the last rounds of testing. These are all people that indicated an interest in the product. It wasn’t me asking, it was all people that had asked me to participate as part of it. The rest of the group were all what I’d call second-degree connections. People that didn’t know me that well and a lot of them came from the pool of people that were involved in testing.
I would do something simple. I would say, “Do you know anyone else that you think would like to participate in this test?” Sure enough, all sorts of people came in. 75% of my testers had never seen this product in the previous iterations. This was extremely important to me because it meant they didn’t come with any implicit biases already and they weren’t going to be nice just because they knew Amber. I could be 100% certain of that. It’s nice if people are nice. It’s all about bias again and how we eliminate that bias as founders to make sure we’re absolutely building the right product. That’s actually the who of the target audience, a little over 40 people.
What Am I Trying To Learn?
The next question to ask is “What are you trying to learn during the alpha?” I asked myself, “If I’ve got to do this alpha, what do you want to know?” A lot of that came back to this whole concept around some of those initial markets. I thought about as I first take my product up to the market, I want to have solopreneurs and small businesses. If I’m going to do that, I can’t onboard every solopreneur every small business. I have to think about and make sure we’re able to do self-onboarding and that process is reasonably easy. That’s one thing I wanted to measure. Could people get through those processes or would they need help?
The second thing I wanted to understand was around some of the components of the product itself. As a refresher, my product is Bella Scena. Bella helps you go from deliberate to done through an integrated calendar as well as a to-do list. Instead of having to-do lists versus calendar, it’s to-do list and calendar. The second thing that I wanted to test is around this whole concept of integrating your to-do list with your calendar. It’s different and there aren’t that many products out there that do it. There are a few that do it in some pieces but not where the calendar is central component managing it. I wanted to see how well did that take. Did people get the concept? How easy was it to navigate through the design and what did they think of that functionality? I wanted to understand what percent couldn’t actually live without Bella just with this whole to-do list and calendar component.
The third thing I wanted to understand was around our instant meeting features. The instant meeting is essentially a digital notebook. It’s a giant notepad where you can take your own notes. It saves it right on the calendar for you so you don’t lose them again. You can tag the to-dos and follow-ups and you can collaborate with other people on this as well. It’s a super-powerful feature. I’ve been using it myself and I found it extremely helpful. I wanted to see, do people find it in its fashion usable enough where we add MVP with it? What else did we need to add to that? Yet again measure what percent couldn’t live without the product now that they had seen that as well. Each step of the way, trying to measure and understand how well we were doing overall with the product. That was the what of what was I trying to get out of it. Those three things that I wanted to measure.
Where Do I Do The Test?
The next one was where do you do the test? You might go half on the surface, but having a tech background, you have to know what does your product infrastructure look like? We had set up a Developmental QA in prod because I come from an ERP background and you always have a Dev QA in prod, but not everyone does it that way. Maybe you just have a development in production. Are they going to be in a development system? Where are you going to put them and what does that mean you do with their data as well? I had decided from a where perspective that I needed to have a production environment stood up and ready to go. I was going to take the time and effort to set that up now so that I wouldn’t have to do it later. For me, the where was getting my production environment set up, getting all my support processes set up and then I could start actually to practice this concept of release management. The where was important because I knew I was going to have to start being able to incorporate changes moving through my systems. It’s part of that growing up processes as an entrepreneur, things we have to learn and be able to do and release.
When Am I Doing The Test?
It also helped me make sure my support processes were actually working, were they usable and what other things did I need to add along with it? For me, a key thing I learned here in the where is I don’t have a lot of support documentation. I hadn’t had time to write it. Along the way, I’ve been making sure I get a lot of that documentation written, get a lot of those videos and things captured so those initial users will have a better experience once we hit the beta. That was all of the information that surrounded where and that went into the thoughts around where. Let’s talk about the next W, when. “When are they going to do this testing?” Everybody is busy. “How much time did they have to do it?” For me, I wanted to have structured testing. I’m a practical engineer. I’m not one of these free-for-all people. I measure things and know what I’m doing.Double down on creating a product you will happily tell all of your friends that it makes a huge difference for you. Click To Tweet
I also added the caveat, “Try other stuff and let me know what you think.” Each time I release a test, I’d have a structured set I’d have people go through and then say, “Play with things and send me feedback.” I wanted to give people an out because you always have the curious sort. They’re going to push every button. They’re going to try every little thing possible and probably do far more than you expected them to do, which is completely wonderful and you want, but you have to be prepared for that. I had a mix of primarily structured exercises, but I also basically left the door wide open for any unstructured feedback and observations that testers had as well.
Why Did I Do This?
That takes us through when. Why did I do this? At the heart of measuring all this, what’s going on here? There’s a lot of time and effort put into it when you’re trying to make the decision, “Do you just go to market and try and get people in there?” I wanted to make sure I understood where I was as I was getting ready to go to market. I wanted to know what percentage of people were going to find it incredibly helpful as is. I originally had this thought that I had to have my whole integrated calendar component, my to-do lists, my instant meetings, my meetings. I have everything there before I could release it. I realized as the first user that there was a tremendous amount of value in the to-do lists calendar and the instant meeting. I wanted to test that premise because my original idea what I needed to include in my MVP is wrong. I was insisting on being too perfect and having too much in my MVP. It’s a common mistake for a lot of founders.
This was going to help me get comfortable with not having to be quite so perfect and going to the market sooner as opposed to continuing to invest and build out this whole other component. How did they rate it? This is the best part. Lessons learned. “What did you actually find out, Amber? Give us some numbers.” I was pleasantly surprised by the feedback that I got back. I thought people would be hard on the designs because I was in the earlier stage. We’re about version four the product, it turned out we were closer than I thought in a lot of areas and there’s always room for improvement. I was pleasantly surprised by the feedback that I got.
Let’s break down a couple of these. Let’s take our to-dos, the way we had set up our to-do list and integrate it with the calendar. We had some different concepts in how we had built that. 90% of the people that tested it gave us between a one and a three on the “it’s not hard at all” scale. In other words, it’s not a problem to do this functionality. It was pleasantly surprising. I thought more would struggle with it. I knew we worked hard on making intuitive but you don’t know what’s going to happen out in the wild.
As it related to those instant meetings, that digital notebook concept, we put the barebones there for that and I knew it needed some work. 100% rated it super easy to use, a one or a two on, “This is not difficult at all.” I was surprised to see such a high percentage rate that. About 50% of them came back and said one of the components, how our to-do’s worked, we needed to do a little work on it. There were improvements they wanted. I was happy to hear that. That was great to get that feedback. About half people said it was super easy and they could use it right out of the gate. They can figure it out and didn’t have to spend a lot of time explaining it. This was interesting to me to see the feedback for what would then happen as people use the product.
Overall, I was hoping to hit somewhere around seven out of ten if I was lucky with my MVP. It’s an early product so I knew I was also going to be doubling down and using this as an extremely helpful way to gather feedback. If the scores were terrible, I wasn’t going to be offended. I’d probably be disappointed because we’ve been at it for a couple of years. You can’t be offended, you’re out in the wild. People are using it. After the second exercise, I started measuring the percentage of people that basically said, “Don’t take it away.” “How likely are you to recommend it to a friend who want to use this product?”
After the second exercise, just with the way the to-do list worked with the calendar, 70% of people gave me a seven or better. I was shocked. They found what we were doing with integrating to-do lists into your calendar that helpful. They would absolutely tell a friend to use this. I was thrilled to see the numbers come in that high that early. Of course, we still have work to do but that felt good to see that all that thought, work and care we put into it was working. After the third exercise, I was up to 83% of people that gave me a seven or more out of ten on my MVP product. 83% of people that had tried it out, used all this functionality and gave me feedback had said yes, they’d pretty much recommend this to a friend.
In the end, what I learned is we’re ready to go to the market. I can go to the market confident that I have something good here. All these groundwork that we’ve laid were ready. Not only that, this group that was providing all that feedback is telling me how to improve my scores. They’re telling me how I could get to higher numbers. I was looking for consistency in that feedback. When I found consistent feedback, I know what features I’m building next. There’s a lot of noise as an entrepreneur. How do you decide where to put your focus and your efforts? For me, it’s about doubling down on creating a product that you won’t let me fry out of your hands and you will happily tell all of your friends that it makes a huge difference for you. That’s the kind of product I want to build. This is how I’m actually working my way toward that and I’ll be continuing to work on it and gathering feedback as part of our beta test as well.
That’s my five steps on how I actually run this alpha test. It was an incredible process to go through. If you’re wondering I’m past the Superhuman article where you read more, this is it. I’m sure I’ll write about it in the future, but this is what I did. I can’t say that it came out of any other structured methodology other than what made sense as I navigated through it for my company and my company’s goals. Hopefully, it gives you some good food for thought as you think about, how do we measure things? How do we know things are working? That’s hard. There’s a lot of noise and it’s hard to find the signal. Running through this structured alpha test was a way for me to see where is the signal and all the noise that goes with building a company. If you’d like to learn more about me, you can follow me, @BeingWonderly on Twitter or you can follow my company at Wonderly Software Solutions on LinkedIn or just connect with me on LinkedIn. I’m pretty easy to find as well. That’s it for this episode. It’s time to go be wonderly.
- Bella Scena
- Article by Rahul Vohra
- @BeingWonderly on Twitter
- Wonderly Software Solutions on LinkedIn
- Amber Christian on LinkedIn