As marketers, it can be tricky to know if a campaign will land (or not) prior to launch. Often we rely on click-through rates and conversions after going live, but that’s after time and money are hopefully well spent. Sure, there are some proactive ways of testing, like AB testing, that can help you pivot during a campaign, but even the most telling AB tests are still only metrics at best. In other words, if one design outperforms the other by 10%, would you really know why?
What if you could test every aspect of your campaigns quickly and directly with your target audience before going live? What if you could see and hear them interacting with your messaging and designs before ever launching it to the market?
It may sound too good to be true. At least that’s what I thought when I joined UserTesting in July earlier this year.
Rarely do we have the time or budget to get focus group feedback on ads and content that need to be turned around in weeks—sometimes days. UserTesting, however, is an easy, fast, and straightforward way to get the feedback you need to launch campaigns with confidence.
Don’t believe me? Here’s my brief use case of a marketer come novis tester using UserTesting to make my campaigns as impactful as possible. This example focuses on a campaign targeting marketing folks in financial services.
Building a test for the first time
Without question, I’m not a researcher. Nonetheless, the process was quite simple. I used one of UserTesting’s templates to create a test in about 20 minutes.
Here’s a snapshot of the test:
- Included 12 people in the UK who held marketing roles within financial services
- Asked them what they thought about UserTesting’s website, including messaging and content from blogs and other resources to understand if it resonated with them or not
- Asked follow-up questions that helped me understand:
- How to best position our offerings based on the audience’s challenges
- What type of content to use in the campaign based on their learning preferences
After launching the test, despite this niche audience, all the results came back within 48 hours.
Turning ‘learnings’ into actions
Surprisingly, the test results were easy to digest. Each contributor’s video feedback was under 16 minutes in length. What made it even easier was UserTesting’s suggested sentiment feature that highlights sections of the video with positive and negative sentiment so I could easily skip to the areas of feedback that are most important.
In total, it took me about 90 minutes to analyse the results from every video.
Here are some examples of what I learned and how I turned them into actionable insights.
1. People in financial services are skeptical that the UserTesting platform can meet their demographic needs
Several people highlighted a concern about whether or not the UserTesting Contributor Network (panel) would be able to find their exact customer and/or target market.
Listen to one person share their feedback:
With this feedback, I then knew I would need to ensure the campaign messaging made it clear that people can use the UserTesting platform to reach their target audience in three different ways:
- Through the UserTesting Contributor Network
- By sending the test to their own database (a custom panel)
- By sending anyone a secure link
2. Marketers in financial services need more context to understand their data
According to the marketers in financial services, they mentioned needing more context for their data. To be honest, I think this might be the case in all industries but that’s just my un-tested hypothesis.
Nonetheless, when I asked marketers in finance what they would like to understand better about their customers, one person conveyed that it was tough to understand the type of person who comes in through digital channels.
Take a listen.
This feedback gave me another pain point to address in the campaign content. It may even be another hook for my messaging. Essentially, the data is only providing so much information. It shows what a user is doing, but doesn’t say much about who they are and what makes them tick.
In other feedback, the marketer wanted to know more about why a user didn’t make a purchase from them. Again, they have data that shows where their prospective customer might have dropped, but it doesn’t give them context into why.
Hear what he had to say.
3. Insight into preferred content types
What type of content do different audience segments prefer to consume? We can obviously see the number of downloads on our websites but what about new audiences that aren’t already engaged in our content?
Clear themes emerged in the feedback and nearly all marketers in the test were frustrated with long-form written content such as ebooks or whitepapers.
The majority stated that they preferred to learn and digest content through watching videos or through content that’s shorter in length and more scannable.
So I can deduce that video and visual content will be clear winners in my campaign for this audience with some bite-sized resources thrown in the mix.
Now that I understand more about my target audience, it’s time to get stuck in and craft the campaign messaging and test some ads! This was much easier than I thought it was going to be. As I said, I’m definitely not a researcher and have never conducted my own research before, but I was really surprised at the speed and quality of feedback.
Coming next will be some results on how I tested ads to gain maximum conversions from the right leads.