What are different messaging processes that you do? And which ones work best to help team/stakeholders come to consensus faster?
If you have differing opinions across your org, you definitely need to bring some data to the table. Testing is a critical part of developing successful messaging, and testing or market research does not have to cost an arm and a leg.
Even if you don’t have a huge budget, you can still do some amount of testing to gather insights that help highlight what resonates with your market. You also want to do a mix of these to get a holistic view of what works and what doesn’t across various channels. Here are a few ideas to get you thinking:
- Test in different marketing channels
- A/B testing on your website is probably one of the best ways of testing messaging if you get decent amount of traffic to your site
- Test it on social channels - see what people respond to most - and this can be free
- A/B test email copy - Most companies are already doing email marketing - pick a few to test your messaging in
- SEM ads - if you are already running paid ads, just test your messaging against a control. SEM ads are typically quick to implement, and it can give you a wealth of data
- If you have a SaaS product - test your new messagin in-product
If you have an outbound sales motion, test your messaging in your sales outreach emails or call scripts - There are DIY market research products out there that are affordable (shameless plug, but as a PMM, I regularly use Audience which our online DIY panel that really is very affordable for quick projects like this). We used this to develop our own messaging internally - very Meta I know.
- Focus groups. You will likely spend a little bit on incentives with this, but do a few focus groups and get a mix of current customers as well as potential customers to give you feedback - the information you get from these interviews is usually worth every penny. I have also seen a few apps out there that help you source very specific audiences for user research.
Lastly, another approach we took was to do Analyst briefings. These are experts in your space, leverage their opinion to get consensus internally as well (along with the data from all other tests)
Two-part question? Two-part answer :)
Messaging Processes & Consensus
The type of messaging we do varies depending on whether we’re looking at our entire product, a feature or set of features, or a campaign or sales motion.
In most cases, we’re likely to draft up a mini-brief to make sure we have strategic alignment. You can find an example mini-brief template in my PMM Hub template .
For net new products, or those that shift the paradigm of our product, we likely need to allocate time to naming. My impossibly talented colleague Laura, who leads our brand team, has codified her process in this Namestorming doc , which you can copy and use.
Both of these help drive alignment, but when it comes to overall consensus for naming and messaging, we’re likely to dedicate some time in a “Catalyst” meeting at Coda to drive decisions. It’s common for these meetings to have a pre-read, which includes an opportunity to log sentiment and questions, and to vote on questions or feedback ahead of time so that we can focus on the top issues/concerns.
Driving consensus requires a company-wide set of guiding principles and governance for decision-making. The way you confirm messaging decisions shouldn’t look wildly different from how you confirm product, budget, headcount, etc. decisions.
You can see how we put our principles into practice in the Coda Meeting Starter Kit.
A/B Testing & Alternatives
Never underestimate the power of talking to your customers, community and colleagues.
While A/B testing might be costly, the only cost to sitting in on customer calls is your time. If you have 2-3 options for how you want to frame an upcoming product or feature release, jump on a few calls and try describing the upcoming feature with a different option each call. Take notes on the types of questions you get, the level of excitement, etc.
We’re fortunate to have a pretty vibrant community on community.coda.io, where we’ve announced a few betas this year for bigger product updates like infinitely nested pages, forms, and attachments. The volume and content of responses to invitations to participate in betas help to identify key value props and resonant language, and your beta participants can also be engaged as customers in the above example.
While your colleagues might have some biases (they’re likely power users of your product!), it’s still helpful to poll them if you have some options. At Coda, we pretty frequently use a voting table and/or sentiment tracker to get feedback at scale on proposed messages, campaign strategies, feature sequencing, problems to solve, and more.
While none of these provide the objective, data-driven insights as a well-articulated A/B test, they can help point you in the right direction so that any tests you do conduct are a good use of resources.
I find it helpful to put together a simple exec-level deck that incapsulates the key messaging, how I got there, and how it'll show up. I use that deck to organize my own thoughts concisely, as well as socialize with/gain feedback from stakeholders.
1. Key messaging - this is usually a one-slider with a headline message and 2-3 supporting or key points. It's just a way to directly answer "what are we saying"
2. How I got there - key audience insights, industry trends, data points or considerations (e.g., sales feedback) that drove the messaging (and show off all your hard work and thought leadership!)
3. Even if not fully baked out, a very simple channel plan with a timeline (and even some mocks if your brand team is so willing) can really do wonders to helping stakeholders get onboard and excited.
A/B testing is always helpful, if you're in a good position to do it right! However, I oftentimes find that some marketers A/B test entirely different messages in a way that has too many variables and doesn't make for a true test.