“How do you measure the health of a community?” is one of the core questions every community professional will eventually be asked.At CMX, one of the core ways we measure community health over time is with a survey.Our most recent community health survey was run in March 2021. Our goal was to hear from as many CMX members as possible about their experience, and take away some actionable next steps for the team. This survey also gives us a baseline to track member satisfaction over time — we’re planning to run this on a biannual basis.In this post, we’re going to walk you through:
We’ll also share the actual survey we used. Feel free to use this as a starting point for designing your own community health surveys! For an example of our survey results, see the full results of the CMX Community Health Survey here.
First, some context. The CMX Community is the world’s largest and most passionate network of community professionals. With three online community spaces for members to connect, 25+ community-run events every month, training and research, a job board, CMX Summit, and the Community Industry Awards, CMX is a multifaceted, mature community.With so many community activities, it’s important that we keep a pulse on what’s working, what’s not, and where we should focus our efforts.Let’s dive in!
At CMX, we begin every project with a brief. This first step gives us the opportunity to strategize and answer questions like, “Why is this project important?” “How will we measure success?” and “What deliverables do I need to have?” You can check out my full Satisfaction Survey brief here.We had three goals for this survey:
For the timeline, I laid out all the steps I wanted to take and put them in a calendar for myself. Project managers always tell you to work backwards from the due date, so I did! Our quarter closes at the end of April, so I knew I wanted the survey to be completed and ready for analysis then.To see the full breakdown of my timeline, check out my project brief.
Then I began! I used Miro to create a mind map and brainstorm the flow of the survey. I knew I needed to ask specific questions about each space, but had to keep the survey relevant to all members.
In the end, I broke the survey down into four sections:
I spent four weeks brainstorming, mapping out, building, and getting feedback on the survey layout. Here’s how that breaks down:
All respondents answered the same questions up front. These core questions asked about the member experience in the CMX Community overall. We also asked about Bevy, and how Bevy’s acquisition of CMX had impacted members’ experience in the community.To get specific in the programming section, we asked respondents to tell us which community programs and spaces they use or participate in. These responses would inform the layout for the rest of the survey, so the programming questions would be relevant to each member.At CMX, we are committed to building a diverse, equitable, and inclusive space for all community professionals. To better serve our community and establish a baseline as we continue to work towards an inclusive community culture, we included a section on demographic information. This data will also be used to inform programming and other initiatives aimed at creating an inclusive culture, like resource groups for specific audiences.Last but not least, the fun section! I waffled back and forth about whether to include this section. But in the end, I had to—to put a smile on your faces, AND to collect information for some new brand images.
For more on survey design, check out 7 Steps to Create Effective Surveys and Collect Community Feedback.
Finally, you’ll need to promote your survey to drive responses. My goal for this survey was 300 respondents. This was determined by using a sample size calculator, targeting a 95% confidence level and a 5% margin of error.Our survey was open for four weeks. During that time, we posted in all of our community spaces four times. The survey also appeared in the CMX Weekly each week. I sent two direct emails to CMX Community members, and sent about 100 direct messages in Slack and Facebook. Since this survey was only open to CMX Community members, we didn’t run any external campaigns (like social media).For this survey, I made sure to track responses using UTM links and Google Analytics. Although the survey was anonymous, we still wanted to see which methods of engagement drove the most responses.
After the survey closed, I pulled and analyzed the data, built reports, and wrote this case study! In this stage, our design team helped put together the charts and graphs, as well as social media assets.We had 293 people respond to the survey. With a 95% confidence level, this meant our margin of error is about 6%. While we know that many people did not get to fill out the survey, this means we are fairly confident that this sample of responses reflects the broader community.The data from this survey gave us a baseline that we can track over time, and we hope to see ongoing improvement the next time we run this survey. We're actively working to make CMX a more vibrant, welcoming and inclusive community.To see the results from this first CMX Satisfaction Survey and read how the CMX team is implementing your feedback, check out this blog post!