Skip to main content

@Mark Mitchell has a saying that’s gotten popular around here. He says “If your process is set in stone, it may as well be your tombstone.”

That’s why I love working with him, he’s got all sorts of nuggets of wisdom to share! 

I think that quote resonates strongly when talking about VOC (Voice of the customer). It’s so important to hear what customers are saying and also implement updates according to their feedback. 

So… please share what methods you’ve found most successful to capture/implement customer feedback!

 

PS if you’re interested in chiming in on the topic, join our livestream!

Just trying to be like YOU, @emaynez!

 

I’m a huge fan of our CSAT process, and how @ellibot formed an integration that notifies me in Slack whenever we get a new submission. It gives me a chance to review feedback, and schedule time with the customer to hear what worked well for them, and what could have gone better.

All you need to do is ask, then add that to your pool of decision making. Not every piece of feedback gets implemented, but everything is discussed and considered 100%.


Customer feedback is some of my favorite data to work with! Being able to trend things such as NPS and CSAT over time is critical to analyzing your onboarding process. It’s also fun to think of creative ways to get feedback from customers. Overall, most respondents to a CSAT survey just give a score without leaving any written feedback. Trying new things like injecting humor or asking in a different way can improve response rate. This is a lifelong task! Those responses are pure gold when it comes to incremental improvements.


I am in the process of revamping our CSAT so we capture relevant information at the right time. Funny enough, I recently found out there was a post-implementation CSAT here that no one shared with us. The email to the client even went out with my boss’s signature and he didn’t know about it.

 

I’d love to hear what others are asking during their CSAT/NPS surveys, what the response rate is, and how you follow up on them.


@rondeaul,

Crazy!

 

We’re using the GUIDEcx CSAT surveys that get sent to customers once specified milestones are completed. We also have another one sending once the whole project is complete. It is a very simply “on a scale of 1-5, how was your experience?” Survey.

 

We’re also building logic on the backend that will follow up with customers 1-2 months post-onboarding if they didn’t leave a project CSAT, and another one sent when we get a score below 4. The email basically just thanks them for their feedback, and provides a quick 15min calendly to schedule a time where we can chat and confirm what worked and what didn’t.

 

It’s also interesting to report on milestone duration, then overlay that with CSAT by milestone to see if there are any trends.


One thing I’ve done in the past and am going to include in our CSAT revamp is the follow-up process. After every survey response I want one of our managers to have a live call with the client who replied in the survey to get any additional details they are willing to share. I have found that these post survey calls typically last 15-30 minutes and give incredible insight into our processes from the client’s POV.  We’re able to dig into why they scored a specific question one way and get details they often don’t want to take the time to write down.

In practice, about ½ to ¾ of survey respondents will agree to a live call after the survey. It’s a huge value add.


The survey questions I’m planning on sending clients (barring last minute changes) are:

  1. How satisfied are you with the implementation of Spiff? (1-10)
  2. Did the implementation meet the expectations provided during the sales process? (Likert Scale)
  3. How would you rate your implementation team? (1-10) (will automate and populate team member names in the survey as a reminder)
  4. How can we improve our implementation process? (free text)

I’d love to hear how this compares to your questions.


@rondeaul that sounds like a world class CSAT experience, I’m excited to hear how the rollout goes, keep us posted!

 

I definitely think SaaS may have missed a human touch in 2023. This post from Jason Lemkin really resonated with me: https://www.linkedin.com/posts/jasonmlemkin_csin2023-activity-7068236076364009472-2mbi?utm_source=share&utm_medium=member_desktop

 

Taking customer feedback seriously is probably the best thing we can be doing in 2023. Post-survey call sounds like a perfect solutions!


Love that post @ellibot. Thanks for sharing that.


Sharing this idea before I forget:

I had this idea once upon a time where we would hire a high school student or an intern and they would call every single newly onboarded client and conduct a 10 min “CSAT Call”. 

The caveat is that it can’t be an automated/recorded one like bank surveys. The idea is that people are much more open on a live conversation and are able to express emotion better. Imagine the higher quality of data you could collect by conducting these 10 minute calls!

Of course the expectation would be set by the onboarder that a call would be coming. That way the customer answers and feedback is captured.


Hi @rondeaul,

There is some leading methodology that is beginning to deviate away from utilizing the NPS question formatting for transactional experiences (e.g. services for purchase, onboarding, support inquiries, etc.). NPS is still heavily used for relationship experiences (e.g. customer life-cycle, annual/semi-annual distributions), usually led out by Marketing or a proper VoC Office.

Regarding transactional, (e.g. Customer Implementations and Onboarding) I tend to see a heavier focus on a 5 point or 7 point Likert scale that fits the transactional model. This can include an overall satisfaction question.

General best practices (emphasis on general):

  • Do not ask questions you already have data on (e.g. respondent persona information, geo information, demographics, etc.)
  • Transactional surveys should be trigger-based: after chat support, conclusion of a major milestone, conclusion of onboarding project, conclusion of training experience, etc).
  • Ask your most important questions first. That way, if your respondents drop out, you still have the most critical question response recorded.
  • Make sure the respondent can complete this type of survey within 5 minutes (ideally less than 5 questions, no more than 7).
  • No more than 1 qualitative response box (open text box), and leave this for the end of the survey.
  • Matrix tables require a lot of effort to complete, and you will see drop-off rates increase with these.
  • Survey with the intent to be actionable.
  • Close the loop with your respondents.

These are just a few items. Happy surveying!

-Mark

p.s. In your 2nd question example, it is worded as a definitive Yes/No, but you are using a Likert scale. Consider changing the opening text to something like: “To what extent...”.

p.p.s. You can use a 10 point scale or simplify it to a 5 point scale and still present in percentages with some minor calculations.


Great discussion everyone! Thank you for your thoughts! We just held our weekly livestream (thanks to @Davi1700 and @DBoyd03 for jumping on) Here’s the link if you wanna view it: 

 


Reply