Defining design for AI driven conversational sales enablement SaaS tool
Project Overview
Q for Sales is an AI conversational intelligence sales enablement tool that was created in the height of COVID to help sales reps review their past meetings to understand their clients’ perspective and needs. It capitalized on the fact that many meetings had to take place over zoom or other conferencing platforms, making it potentially difficult for sales reps to “read their clients” on a more sentiment basis such as how their customers reacted to their pitch.
Timeline: June 2021 - June 2023
My Role & Contributions
Founding designer from product inception to GA launch and all subsequent releases
Created the information architecture and navigation structure
Defined and researched all user workflows and interaction designs
Competitive analysis with other sales tools to aid product team with requirements
Hands-on low and high fidelity designer & prototyper
Collaborator with engineering teams to incorporate AI technology into the product
Creating Product Value using AI
Because there were also several other competitor products in the same space, we also needed to figure out how we could not only stand out from those products, but how we can give more value to the user. We saw the opportunity to leverage AI to provide insights to the user about their meetings that would normally require them to thoroughly review each meeting. That time could add up to several hours depending on how long their past meetings were.
How?
There were 2 areas that were identified that coincided with what sales reps that we interviewed were looking for and what our technology could support.
Summary creation
Emotion lens
Summary Creation
Our data science team worked with several AI models to enable quick, easily digestable summaries of the overall meeting, key moments, and actions items such as ChatGPT and LaMDA.
Summarization
This feature came as a customer request to be able to have an overall summary of what happened in the meeting. This was mainly for managers to get a quick understanding of what happened in their sales rep’s meeting. This was a highly sought after feature and received positive feedback from customers on its accuracy.
The primary challenge with the summary creation was the length. Sometimes it would be considered as too short and other times it felt too long to be a summary.
Key moments & Action items
Before we used AI to help generate summaries, these key moments and action items would be pulled directly from the transcript. Sometimes these transcriptions were very long, which would be difficult to read through. Therefore we also applied the summarization technology here as well.
Key moments were designed as cards to create visual blocks between them and limited in length to at most 250 characters so the block lengths would not vary tremendously and have cohesion.
I also included the ability to play back that part of meeting where the key moment /action item was derived from in order for the user to double check if our summaries were correct. This way we could build trust with the users and they can verify our reliability.
This is where the overall meeting summary would live.
These would be where key moments that were pulled from the meeting would live.
These would be where the actions items that were derived from the meeting would live.
Emotion Lens
Providing the sentiment of the meeting was a vision from our SVP to help sales reps understand during which parts of the meeting customers may have had negative and positive feelings. The aim was that there could be a correlation drawn from how customers react emotionally in a meeting to the likeliness that sales deals would close successfully. It was also to better anticipate where customer objections would be and how sales reps could identify them more accurately and find solutions for them.
Our data science team created the technology to be able to detect and analyze the sentiment and engagement of people in the meeting through computer vision, tonal, and NLP models during post processing. However our product team understood that the emotion lens first needed to be built on top of a feature base that customers expect as part of the conversational intelligence solution such as a quick way to view when and where people spoke, full transcript, and basic actions such as commenting, sharing, and creating clips of the meeting. Therefore, the emotion lens is a layer on top rather than its own stand alone feature.
The emotion lens applied to the following areas:
Meeting
Each meeting has a timeline where the viewer can see how the sentiment changes throughout the meeting. There were also additional indicators on the timeline to showcase when key moments were detected and when a screen was shared.
Speaker track
Each speaker have an sentiment bar below to showcase during which parts of the meeting were positive and negative sentiment detected. Then there’s an overall sentiment and engagement value per person
Key moments
These also have a sentiment value attached to them as a way for sales reps to view quickly which ones they may want to review. When there’s negative sentiment, they may be drawn to figuring out why there was a negative sentiment associated with that moment so we provided the option for them to play back that moment.
This image depicts when the emotion lens is turned off.
This image depicts when the emotion lens is turned on.
Example Design Deliverables
Below are examples of design deliverables that I created for this product.
Information architecture
Below is the figjam of the information architecture behind the meeting page where sales reps will use to review the past meetings with their clients. I worked with the product team to prioritize what types of actions and features were “must haves.”
Figma Components
Below is an example of what Figma components I created and organized this product
Prototype
Below is a video of the prototype I created to showcase the meeting replay page in Q for Sales and its different components using Figma.
Where is the Project Now?
This product has a small number of customers who are helping test the product. It is currently in the stage of trying to find product market fit.