Generative user research

Users told us we weren’t measuring the right things and they couldn’t understand our charts.

We interviewed seven people who matched our ideal customer profile to get their perspective on the data we were collecting and the way we were visualizing it on the current client dashboard. I put together an interview guide and and interview questions, scheduled the interviews and synthesized key themes.

Key findings

Our data wasn’t great. The data we were collecting weren’t that meaningful.

Our charts were confusing. Our data visualizations were not clear.

Market research

We needed to catch up to the competition and provide something better: programs aligned to client goals.

We evaluated our competitor’s platforms and read up on program evaluation models which lead us to recognize three problems to solve on our existing platform.

Key findings

We were behind the curve. Our platform was missing too many table stakes features to be competitive.

Our reporting process was not scalable. Unlike our competitors, our reporting required time-consuming manual work.

Our programs lacked focus. Our platform didn’t support aligning programs to client goals, which could be a key differentiator.

Domain research

We found better ways of measuring ROI and created a better process for creating programs that would produce it.

We tried to learn everything we could about how to prove the ROI of coaching programs which led us to the books The Kirkpatrick Model of Training Evaluation and Measurement Demystified. We also interviewed subject matter experts in soft skills proficiency measurement.

Key findings

Align programs to client goals. Our program creation process was ad hoc. We rethought the client journey and created a conversation framework that would help align programs to client goals.

Ask the right questions. Measuring soft skills is hard, but it experts have been noodling on this for a long time and there are some tried-and-tested survey questions for evaluating it as well as possible.

Usability testing

Users told us the new data we were collecting was valuable, our visualizations were getting better, and exporting data was a big value add.

Based on our findings, I introduced a new navigation structure, new types of ROI data collection, and new data visualizations to our prototype. Then I designed a user test (a first for Lingo Live) based on what we had learned that L&D Admins needed to do on the platform. We brought it to 5 of the same user interviewees to find out whether our new design made the value of our programs more clear.

Key findings

Our new data was more valuable. Interviewees rated most of the data 4 or 5 out of 5 in terms of value.

Our new charts and functionality were better. Some of our chart titles were still confusing, but overall, the visualizations were much more clear, and interviews loved that they could filter and export the data.

User Problems

It was hard for clients to know whether programs were producing value. Account managers were buried in manual work.

The biggest problem we identified was that program design process was ad hoc and key client information was lost in the shuffle. As a result, programs were not systematically designed to align with clients’ business goals—which meant they weren’t set up for success and expansion. If we could fix that, that would set us up to collect more meaningful data, prove value, and expand programs.

Client user problems

Unfocused programs. Our platform didn’t support customized programs aligned to client goals.

Unconvincing survey data. Surveys did not measure meaningful ROI data and response rates were often too low to be meaningful.

Lack of engagement visibility. Clients couldn’t easily track session usage and survey completion.

Lack of content visibility. Clients couldn’t tell what coachees were working on and whether it was providing business value. The coaching felt like a “black box.”

Unclear and unconvincing ROI reporting. Clients couldn’t easily report on program ROI to their executive teams.

Internal team user problem

Unclear and unconvincing ROI reporting. Clients couldn’t easily report on program ROI to their executive teams.

Project Goals

Improve the service design of program creation so programs produce more value. Measure that value well and make it easily visible.

Goals

Focused programs. Set clients up for success by ensuring programs are consistently structured to achieve client goals.

Convincing ROI data. Collect data that measures program value and ROI in a meaningful way.

Streamlined survey creation. Increase efficiency by automating some of the survey creation process.

Clear, compelling, and scalable reporting. Share data automatically in a way that allows admins to focus on what is relevant to them in the moment.

Self-service engagement tracking. No more tedious back-and-forth. One place to get immediate visibility into unused sessions and incomplete survey data.

Responsible content and progress sharing. Provide visibility into coachee work and progress without breaching confidentiality of the coach/coachee relationship.