1 week to concept test a tool for physicians

Facilitating ongoing medical education

I was brought onto a 4-month project when there was just 1 week remaining to evaluate the desirability of a product that aims to streamline Continuing Medical Education (CME) accreditation for medical professionals.

Context

Continuing Medical Education (CME) credits are required for all medical professionals to maintain accreditation in their field, and are acquired by completing educational opportunities applicable to their work. Our client had noticed a service gap that caused many medical professionals to have a less than ideal CME experience. A team of consultants had been working with them for weeks to flesh out a product concept that could address CME pain points. By the time I was added to the project, a lot of work had been done to research the current CME experience, but it had not been validated by the primary users, medical professionals.
My role was to use the 1 week remaining to learn about and report on the following question: how do medical professionals (more specifically, physicians) respond to the concept?

Part of understanding what and how to test is understanding the current pain points that exist in the CME landscape, and how the concept aims to address them.

Process

Laying the groundwork

On day 1, I spent time familiarizing myself with the work the team had done so far. I asked the team the following questions: 

-  What is the CME experience? How do medical professionals seek out, complete, and report CME credit? What are their unmet needs and pain points?

- How does this concept aim to address medical professional needs and pain points? What is the primary value proposition?

- What big questions are still outstanding about the tool's value or how medical professionals might use a tool like this?

Answers to these questions were what I needed to write an effective field guide for interviews.

Learning objectives

Also on day 1, I worked with my new team to outline learning objectives for this single round of testing, all of which fell into 3 main categories of what we were hoping to learn:

1. Do physicians see the concept's offerings as valuable? How valuable? And, which offerings are most attractive?

2. What barriers might exist to physicians using something like this?

3. How do physicians react to a free-mium model, which would require them to pay to use premium features?

At the end of my first day on the project, I started to reach out to potential test participants to schedule interviews for later in the week.

Designing the test

On the second day, I started to work through how the test would function and the types of questions I would ask. With these types of tests, it's important to have some level of confidence that all participants are reacting to the same experience, and it's important to ensure that they're reacting to me (and wanting to please me) as little as possible. For these reasons, I designed a very simple, low-fidelity landing page prototype that would communicate the concept. Below are some elements from that prototype.

Prototype

When I need to build a stimuli with the intention to evaluate desirability of a concept, I often gravitate towards a landing page prototype. It's quick and simple to build, can be effective even at a low fidelity, and it always succeeds at getting the idea across to participants. I tend to have a general structure: (1) start with a short tagline that immediately addresses the felt need of users but start general (2) add some text to give a hint to what this concept aims to do to address that need, (3) then jump into fleshing out how the concept will work, getting more specific as one moves down the screen.

Here are some elements for the prototype I created for this test:

"Say good bye to CME audit anxiety. It's time that CME compliance entered the 21st century. Say hello to a new simpler way of reporting CME."
This text was the first thing test participants saw when viewing the stimulus I created. The intention here was to introduce the concept bit by bit through the stimulus, starting with acknowledging the pain of CME audits and hinting at a better way.
Test participants saw this section right under the "fold," and at this point of exploring the stimulus, participants are starting to understand a bit more about the concept, specifically how the concept hopes to fulfill its promise of a new, simpler way.
This part of the stimulus starts to get more into the specifics of what the concept is - an app for physicians - and a list of key features.

Writing the field guide

Also on day 2, I wrote the field guide with all the prompts for the interviews, which I estimated would last about 45 minutes each. With desirability tests like this, I tend to start the interview by giving the participant some time to explore before I become more directive and start asking questions. I planned to follow this same pattern this time. Questions that I planned to ask include: 

- How would this concept change how you currently report CME compliance?

- What thoughts do you have about providers sharing your CME completion data directly with your board(s)?

- Go through this list of offerings and tell me your thoughts about each one.

- If you came across this product being offered later today, how likely would you be to sign up on a scale of 1 to 5?

- And then: What would need to change to make your rating a 5?

Synthesizing and presenting findings

On day 3 and 4, I conducted interviews with physicians, and then synthesized findings. On day 5, I shared my learnings with our internal team and created slides to be added to the team's final readout report. I include just a few of the more interesting findings here:

- Overall, test participants were most excited about the feature that brought all their CME data in one place, in what we were calling a transcript. That, plus the affordance to send that transcript to other parties, such as employers, which was not a need that the consultant team had uncovered in their prior research.
I quickly set up a template in Google Sheets on day 3 for note-taking during my interviews, which allowed me to jot notes and verbatims while I was conducting interviews (it helps that I'm a quick typer).
- The feature of a CME opportunity finder was seen as valuable, but participants said the value would be greatly increased if those opportunities could be filtered by details such as CME topic, cost of activity, amount of CME credit granted, activity schedule, location and - confessed by at least one participant with a bit of embarrassment - climate (hint: it's nice to go to the tropics on a work trip).

- Another brand new finding that came from this round of testing: Participants made the connection between how the concept was addressing pain points around medical board certification and the pain points they were feeling around maintaining certifications specific to their workplace. For this reason, some stated the concept would be even more valuable if it could help them track workplace certification deadlines and requirements.
A little bit of skepticism from one participant, but that's okay. That means once the concept becomes a product it will be a positive surprise.

Search Pivot

Some final thoughts

I love the part of the project when a concept is put in front of users for the first time. Until that moment, the concept is just an idea that we have no reason to believe in. But once users start reacting, it becomes crystal clear what is working, and what is not, and in my experience, next steps to improve product experience, desirability and value form almost immediately after understanding test findings.