On Call

In the world of insurance, the phones are constantly ringing. Phone reps handle claims, quoting, complaints, and anything they can do to help the customer. But how do these reps seeks help when they need it?

Completed: 2021 - 2023

A personal English tutor (1).png

“So where do you turn for help?”

“Usually my co-workers.”

“And if they’re not available?”

“Then my Outlook. Here, I’ll show you.”

With that, the phone rep opened her Outlook email account. At first glance, it’s perfectly ordinary - the newest email is about a flu shot clinic we’re hosting. Then, I see it.

On the left side menu, there’s over thirty categories for emails - all custom made - and within those, countless drop-down categories. This phone rep had spent weeks compiling all emails relevant to any issue that might be called in, making her own mega-database of information. “I made a lot of mistakes with organizing at first,” she told me, “and it really wasn’t easy.”

This is what happens when workers are made to solve complex problems without a digital help database. Without a presented solution, they glue together what they have and hope it’ll stick.

Project Initiation: 2021

I was reached out to by our internal Help Desk to deal with the issue of calls being escalated by reps. This is a process that is allowed, and often necessary, but many of the calls were tech related issues that reps could have fixed themselves, if they had the knowledge to do so - think password resets and account lockouts. Without any central documentation, reps try their best to help the customer as quick as possible. In our case, this ended up overloading the Help Desk with relatively simple tech fixes.

It was shortly after we began this project that we were introduced to a little-trafficked internal documentation page, which, in theory, should hold all of the information that would reduce the calls to the Help Desk. What we found, however, was a poorly organized site with no search function, and a lack of consistency and formatting. The vast majority of reps we talked to had no idea this page existed, and the ones who had heard of it only used it to check coverage updates.

Both issues we were presented - too many tech calls, and too little documentation page traffic - had the same root issue: accurate information is difficult for reps to quickly access.

At this point, I began talking directly to the phone reps.

Qualitative Research, Round 1

Products:
Hybrid Interview Scripts

8 Qualitative Interviews

After being presented with information with stakeholders, I helped establish recurring observation sessions with phone reps in each of our departments. Our reps have vastly different tasks depending on where they work, so getting a broader sample was key.

I created a basic interview script to follow, adopting a mix of closed and open scripting. The script focused on how they currently sought help, what type of calls they escalated, and any knowledge they had of the internal help site.

In total, I led 8 one-on-one qualitative research sessions with phone reps.

Because I was new to the company at this point, much of the learning was on the job.

I realized that I needed to make it clear I’m not conducting an audit of their work, and that their responses are anonymous. The simple addition of this copy to the interview script proved remarkably helpful for getting more honest and detailed answers.

My findings, across departments, were remarkably similar. Phone reps sought help from their coworkers, either in person or digitally, or relied on documentation they made themselves. In the most extreme cases, such as the Outlook organization seen in the intro, reps are spending a significant amount of time curating their own libraries, and often even more time searching through them for the relevant information. Users who didn’t have these large libraries usually escalated the calls to the Help Desk. Both audiences had very little, if any, knowledge of the internal help page.

An interesting finding was that while reps seemed excited to learn about the internal help page, it almost always didn’t match the comprehension and ease of use of their own libraries. Information was missing, there was no search function, and there were few dates on articles. These factors led reps to be generally unimpressed.

Sitting down with the designer and business consultant on the project was to see how the current internal help site should be organized.

Quantitative Research, Card Sort

Product

43 Phone Rep Card Sort Responses

Again, this proved to be a learning experience. At the time of launching the card sort, I had little real-world experience doing so.

While I put all of my undergraduate learning to the test creating a simple card sort study, the omission of a few key questions early on hindered my results. Typically working with users sourced from a panel, I had missed two crucial questions: what department the phone reps who took the card sort were in, and how many years they had been at Amica.

Thus, we didn’t know if we were looking at the data of a Claims worker with 20 years of experience, or a First Notice of Loss worker who had just started. The results were all over the place, with little to no agreement. According to our advanced testing software, the ideal sorting for these cards would have resulted in 24 different main categories.

What we could see from the data was that similarity came in waves. This was a mystery to me until I realized how the card sort email actually arrived to the phone reps. We sent out an email to department call center leads, who then sent it out to reps. The result was all Claims reps getting the card sort at 2:30, when all Life reps may have gotten it at 4:15. I couldn’t make any conclusive findings based on speculation like this, but I could see a general pattern. Each department of phone reps organized similarly to their coworkers, and dissimilarly to other departments. If the card agreement is to be believed, there can never be a true conclusive database for all phone reps. But this was not the result many people were hoping for.

Break

With results disappointing to the stakeholders, research on this project took a pause. The designer was able to implement some of the more general patterns we noticed, such as groupings not represented in the main nav, but the result was very much a “band-aid” solution.

Qualitative Interviews, Part 2: 2022

Products:
Hybrid Interview Scripts

6 Qualitative Interviews: Sales Reps

After almost three months of complete inactivity on the project, I was CC’d back in. This time, we had new stakeholders with more access to reps. Additionally, the old site was regarded as a relic, with a new site - already developed, but in Beta - was the future. This new site was the one I would testing now. Ideally, this is a site that would have drawn from research from the beginning. It seemed like a clean slate for the project.

Similar to before, I was able to establish times to sit with phone reps and conduct observation sessions, as well as qualitative interviews.

I used a similar hybrid script but added questions focusing on situational questions that would prompt them to need to seek external help. I did this because many of our reps have been here for years, even decades, so there are few questions they would realistically have.

The findings were extremely similar to the first round. Even with a prettier site, with a search function and more information, reps had either developed their own libraries of documentation, or simply didn’t need to reach out for help often.

With this in mind, our main audience would be new hires and niche cases (glass coverage in Florida, for example). New hires hadn’t yet developed their own libraries, and niche cases could utilize the AI-powered search function.

I was surprised, however, when even experienced reps found use in the site because of one feature. This feature reported on natural disasters that may be affecting our customers, and how the reps were to respond to it.

The biggest takeaway from this second round of qualitative interviews is the similarities to the previous site. While there had been significant design and search feature improvements, our phone reps who are no longer in training rarely get calls they don’t know the answer to. Focusing in on the use cases where it is needed - training and niche information - could make this more beneficial for it’s true users.

Further Research: 2023

The team did not immediately request additional research on this project. When they approached the team later on with a more developed site, I was already dedicated to another project, so my Senior Researcher took this project over.

She spoke to 10 phone reps working in a specific area of coverage in Florida, to determine if their unique needs were being met by the prototype. Unsurprisingly, her findings echoed what I had found at the very beginning; very few of these reps used the site post-training. They either knew the answer to a question, or relied on their own documentation. While the reps noted the tool has potential, especially for new employees, it was not something they often used.

Reflection

This project is a unique one for me because of how early I was in my career when I began it, and how long it has lasted. It was very much a learning opportunity for me. If I was presented with this project at this stage of my career, I may have been able to tell stakeholders early on that further testing should only be with the true audience - new hires. Instead, we saw staggered research over two years where we repeatedly spoke to users who did not use the product we were testing.

It’s also a good reminder that correlation is not causation. While reps had a very low usage of the initial help site, and only a slightly higher use of the new site, this was not because the site was poorly designed, or had a lack luster search function. Truly, it was because reps had evolved to not need or rely on internal help sites or documents, and after only a short while on the job, needed little if any assistance. In my future projects, I hope to better identify the needs of our users at an earlier step.