Nivi Singh

CASE STUDIES

Note: The numbers and figures shown in these case studies are representative and not actual figures. 

​​CASE STUDY 1: CART ABANDONMENTS   

THE PROBLEM

​​Localytics data was consistently showing a high rate of cart abandonment at the checkout step. Users were not finishing up purchase transactions even after spending time and effort in creating a product. This was a serious issue since it was resulting in losing customers, especially because the holiday season was approaching which is the main business season for the app.   

CHALLENGE

Investigate why users were abandoning cart at checkout steps in a week’s time and propose a quick solution without involving a lot on the backend changes and coding efforts since development resources were gearing up for holiday season.   

MY APPROACH TO PROBLEM

1

2

3

CLICKSTREAM AUDIT

BENCHMARK COMPETITORS

QUICK STUDY 
& SURVEY

A quick audit of the clickstream for the existing checkout flow identifying potential problems.
Benchmark how competitors and other shopping apps were treating the checkout flow.
Ran a fast track study with friends and family (lack of time for recruitment) involving creation and purchase of a product using Kodak Moments, followed with a survey.

WHAT I FOUND

  • Average time for the magnet purchase was ~4 minutes (3.8 min). It was ~1 min more than what user expected to complete the task and the benchmarked checkout flows.
  • Filling Shipping Address details disrupted purchase experience, it was not in the order where users expected it to be presented to them (shipping address had an impact on taxes and shipping cost). Too many steps and busy screens to check out.
  • Too much information and scrolling on cart screen and unclear cost including taxes.
  • Checkout flow was slow and some elements on the screen were not very responsive.   

WHAT I FOUND

  • Simplified checkout process
  • Reduced number of screens and clicks
  • Changed order of delivery address details
  • Simplified and cleaner Cart screen
  • No major coding effort or backend changes
   

Cart abandonment rate came down by ~ 15​​

​​CASE STUDY 2: RESPONSE TIME 

THE PROBLEM

A previous longitudinal study had revealed that users were unhappy about how much time it takes for them to perform simple image functions such as applying filters, swapping images, or another image editing.   

CHALLANGE

​​Convincing stakeholders that app’s responsiveness is a hindrance in user experience and is worth investing in the coding effort to make it better.
Image manipulations were done on cloud and with average network speed, upload/download of the image took most time and user would see the spinney for a long time.

MY APPROACH TO PROBLEM

1

2

3

CLICKSTREAM AUDIT

BENCHMARK COMPETITORS

IN-PERSON STUDY

A quick audit of the existing workflows for diferent products on Kodak Moments recording the response time. .
Benchmark response time of the competitor apps and comparable workflows.
Ran in-person study for a newly introduced feature with time on task and response time as the metrics.

WHAT I FOUND


  • On an average, the task completion time was higher from competitor apps for equivalent workflows by 1.5 minutes for multi-image flows and image editing features such as collage, cards, and photobooks.
  • Users were frustrated with the spinney and felt app was very slow when compared to other apps.
  • Data from the study indicated (for average 50mbps speed):
  • Upload + download time for 5MB image = 2 sec
  • For average 2 manipulations per image  = 2x2 sec
  • The average time for a 4 image product = 4x4 sec Vs ~5 sec on competitor app
  • User satisfaction rating (average) for app responsiveness 3 on the scale of 7

RESULTS

  • Discussion with the stakeholders with data finding from the study.
  • Discussions with technical team on reducing spinney time for the users and explore solutions for cloud.
  • Worked with developers to modify the code such that user work with the low res image natively and upload to cloud happens in the background invisible to the user reducing the spinney time.   

Response time per image/manipulations reduced by ~ 1.5 sec​​

​​CASE STUDY 3:   ICONS AFFORDANCE   

THE PROBLEM

With a new design head joining the team, a new visual language was introduced in the app that was more modern and minimalistic in approach. However, monitoring localytics data indicated marked drop-offs in creating product flows especially in editing routines.
   
There clearly was a problem that new visual design language had introduced in the user experience. My challenge was to bring out the exact problem with the new visual language and convince design head to revisit it.

MY APPROACH TO PROBLEM


  • Referred to studies and blogs on icons and visual design usability.
  • Designed an in-person study with scenario-based tasks.
  

WHAT I FOUND

  • New icons were not mapping either with users’ mental model and their familiarity with established icons.
  • Users were frustrated cause icons were not indicative of their purpose and they felt they have to invest time to try and find out the functionality of these icons.
  • Users were giving up on creating products, especially during editing image routines.
  • Users wanted to see established and clearly indicative icons.
  • Lack of labels with the icons was considered as table stakes by users.
 

RESULTS

  • Based on the findings, came up with recommendations for improving icon usability in the app.
  • Presented recommendations with data to back up my research finding.
  • The icon set was revisited with a better understanding of users’ mental model, some of the icons were redesigned and as a general rule icon labels were added with the icons.
   

Text labels were added to the icons

​​CASE STUDY 4:  TECHNOLOGY AND USER BEHAVIOR     

THE PROBLEM

With the newly designed Mail For Me feature, the classic question arose - Technology Vs Behavior. Development and product team wanted to use a built-in address book for the feature with the premise that people will adapt and adopt. With no other way to mail the cards from the app, it would have left a huge section of our customers struggling and possibly abandoning the feature completely.   

The primary question for me was to understand users’ behavior with using address book on their devices, and if the proposed feature conforms with the user behavior across our customer base that spanned from baby boomers to millennials.

MY APPROACH TO PROBLEM

It was a very important question from the product development perspective, involving decisions about functionality and definitely the coding effort. The best way to find out the answer was to see what people were doing in real life with the address books on their devices. I launched a survey with approximately 400 users in our database.   

WHAT I FOUND

  • About 92% of respondents were not using the address book feature on their devices, smartphones, or tablets.
  • Those who were using the address book to save the addresses on their devices were only using it to save important addresses such as their immediate family or close friends. Their address books were incomplete on their devices and they were not confident that they had the updated physical addresses of everyone they send out holiday cards.
  • People still maintain paper address books and feel more confident updating them. During holiday time, they create a list (paper or electronic) that they use primarily to send the holiday cards.

RESULTS

  • Based on the findings, I recommended that in addition to accessing the built-in address books on the devices for our more advanced users, we must provide a manual way of adding the address on the app.
  • Presented recommendations to the stakeholders with data to back up my research finding.
  • The feature was designed keeping users and their needs in mind.

CASE STUDY 5:  FLOOD CLAIM TOOL     

THE PROBLEM

During hurricane Harvey and Irma, Amica wanted to reach out to the customers to help them out with an expedited claims process. A rapid solution was created to support customers from hurricane hit areas that enabled them to send in the photos of their automobiles and fast track desk appraisal based on the photos was done to release the claim for these cases. An estimated saving of $250 per claim, reduced call volumes for the company and faster service to the disaster-hit customers was envisioned with the solution.

However, it was observed that the adoption rate was very low accompanied by a very high abandonment rate. Of the customers that were sent the tool, only 43% opened the tool. Of these customers, 21% uploaded the 1st photo and only 8% completed and submitted their claim through the tool.

MY APPROACH TO PROBLEM

I had access to some analytics data to see how the numbers were dropping off the entire workflow. However, there was no indication of why people were not using the tool and those who started were abandoning the tool. I decided to take a deep dive to the problem with: 
   
  • Analytics: I started with the analytics data that I had access to. Understanding the drop off rate from different steps helped me get an idea where I could start the research.
  • Workflow Audit: Identified probable issues in the workflow with data and preliminary usability study of the tool.
  • Customer Interviews: Validated findings with a follow-up study with customers who used or abandoned the tool.

WHAT I FOUND

1. From Analytics
2. From Workflow Audit
Workflow audit indicated at a couple of possible hypotheses:

  • Too many steps
  • Complicated and long email
  • Complex instructions

3. From Customer Interviews

The Tool:
  • Users were required to submit all 6 photos to be able to reach Submit step. In the circumstances of flooded vehicles, it was not possible for them to take all the required photos to submit the claim.
  • Certain photos (such as odometer readings) were difficult or impossible to take when vehicles were flooded and could not be started.
   
Environmental Factors:
  • Poor to no connectivity to upload photos.
  • Flooded garages and no power, discharged phones.
  • Emotional distress and feeling of loss.

Customer Expectation:
  • Insurance to help them get back to normal life as soon and as less resistance as possible.
  • An easier way to submit claims and faster settlement.
  • Provide simple and fast tools and processes to assist them at the time of need.

WHAT I FOUND


  • Perceiving the importance and user needs, the tool has strategic implications beyond the catastrophic situations.
  • End-to-end customer experience was evaluated for a more strategic implementation in claims processing.
  • Worked with the designers with identified problem areas to come up with a more streamlined and frictionless workflow for the tool.
  • This project took a more strategic place in the roadmap and was taken in as a discovery research project with customer interviews and user research to expedite the claims process tool.
   
Design Decisions: 
  • Simplify and enhance photo capture and upload process.
  • Clean and simple email content.
  • Concise and clear confirmation message.
  • Provide alternate ways to complete the process.