core77-award-copy

Core77 Design Awards 2020

Student Honoree & Community Choice Award

Conversational AI
for Culinary Experience
on Multiple Platforms

0_Nest-Hub-Max_1

Meet Chef, a conversational AI to bring you a personalized culinary experience across multiple platforms.

Time

2019 Fall
7 weeks

Carnegie Mellon Universtiy

Team

Anna Boyle
Amrita Khoshoo
Deepika Dixit
Yiwei Huang

Role

Design Research
Motion Design

Interaction Design
UI/UX Prototyping
(Desktop Device)

Tools

Sketching
Adobe After Effect
Adobe XD

Project Objectives

Our challenge was to design a virtual assistant for a company that has not launched one.

The use of AI continues to grow as it offers efficient solutions to problems facing people and businesses, as well as quick ways of accessing relevant information. In fact, research consultancy Gartner predicts that by 2020, customers will manage 85% of their relationship with an enterprise without interacting with a human.

The Outcome

Designed for the meal kit service Blue Apron, Chef is friendly, engaged, and whimsical.

1-base-1013_1
ba-alert
ba-modifying
ba-celebration
ba-thinking
ba-no result
ba-times-up
ba-timer-on
ba-speaking
ba-listening

Chef is expressive, in order to resonate with a variety of situations.

Chef is an Intelligent Assistant in...

healthier-meal-plan

1. Healthier Meal Plan

" Hey Chef, Can you tell me what meals work for my diet?"

remote-control

2. Remote Control

" Safty check done. Oven is preheating."

3. Real Time Assistance in Collabrative Cooking Scenario

" Hey Chef, Can you tell me what meals work for my diet?"

4. Breaking down the Culinary Literacy into digestable visualizations

" Hey Chef, can you show me how to properly dice scallion?"

5. Food Recognition, Logging, and Recipe Substitution

" Corn Tortilla from Trader Joes, is that correct?"

The Process

blue-apron-design-process-over-view2

PART 1

01. Exploratory Research

A Virtual Assistant for Cooking

During our decision-making process, we asked ourselves: What role the virtual assistant would play significantly? Its significance in the service eco-system, and the scope of the brand in terms of the number of products and services it supplies. Through this evaluation, we identified an opportunity for a virtual assistant in the kitchen.

Cooking is task-based, usually requires both hands, and often times a piece of technology would help significantly. There's an opportunity for a virtual assistant to anticipate and aid with tasks and information, letting users focus on recipe-making. People could enjoy cooking guidance, advice, and tips without having to touch a device.

Blue Apron: Cooking for all

We chose the meal kit service Blue Apron because of its service, mission, and strong brand identity. Blue Apron is a recipe and fresh ingredient delivery service that sends subscribers pre-portioned meal kits. Launched in 2012, the company's mission is to make home cooking fun and accessible to all.

Screen-Shot-2020-02-19-at-10.03.35-PM

Understanding the Brand Identity

After choosing Blue Apron, we explored the brand in terms of visual brand language, the target audience, and the tone of the virtual assistant.

Visual Language

The Blue Apron brand exhibits a strong, bold and dynamic style. Its distinct illustration style is clean, friendly and personalized.

The Tone:

We wanted a virtual assistant to provide helpful guidance throughout the cooking process and for it to exude a sense of calmness to counter the user’s anxiety relating to the cooking process as being an upbeat and optimistic companion, reassuring the user along the way.

Target Audience:

Upon further analysis, we found that the largest proportion of Blue Apron’s user base is American women between the ages of 25–34. 82% of the account holders are female-sharing meals with partners. To appeal to this specific demographic, we imagined our virtual assistant to be a male voice.

02. Defining the Role

User Journey Mapping & Emotion Mapping

After our analysis of the brand, we outlined the user journey of a Blue Apron user to understand the process of cooking using the Blue Apron meal kit. We identified tasks and grouped them as undertaken before, during, and after the process of meal preparation.

Screen-Shot-2020-02-18-at-1.13.44-AM

Opportunity Defining & New Scenario Building

Through this process, we narrowed down to focused on three new specific scenarios that may benefit most from a conversational assistant:

01. Onsite Recipe Substitution

Scenario: Emily is cooking beef potato soup today, she finds out that the potato in the meal kit has gone bad, How can Chef support her?

A virtual assistant can provide real-time help with ingredient substitutions during cooking. It can also keep the user informed or dietary restrictions according to the user's preference.

02. Step-by-step cooking guidance

Scenario: Emily is cooking for a family with 2 kids, the cooking duty can sometimes be cumbersome and hard to process. What can Chef do?

A virtual assistant is perfect for guide users step-by-step through meal prep and cooking without causing information overloading to the user, leveraging the channel of voice and visual signals, cooking is made easy.

03. Multi-tasking with multiple timers

Scenario: Emily needs to prep the ingredients for the next plate, at the same time she needs to take care of baking. What can Chef help?

While cooking, Chef can help the user navigate through multiple tasks and work-flows simultaneously.

blue-apron-storyboard-2

Scenario Storyboard Part 1

03. Assistant Persona Ideation

Design Principles

With research and scenario building, we've defined the design principles for the ideal Virtual Assistant's personality, tone, and form. We wanted Chef to be:

Friendly

interactive, upbeat, and optimistic

Engaged

Proactive and anticipatory

Encouraging

Stable, reassuring, and guiding

Expressional States

After an iteration on the user journey,we've explored different responses, reactions, or expressional states of Chef through each interaction of our user journey.

emotional-journey

Chef's expressional states mapped to three user scenarios

The mapping has led us revealing nine key expressional states of the Virtual Assistant to support a preferable user journey: 

 

BASE/IDLE: How the assistant would appear at rest or appear active

THINKING: When the assistant is performing a task that requires processing

LISTENING: How the assistant would listen to the user

SPEAKING: the assistant would speak to the user

TIMER ON: How the assistant would react when the timer would be set on 

TIMER COMPLETE: when the timer would be done completing a specific cooking task

CELEBRATION: How the assistant would react after a meal had been completed

NO RESULTS: Task or search yields no results

ALERT/PROBLEM: How the assistant would react if there was a task that it was unable to complete or an issue it faced

Exploring Form & Motion

We looked into existing virtual assistants as a reference to define Chef's visual form. eg. Siri, Google Assistant, and Cortana

siri

Apple Siri

google-assistant

Apple Siri

5e1a7d3d1fbdba17eb56a350dd191411

Microsoft Cortana

A key theme between many virtual assistants is its abstract form, to avoid the uncanny valley by humanizing our AI, we decided to explore abstract representations. We used blue apron's illustrative style as a starting point, exploring both form and motion of common cooking tools.

Screen-Shot-2020-02-20-at-1.22.20-AM
design

We landed on the visual form of a bowl. Our virtual assistant Chef is composed of three elements- a fill shape of a bowl, an outline shape of a bowl and a circular shape representing an abstract ingredient. We choose the colors based on Blue Apron’s primary and secondary color schemes.
The form proved to be simple and flexible to morph across all states of motion.

Animation Explorations

We've conducted workshops in after effect among ourselves, and learned from the 12 principles of animation, and other materials to give Chef life.

expressions

Identify Expression Gap

After designed 8 expressions states, we've mapped the expression in the valence-arousal model to identify the expression gap to evaluate the coverage of Chef's expressions. As a result, we've identified missing states in the Negative - Formal Quadrant: which I added later, as the expression: Alert

matrix-Gif

Voice & Conversation

Voice is an integral part of our experience. We explored various voices on a scale from human to robotic. We ended up choosing a human female and human male voice as it provides a clearer, reassuring, and less distracting way to communicate information. We also charted out the framework around conversations. This included how users can call up Chef( " Hey Chef") and how Chef might scaffold tasks during cooking guidance.

04. Prototype in Mobile

Version 1& 2. Illustrating the Conversation Flow

The first version of the Mobile interface focused mainly on how to make the conversation flow smoothly. In this part we decided to place Chef in the bottom section, leaving more space for abbreviated text to accompany verbal cooking instructions. We had 3 key iterations. the first analog and the rest are digital.

 

Screen-Shot-2020-03-12-at-9.00.45-PM

V2. high fidelity mobile flow

PART 2

05. Test Kitchen

test-kitchen

Testing the Usability of Conversation in the Kitchen

After developing our assistant for mobile, we ran an analog test in an effort to evaluate Chef's usability. Anna was the solo cook and Amrita was acting as Chef using voice guidance. We had three major insights that informed our next iteration.

Screen-Shot-2020-02-20-at-12.38.14-AM

01.

Contextualized Conversation

During the simulated cooking experience, measuring ingredients became confusing with just numerical descriptions and Anna ended up pouring too much olive oil and pasta sauce into a pan.

This ended up feeding into how we approached our conversational design, by making conversation informative and relatable.

02.

Enable Trust through Empowering User Autonomy

In the test kitchen, our oven preheated faster than expected, which ended up cutting cook time in half.

Considering safety Issues, as long as the user's autonomy in configuring the remote device, we designed the interface in a way that goes through a safety check first and then gives the user the autonomy of control.

Screen-Shot-2020-02-20-at-12.38.22-AM
Screen-Shot-2020-02-20-at-12.38.28-AM

03.

Visualizing Procedural Outcome

In the test kitchen, there was also confusion about what the pizza should look like and taste like when it’s cooked well.

This led us to redefine our original design to set expectations and provide more context. Which helped us to understand the importance of showing what food looks like and tastes like after it is cooked.

06. Conversation Design Iteration

Screen-Shot-2020-02-20-at-12.38.42-AM

The anatomy of constructing a conversation

Mobile UI Version 3. Merging into the ecosystem

The third version of the Mobile interface learns from the usability test we've conducted in the test kitchen and the design principles we've surfaced in competitive analysis. We've also had a lot of discussions on how to better design the transition from the mobile system to the home device through notifications, remote device control, third party app integration and context-aware features.

V3-UI

V3. final version of high fidelity mobile flow

07. Expand to a new platform

Moving to the Smart home speaker, enabling collaborative cooking experience.

We expanded our ecosystem to support new contexts and multi-user cooking scenarios. Chef will support users and their cooking partners through a complete journey from ordering to rating recipes after having the meal.

1_DU6VCkMAcROGcgeZWu0dCg

Chef will provide meal recommendations based on personal taste and health data, work with smart kitchen appliances for streamlined prep time, delegate cooking tasks based on skill level, substitute recipe ingredients, and use feedback to improve preferences.

Smart kitchen Device

We chose to incorporate Google NEst Hub Max into our expanded ecosystem. As a platform, Google Nest Hub Max has a prominent screen, is useful in communal spaces, and connects with multiple devices and third-party applications. Other key features include gesture control, facial and vocal recognition, and the subtle presence of Google Assistant.

Developing the Smart Display UI

Based on the UI we developed for the mobile platform, we incorporated a dynamic task timeline with both audio and visual signifiers to guild the user through the cooking experience. Text is prominent and makes up short phrases, referencing the task at hand. Users can also ask Chef to surface reference videos or third-party media apps, using gesture controls to pause and play.

In this part, we purchased a Google Nest Hub Max to ensure we were adopting existing macro and micro-interactions patterns on the platform. Technology-wise, our multi-user cooking scenarios use voice recognition in the onboarding phase, gesture controls and proximity sensor to surface and change information. Chef has a subtle presence in order to bring more focus to cooking tasks at hand.

Group-1

high fidelity UI on the Smart home display

Screen-Shot-2020-03-13-at-1.05.07-PM
Screen-Shot-2020-03-13-at-1.04.40-PM
Screen-Shot-2020-03-13-at-1.04.48-PM
Screen-Shot-2020-03-13-at-1.05.38-PM

08. Reflections

Understanding Human Values

When defining the autonomy of Chef, we started thinking about human values and how they affect what we're creating. By doing so, when designing, we had a lot of discussions around user consent, user privacy, autonomy, and social needs.

Prototyping & Testing

In the process, prototyping and testing are fundamental! Different from screen-based interaction, the Test kitchen we had helped a lot in building understanding around both the user need and their mental workload when cooking. 

© 2020 Yiwei Huang. All rights reserved.