top of page

Echo Show 10

Screenshot 2023-12-28 at 2.33.46 PM.png
Project Brief

Flagship product

Physical interaction

Video calls

My Role

Lead UX Designer

Design POC for 

Industrial Design

Product team

UX & Product Research Implementation

Alexa Communications

Computer Vision

Privacy & Legal

Echo Show 10 is an Echo Multi Modal family product that was set to become the premium Multi Modal device for 2020. Echo Show 10 is considered a “Countertop Robot” thanks to its Smart Motion capabilities.

Problem:

Current Echo Show users have a limited experience due to positioning of the device in a space and viewing angles during video calls. Users are therefore forced to a remain at a specific location to stay in view during a call or other long running activities. The same limitation exists in order to have an optimal view of the screen when interacting with the device. Although customers can mitigate by moving the device manually it is far from ideal and easily becomes a temporary solution.

 

 

Solution:

Smart motion (which includes device hardware rotation and digital video framing through pan, tilt, and Zoom) untethers the users allowing them to move around while using Alexa and always see the screen. If the device is placed centrally in a more open concept home then Theia could be used from multiple spaces making Alexa much more accesible. 

CHALLENGES

1

Although there was initial buy-in to the product, there was no buy-in to the vision of the experience, technology needs and feasibility. It was up to design to create a compelling experience that would create alignment between product, Implementation, and design senior leadership

2

As a V1 product, there were no known tools to design the experience with. Technological requirements were not understood and impact of capabilities were unknown.

3

Testing had no known process for qualifying this type of interaction and participants did not have the language to provide specific actionable feedback. With a compressed runway, the amount of iteration possible was limited from the beginning of the program

EXPLORATION

As there were no physical prototypes of EchoShow 10, I leveraged a virtual reality (VR) environment for testing with the help of design technologist at the device and services group.

 

I led Alexa Devices UX Research to conduct a study in collaboration with the Echo Show 10 Smart Motion team to explore how users responded to core aspects of Echo Show 10 hardware motion interaction patterns. Participants wore a VR headset that placed them in a virtual kitchen space where Echo Show 10 was located on a central kitchen island. In this virtual environment, they were able to walk around the kitchen and give a set of commands to Echo Show 10 to observe the device’s visual/voice responses, as well as the motion of the screen in response to their commands.

 

All interactions were conducted through a Wizard of Oz (WoZ) protocol, which allowed us to simulate device reactions to the participant behaviors. I was able to both rapidly iterate through different behaviors and customer experiences to prove/disprove initial thoughts on how the device could behave and react.

ITERATION

I built a prototype rig using a combination of a 360 camera, After effects, 3D rendering and compositing to mock up, to a high confidence level, the usecases initially identified in lieu of the absent hardware. This gave design an additional 4 months of runway to explore and iterate interactions and do internal testing before any hardware was available.

​

​Although the hardware motion was tested using VR, the digital pan tilt and Zoom (DPTZ) being design for video calls was not posible to test leveragin the WoZ protocol. There was full buy in to the vision I showcased after presenting the current design thinking and examples mocked up to Amazon’s Senior Leadership.

Asset 16.png

Part of the showcase demostrated how a design system could be built to fulfill the UX requirements of the device. I showed examples of the flexibility we could have, if we followed a list of proposed design driven technical requirements I proposed, to create a tool to build and tune the experience. I prioritized the tooling requirements based on the anticipated impact they would have on the user experience of the final product. 

I created a myriad of informational assets to explain to the different implementation team how I envisioned the system working. This was presented to the Computer Vision , Alexa communications, Smar Motion, and Alexa implementation teams.

LEADING & GUIDING

I was the main design point of contact and liaison between the design organizations and partner organizations which included the Industrial Design, Product team, UX Research, Product Research, Computer Vision, Alexa Communications, legal and Privacy teams.

I led, guided, and reviewed work of discipline based teams in the design organization to communicate the design intent and how each team’s effort influenced other teams aspect of the experience.

I led other designers to solve edge cases, design the onboarding experience and settings. I also created a system and guide that empowered other designers to mock their designs using the techniques I employed when presenting to leadership. This allowed me to scale as a design lead and maintain a birds eye view of the project while remaining attached to the day to day design deliverables of the design team.

Screenshot 2024-01-03 at 6.52.39 PM.png

01

I worked with the computer vision teams to tailor tracking of customers in order to have more consistent user data while keeping CPU load constrains

02

I showed the tooling team how bezier acceleration curves could improve the CX rather than the common trapezoidal acceleration curves that most motors utilize. I also suggested them to build a predictive model to counteract the low fidelity tracking the device was achieving due to the low CPU headroom.

03

Worked with UX writing and device settings design teams to create new settings that where easy to understand by customers and predictable in their effects to the customer experience.

04

I safeguarded customer trust by working with the Privacy team to create a out of the box experience that clearly and easily explained how and why the feature works

05

Worked with the Privacy and implementation teams to maintain design intent while keeping implementation within the legal bounds.

TESTING & UX RESEARCH

Echo Show 10 had expansive design research. I worked with the User Research team to have internal, external (golden eyes) and BETA programs running simultaneously. These programs leapfrogged each other and the results from one would influence the design and what was tested on the other research programs. This allowed for weekly design iteration when we could only run BETA in an every other week cadence.

Internal testing

Internal testing was the main way I worked through the design hypothesis. This testing included participants from all disciplines related to the project. We also recruited other Amazonians from other programs to continuously get impressions from Fresh Eyes. This testing happened on a weekly basis in order to qualify design improvements internally before presenting them to the Golden Eyes cohort

 

 

Golden Eyes Cohort

 

I proposed to review design with professionals versed in cinematographic language in order to have clear and specific feedback on the design. Leadership approved this and a limited group of cinematographers, TV and movie professionals were recruited from Amazon Studios to give design direct feedback. The UXR team and I did this to qualify the performance and feel of the framed experience during a video call. Multiple framing styles were tested using this methodology. This included single and multiuser testing, user coming in an out of framing, distance based framing, among other variables. We ran this testing every other week using an internal build.

 

 

BETA testing

 

BETA testing ran every other week alternating with golden eyes testing. This provided me with data from fresh eyes participants and previous participants with prolonged exposure to the program. The prolonged exposure data was a good indicator of how the design was progressing as we implemented technical solutions and added and updated the design. I used the “Fresh Eye” data to understand how new customers would qualify the experience at product lunch. The combined data allowed for an easier conversation with the implementation team to prioritize the appropriate development vectors of specific design aspects to most effectively improve customer satisfaction scores.

OUTCOMES & LEARNINGS

01

The Echo Show 10 released with enthusiastic reviews from critics and customer alike with a customer rating of 4.5 stars on Amazon.com. 80% of customers mentioned that their purchase decision was based on the smart motion capabilities of the device. Customers use the smart motion feature on a daily basis impacting over 300.000 customers on multiple devices.  90% of Echo Show 10 customers (100.000+) of customers keep the physical rotation aspect of the device enabled

03

The biggest learning from launching this product was that even earlier engagement with product and the ID team could have greatly improved the device. Although many explorations were done before hardware was available, a stronger case could have been made to make different choices of hardware capabilities and industrial design. Hardware limitations and design choices were speculated but not fully explored to a point were it could have been communicated without counter arguments or speculation.
This is reinforced by newer devices launching with different hardware and Industrial design choices that have resulted in improved customer satisfaction scores.

05

Towards the end of Echo Show 10 production cycle, I became the go to designer to lead high ambiguity V1 products and programs with emerging UX, high coordination needs between technical and design teams, and/or complex platform level design systems. Some of the projects I have worked since are AI conversational modalities, new unreleased flagship products, and organization wide design systems.

02

Smart motion has now been released on 3 other devices that feature in call video framing. The system's flexibility has been crucial to the successful launch of new devices with Smart Motion. The system has reliably scaled to new devices and has maintained or improved its customer satisfaction score while accommodating to the different hardware, camera location on the device and/or device orientation. Modifications to the attributes of the system are easy to implement and can be handled by more junior designers, reviewed by me, and tested by UXR.

04

The second biggest learning was that early prototyping and designing without hardware was crucial to the success of the device launch. The hardware team and product team now approach me directly very early during their process and a lot of trust was earned to create a valuable collaboration between the 3 teams. I am the main contact for UX design feedback through the industrial design ideation process for headless devices and multimodal devices with smart motion capabilities.
I anticipated some negative reviews due to hardware and industrial design choices that I was unable to mitigate for through the UX design. Most of the negative reviews regarding the motion tracking 
received were based on the risks I flagged during production.

Customer Quotes

"The Amazon team really did wonders creating this device. Both the camera and the AI in this thing are crazy good. Assuming you have motion enabled, as soon as you say the wake word, i.e., “Alexa” the device honed in on my direction, rotated towards me, and began listening."

"I called up my Grandma who has my old first gen echo show, and she didn’t even realize we were conversing as I was walking around my room. She thought I was just stationary the whole time. When I pointed it out and showed to her, her jaw dropped."

"What I found is that it rarely loses focus of the individual completely even though it will trail the motion."

© 2023 by David Jara. 

bottom of page