Show info
Amy Roberts

Amy Roberts is a product designer currently based in Austin, TX.





Functional prototype, UI, and video


2016 (3 weeks)


Prototyping course, University of Washington


UX Design, UI Design, Identity


Xiao Yan
Chris Chung
Catherine Jou
Amy Roberts


Patients can feel isolated from their loved ones while in the hospital. Nausea, pain, and reduced mobility can all impair their ability to use mobile technology. Their cell phones can harbor harmful bacteria and can spread contagious diseases.


Convey is a phone dock that helps patients communicate with their loved ones by providing them a form that is easy to touch with a voice controlled interface to make and receive calls.


I worked as part of a four person team to create this prototype. My role included designing the logo and user interface, investigating real-time digital prototyping methods, manipulating the interface wirelessly for user testing, and creating the poster and other visual artifacts.


Convey is a phone dock that helps patients communicate with their loved ones by providing them a form that is easy to touch with a voice controlled interface to make and receive calls. Convey allows users of varying abilities to easily communicate through a simple user interface, touch input, and voice commands. The plastic enclosure of Convey keeps the phone isolated, preventing the spread of bacteria and contagious illnesses. When an incoming call is received, Convey pulsates softly with soothing ambient light.


Design Goals

We had three main design goals:


People in the hospital have varying abilities, so we wanted to make the phone dock as accessible as possible. We would achieve this by using a simple user interface, touch input, and voice commands.

Emotionally Comforting

We wanted to make the phone dock comforting to use for patients who might feel otherwise very sick or weak. We would achieve this by using calming lighting effects that change with the patient’s interactions.


The plastic enclosure of Convey keeps the phone isolated, preventing the spread of bacteria and contagious illnesses.



Our work included interface design, physical fabrication and Arduino design. The prototyping methods we used were high-fidelity wireframing, laser cutting, 3D printing, and Arduino. We divided this work between team members in the beginning, then we put it all together.

This involved putting Arduino elements into physical prototype, testing lighting and sound results, loading and syncing interface on the iPhone, and placing the iPhone inside the physical prototype. After the prototype was assembled and functioned well, we moved to the video shooting stage collaboratively and the final video editing and production.


3D Printing

We chose to focus on the look and feel of our prototype by creating a 3D model and using 3D printing as a means to allow the user to touch our design. The concept was built in SolidWorks and printed on a Makerbot.

We created our model to be translucent to allow light to pass through. This was imperative to the overall finish of our prototype, as we had LED lighting inside the prototype to provide ambient visual feedback through the glow in the translucent plastic.



User Interface

We wanted to create a user interface that was easily visible and simple enough to use for someone in bed at a hospital, who might have an illness or motor impairment. In addition to this, we wanted to make something that would complement both voice and touch interactions, making things easier for patients of varying ability who might not be able to easily communicate.

We created a user interface to simulate an outgoing call and an incoming call. In our design, we used large text and contrast to provide clarity for the user. Simple diagrams aid the user in understanding how the touch interactions work and are large enough for them to view through the acrylic screen. During calls, a comforting image of their loved one is displayed.

We controlled the interface with an app called Skala. This used a wireless Internet connection to connect to Photoshop, allowing us to remotely turn on and off layers to simulate an interactive UI.


Since the target user of our prototype were patients in the hospital, we wanted our prototype to provide clearly distinguishable states though audio and visual feedback. However, we did not want the feedback to be so strong that it is distracting for the user. We decided to go with LEDs for visual feedback, while using a buzzer to provide audio feedback.

We used a light sensor as a trigger to detect when a user has touched and activated the device. This sensor could detect when a hand is touching the prototype through the translucent material, since the hand blocks much of the light detected by the sensor. Because lighting conditions vary, took several light readings each time the device was turned on and averaged them to create a baseline and threshold. Conditionals were used to determine when to turn on the LEDS and buzzer, based on the change in light read by the sensor. For our lighting, we used an LED strip wrapped around the base of the prototype and a powerful blinkm LED at the top.

Six stages of interaction

  1. Hand touches shell to wake up device
  2. Hand is removed and device is awake
  3. Hand touches shell to initiate call
  4. Hand removed from shell and call initiated
  5. Hand touches the shell to hang up
  6. Hand removed from shell and call terminated. Device goes back to sleep.



The storyline of our concept video is based on the storyboard we created. The story is divided into two parts. The first one is the patient initiating phone call with his family and the second part is the patient picking up the call from his family.

We shot our video in a medical training room on campus in order to mimic a real hospital context. We shot three sets of footage of the whole story from three different angles, highlighting the overall medial setting, the patient’s movements and facial expressions, and interactions with the product. We designed our video to tell a story and introduce the product so the audience can easily understand its usage and functionality.


Our filming location


We wanted our logo to embody simplicity and clarity to express the accessibility of our prototype. We designed a simple, rounded image of a hand reaching out and rounded the ends of a sans-serif typeface. The blue color is soothing while still reflecting the hygienic nature of the medical field.


Rendering and Poster

We created a realistic rendering of the form in V-Ray and placed it in a hospital setting to create context. This served as the header of our poster, which we designed for our demo to help communicate what our prototype is and how it works.



Throughout our prototyping process, we conducted informal tests with others to gauge the effectiveness of our prototype. The insights we gained by watching people interact with our prototype gave us some direction for our next iteration.

We evaluated our final prototype by conducting three user tests with strangers. After giving users a brief overview of how Convey works, we gave them instructions for making a call. As the user interacted with the phone dock, we had someone sitting behind a laptop changing the screens accordingly.

Make a call

  1. Touch Convey to turn on
  2. Give a voice command to Call Catherine
  3. Touch Convey to confirm call
  4. Talk to Catherine
  5. Touch Convey to hang up

We then asked each person a short series of questions to learn more about what their thoughts were on the prototype and what they liked and didn’t like about the experience. Because we were not able to test with actual patients, we opened with questions about their previous hospital experiences to better frame the interview.



In our user testing, we found that the form was ergonomic to touch, and the voice and touch input appeared to be intuitive for making and receiving calls. However, we did notice that users didn’t always understand how to initially activate the device. Once we explained it to the user afterwards, they understood how to interact with the form. Some users questioned the need for “touch to confirm” when initiating the call, which seemed unnecessary for a device with voice control. Moving forward, we would need to simplify the experience or provide instructions for use.

We found that all users had varying levels of experience staying in a hospital, yet all shared a similar mindset. The users viewed the hospital experience as negative when forced to stay for an extended period of time. Pain points included isolation and the sterile nature of a hospital environment. Some users remarked that the curved aesthetic of the prototype would be comforting in this context, and things like light and sound feedback can make a hospital room feel more personal.

Future Work

While our prototype was effective in demonstrating the basic interactions of the phone dock, it would be necessary to make an additional iteration integrating real-time voice interactions before moving forward with testing our prototype with patients in the hospital.

Testing users of varying abilities (motor impairments, weak grip, soft voice, blindness, or another disability) would help us account for a broader range of use cases. A diary study would allow us to track and gauge emotional comfort received from the device over a period of time. In addition to this, we would need to work with medical professionals to ensure our device is compliant.

While further testing would reveal more clear design directions, some ideas for future work involve investigating whether UV lights could be used to sanitize the phone dock, integrating a video chat option for a more realistic experience, and adding features for patient entertainment.