top of page

Actionable Insights

Experts: A valuable resource and bridge to user's mental model.

Universality

Insclusive design that accommodates color blind users.

MVP

Priotising features + strategizing core focus to implement in "real world".

Business Mindset

Unaffordable for all users. Keeping in mind cost-cuts; 
start expensive, scale down.

My Takeaways

FireShield was my first all-online project. This project was created during the first few months of the pandemic with a lot of unknowns making the experience very distinctive. Collaborating asynchronously with a team across two different time zones was a learning curve. As a result this project helped me look at research, communication and the design process from a different lens.

My Role

I was the UX Research Lead as well as the Interface Design Lead. I spearheaded the user testing as well and contributed to other aspects such as product design.

Deliverables 🔗

Processbook
Vision Video
Digital Prototype (App)
Digital Prototype (Product)

Team

Varun Khatri 🔗
Sheryl Chan 🔗
Madeline Walz 🔗

Tools

Figma
Rhinoceros, Keyshot
Adobe Illustrator

Adobe After Effects

Showcase

Overview

A three part system that consisting of a face-tracking voice assistant that displays emotions on its screen, and sensors that can gather and record plant health information.

Time Frame

Fall 2019

10 weeks

Full Process Coming soon

Click to view design process

Vision Film

The Film features the System's features as well as its usability.

brighter Vision Film Credits: Sheryl Chan (Editor and Motion Graphics), Aparna Somvanshi (Product Renders, Videographer and Script), Varun Khatri (Voiceover and Actor), Ren Fairly (Voice Actor for "bud") and Madeline Walz

How does bud work?

Uses a series of invisible lasers to measure the distance between the sensor and an object, structure, or landscape. This data is then combined into a 3D map in terms of x, y and z coordinates.

Value Proposition

Bud is designed to help beginners grow thriving houseplants. The central device is the Bud voice assistant module that follows the user using facial recognition technology. Based on plant health data, Bud’s display is used to emulate emotion through facial animations. The voice assistant module gathers its data from Seeds, sensors designed to be placed with each plant that measure soil moisture levels, lighting and temperature. The data is synced with the mobile app for long-term tracking, phone notifications, and community features. This integrated system allows for scalability, with the ability to buy multiple Seeds, and the flow of data through different touchpoints.

Bud

Voice Interaction,

Motion Sensor based movement tracking

LCD panel face that changes

Seed

Soil Moisture detection
Temperature and Humidity
Light Sensivity

Application

View Plant Care Info,
Notifications and Tasks
Installation Guide

bottom of page