In an age where notifications increasingly manipulate attention, contribute to anxiety, and disrupt focus, how might we design a voice-driven system that respects the user’s emotional state, supports goal achievement, and delivers content with empathy rather than addiction?
I led the research, design, and prototyping of Ellsi, an emotionally intelligent voice assistant designed to reframe how we engage with notifications through equity, empathy, and multi-modal interaction.
Ellsi is not just a voice assistant—it’s a counter to toxic digital norms.
User: “Ellsi, add turmeric, cardamom, and chili powder to my shopping list.”
Ellsi: “I’ve added those three for you.”
[Displays shopping list with three newest items on top]
Ellsi: “Would you like me to remind you when you’re at a particular store?”
User: “Yes, that would be so helpful!”
Ellsi: “Which store? Or search for it in the following card:”
[Displays map card]
User: “Fresh Thyme on Grandview Ave.”
Ellsi: “I’m setting the location reminder.”
Ellsi: “I found a closer location you’ve visited before. Would you like directions and a reminder there instead?”
User: “No, thank you. I’m heading to that part of town.”
Ellsi: “Sounds good!”
[Displays list of closest matching results]
[User selects location]
Ellsi: “I just set the reminder.”
This moment introduces a multi-modal system with hands-free, eyes-free support, extending beyond traditional voice UX. By blending visual and conversational interaction, Ellsi becomes more adaptive, meeting people in both comfort and uncertainty—empowering them to act without friction.
Because the user is signed in, Ellsi gathers input flexibly, completing tasks in context using Natural Language Understanding and past interactions. Rather than forcing structure, Ellsi fills in the blanks by bridging brand, product, and behavior across devices.
User: “Set a reminder for Karmen’s exam on Wednesday to send her the motivational message I wrote last week for my friends.”
Ellsi: “Please select which note you’d like me to remind you with.”
[Displays Notes card]
[User selects note]
Ellsi: “Alright, I’ll remind you.”
Here, Ellsi shows its adaptive capability by recognizing the user’s past behavior byaccessing saved notes and sending them to frequent contacts. This seamless action reflects the strength of context-aware design in Voice UX.
By aligning with the user’s mental model in real time, Ellsi supports fluid interaction, solving spontaneous needs and reinforcing trust. Context isn’t just convenience—it’s the core of designing truly intelligent, responsive systems.
Ellsi isn’t just a voice assistant. These version 4 designs show how it adapts to movement. These final flows show how voice and ambient intelligence meet real life: when screens are out of reach, but people still need to think, feel, and act.
Large text and focused actions help users stay on task while hands-free. The always-listening interface is easy to use and just as easy to dismiss—for safe, seamless interaction on the go.
When driving ends, Ellsi shifts modes to stop voice input and prompt a return to touch. A flexible layout adapts control placement based on vehicle side, offering quick access to resume or exit. Users can speak a hotword or tap to preview messages and re-engage the full hybrid interface.
User: “What did Mateo email me?”
[User is driving, Ellsi reads it aloud instead of showing a preview card]
Ellsi: “Mateo said, ‘Sprint 3 completed on time and we're on track to target! I'd like to know what you think before I share with any others.’ There is an attachment.”
User: “Thank you Ellsi!”
Ellsi: “You’re welcome! That sounds like good news. I’ll add a snapshot to your moments view the next time you're on your device.”
[Ellsi updates the moments view with a visual summary of the email and attachment]
These flows show Ellsi adjusting its tone; sometimes warm and conversational, other times clear and directive,based on the user’s approach. In Voice UX, personality isn’t static; it’s strategic.
Adaptation depends on:
Great voice design meets the user where they are, not just in context, but in character.
User: “Start a 10 minute timer called check live...”
[Ellsi waits briefly, detecting pause]
[Plays error chime]
Ellsi: “I don't think you completed your last question.”
User: “Oh sorry, I was distracted! Set the timer for ‘check the live blog.’”
Ellsi: “10 minutes and counting!”
Ellsi handles confusion with graceful recovery, using on-device memory to respect privacy while learning from past interactions. When a query is incomplete:
By staying aware of nearby context and typical patterns, Ellsi keeps the experience flowing throughempathy, efficiency, and adapting to distractions.
User: “What’s the weather like near Kiko?”
Ellsi: “I found a previously shared current location in Maine.”
Ellsi: “The weather there is chilly for this time of year, but it's expected to warm up later this week. I’m displaying the forecast:”
[Displays forecast card]
User: “Message her: ‘I heard it’s cold there! But not too cool to be cool. You got this [dash]stay warm!’”
Ellsi: “Message sent!”
Ellsi uses customer login to pre-fill essential details e.g. like contact names, reducing friction if an error occurs. While she functions without login, the experience is smarter and smoother when signed in.
Across these flows, Ellsi handles variable user responseswith Natural Language Understanding, filling in blanks via statistical grammar that mirrors natural human conversation—flexible, familiar, and fluid.
Step | Preview | What | How |
---|---|---|---|
1. Requirements Definition | A VUI bridges user comfort and business priorities through innovation. | Define the business motivation and align with user and application goals. |
|
2. High-Level Design | Set the stage for seamless interaction with a brand’s voice. | Establish dialog strategy, grammar type, and persona alignment. |
|
3. Detailed Design | Focus on research-backed scenarios to ensure user satisfaction. | Craft precise dialogs and tailored prompts for all use cases. |
|
4. Development | Blueprint to reality with to have detail where it matters, flexibility where it counts. | Integrate coding practices with front-end and back-end systems. | Build iteratively with stakeholders under a shared vision.
|
5. Testing | Thorough testing ensures system reliability and trust. | Design the system to work reliably at scale, for everyone intended in real-world scenarios. |
|
6. Tuning | Continuously evolve the VUI with the user in mind. | Optimize grammar, accuracy, and all-around satisfaction through feedback. | Identify high-traffic patterns to inform strategy
|
Standard UI places CTAs in the bottom right, assuming right-handed efficiency. But our research revealed that this position strains the thumb’s natural arc. I designed a flexible, elliptical button that adapts to each user’s relaxed reach, left or right, making interaction feel intuitive, inclusive, and effortless.
Standard UI places CTAs in the bottom right, assuming right-handed efficiency. But our research revealed that this position strains the thumb’s natural arc. I designed a flexible, elliptical button that adapts to each user’s relaxed reach, from either hand, making interaction feel intuitive, inclusive, and effortless.
The experience opens with a personalized “Moment” view that features surface-level calm, deeply intentional. Based on user cues and contextual research, it highlights what matters most right now, offering three gentle prompts to encourage meaningful engagement.
The experience opens with a personalized “Moment” view—surface-level calm, deeply intentional. Based on user cues and contextual research, it highlights what matters most right now, offering three gentle prompts to encourage meaningful engagement.
At the core of the multimodal experience is conversation. We pair natural language understanding with best practices in conversational UX, card systems, and visual design to afford a seamless flow of dialogue, context, and action across both app and device.
At the core of the multimodal experience is conversation. We pair natural language understanding with best practices in conversational UX, card systems, and visual design—creating a seamless flow of dialogue, context, and action across both app and device.
In mobile-first contexts, we previewed a voice-only interface with a universal hotword, which is designed for instant engagement without typing. The dialog flows illustrate how users can navigate key moments entirely hands-free.
In mobile-first contexts, we previewed a voice-only interface with a universal hotword designed for instant engagement without typing. The dialog flows illustrate how users can navigate key moments entirely hands-free.
From conversation mode, users can access email and other alerts through two distinct views. Smart chips filter by type; like payments or packages, while badges verify sources, flag priorities, and surface followed threads, reinforcing trust and clarity at a glance.
From conversation mode, users can access email and other alerts through two distinct views. Smart chips filter by type; like payments or packages, while badges verify sources, flag priorities, and surface followed threads, reinforcing trust and clarity at a glance.
Expanding the introductory weather card opens the Today view, where users can scroll through the day, confirm or adjust preferences, and take action—all from a conversational, context-aware interface.
Expanding the introductory weather card opens the Today view, where users can scroll through the day, confirm or adjust preferences, and take action—all from a conversational, context-aware interface.
How do you build a high-traffic website for a regional airport that’s fast, on-brand, and fully inclusive while overcoming skepticism about accessibility’s perceived complexity, cost, and impact on visual design?
I led the full accessibility strategy, UX design, and front-end development for the Bishop International Airport website, ensuring an equitable, efficient experience for all travelers.
Accessibility isn’t a checkbox—it’s an invitation.
At a university with deeply entrenched legacy systems and resource constraints, how can accessibility become a shared practice and not just a checklist before launch—especially in emotionally and socially urgent spaces like Title IX?
As part of the Digital Content and Accessibility Team, I helped lead a cultural and systemic shift in how the university approaches inclusion, accessibility, and digital equity.
Accessibility isn’t about perfection. It’s about progress you can feel.
When I joined Consumers Energy/CMS Energy, Low and Moderate Income (LMI) customers faced barriers that went unseen, their experiences often misunderstood or neglected in product design. Internal teams operated in silos, each seeing only fragments of the full picture—hidden to the complete lives our customers lived every day.
Without aligning stakeholders around a agreed-upon human-centered vision, our products risked remaining disconnected, misaligned, and ultimately ineffective.
I decided we needed more than just data—we needed empathy. Real stories from real lives.
First, I led deeply empathetic interviews, immersing stakeholders in the practical, and emotional, realities of LMI customers across Michigan. We heard the fatigue of a ALICE customers (Asset Limited, Income Constrained, Employed) juggling bills, felt the frustration of excessively complicated enrollment red-tape, and recognized the quiet dignity of people striving to keep their families comfortable.
This storyboard transformed a stakeholder's abstract vision into an actionable cross-functional plan. By grounding big ideas in the everyday, we bridged strategic aspiration with practical empathy, which shaped ambitious features to meet human problems.
Designing with stakeholders isn’t about aligning dislocated plans, it's about aligning hearts to create movement.
What surfaced through these stories reshaped the path ahead. They sparked a moment of clarity and momentum. I designed an interactive Journey Map Experience: a compelling visuals interwoven with authentic customer voices, revealing how our marketing and outreach either eased burdens or unintentionally created them.
Across the organization, many believed that highlighting premium features, even those offered for free to qualifying customers, would overwhelm systems or set unsustainable expectations. Some product teams hesitated to promote them altogether.
But in this clip, a participant returns to the outreach and suddenly notices the mention of premium upgrades. Their reaction is immediate: they’re surprised, intrigued, and want to learn more; and enroll even, not because they expect to receive the upgrade—as product owners percieved, but because the feature alone signals value.
The insight was clear: withholding key features out of fear limits potential impact. When customers discover what’s possible, it builds trust. Clarity builds curiosity. People often want to engage when they feel part of the picture, not left out of it.
Stakeholders assumed marketing imagery fell flat—that it reinforced perceptions of Consumers Energy as municipal, cold, and transactional. But in this clip, customers offer a different story. When asked about outreach materials, they shared how specific imagery marketing emails shaped their mindset, make way for trust and prompt engagement or the exact opposite.
This moment challenged prevailing internal beliefs. The insight reshaped MVP priorities and shifted focus toward outreach that feels timely, human, and aligned with how customers want to feel supported. It wasn’t just what the program offered. It was how it entered the customer’s life that mattered most.
Designed not just to inform but to emotionally anchor, the beginning and end of the Journey Map Experience invite stakeholders into a space of reflection, insight, and alignement. These moments use delight, pacing, and ambiance to soften fragmented resistance and open the door for empathy and set the stage for insight and leaving a long lasting impression after the stories end.
Full Video of Interactive Journey Map with Customer Feedback (~8 mins)
By layering customer audio clips directly into the visual journey, stakeholders could feel firsthand the gap between our intentions and customers' actual experiences in compelling narrative.
This moment was later highlighted in our internal culture publication as a turning point for breaking down silos and rebuilding shared vision:
This excerpt from a culture publication demonstrates how our audio-augmented journey map disrupted fragmented strategies and led to confidence in changing the product and its website. Though text-heavy, it reflects the lasting organizational change sparked by storytelling; not just in deliverables, but in mindset.
Engaging stakeholders through stories transformed internal perspectives. Silos began dissolving; teams started speaking a common language rooted in their own experience and the customers' as well. Stakeholders who once debated strategy in abstract terms now vividly saw and heard their impact on real people.
At the same time, we also ran hands-on workshops where we used storytelling methods to turn abstract strategy into something people could feel and build around."
Built collaboratively in a live session, this full-scale journey map combined customer quotes (some real, some invented) with stakeholder expertise. By inviting teams to defeat assumption, we challenged bias, elevated unheard voices, and revealed gaps in collective understanding.
Captured during a board game activity, this close-up shows a moment where stakeholders stepped into the shoes of LMI customers and made real choices under imagined constraints. This play-based method fostered emotional connection and deliverable insight beyond traditional research readouts.
Through these sessions, product owners, executives, and frontline teams collectively envisioned old and new products through the lens of customer lives, not just organizational metrics.
Stakeholders debated two competing visions for the LMI MVP.
Through guided facilitation and live synthesis, we aligned diverse perspectives and clarified assumptions and surfacing shared priorities.
This workshop journey map detailed how LMI customers engaged with layered systems across multiple products. Though product-specific, it uncovered unmet needs and misaligned expectations that hindered broader service strategy.
Storytelling allows me to establish trust and promote collaboration, culminating in an organizational shift towards proactive, empathetic product development designed for human outcomes first, business outcomes second.
How can smaller museums create meaningful, engaging experiences without major construction or funding especially when competing with the expectations of modern audiences surrounded by the digital world?
We designed an overhaul of a hybrid physical-digital exhibit experience for the Hall of Evolution at MSU’s Museum, transforming a static display into an interactive, educational journey through space, time, and impact.
Designing for physical experience is about designing for memory.