Edge AI for Mission-Critical HMI

Edge AI for Mission-Critical HMI

Edge AI for Mission-Critical HMI

Extending the WING Ecosystem for Real-Time, Privacy-First Mobility

Role

UX/UI Designer

Duration

3 months

Industry

Health & Fitness

Overview

This project extends the WING HMI ecosystem, originally designed to centralize vehicle control into a streamlined, driver-first platform. With the rise of embedded AI in mobility systems, the next evolution the WING Intelligence Platform focuses on enabling real-time, privacy-respecting AI at the edge, right inside the vehicle.

Our goal: Eliminate cloud dependency for mission-critical decisions and give users full control over their data and AI behavior.

Problem

Smart vehicles are increasingly AI-powered, offering predictive navigation, automated alerts, and conversational assistance. However, these systems rely heavily on cloud computation, causing latency for split-second decisions. Worse, users lack visibility or control over how their driving data is collected or processed eroding trust.

The challenge: How might we design a real-time HMI system powered by edge AI that prioritizes both performance and user data sovereignty?

Proposed Solution

Reel image

Edge AI Mobile Companion

Training and Trust Hub for the entire HMI Ecosystem.

Research Approach: AI-Assisted Synthesis

As this was a conceptual project, we applied a synthetic research model using AI-assisted generative research a method often referred to as computational ethnography.

Instead of conducting direct interviews, we used AI tools to extract and synthesize user sentiment and behavioral patterns from:

  • Automotive user forums (Tesla, Comma.ai, Rivian)

  • Product reviews of vehicle infotainment systems

  • Reddit and Twitter discussions on privacy and latency

  • Driver safety reports and UX studies in telematics

  • HMI blogs

From this synthesis, we identified three recurring drivers of user dissatisfaction

Research Approach: AI-Assisted Synthesis

As this was a conceptual project, we applied a synthetic research model using AI-assisted generative research a method often referred to as computational ethnography.

Instead of conducting direct interviews, we used AI tools to extract and synthesize user sentiment and behavioral patterns from:

  • Automotive user forums (Tesla, Comma.ai, Rivian)

  • Product reviews of vehicle infotainment systems

  • Reddit and Twitter discussions on privacy and latency

  • Driver safety reports and UX studies in telematics

  • HMI blogs

From this synthesis, we identified three recurring drivers of user dissatisfaction

Design Principles

These insights shaped our core product principles used to guide both system architecture and interface behaviors:

  1. Transparency
    Clearly communicate how AI decisions are made and what data is used.

  2. Control
    Give users intuitive toggles to manage AI training, cloud syncing, and data usage.

  3. Efficiency
    Ensure interfaces reduce scan time and cognitive effort — especially while driving.

Core Architectural Principle:

The WING Intelligence Platform is designed as a layered HMI system that distinguishes between:

  • Sensitive, Locally Trained Data
    (e.g., driving behavior, routes, voice interactions)

  • Generalized, Cloud-Synced Data
    (e.g., firmware updates, public map data)

By embedding edge AI inference directly into the vehicle’s onboard system, the platform guarantees real-time responsiveness while retaining privacy with all training, inference, and decision-making happening locally unless explicitly opted into cloud features.

The system is designed to sync with companion mobile and wearable apps for zero-latency alerts, haptic feedback, and data review outside the vehicle.

Core User Flows

Reel image


Create a free website with Framer, the website builder loved by startups, designers and agencies.