Wing HMI Concept
WING HMI Exploration is a concept aimed at designing next-gen Human-Machine Interfaces for premium, intelligent mobility vehicles, specifically targeting younger segments of premium car buyers (age 28-42). The project draws its conceptual inspiration from interplanetary mission control environments.
HMI
AI
Project Overview
Market
Design
Phase 1: Ideation
Defined primary use cases (e.g., navigation, media, charging info, ADAS controls)
Mapped driving contexts to determine interaction zones (HUD, cluster, center display)
Phase 2: Prototyping
Wireframes and low-fidelity flows were created and tested in simulated environments
Included variable states for light/dark modes, day/night driving, and system alerts
Phase 3: Refinement
Visual hierarchy : clear typography, consistent spacing, and subtle motion
Feedback modeled using progressive disclosure and dynamic affordances
Phase 1
Customer Segmentation
Younger, tech-savvy premium buyers
Value safety, personalization, UX
Digital-first, expect seamless CX
Product Segmentation
Panoramic and adaptive touch displays
Advanced voice assistants & AI layering
Advanced voice assistants & AI layering
Brand Segmentation
Legacy OEMs focus on incremental change
New entrants lead with software agility
Most lack adaptive zoning, emotional UX
Phase 1
Phase 2
Phase 2
Phase 2

Phase 2
Phase 2
Phase 2
Phase 2
Phase 3


Evaluation was conducted using a high-fidelity vehicle simulator, utilizing a camera system with a 20mm focal length to accurately replicate the human eye's natural field of view (17–22mm). This setup ensured an authentic, driver-centric perspective for assessing HMI elements (speed, navigation, and auxiliary widgets) across key criteria: visibility, clarity, and visual comfort.
The preliminary results confirm a user-centered design: critical dashboard information is optimally positioned within the driver’s primary line of vision, and secondary peripheral controls are strategically placed to minimize cognitive distraction. The central positioning of the primary display and physical controls ensures maximum reachability and intuitive interaction. This simulation based approach provides essential, pre-road data for final HMI refinement.
Phase 3
Phase Evaluation
92% task success rate in simulation
High user trust in adaptive UI
However, generative features remain surface-level personalization lacks deep logic and multi-turn context.
Key takeaways
Modular UI zoning enables safe expansion without redesigning core surfaces.
Users appreciate visual calmness + proactive utility, especially when driving attention is split.
Next Step
Integrate OpenAI or fine-tuned models for contextual, multi-turn dialogue
Link voice commands to intent + logic maps
Build out testing for stress states, low-light edge cases, new drivers











