At the Consumer Electronics Show (CES) 2019, visitors got a chance to experience our mobility assistant platform during a virtual cruise down the streets of the Big Apple. In a 220°-degree immersive theater, we brought “the city that never sleeps” to life and delivered a natural and intuitive experience of the digital car of the future. In addition to emotional intelligence of the mobility assistant, visitors were able to experience a smart combination of voice recognition, eye and head tracking, and visual feedback on an innovative windshield technology.
Advances in artificial intelligence, ubiquitous connectivity, autonomous vehicles and shared mobility models are transforming the way people are moving through the world. A variety of new, more needs-based mobility concepts is rising that will enable users to get all types of things done while on the go and that will be connected to the ecosystem of digital cars, digital homes and digital cities. As a result, future mobility assistants will serve users throughout their digital lives – no matter if in their owned cars, in shared cars or in a semi-autonomous or autonomous vehicle.
CES 2019: Introducing the world’s first multi-sensorial mobility assistant
The Cerence Drive, then Nuance Dragon Drive, demo showcased at CES 2019 was the next important step in creating a more humanlike, conversational experience for drivers and passengers – an experience that is reliable, accurate, safe and intuitive. At the heart of Cerence’s CES presence was a 220° immersive theater experience that brought the city of New York to Las Vegas. Leveraging a combination of eye tracking and natural language understanding, users were able to interact with points of interest (POI) outside the car to get general information, opening hours, ratings and more. “Enhanced context capabilities and the ability to have a collaborative dialogue ensures users can further specify their request or ask additional questions about the POI,” explains Lior Ben-Gigi, Director of Product Management at Cerence. “This combination ensures a very intuitive interaction with the outside world, that is becoming more like interacting with a human.” The results are highlighted in augmented reality displayed on a smart windshield developed by Saint-Gobain Sekurit.
No knobs and buttons needed: The button-free car is becoming a reality
Operating the car without a need to push a button, use a rotary switch or even touch a screen is not only possible but significantly improves the user experience. Again, the smart windshield technology provided by Saint-Gobain Sekurit comes into play: Using voice commands in combination with eye-tracking, users can intuitively interact with “widgets” displayed on the windshield to refine and select services and information – phone, contacts, weather, navigation, music – that would traditionally be shown on the console display. “Bringing these controls up to eye level on the windshield enhances convenience, comfort, productivity, and safety already today”, explains Ben-Gigi. “Using the windshield as an additional interface to provide users with information or access to services will become a key UX feature especially in the context of the increasingly digital and autonomous car.”
In addition, Cerence Drive fuses head movement results with voice recognition to enable intuitive in-car controls, for example by looking at the passenger-side window and saying: “Open that window".
The mobility assistant that knows how you feel
In addition to understanding what users want to do, Cerence Drive now also senses how they feel. Using “emotion AI” powered by Affectiva combined with cameras to analyze facial expressions and tone of voice the mobility assistant understands drivers’ and passengers’ cognitive and emotional states. From there, the assistant adapts its behavior accordingly, changing both its response style and tone of voice to match the situation. Besides providing more “empathic” assistance, this technology could enhance safety on the road by preventing distracted, drowsy and impaired driving.