Nuance introduces emotion AI based humanised mobility assistant

Nuance Communications, Inc., introduced new innovation in its Dragon Drive platform that transforms it into a conversational, humanised mobility assistant that will be core to the digital, button-free car of the future.
With these advancements, Nuance has created a multi-sensorial, cognitive mobility assistant that is poised to change the way drivers and passengers interact with their vehicles, other forms of mobility and transportation, and smart cities.
With the ability to understand tone of voice, eye and head movements, and emotions and deep integration with both in-car and exterior sensors, as well as third-party bots and assistants through an open platform, Dragon Drive leverages a myriad of insights to provide a personalised, human-like user experience that keeps drivers and passengers entertained, productive, connected, and safe. This experience is foundational to the future of the digital car, removing the need for the myriad of buttons in today’s vehicles and giving way to a humanised, multi-sensorial user experience powered by simultaneous voice, sight, gesture and emotion interaction.
Users can experience this multi-sensorial interaction through:
– POI interaction enhanced by augmented reality – Eye tracking, combined with voice recognition, can be used for interaction with points of interest outside the car, allowing the driver to get information on hours, ratings, etc. The results are highlighted in augmented reality and displayed on a smart windshield using a transparent screen, developed by Saint-Gobain Sekurit.
– Advanced button-free car – Leveraging voice recognition and eye tracking, users can intuitively interact with widgets on this smart windshield to access and refine services and information – phone and contacts, weather, navigation, music – that would traditionally be shown on the console display. Bringing these controls up to eye level on the windshield enhances convenience, comfort, productivity, and safety. The visibility is never compromised as you can still see through the image, a necessary feature, especially in the context of the increasingly digital and autonomous car. In addition, head tracking combined with voice recognition enables in-car controls (ex., the driver can look at the passenger-side window and say, “open that window half way”), thus further decreasing the need for button-based control in the car.
– Enhanced context capabilities and collaborative dialogue enable an even more human-like experience. Drivers can converse with the mobility assistant just like they would a human, referencing prior parts of the conversation, asking follow-up questions and giving follow-up commands (“open that window half way,” followed by, “a little bit more”) – all without the need for a wake-up word.
– Emotion and cognitive state analysis – Dragon Drive uses Affectiva’s Emotion AI, as well as interior cameras, to analyse facial expressions and tone of voice to understand drivers’ and passengers’ complex emotions and cognitive states such as drowsiness and distraction. The assistant then changes both its response and tone of voice to match the situation with empathy and relevance. This enhances road safety by preventing impaired driving and improves the in-cabin experience by adapting the environment to passenger moods and reactions.
Nuance introduces emotion AI based humanised mobility assistant
Modified on Friday 28th December 2018
Find all articles related to:
Nuance introduces emotion AI based humanised mobility assistant