{"id":752,"date":"2021-01-20T03:01:47","date_gmt":"2021-01-20T03:01:47","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2021\/01\/20\/a-trusted-companion-ai-software-keeps-drivers-safe-and-focused-on-the-road-ahead\/"},"modified":"2021-01-20T03:01:47","modified_gmt":"2021-01-20T03:01:47","slug":"a-trusted-companion-ai-software-keeps-drivers-safe-and-focused-on-the-road-ahead","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2021\/01\/20\/a-trusted-companion-ai-software-keeps-drivers-safe-and-focused-on-the-road-ahead\/","title":{"rendered":"A Trusted Companion: AI Software Keeps Drivers Safe and Focused on the Road Ahead"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2021\/01\/19\/drive-ix-ai-software-drivers-safe\/\" data-title=\"A Trusted Companion: AI Software Keeps Drivers Safe and Focused on the Road Ahead\">\n<p><i>Editor\u2019s note: This is the latest post in our <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/self-driving-cars\/drive-labs\/\"><i>NVIDIA DRIVE Labs series<\/i><\/a><i>, which takes an engineering-focused look at individual autonomous vehicle challenges and how NVIDIA DRIVE addresses them. Catch up on all of our automotive posts, <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/category\/auto\/\"><i>here<\/i><\/a><i>.<\/i><\/p>\n<p>Even with advanced driver assistance systems automating more driving functions, human drivers must maintain their attention at the wheel and build trust in the AI system.<\/p>\n<p>Traditional driver monitoring systems typically don\u2019t understand subtle cues such as a driver\u2019s cognitive state, behavior or other activity that indicates whether they\u2019re ready to take over the driving controls.<\/p>\n<p><a href=\"https:\/\/developer.nvidia.com\/drive\/drive-ix\">NVIDIA DRIVE IX<\/a> is an open, scalable cockpit software platform that provides AI functions to enable a full range of in-cabin experiences, including intelligent visualization with augmented reality and virtual reality, conversational AI and interior sensing.<\/p>\n<p>Driver perception is a key aspect of the platform that enables the AV system to ensure a driver is alert and paying attention to the road. It also enables the AI system to perform cockpit functions that are more intuitive and intelligent.<\/p>\n<p>In this DRIVE Labs episode, NVIDIA experts demonstrate how DRIVE IX perceives driver attention, activity, emotion, behaviour, posture, speech, gesture and mood with a variety of detection capabilities.<\/p>\n<\/p>\n<h2><b>A Multi-DNN Approach<\/b><\/h2>\n<p>Facial expressions are complex signals to interpret. A simple wrinkle of the brow or shift of the gaze can have a variety of meanings.<\/p>\n<p>DRIVE IX uses multiple DNNs to recognize faces and decipher the expressions of vehicle occupants. The first DNN detects the face itself, while a second identifies fiducial points, or reference markings \u2014 such as eye location, nose, etc.<\/p>\n<p>On top of these base networks, a variety of DNNs operate to determine whether a driver is paying attention or requires other actions from the AI system.<\/p>\n<p>The <b>GazeNet<\/b> DNN tracks gazes by detecting the vector of the driver\u2019s eyes and mapping it to the road to check if they\u2019re able to see obstacles ahead. <b>SleepNet<\/b> monitors drowsiness, classifying whether eyes are open or closed, running through a state machine to determine levels of exhaustion. Finally, <b>ActivityNet<\/b> tracks driver activity such as phone usage, hands on\/off the wheel and driver attention to road events. DRIVE IX can also detect whether the driver is properly sitting in their seat to focus on road events.<\/p>\n<p><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/01\/DRIVE_IX_blog.png\"><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/01\/DRIVE_IX_blog-672x363.png\" alt=\"\" width=\"672\" height=\"363\"><\/a><\/p>\n<p>In addition to driver focus, a separate DNN can determine a driver\u2019s emotions \u2014 a key indicator of their ability to safely operate the vehicle. Taking in data from the base face-detect and fiducial-point networks, DRIVE IX can classify a driver\u2019s state as happy, surprised, neutral, disgusted or angry.<\/p>\n<p>It can also tell if the driver is squinting or screaming, indicating their level of visibility or alertness and state of mind.<\/p>\n<p><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/01\/DRIVE_Labs_IX.png\"><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/01\/DRIVE_Labs_IX.png\" alt=\"\" width=\"642\" height=\"313\"><\/a><\/p>\n<p>\u00a0<\/p>\n<h2><b>A Customizable Solution<\/b><\/h2>\n<p>Vehicle manufacturers can leverage the driver monitoring capabilities in DRIVE IX to develop advanced AI-based driver understanding capabilities for personalizing the car cockpit.<\/p>\n<p>The car can be programmed to alert a driver if their attention drifts from the road, or the cabin can adjust settings to soothe occupants if tensions are high.<\/p>\n<p>And these capabilities extend well beyond driver monitoring. The aforementioned DNNs, together with gesture DNN and speech capabilities, enable multi-modal conversational AI offerings such as automatic speech recognition, natural language processing and speech synthesis.<\/p>\n<p><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/01\/DRIVE_IX_Blog_3.png\"><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/01\/DRIVE_IX_Blog_3-672x368.png\" alt=\"\" width=\"672\" height=\"368\"><\/a><\/p>\n<p>These networks can be used for in-cabin personalization and virtual assistant applications. Additionally, the base facial recognition and facial key point models can be used for AI-based video conferencing platforms.<\/p>\n<p>The driver monitoring capabilities of DRIVE IX help build trust between occupants and the AI system as automated driving technology develops, creating a safer, more enjoyable intelligent vehicle experience.<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>http:\/\/feedproxy.google.com\/~r\/nvidiablog\/~3\/5FQlzq7HRUA\/<\/p>\n","protected":false},"author":0,"featured_media":753,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/752"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=752"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/752\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/753"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=752"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=752"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=752"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}