{"id":2555,"date":"2022-09-22T14:41:12","date_gmt":"2022-09-22T14:41:12","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2022\/09\/22\/continental-and-aeye-join-nvidia-drive-sim-sensor-ecosystem-providing-rich-capabilities-for-av-development\/"},"modified":"2022-09-22T14:41:12","modified_gmt":"2022-09-22T14:41:12","slug":"continental-and-aeye-join-nvidia-drive-sim-sensor-ecosystem-providing-rich-capabilities-for-av-development","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2022\/09\/22\/continental-and-aeye-join-nvidia-drive-sim-sensor-ecosystem-providing-rich-capabilities-for-av-development\/","title":{"rendered":"Continental and AEye Join NVIDIA DRIVE Sim Sensor Ecosystem, Providing Rich Capabilities for AV Development"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2022\/09\/22\/continental-aeye-drive-sim-sensor-ecosystem\/\" data-title=\"Continental and AEye Join NVIDIA DRIVE Sim Sensor Ecosystem, Providing Rich Capabilities for AV Development\" data-hashtags=\"\">\n<p>Autonomous vehicle sensors require the same rigorous testing and validation as the car itself, and one simulation platform is up to the task.<\/p>\n<p>Global tier-1 supplier Continental and software-defined lidar maker AEye announced this week at <a href=\"https:\/\/www.nvidia.com\/gtc\/\">NVIDIA GTC<\/a> that they will migrate their intelligent lidar sensor model into <a href=\"https:\/\/www.nvidia.com\/en-us\/self-driving-cars\/simulation\/\">NVIDIA DRIVE Sim<\/a>. The companies are the latest to join the extensive ecosystem of sensor makers using NVIDIA\u2019s end-to-end, cloud-based simulation platform for technology development.<\/p>\n<p>Continental offers a full suite of cameras, radars and ultrasonic sensors, as well as its recently launched short-range flash lidar, some of which are incorporated into the <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/11\/09\/drive-hyperion-orin-production-ready-platform\/\">NVIDIA Hyperion<\/a> autonomous-vehicle development platform.<\/p>\n<p>Last year, Continental and AEye <a href=\"https:\/\/www.continental.com\/en\/press\/press-releases\/20210714-lidar-aeye\/\">announced a collaboration<\/a> in which the tier-1 supplier would use the lidar maker\u2019s software-defined architecture to produce a long-range sensor. Now, the companies are contributing this sensor model to DRIVE Sim, helping to bring their vision to the industry.<\/p>\n<p>DRIVE Sim is built on the <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\">NVIDIA Omniverse<\/a> platform for connecting and building custom 3D pipelines, providing physically based <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/solutions\/digital-twins\/\">digital twin<\/a> environments to develop and validate autonomous vehicles. DRIVE Sim is open and modular \u2014 users can create their own extensions or choose from a rich library of sensor plugins from ecosystem partners.<\/p>\n<p>In addition to providing sensor models, partners use the platform to validate their own sensor architectures.<\/p>\n<p>By joining this rich community of DRIVE Sim users, Continental and AEye can now rapidly simulate edge cases in varying environments to test and validate lidar performance.<\/p>\n<h2><b>A Lidar for All Seasons<\/b><\/h2>\n<p>AEye and Continental are creating <a href=\"https:\/\/www.continental-automotive.com\/en-gl\/Passenger-Cars\/Autonomous-Mobility\/Enablers\/Lidars\/HRL131\">HRL 131,<\/a> a high-performance, long-range lidar for both passenger cars and commercial vehicles that\u2019s software configurable and can adapt to various driving environments.<\/p>\n<p>The lidar incorporates dynamic performance modes where the laser scan pattern adapts for any automated driving application, including highway driving or dense urban environments in all weather conditions, including direct sun, night, rain, snow, fog, dust and smoke. It features a range of more than 300 meters for detecting vehicles and 200 meters for detecting pedestrians, and is slated for mass production in 2024.<\/p>\n<figure id=\"attachment_59891\" aria-describedby=\"caption-attachment-59891\" class=\"wp-caption aligncenter\"><img decoding=\"async\" loading=\"lazy\" class=\"wp-image-59891 size-large\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2022\/09\/NVIDIA_Conti_AEye_blog_cam114_1280_680-672x357.png\" alt=\"\" width=\"672\" height=\"357\"><figcaption id=\"caption-attachment-59891\" class=\"wp-caption-text\">The simulated Continental HRL131 long-range lidar sensor, built on AEye\u2019s 4Sight intelligent sensing platform, running in NVIDIA DRIVE Sim.<\/figcaption><\/figure>\n<p>With DRIVE Sim, developers can recreate obstacles with their exact physical properties and place them in complex highway environments. They can determine which lidar performance modes are suitable for the chosen application based on uncertainties experienced in a particular scenario.<\/p>\n<p>Once identified and tuned, performance modes can be activated on the fly using external cues such as speed, location or even vehicle pitch, which can change with loading conditions, tire-pressure variations and suspension modes.<\/p>\n<p>The ability to simulate performance characteristics of a software-defined lidar model adds even greater flexibility to DRIVE Sim, further accelerating robust autonomous vehicle development.<\/p>\n<p>\u2018\u2019With the scalability and accuracy of NVIDIA DRIVE Sim, we\u2019re able to validate our long-range lidar technology efficiently,\u2019\u2019 said Gunnar Juergens, head of product line, lidar, at Continental. \u2018\u2019It\u2019s a robust tool for the industry to train, test and validate safe self-driving solutions\u2019\u2019<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2022\/09\/22\/continental-aeye-drive-sim-sensor-ecosystem\/<\/p>\n","protected":false},"author":0,"featured_media":2556,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2555"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2555"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2555\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2556"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2555"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2555"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2555"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}