{"id":2571,"date":"2022-10-06T17:47:08","date_gmt":"2022-10-06T17:47:08","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2022\/10\/06\/meet-the-omnivore-ph-d-student-lets-anyone-bring-simulated-bots-to-life-with-nvidia-omniverse-extension\/"},"modified":"2022-10-06T17:47:08","modified_gmt":"2022-10-06T17:47:08","slug":"meet-the-omnivore-ph-d-student-lets-anyone-bring-simulated-bots-to-life-with-nvidia-omniverse-extension","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2022\/10\/06\/meet-the-omnivore-ph-d-student-lets-anyone-bring-simulated-bots-to-life-with-nvidia-omniverse-extension\/","title":{"rendered":"Meet the Omnivore: Ph.D. Student Lets Anyone Bring Simulated Bots to Life With NVIDIA Omniverse Extension"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2022\/10\/05\/yizhou-zhao-omniverse-developer\/\" data-title=\"Meet the Omnivore: Ph.D. Student Lets Anyone Bring Simulated Bots to Life With NVIDIA Omniverse Extension\" data-hashtags=\"\">\n<p><i>Editor\u2019s note: This post is a part of our <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/meet-the-omnivore\/\" target=\"_blank\" rel=\"noopener\"><i>Meet the Omnivore<\/i><\/a><i> series, which features individual creators and developers who use <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\" target=\"_blank\" rel=\"noopener\"><i>NVIDIA Omniverse<\/i><\/a><i> to accelerate their 3D workflows and create virtual worlds.<\/i><\/p>\n<figure id=\"attachment_60070\" aria-describedby=\"caption-attachment-60070\" class=\"wp-caption alignright\">\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2022\/10\/yizhou-zhao-150x150.jpg\" alt=\"\" width=\"150\" height=\"150\"><figcaption id=\"caption-attachment-60070\" class=\"wp-caption-text\">Yizhou Zhao<\/figcaption><\/figure>\n<p>When not engrossed in his studies toward a Ph.D. in statistics, conducting data-driven research on AI and robotics, or enjoying his favorite hobby of sailing, Yizhou Zhao is winning contests for developers who use <a href=\"https:\/\/developer.nvidia.com\/nvidia-omniverse-platform\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse<\/a> \u2014 a platform for connecting and building custom 3D pipelines and metaverse applications.<\/p>\n<p>The fifth-year doctoral candidate at the University of California, Los Angeles recently received first place in the inaugural <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/code\/developer-contest\/\" target=\"_blank\" rel=\"noopener\">#ExtendOmniverse contest<\/a>, where developers were invited to create their own <a href=\"https:\/\/docs.omniverse.nvidia.com\/prod_extensions\/prod_extensions\/overview\" target=\"_blank\" rel=\"noopener\">Omniverse extension<\/a> for a chance to win an NVIDIA RTX GPU.<\/p>\n<p>Omniverse extensions are core building blocks that let anyone create and extend functions of <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/\" target=\"_blank\" rel=\"noopener\">Omniverse apps<\/a> using the popular Python programming language.<\/p>\n<p>Zhao\u2019s winning entry, called \u201cIndoorKit,\u201d allows users to easily load and record robotics simulation tasks in indoor scenes. It sets up robotics manipulation tasks by automatically populating scenes with the indoor environment, the bot and other objects with just a few clicks.<\/p>\n<\/p>\n<p>\u201cTypically, it\u2019s hard to deploy a robotics task in simulation without a lot of skills in scene building, layout sampling and robot control,\u201d Zhao said. \u201cBy bringing assets into Omniverse\u2019s powerful user interface using the Universal Scene Description framework, my extension achieves instant scene setup and accurate control of the robot.\u201d<\/p>\n<p>Within \u201cIndoorKit,\u201d users can simply click \u201cadd object,\u201d \u201cadd house,\u201d \u201cload scene,\u201d \u201crecord scene\u201d and other buttons to manipulate aspects of the environment and dive right into robotics simulation.<\/p>\n<p>With <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/usd\" target=\"_blank\" rel=\"noopener\">Universal Scene Description (USD)<\/a>, an open-source, extensible file framework, Zhao seamlessly brought 3D models into his environments using <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/#apps-connectors\" target=\"_blank\" rel=\"noopener\">Omniverse Connectors<\/a> for Autodesk Maya and Blender software.<\/p>\n<p>The \u201cIndoorKit\u201d extension also relies on assets from the <a href=\"https:\/\/developer.nvidia.com\/isaac-sim\" target=\"_blank\" rel=\"noopener\">NVIDIA Isaac Sim<\/a> robotics simulation platform and Omniverse\u2019s built-in <a href=\"https:\/\/developer.nvidia.com\/physx-sdk\" target=\"_blank\" rel=\"noopener\">PhysX<\/a> capabilities for accurate, articulated manipulation of the bots.<\/p>\n<p>In addition, \u201cIndoorKit\u201d can randomize a scene\u2019s lighting, room materials and more. One scene Zhao built with the extension is highlighted in the feature video above.<\/p>\n<h2><b>Omniverse for Robotics\u00a0<\/b><\/h2>\n<p>The \u201cIndoorKit\u201d extension bridges Omniverse and robotics research in simulation.<\/p>\n<figure id=\"attachment_60061\" aria-describedby=\"caption-attachment-60061\" class=\"wp-caption aligncenter\">\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2022\/10\/yizhou-zhao-extension-view.jpg\" alt=\"\" width=\"512\" height=\"272\"><figcaption id=\"caption-attachment-60061\" class=\"wp-caption-text\">A view of Zhao\u2019s \u201cIndoorKit\u201d extension<\/figcaption><\/figure>\n<p>\u201cI don\u2019t see how accurate robot control was performed prior to Omniverse,\u201d Zhao said. He provides four main reasons for why Omniverse was the ideal platform on which to build this extension:<\/p>\n<p>First, Python\u2019s popularity means many developers can build extensions with it to unlock <a href=\"https:\/\/www.nvidia.com\/en-us\/deep-learning-ai\/solutions\/machine-learning\/\" target=\"_blank\" rel=\"noopener\">machine learning<\/a> and <a href=\"https:\/\/developer.nvidia.com\/deep-learning\" target=\"_blank\" rel=\"noopener\">deep learning<\/a> research for a broader audience, he said.<\/p>\n<p>Second, using <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/\" target=\"_blank\" rel=\"noopener\">NVIDIA RTX GPUs<\/a> with Omniverse greatly accelerates robot control and training.<\/p>\n<p>Third, Omniverse\u2019s ray-tracing technology enables real-time, photorealistic rendering of his scenes. This saves 90% of the time Zhao used to spend for experiment setup and simulation, he said.<\/p>\n<p>And fourth, Omniverse\u2019s real-time advanced physics simulation engine, <a href=\"https:\/\/developer.nvidia.com\/physx-sdk\" target=\"_blank\" rel=\"noopener\">PhysX<\/a>, supports an extensive range of features \u2014 including liquid, particle and soft-body simulation \u2014 which \u201cland on the frontier of robotics studies,\u201d according to Zhao.<\/p>\n<p>\u201cThe future of art, engineering and research is in the spirit of connecting everything: modeling, animation and simulation,\u201d he said. \u201cAnd Omniverse brings it all together.\u201d<\/p>\n<h2><b>Join In on the Creation<\/b><\/h2>\n<p>Creators and developers across the world can download <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse for free<\/a>, and <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/enterprise\/\" target=\"_blank\" rel=\"noopener\">enterprise teams<\/a> can use the platform for their 3D projects.<\/p>\n<p>Discover <a href=\"https:\/\/www.youtube.com\/watch?v=eGxV_PGNpOg\" target=\"_blank\" rel=\"noopener\">how to build an Omniverse extension in less than 10 minutes<\/a>.<\/p>\n<p>For a deeper dive into developing on Omniverse, watch the on-demand NVIDIA GTC session, \u201c<a href=\"https:\/\/www.nvidia.com\/gtc\/session-catalog\/?tab.catalogallsessionstab=16566177511100015Kus&amp;search=A41167#\/\" target=\"_blank\" rel=\"noopener\">How to Build Extensions and Apps for Virtual Worlds With NVIDIA Omniverse<\/a>.\u201d<\/p>\n<p>Find additional documentation and tutorials in the <a href=\"https:\/\/developer.nvidia.com\/nvidia-omniverse-developer-resource-center\" target=\"_blank\" rel=\"noopener\">Omniverse Resource Center<\/a>, which details how developers like Zhao can build custom <a href=\"https:\/\/developer.nvidia.com\/usd\" target=\"_blank\" rel=\"noopener\">USD-based applications and extensions<\/a> for the platform.<\/p>\n<p>To discover more free tools, training and a community for developers, join the <a href=\"https:\/\/developer.nvidia.com\/developer-program\" target=\"_blank\" rel=\"noopener\">NVIDIA Developer Program<\/a>.<\/p>\n<p><i>Follow NVIDIA Omniverse on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiaomniverse\/\" target=\"_blank\" rel=\"noopener\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/medium.com\/@nvidiaomniverse\" target=\"_blank\" rel=\"noopener\"><i>Medium<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/nvidiaomniverse\" target=\"_blank\" rel=\"noopener\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\" target=\"_blank\" rel=\"noopener\"><i>YouTube<\/i><\/a><i> for additional resources and inspiration. Check out the Omniverse <\/i><a href=\"https:\/\/forums.developer.nvidia.com\/c\/omniverse\/300\" target=\"_blank\" rel=\"noopener\"><i>forums<\/i><\/a><i>, and join our <\/i><a href=\"https:\/\/discord.com\/invite\/XWQNJDNuaC\" target=\"_blank\" rel=\"noopener\"><i>Discord server<\/i><\/a> <i>and <\/i><a href=\"https:\/\/www.twitch.tv\/nvidiaomniverse\" target=\"_blank\" rel=\"noopener\"><i>Twitch <\/i><\/a><i>channel<\/i><i> to chat with the community.<\/i><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2022\/10\/05\/yizhou-zhao-omniverse-developer\/<\/p>\n","protected":false},"author":0,"featured_media":2572,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2571"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2571"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2571\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2572"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2571"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2571"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2571"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}