{"id":2919,"date":"2023-03-21T18:10:59","date_gmt":"2023-03-21T18:10:59","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2023\/03\/21\/nvidia-studio-at-gtc-new-ai-powered-artistic-tools-feature-updates-nvidia-rtx-systems-for-creators\/"},"modified":"2023-03-21T18:10:59","modified_gmt":"2023-03-21T18:10:59","slug":"nvidia-studio-at-gtc-new-ai-powered-artistic-tools-feature-updates-nvidia-rtx-systems-for-creators","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2023\/03\/21\/nvidia-studio-at-gtc-new-ai-powered-artistic-tools-feature-updates-nvidia-rtx-systems-for-creators\/","title":{"rendered":"NVIDIA Studio at GTC: New AI-Powered Artistic Tools, Feature Updates, NVIDIA RTX Systems for Creators"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2023\/03\/21\/omniverse-generative-ai-unity-blender-connectors\/\" data-title=\"NVIDIA Studio at GTC: New AI-Powered Artistic Tools, Feature Updates, NVIDIA RTX Systems for Creators\" data-hashtags=\"\">\n<p><i>Editor\u2019s note: This post is part of our weekly <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/in-the-nvidia-studio\/\"><i>In the NVIDIA Studio<\/i><\/a><i> series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\"><i>NVIDIA Studio<\/i><\/a><i> technology improves creative workflows. We\u2019re also deep diving on new <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/40-series\/\"><i>GeForce RTX 40 Series GPU<\/i><\/a><i> features, technologies and resources, and how they dramatically accelerate content creation.<\/i><\/p>\n<p>Powerful AI technologies are revolutionizing 3D content creation \u2014 whether by enlivening realistic characters that show emotion or turning simple texts into imagery.<\/p>\n<p>The brightest minds, artists and creators are gathering at <a href=\"https:\/\/www.nvidia.com\/gtc\/\">NVIDIA GTC<\/a>, a free, global conference on AI and the metaverse, taking place online through Thursday, March 23.<\/p>\n<p>NVIDIA founder and CEO Jensen Huang\u2019s <a href=\"https:\/\/www.nvidia.com\/gtc\/keynote\/\">GTC keynote<\/a> announced a slew of advancements set to ease creators\u2019 workflows, including using generative AI with the <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/audio2face\/\">Omniverse Audio2Face<\/a> app.<\/p>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/creators\/\">NVIDIA Omniverse<\/a>, a platform for creating and operating <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/08\/10\/what-is-the-metaverse\/\">metaverse<\/a> applications, <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/03\/21\/omniverse-accelerates-game-dev\/\">further expands<\/a> with an updated Unreal Engine Connector, open-beta Unity Connector and new SimReady 3D assets.<\/p>\n<p>New NVIDIA RTX GPUs, powered by the Ada Lovelace architecture, are fueling next-generation laptop and desktop workstations to meet the demands of the AI, design and the industrial metaverse.<\/p>\n<p>The March NVIDIA Studio Driver adds support for the popular <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/02\/28\/rtx-video-super-resolution\/\">RTX Video Super Resolution<\/a> feature, now available for GeForce RTX 40 and 30 Series GPUs.<\/p>\n<p>And this week <i>In the NVIDIA Studio, <\/i>the Adobe Substance 3D art and development team explores the process of collaborating to create the animated short<i> End of Summer<\/i> using <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/create\/\">Omniverse USD Composer <\/a>(formerly known as Create).<i>\u00a0<\/i><\/p>\n<h2><b>Omniverse Overdrive<\/b><\/h2>\n<p>Specialized generative AI tools can boost creator productivity, even for users who don\u2019t have extensive technical skills. Generative AI brings creative ideas to life, producing high-quality, highly iterative experiences \u2014 all in a fraction of the time and cost of traditional asset development.<\/p>\n<p>The <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/audio2face\/\">Omniverse Audio2Face<\/a> AI-powered app allows 3D artists to efficiently animate secondary characters,\u00a0 generating realistic facial animations with just an audio file \u2014 replacing what is often a tedious, manual process.<\/p>\n<\/p>\n<p>The latest release delivers significant upgrades in quality, usability and performance including a new headless mode and a REST API \u2014 enabling game developers and other creators to run the app and process numerous audio files from multiple users in the data center.<\/p>\n<p>A new Omniverse Connector developed by NVIDIA for Unity workflows is available in open beta. Unity scenes can be added directly onto <a href=\"https:\/\/docs.omniverse.nvidia.com\/prod_nucleus\/prod_nucleus\/overview.html\">Omniverse Nucleus servers<\/a> with access to platform features: the <a href=\"https:\/\/docs.omniverse.nvidia.com\/prod_services\/prod_services\/services\/deepsearch\/overview.html\">DeepSearch<\/a> tool, thumbnails, bookmarks and more. Unidirectional live-sync workflows enable real-time updates.<\/p>\n<p>With the Unreal Engine Connector\u2019s latest release, Omniverse users can now use Unreal Engine\u2019s USD import utilities to add skeletal mesh blend shape importing, and Python USD bindings to access stages on Omniverse Nucleus. This release also delivers improvements in import, export and live workflows, as well as updated software development kits.<\/p>\n<p>In addition, over 1,000 <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/03\/21\/new-omniverse-connections-advance-3d-workflows\">new SimReady assets<\/a> are available for creators. SimReady assets are built to real-world scale with accurate mass, physical materials and center of gravity for use within Omniverse <a href=\"https:\/\/developer.nvidia.com\/physx-sdk\">PhysX <\/a>for the most photorealistic visuals and accurate movements.<\/p>\n<h2><b>March Studio Driver Brings Superfly Super Resolution<\/b><\/h2>\n<p>Over 90% of online videos consumed by NVIDIA RTX GPU owners are 1080p resolution or lower, often resulting in upscaling that further degrades the picture despite the hardware being able to handle more.<\/p>\n<p>The solution: RTX Video Super Resolution. The new feature, available on GeForce RTX 30 and 40 Series GPUs, uses AI to improve the quality of any video streamed through Google Chrome and Microsoft Edge browsers.<\/p>\n<figure id=\"attachment_63038\" aria-describedby=\"caption-attachment-63038\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-rtx-video-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"wp-image-63038 size-large\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-rtx-video-1280w-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-63038\" class=\"wp-caption-text\">Click the image to see the differences between bicubic upscaling (left) and RTX Video Super Resolution.<\/figcaption><\/figure>\n<p>This improves video sharpness and clarity. Users can watch online content in its native resolution, even on high-resolution displays.<\/p>\n<\/p>\n<p>RTX Video Super Resolution is now available in the March Studio Driver, which can be downloaded today.<\/p>\n<h2><b>New NVIDIA RTX GPUs Power Professional Creators<\/b><\/h2>\n<p>Six new professional-grade <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/ada-lovelace-architecture\/\">NVIDIA RTX GPUs<\/a> \u2014 based on the Ada Lovelace architecture \u2014 enable creators to meet the demands of their most complex workloads using laptops and desktops.<\/p>\n<p>The NVIDIA RTX 5000, RTX 4000, RTX 3500, RTX 3000 and RTX 2000 Ada Generation <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/rtx-professional-laptops\/?ncid=ref-pr-584767#cid=_ref-pr_en-us\">laptop GPUs<\/a> deliver up to 2x the performance compared with the previous generation. The <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/rtx-4000-sff\">NVIDIA RTX 4000 Small Form Factor (SFF) Ada Generation<\/a> desktop GPU features new RT Cores, Tensor Cores and CUDA cores with up to 20GB of graphics memory.<\/p>\n<\/p>\n<p>These include the latest NVIDIA <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/rtx-professional-laptops\/#max-q-feature-1\">Max-Q<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/technologies\/rtx\/\">RTX<\/a> technologies and are backed by the <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\">NVIDIA Studio<\/a> platform with RTX optimizations in over 110 creative apps, NVIDIA RTX Enterprise Drivers for the highest levels of stability and performance, and exclusive AI-powered NVIDIA tools: Omniverse, <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/canvas\/\">Canvas<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/broadcasting\/broadcast-app\/\">Broadcast<\/a>.<\/p>\n<p>Professionals using these laptop GPUs can run advanced technologies like <a href=\"https:\/\/developer.nvidia.com\/rtx\/dlss\">DLSS 3<\/a> to increase frame rates by up to 4x compared to the previous generation, and <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/enterprise\/\">Omniverse Enterprise<\/a> for real-time collaboration and simulation.<\/p>\n<p>Next-generation mobile workstations featuring NVIDIA RTX GPUs will be available starting this month.<\/p>\n<h2><b>Creative Boosts at GTC<\/b><\/h2>\n<ul>\n<li>Experience GTC for more inspiring content, expert-led sessions and a <a href=\"https:\/\/www.nvidia.com\/gtc\/keynote\/\">must-see keynote<\/a> to accelerate your life\u2019s creative work.<\/li>\n<li>Catch these sessions on Omniverse, AI and 3D workflows \u2014 live or on demand:<\/li>\n<li>Fireside Chat With OpenAI Founder Ilya Sutskever and Jensen Huang: AI Today and Vision of the Future [<a href=\"https:\/\/www.nvidia.com\/auth\/gtc?scope=openid+email+profile&amp;client_id=FNwji43RhQ7B2YKM7B6rC6N7KA_Gu3-ohkaoljY9NJ8&amp;redirect_uri=https%3A%2F%2Fevents.rainfocus.com%2Foauth%2Fnvidia%2F1629402589906001W3aJ&amp;response_type=code&amp;state=98e854cab3ba4fbeb2583c8f1f10ecc7fa367c679780e462d67fc9dbff3cdbec1f4ca0dc59be4c09357d55a505ac3ebfaac80af4b5d2fc104d278fd1c3258500b21be48bf53bb2962d0123b680a47b0f0d4fbf7397adccddb986f2312218a40a394eb51dc3ce3fb289303d56\">S52092<\/a>]<\/li>\n<li>How Generative AI Is Transforming the Creative Process: Fireside Chat With Adobe\u2019s Scott Belsky and NVIDIA\u2019s Bryan Catanzaro [<a href=\"https:\/\/www.nvidia.com\/auth\/gtc?scope=openid+email+profile&amp;client_id=FNwji43RhQ7B2YKM7B6rC6N7KA_Gu3-ohkaoljY9NJ8&amp;redirect_uri=https%3A%2F%2Fevents.rainfocus.com%2Foauth%2Fnvidia%2F1629402589906001W3aJ&amp;response_type=code&amp;state=98e854cab3ba4fbeb2583c8f1f10ecc7fa367c679780e462d67fc9dbff3cdbec1f4ca0dc59be4c09357d55a505ac3ebfaac80af4b5d2fc104d278fd1c3258500b21be48bf53bb2962d0123b680a47b050147bd7194adccdd81f78c14963415920b55a63b312e5c6cd8956ff8\">S52090<\/a>]<\/li>\n<li>Generative AI Demystified [<a href=\"https:\/\/www.nvidia.com\/auth\/gtc?scope=openid+email+profile&amp;client_id=FNwji43RhQ7B2YKM7B6rC6N7KA_Gu3-ohkaoljY9NJ8&amp;redirect_uri=https%3A%2F%2Fevents.rainfocus.com%2Foauth%2Fnvidia%2F1629402589906001W3aJ&amp;response_type=code&amp;state=98e854cab3ba4fbeb2583c8f1f10ecc7fa367c679780e462d67fc9dbff3cdbec1f4ca0dc59be4c09357d55a505ac3ebfaac80af4b5d2fc104d278fd1c3258500b21be48bf53bb2962d0123b681a573020e48bf7b90adccdd89dad5172d9c12d1f55b23314f918ba8c3e487f7\">S52089<\/a>]<\/li>\n<li>3D by AI: How Generative AI Will Make Building Virtual Worlds Easier [<a href=\"https:\/\/www.nvidia.com\/gtc\/session-catalog\/?search=S52163&amp;tab.day=20230321#\/\">S52163<\/a>]<\/li>\n<li>Custom World Building With AI Avatars: The Little Martians Sci-Fi Project [<a href=\"https:\/\/register.nvidia.com\/flow\/nvidia\/gtcspring2023\/attendeeportal\/page\/sessioncatalog\/session\/1666381992944001RK2w\">S51360<\/a>]<\/li>\n<li>AI-Powered, Real-Time, Markerless: The New Era of Motion Capture [<a href=\"https:\/\/register.nvidia.com\/flow\/nvidia\/gtcspring2023\/attendeeportal\/page\/sessioncatalog\/session\/1666650017670001Ab8g\">S51845<\/a>]<\/li>\n<li>3D and Beyond: How 3D Artists Can Build a Side Hustle in the Metaverse [<a href=\"https:\/\/www.nvidia.com\/gtc\/session-catalog\/?tab.catalogallsessionstab=16566177511100015Kus&amp;search=SE52117#\/\">SE52117<\/a>]<\/li>\n<li>NVIDIA Omniverse User Group [<a href=\"https:\/\/www.nvidia.com\/gtc\/session-catalog\/?search=SE52047&amp;tab.catalogallsessionstab=16566177511100015Kus&amp;search=SE52047#\/session\/1668057901010001bA2k\">SE52047<\/a>]<\/li>\n<li>Accelerate the Virtual Production Pipeline to Produce an Award-Winning Sci-Fi Short Film [<a href=\"https:\/\/register.nvidia.com\/flow\/nvidia\/gtcspring2023\/attendeeportal\/page\/sessioncatalog\/session\/1666600419805001oxiw\">S51496<\/a>]<\/li>\n<\/ul>\n<p>As part of the <a href=\"https:\/\/80.lv\/articles\/attend-nvidia-gtc-for-a-chance-to-win-a-geforce-rtx-4080\/\">Watch \u2018n Learn Giveaway<\/a> with valued partner <a href=\"https:\/\/80.lv\/articles\/attend-nvidia-gtc-for-a-chance-to-win-a-geforce-rtx-4080\/\">80LV<\/a>, GTC attendees who register for any <a href=\"https:\/\/nvda.ws\/3l5Jl1F\">Omniverse for creators session<\/a> \u2014 or watch on-demand before March 30 \u2014 have a chance to win a powerful <a href=\"https:\/\/nvda.ws\/3yqL04L\">GeForce RTX 4080<\/a> GPU. Simply fill out this <a href=\"https:\/\/80level.typeform.com\/nvidia?utm_source=article\">form<\/a> and tag #GTC23 and @NVIDIAOmniverse with the name of the session.<\/p>\n<p>Search the <a href=\"https:\/\/www.nvidia.com\/gtc\/session-catalog\/?tab.catalogallsessionstab=16566177511100015Kus#\/\">GTC session catalog<\/a> and check out the \u201cMedia and Entertainment\u201d and \u201cOmniverse\u201d topics for additional creator-focused sessions.<\/p>\n<h2><b>A Father-Daughter Journey Back Home<\/b><\/h2>\n<p>The short animation <i>End of Summer<\/i>, created by the Substance 3D art and development team at Adobe, may evoke a surprising amount of heart. That was the team\u2019s intent.<\/p>\n<\/p>\n<p>\u201cWe loved the idea of allowing the artwork to invoke an emotion in the viewer, letting them develop their own version of a story they felt was unfolding before their eyes,\u201d said team member Wes McDermott.<\/p>\n<figure id=\"attachment_63039\" aria-describedby=\"caption-attachment-63039\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-design-boards-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-63039\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-design-boards-1280w-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-63039\" class=\"wp-caption-text\">\u201cEnd of Summer\u201d design boards.<\/figcaption><\/figure>\n<p><i>End of Summer<\/i>, a nod to stop-motion animation studios such as Laika, began as an internal Adobe Substance 3D project aimed at accomplishing two goals.<\/p>\n<p>First, to encourage a relatively new group of artists to work together as a team by leaning into a creative endeavor. And second, to test their pipeline feature set for the potential of the <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/usd\/\">Universal Scene Description (USD)<\/a> framework.<\/p>\n<figure id=\"attachment_63042\" aria-describedby=\"caption-attachment-63042\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-concept-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-63042\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-concept-1280w-672x357.jpg\" alt=\"\" width=\"672\" height=\"357\"><\/a><figcaption id=\"caption-attachment-63042\" class=\"wp-caption-text\">Early concept work for \u201cEnd of Summer.\u201d<\/figcaption><\/figure>\n<p>The group divided the task of creating assets across the most popular 3D apps, including Adobe Substance 3D Modeler, Autodesk 3ds Max, Autodesk Maya, Blender and Maxon\u2019s Cinema 4D. Their GeForce RTX GPUs unlocked AI denoising in the viewport for fast, interactive rendering and GPU-accelerated filters to speed up and simplify material creation.<\/p>\n<div class=\"simplePullQuote right\">\n<p>\u201cNVIDIA Omniverse is a great tool for laying out and setting up dressing scenes, as well as learning about USD workflows and collaboration. We used painting and NVIDIA PhysX collision tools to place assets.\u201d \u2014 Wes McDermott<\/p>\n<\/div>\n<p>\u201cWe quickly started to see the power of using USD as not only an export format but also a way to build assets,\u201d McDermott said. \u201cUSD enables artists on the team to use whatever 3D app they felt most comfortable with.\u201d<\/p>\n<p>The Adobe team relied heavily on the Substance 3D asset library of materials, models and lights to create their studio environment. All textures were applied in Substance 3D Painter, where RTX-accelerated light and ambient occlusion baking optimized assets in mere moments.<\/p>\n<p>Then, they imported all models into <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/create\/\">Omniverse USD Composer<\/a>, where the team simultaneously refined and assembled assets.<\/p>\n<p>\u201cThis was also during the pandemic, and we were all quarantined in our homes,\u201d McDermott said. \u201cHaving a project we could work on together as a team helped us to communicate and be creative.\u201d<\/p>\n<figure id=\"attachment_63049\" aria-describedby=\"caption-attachment-63049\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-ov-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-63049\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-ov-1280w-672x354.jpg\" alt=\"\" width=\"672\" height=\"354\"><\/a><figcaption id=\"caption-attachment-63049\" class=\"wp-caption-text\">Accelerate scene composition, and assemble, simulate and render 3D scenes in real time in Omniverse USD Composer.<\/figcaption><\/figure>\n<p>Lastly, the artists imported the scene into Unreal Engine as a stage for lighting and rendering.<\/p>\n<figure id=\"attachment_63052\" aria-describedby=\"caption-attachment-63052\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-ue-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-63052\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-ue-1280w-672x362.jpg\" alt=\"\" width=\"672\" height=\"362\"><\/a><figcaption id=\"caption-attachment-63052\" class=\"wp-caption-text\">Final scene edits in Unreal Engine.<\/figcaption><\/figure>\n<p>McDermott stressed the importance of hardware in his team\u2019s workflows. \u201cThe bakers in Substance Painter are GPU accelerated and benefit greatly from NVIDIA RTX GPUs,\u201d he said. \u201cWe were also heavily working on Unreal Engine and reliant on real-time rendering.\u201d<\/p>\n<p>For more on this workflow, check out the GTC session, <a href=\"https:\/\/register.nvidia.com\/flow\/nvidia\/gtcspring2023\/attendeeportal\/page\/sessioncatalog\/session\/1666266076380001YQCn\"><i>3D Art Goes Multiplayer: Behind the Scenes of Adobe Substance\u2019s \u2018End of Summer\u2019 Project With Omniverse<\/i><\/a>. <a href=\"https:\/\/register.nvidia.com\/flow\/nvidia\/gtcspring2023\/attendeeportal\/page\/sessioncatalog\/session\/1666266076380001YQCn\">Registration<\/a> is free.<\/p>\n<figure id=\"attachment_63055\" aria-describedby=\"caption-attachment-63055\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-featured-setup-1280w.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-63055\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/03\/studio-itns-wes-mcdermott-adobe-wk49-featured-setup-1280w-672x285.png\" alt=\"\" width=\"672\" height=\"285\"><\/a><figcaption id=\"caption-attachment-63055\" class=\"wp-caption-text\">Adobe Substance 3D team lead and artist Wes McDermott.<\/figcaption><\/figure>\n<p>Check out McDermott\u2019s portfolio on <a href=\"https:\/\/www.instagram.com\/wesrmcdermott\/\">Instagram<\/a>.<\/p>\n<p><i>Follow NVIDIA Studio on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiastudio\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/NVIDIAStudio\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.facebook.com\/NVIDIAStudio\/\"><i>Facebook<\/i><\/a><i>. Access tutorials on the <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCDeQdW6Lt6nhq3mLM4oLGWw\"><i>Studio YouTube channel<\/i><\/a><i> and get updates directly in your inbox by subscribing to the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/?nvmid=subscribe-creators-mail-icon\"><i>Studio newsletter<\/i><\/a><i>. Learn more about Omniverse on<\/i> <a href=\"https:\/\/www.instagram.com\/nvidiaomniverse\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/medium.com\/@nvidiaomniverse\"><i>Medium<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/nvidiaomniverse\"><i>Twitter<\/i><\/a> <i>and <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\"><i>YouTube<\/i><\/a><i> for additional resources and inspiration. Check out the Omniverse <\/i><a href=\"https:\/\/forums.developer.nvidia.com\/c\/omniverse\/300\"><i>forums<\/i><\/a><i>, and join our <\/i><a href=\"https:\/\/discord.com\/invite\/XWQNJDNuaC\"><i>Discord server<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.twitch.tv\/nvidiaomniverse\"><i>Twitch<\/i> <\/a><i>channel to chat with the community.<\/i><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2023\/03\/21\/omniverse-generative-ai-unity-blender-connectors\/<\/p>\n","protected":false},"author":0,"featured_media":2920,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2919"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2919"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2919\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2920"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2919"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2919"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2919"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}