{"id":2879,"date":"2023-02-15T16:58:43","date_gmt":"2023-02-15T16:58:43","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2023\/02\/15\/blender-alpha-release-comes-to-omniverse-introducing-scene-optimization-tools-improved-ai-powered-character-animation\/"},"modified":"2023-02-15T16:58:43","modified_gmt":"2023-02-15T16:58:43","slug":"blender-alpha-release-comes-to-omniverse-introducing-scene-optimization-tools-improved-ai-powered-character-animation","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2023\/02\/15\/blender-alpha-release-comes-to-omniverse-introducing-scene-optimization-tools-improved-ai-powered-character-animation\/","title":{"rendered":"Blender Alpha Release Comes to Omniverse, Introducing Scene Optimization Tools, Improved AI-Powered Character Animation"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2023\/02\/15\/blender-alpha-release-omniverse\/\" data-title=\"Blender Alpha Release Comes to Omniverse, Introducing Scene Optimization Tools, Improved AI-Powered Character Animation\" data-hashtags=\"\">\n<p>Whether creating realistic digital humans that can express emotion or building immersive virtual worlds, 3D artists can reach new heights with <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse<\/a>, a platform for creating and operating <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/08\/10\/what-is-the-metaverse\/\" target=\"_blank\" rel=\"noopener\">metaverse<\/a> applications.<\/p>\n<p>A new Blender alpha release, now available in the <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/download\/\" target=\"_blank\" rel=\"noopener\">Omniverse Launcher<\/a>, lets users of the 3D graphics software optimize scenes and streamline workflows with AI-powered character animations.<\/p>\n<h2><b>Save Time, Effort With New Blender Add-Ons<\/b><\/h2>\n<p>The new scene optimization add-on in the Blender release enables creators to fix bad geometry and generate automatic <a href=\"https:\/\/docs.blender.org\/manual\/en\/latest\/editors\/uv\/introduction.html\" target=\"_blank\" rel=\"noopener\">UVs<\/a>, or 2D maps of 3D objects. It also reduces the number of polygons that need to be rendered to increase the scene\u2019s overall performance, which significantly brings down file size, as well as CPU and GPU memory usage.<\/p>\n<p>Plus, anyone can now accomplish what used to require a technical rigger or animator using an <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/audio2face\/\" target=\"_blank\" rel=\"noopener\">Audio2Face<\/a> add-on.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-62401\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/Blender_A2F_addon.gif\" alt=\"\" width=\"1280\" height=\"720\"><\/p>\n<p>A panel in the add-on makes it easier to use Blender characters in Audio2Face, an AI-enabled tool that automatically generates realistic facial expressions from an audio file.<\/p>\n<p>This new functionality eases the process of bringing generated face shapes back onto rigs \u2014 that is, digital skeletons \u2014 by applying shapes exported through the Universal Scene Description (<a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/usd\/\" target=\"_blank\" rel=\"noopener\">USD<\/a>) framework onto a character even if it is fully rigged, meaning its whole body has a working digital skeleton. The integration of the facial shapes doesn\u2019t alter the rigs, so Audio2Face shapes and animation can be applied to characters \u2014 whether for games, shows and films, or simulations \u2014 at any point in the artist\u2019s workflow.<\/p>\n<h2><b>Realistic Character Animation Made Easy<\/b><\/h2>\n<p>Audio2Face puts AI-powered facial animation in the hands of every Blender user who works with Omniverse.<\/p>\n<p>Using the new Blender <a href=\"https:\/\/docs.omniverse.nvidia.com\/con_connect\/con_connect\/blender\/audio2face.html\" target=\"_blank\" rel=\"noopener\">add-on for Audio2Face<\/a>, animator and popular YouTuber Marko Matosevic, aka <a href=\"https:\/\/blogs.nvidia.com\/blog\/2022\/07\/15\/marko-matosevic-omniverse-animator\/\" target=\"_blank\" rel=\"noopener\">Markom 3D<\/a>, rigged and animated a <i>Battletoads<\/i>-inspired character using just an audio file.<\/p>\n<\/p>\n<p>Australia-based Matosevic joined Dave Tyner, a technical evangelist at NVIDIA, on a livestream to showcase their <a href=\"https:\/\/youtu.be\/Ir4inY0KveI\" target=\"_blank\" rel=\"noopener\">live collaboration<\/a> across time zones, connecting 3D applications in a real-time Omniverse jam session. The two used the new Blender alpha release with Omniverse to make progress on one of Matosevic\u2019s short animations.<\/p>\n<p>The new Blender release was also on display last month at CES in <i>The Artists\u2019 Metaverse, <\/i>a demo featuring seven artists, across time zones, who used <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/cloud\/\" target=\"_blank\" rel=\"noopener\">Omniverse Nucleus Cloud<\/a>, Autodesk, SideFX, Unreal Engine and more to create a short cinematic in real time.<\/p>\n<\/p>\n<p>Creators can save time and simplify processes with the add-ons available in Omniverse\u2019s Blender build.<\/p>\n<p>NVIDIA principal artist Zhelong Xu, for example, used Blender and Omniverse to visualize an <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/01\/18\/in-the-nvidia-studio-january-18\/\" target=\"_blank\" rel=\"noopener\">NVIDIA-themed \u201cYear of the Rabbit\u201d<\/a> zodiac.<\/p>\n<p>\u201cI got the desired effect very quickly and tested a variety of lighting effects,\u201d said Xu, an award-winning 3D artist who\u2019s previously worked at top game studio Tencent and made key contributions to an animated show on Netflix.<\/p>\n<h2><b>Get Plugged Into the Omniverse\u00a0<\/b><\/h2>\n<p>Learn more about Blender and Omniverse integrations by watching a community livestream on Wednesday, Feb. 15, at 11 a.m. PT via <a href=\"https:\/\/www.twitch.tv\/nvidiaomniverse\" target=\"_blank\" rel=\"noopener\">Twitch<\/a> and <a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\" target=\"_blank\" rel=\"noopener\">YouTube<\/a>.<\/p>\n<p>And the session catalog for <a href=\"https:\/\/www.nvidia.com\/gtc\/\" target=\"_blank\" rel=\"noopener\">NVIDIA GTC<\/a>, a global AI conference running online March 20-23, features hundreds of curated talks and workshops <a href=\"https:\/\/www.nvidia.com\/gtc\/sessions\/metaverse\/\" target=\"_blank\" rel=\"noopener\">for 3D creators and developers<\/a>. Register free to hear from NVIDIA experts and industry luminaries on the future of technology.<\/p>\n<p><i>Creators and developers can <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/download\/\" target=\"_blank\" rel=\"noopener\"><i>download NVIDIA Omniverse free<\/i><\/a><i>. Enterprises can try Omniverse Enterprise free on <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/launchpad\/3d-design\/omniverse-enterprise\/\" target=\"_blank\" rel=\"noopener\"><i>NVIDIA LaunchPad<\/i><\/a><i>. Follow NVIDIA Omniverse on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiaomniverse\/\" target=\"_blank\" rel=\"noopener\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/medium.com\/@nvidiaomniverse\" target=\"_blank\" rel=\"noopener\"><i>Medium<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/nvidiaomniverse\" target=\"_blank\" rel=\"noopener\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\" target=\"_blank\" rel=\"noopener\"><i>YouTube<\/i><\/a><i> for additional resources and inspiration. Check out the Omniverse <\/i><a href=\"https:\/\/forums.developer.nvidia.com\/c\/omniverse\/300\" target=\"_blank\" rel=\"noopener\"><i>forums<\/i><\/a><i>, and join our <\/i><a href=\"https:\/\/discord.com\/invite\/XWQNJDNuaC\" target=\"_blank\" rel=\"noopener\"><i>Discord server<\/i><\/a> <i>and <\/i><a href=\"https:\/\/www.twitch.tv\/nvidiaomniverse\" target=\"_blank\" rel=\"noopener\"><i>Twitch<\/i><\/a><i> channel<\/i><i> to chat with the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/community\/\" target=\"_blank\" rel=\"noopener\"><i>community<\/i><\/a><i>.<\/i><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2023\/02\/15\/blender-alpha-release-omniverse\/<\/p>\n","protected":false},"author":0,"featured_media":2880,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2879"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2879"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2879\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2880"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2879"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2879"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2879"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}