{"id":3611,"date":"2024-05-21T15:10:05","date_gmt":"2024-05-21T15:10:05","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2024\/05\/21\/a-superbloom-of-updates-in-the-may-studio-driver-gives-fresh-life-to-content-creation\/"},"modified":"2024-05-21T15:10:05","modified_gmt":"2024-05-21T15:10:05","slug":"a-superbloom-of-updates-in-the-may-studio-driver-gives-fresh-life-to-content-creation","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2024\/05\/21\/a-superbloom-of-updates-in-the-may-studio-driver-gives-fresh-life-to-content-creation\/","title":{"rendered":"A Superbloom of Updates in the May Studio Driver Gives Fresh Life to Content Creation"},"content":{"rendered":"<div>\n\t\t<span class=\"bsf-rt-reading-time\"><span class=\"bsf-rt-display-label\"><\/span> <span class=\"bsf-rt-display-time\"><\/span> <span class=\"bsf-rt-display-postfix\"><\/span><\/span><\/p>\n<p><i>Editor\u2019s note: This post is part of our <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/in-the-nvidia-studio\/\"><i>In the NVIDIA Studio<\/i><\/a><i> series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\"><i>NVIDIA Studio<\/i><\/a><i> technology improves creative workflows. We\u2019re also deep diving on new <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/rtx\/\"><i>GeForce RTX GPU<\/i><\/a><i> features, technologies and resources, and how they dramatically accelerate content creation.<\/i><\/p>\n<p>A superbloom of creative app updates, included in the May Studio Driver, is ready for <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/drivers\/\">download<\/a> today.<\/p>\n<p>New GPU-accelerated and AI-powered apps and features are now available, backed by the <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\">NVIDIA Studio<\/a> platform.<\/p>\n<p>And this week\u2019s featured<i> In the NVIDIA Studio <\/i>artist, Yao Chan, created the whimsical, spring-inspired 3D scene <i>By the Window<\/i> using her <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/desktop-graphics\/\">NVIDIA RTX GPU<\/a>.<\/p>\n<h2><b>May\u2019s Creative App Rundown<\/b><\/h2>\n<p>RTX Video is a collection of AI enhancements that improves the quality of video played on apps like YouTube, Prime Video and Disney+. <a href=\"https:\/\/blogs.nvidia.com\/blog\/rtx-video-super-resolution\/\">RTX Video Super Resolution<\/a> (VSR) upscales video for cleaner, crisper imagery, while <a href=\"https:\/\/blogs.nvidia.com\/blog\/rtx-video-hdr-remix-studio-driver\/\">RTX Video HDR<\/a> transforms standard dynamic range video content to high-dynamic range (HDR10), improving its visibility, details and vibrancy.<\/p>\n<p>Mozilla Firefox, the third most popular PC browser, has added support for RTX VSR and HDR, including AI-enhanced upscaling, de-artifacting and HDR effects for most streamed videos.<\/p>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/rtx-remix\/\">NVIDIA RTX Remix<\/a> allows modders to easily capture game assets, automatically enhance materials with <a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/generative-ai\/\">generative AI<\/a> tools and create stunning RTX remasters with full ray tracing. RTX Remix recently added DLSS 3.5 support featuring Ray Reconstruction, an AI model that creates higher-quality images for intensive ray-traced games and apps, to the modding toolkit.<\/p>\n<\/p>\n<p>Game developers interested in creating their own ray-traced mod for a classic game can <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/rtx-remix\/\">download the RTX Remix Beta<\/a> and watch <a href=\"https:\/\/www.youtube.com\/playlist?list=PL4w6jm6S2lzvgJ97T1_VbLGBR_l6zzOUm\">tutorial videos<\/a> to get a head start.<\/p>\n<p>Maxon\u2019s Cinema 4D modeling software empowers 3D video effects artists and motion designers to create complex scenes with ease. The integration of the software\u2019s Version 2024.4 with C4D\u2019s Unified Simulation systems now enables control of emission fields to modify behaviors more precisely.<\/p>\n<\/p>\n<p>This integration unlocks the ability to orchestrate object interactions with different simulation types, including Pyro, Cloth, soft bodies and rigid bodies. These simulations run considerably faster depending on the RTX GPU in use.<\/p>\n<p>The <a href=\"https:\/\/www.nvidia.com\/en-us\/ai-data-science\/audio2face\/\">NVIDIA Omniverse Audio2Face<\/a> app for iClone 8 uses AI to produce expressive facial animations solely from audio input. In addition to generating natural lip-sync animations for multilingual dialogue, the latest standalone release supports multilingual lip-sync and singing animations, as well as full-spectrum editing with slider controls and a keyframe editor.<\/p>\n<\/p>\n<p>Along with accurate lip-sync, facial animations are significantly enhanced by nuanced facial expressions. Pairing Audio2Face with the iClone AccuFACE plug-in, powered by NVIDIA Maxine, Reallusion provides a flexible and multifaceted approach to facial animation, laying the groundwork with audio tracks and adding subtle expressions with webcams.<\/p>\n<p>These latest AI-powered tools and creative app power ups are available for NVIDIA and GeForce RTX GPU owners.<\/p>\n<h2><b>All Things Small, Bright and Beautiful<\/b><\/h2>\n<p>China-based 3D visual effects artist Yao Chan finds inspiration and joy in the small things in life.<\/p>\n<p>\u201cAs the weather gradually warms up, everything is rejuvenating and flowers are blooming,\u201d said Chan. \u201cI want to create an illustration that captures the warm and bright atmosphere of spring.\u201d<\/p>\n<p>Her 3D scene <i>By the Window<\/i> closely resembles a corner of her home filled with various succulent plants, pots and neatly arranged gardening tools.<\/p>\n<p>\u201cI think everyone has a place or moment that warms their heart in one way or another, and that\u2019s an emotion I want to share with my audience,\u201d said the artist.<\/p>\n<p>Chan usually first sketches out her ideas in Adobe Photoshop, but with her real-life reference already set, she dove right into blocking out the scene in Blender.<\/p>\n<p>Since she wanted to use a hand-painted texture style for modeling the vases and pots, Chan added Blender\u2019s displace modifier and used a Voronoi texture to give the shapes a handcrafted effect.<\/p>\n<figure id=\"attachment_71710\" aria-describedby=\"caption-attachment-71710\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-particles-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71710\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-particles-1280w-672x271.jpg\" alt=\"\" width=\"672\" height=\"271\"><\/a><figcaption id=\"caption-attachment-71710\" class=\"wp-caption-text\">Chan used hair from the particle system and played with roughness, kink and hair shape effects to accurately model fluffy plants like Kochia scoparia and moss.<\/figcaption><\/figure>\n<p>Blender Cycles\u2019 RTX-accelerated OptiX ray tracing in the viewport, unlocked by Chan\u2019s GeForce RTX GPU, ensured smooth, interactive modeling throughout her creative workflow.<\/p>\n<figure id=\"attachment_71713\" aria-describedby=\"caption-attachment-71713\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-clay-wire-render-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71713\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-clay-wire-render-1280w-366x500.jpg\" alt=\"\" width=\"366\" height=\"500\"><\/a><figcaption id=\"caption-attachment-71713\" class=\"wp-caption-text\">Modeling and mesh work \u2014 complete.<\/figcaption><\/figure>\n<p>For texturing, Chan referred to former <i>In the NVIDIA Studio<\/i> featured artist <a href=\"https:\/\/www.youtube.com\/watch?v=UfSw6428bcc&amp;t=174s\">SouthernShotty\u2019s tutorial<\/a>, using the precision of geometry nodes to highlight the structure of objects and gradient nodes to control the color and transparency of plants.<\/p>\n<figure id=\"attachment_71716\" aria-describedby=\"caption-attachment-71716\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-node-zone-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71716\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-node-zone-1280w-672x361.jpg\" alt=\"\" width=\"672\" height=\"361\"><\/a><figcaption id=\"caption-attachment-71716\" class=\"wp-caption-text\">Chan entered the node zone in Blender.<\/figcaption><\/figure>\n<p>Chan then used the \u201cpointiness\u201d node to simulate the material of ceramic flower pots.<\/p>\n<figure id=\"attachment_71719\" aria-describedby=\"caption-attachment-71719\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-pointiness-node-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71719\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-pointiness-node-1280w-672x340.jpg\" alt=\"\" width=\"672\" height=\"340\"><\/a><figcaption id=\"caption-attachment-71719\" class=\"wp-caption-text\">The \u201cpointiness\u201d node helped simulate materials.<\/figcaption><\/figure>\n<p>Lighting was fairly straightforward, consisting of sunlight, a warm-toned key light, a cool-toned fill light and a small light source to illuminate the area beneath the table.<\/p>\n<figure id=\"attachment_71722\" aria-describedby=\"caption-attachment-71722\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-lights-1280w.jpg-scaled.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71722\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-lights-1280w.jpg-672x357.jpg\" alt=\"\" width=\"672\" height=\"357\"><\/a><figcaption id=\"caption-attachment-71722\" class=\"wp-caption-text\">Several lights added brightness to the scene.<\/figcaption><\/figure>\n<p>Chan also added a few volume lights in front of the camera.<\/p>\n<figure id=\"attachment_71725\" aria-describedby=\"caption-attachment-71725\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-side-lights-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71725\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-side-lights-1280w-672x425.jpg\" alt=\"\" width=\"672\" height=\"425\"><\/a><figcaption id=\"caption-attachment-71725\" class=\"wp-caption-text\">Lighting from the side.<\/figcaption><\/figure>\n<p>Finally, to give the image a more vintage look, Chan added noise to the final rendered image in compositing.<\/p>\n<figure id=\"attachment_71728\" aria-describedby=\"caption-attachment-71728\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-compositing-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71728\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-compositing-1280w-672x478.jpg\" alt=\"\" width=\"672\" height=\"478\"><\/a><figcaption id=\"caption-attachment-71728\" class=\"wp-caption-text\">Final compositing work.<\/figcaption><\/figure>\n<p>Chan\u2019s AI-powered simulations and viewport renderings were powered by her RTX GPU.<\/p>\n<p>\u201cRTX GPUs accelerate workflows and ensure fluent video editing,\u201d she said.<\/p>\n<p>Check out Chan\u2019s latest work on <a href=\"https:\/\/www.instagram.com\/magicwandyc\/\">Instagram<\/a>.<\/p>\n<figure id=\"attachment_71731\" aria-describedby=\"caption-attachment-71731\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-featured-setup-1280w.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-71731\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/05\/studio-itns-yao-chan-wk110-featured-setup-1280w-672x183.png\" alt=\"\" width=\"672\" height=\"183\"><\/a><figcaption id=\"caption-attachment-71731\" class=\"wp-caption-text\">3D artist Yao Chan.<\/figcaption><\/figure>\n<p><i>Follow NVIDIA Studio on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiastudio\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/NVIDIAStudio\"><i>X<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.facebook.com\/NVIDIAStudio\/\"><i>Facebook<\/i><\/a><i>. Access tutorials on the <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCDeQdW6Lt6nhq3mLM4oLGWw\"><i>Studio YouTube channel<\/i><\/a><i> and get updates directly in your inbox by subscribing to the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/?nvmid=subscribe-creators-mail-icon\"><i>Studio newsletter<\/i><\/a><i>.\u00a0<\/i><\/p>\n<p>\t\t<!-- .entry-footer --><\/p><\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/studio-may-driver-rtx-video-remix\/<\/p>\n","protected":false},"author":0,"featured_media":3612,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/3611"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=3611"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/3611\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/3612"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=3611"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=3611"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=3611"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}