{"id":3357,"date":"2024-02-22T15:00:56","date_gmt":"2024-02-22T15:00:56","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2024\/02\/22\/add-it-to-the-toolkit-february-studio-driver-and-nvidia-app-beta-now-available\/"},"modified":"2024-02-22T15:00:56","modified_gmt":"2024-02-22T15:00:56","slug":"add-it-to-the-toolkit-february-studio-driver-and-nvidia-app-beta-now-available","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2024\/02\/22\/add-it-to-the-toolkit-february-studio-driver-and-nvidia-app-beta-now-available\/","title":{"rendered":"Add It to the Toolkit: February Studio Driver and NVIDIA App Beta Now Available"},"content":{"rendered":"<div id=\"bsf_rt_marker\">\n<p><i>Editor\u2019s note: This post is part of our weekly <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/in-the-nvidia-studio\/\"><i>In the NVIDIA Studio<\/i><\/a><i> series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\"><i>NVIDIA Studio<\/i><\/a><i> technology improves creative workflows. We\u2019re also deep diving on new <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/40-series\/\"><i>GeForce RTX 40 Series GPU<\/i><\/a><i> features, technologies and resources, and how they dramatically accelerate content creation.<\/i><\/p>\n<p>The February NVIDIA Studio Driver, designed specifically to optimize creative apps, is now available for download. Developed in collaboration with app developers, Studio Drivers undergo extensive testing to ensure seamless compatibility with creative apps while enhancing features, automating processes and speeding workflows.<\/p>\n<p>Creators can download the latest driver on the public beta of the <a href=\"https:\/\/www.nvidia.com\/en-us\/software\/nvidia-app\/\">new NVIDIA app<\/a>, the essential companion for creators and gamers with NVIDIA GPUs in their PCs and laptops. The NVIDIA app beta is a first step to modernize and unify the NVIDIA Control Panel, GeForce Experience and RTX Experience apps.<\/p>\n<figure id=\"attachment_69910\" aria-describedby=\"caption-attachment-69910\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/studio-james-matthews-wk97-image5-1280w.png\"><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/studio-james-matthews-wk97-image5-1280w-672x377.png\" alt=\"\" width=\"672\" height=\"377\"><\/p>\n<p><\/a><figcaption id=\"caption-attachment-69910\" class=\"wp-caption-text\">The NVIDIA App offers easy access to the latest Studio Drivers, a suite of AI-powered Studio apps, games and more.<\/figcaption><\/figure>\n<p>The NVIDIA app simplifies the process of keeping PCs updated with the latest NVIDIA drivers, enables quick discovery and installation of NVIDIA apps like NVIDIA Broadcast and NVIDIA Omniverse, unifies the GPU control center, and introduces a redesigned in-app overlay for convenient access to powerful recording tools. Download the <a href=\"https:\/\/www.nvidia.com\/en-us\/software\/nvidia-app\/\">NVIDIA app beta<\/a> today.<\/p>\n<p>Adobe Premiere Pro\u2019s AI-powered <a href=\"https:\/\/blogs.nvidia.com\/blog\/studio-rtx-ai-adobe-premiere-pro-photoshop-lightroom\/\">Enhance Speech tool<\/a> is <a href=\"https:\/\/blog.adobe.com\/en\/publish\/2024\/02\/22\/enhance-your-video-editing-workflows-power-adobe-video-ecosystem\">now available in general release<\/a>. Accelerated by <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/technologies\/rtx\/\">NVIDIA RTX<\/a>, the new feature removes unwanted noise and improves the quality of dialogue clips so they sound professionally recorded. It\u2019s 75% faster on a <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/40-series\/rtx-4090\/\">GeForce RTX 4090<\/a> laptop GPU compared with an RTX 3080 Ti.<\/p>\n<figure id=\"attachment_69913\" aria-describedby=\"caption-attachment-69913\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/whats-new-pr-enhance-speech-feb-2024.jpg.img-1.jpg\"><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/whats-new-pr-enhance-speech-feb-2024.jpg.img-1-672x422.jpg\" alt=\"\" width=\"672\" height=\"422\"><\/p>\n<p><\/a><figcaption id=\"caption-attachment-69913\" class=\"wp-caption-text\">Adobe Premiere Pro\u2019s AI-powered Enhance Speech tool removes unwanted noise and improves dialogue quality.<\/figcaption><\/figure>\n<p>Have a <a href=\"https:\/\/www.nvidia.com\/en-us\/ai-on-rtx\/chat-with-rtx-generative-ai\/\">Chat with RTX<\/a>, the tech demo app that lets <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/rtx\/\">GeForce RTX<\/a> owners personalize a <a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/large-language-models\/#:~:text=Large%20language%20models%20(LLMs)%20are,content%20using%20very%20large%20datasets.\">large language model<\/a> connected to their own content. Results are fast and secure since it runs locally on a Windows RTX PC or workstation. <a href=\"https:\/\/www.nvidia.com\/en-us\/ai-on-rtx\/chat-with-rtx-generative-ai\/\">Download<\/a> Chat with RTX today.<\/p>\n<\/p>\n<p>And this week <i>In the NVIDIA Studio<\/i>, filmmaker James Matthews shares his short film, <i>Dive<\/i>, which was created with an Adobe Premiere Pro-powered workflow supercharged by his ASUS ZenBook Pro <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/laptops-desktops\/\">NVIDIA Studio<\/a> laptop with a GeForce RTX 4070 graphics card.<\/p>\n<h2><b>Going With the Flow<\/b><\/h2>\n<p>Matthews\u2019 goal with <i>Dive<\/i> was to create a visual and auditory representation of what it feels like to get swallowed up in the creative editing process.<\/p>\n<figure id=\"attachment_69919\" aria-describedby=\"caption-attachment-69919\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/studio-james-matthews-wk97-nice-view-1280w.png\"><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/studio-james-matthews-wk97-nice-view-1280w-672x250.png\" alt=\"\" width=\"672\" height=\"250\"><\/p>\n<p><\/a><figcaption id=\"caption-attachment-69919\" class=\"wp-caption-text\">Talk about a dream content creation location.<\/figcaption><\/figure>\n<p>\u201cWhen I\u2019m really deep into an edit, I sometimes feel like I\u2019m fully immersed into the film and the editing process itself,\u201d he said. \u201cIt\u2019s almost like a flow state, where time stands still and you are one with your own creativity.\u201d<\/p>\n<p>To capture and visualize that feeling, Matthews used the power of his ASUS ZenBook Pro NVIDIA Studio laptop equipped with a GeForce RTX 4070 graphics card.<\/p>\n<p>He started by brainstorming \u2014 listening to music and sketching conceptual images with pencil and paper. Then, Matthews added a song to his Adobe Premiere Pro timeline and created a shot list, complete with cuts and descriptions of focal range, speed, camera movement, lighting and other details.<\/p>\n<p>Next, he planned location and shooting times, paying special attention to lighting conditions.<\/p>\n<p>\u201cI always have my Premiere Pro timeline up so I can really see and feel what I need to create from the images I originally drew while building the concept in my head,\u201d Matthews said. \u201cThis helps get the pacing of each shot right, by watching it back and possibly adding it into the timeline for a test.\u201d<\/p>\n<p>Then, Matthews started editing the footage in Premiere Pro, aided by his Studio laptop. His dedicated GPU-based NVIDIA video encoder (<a href=\"https:\/\/developer.nvidia.com\/video-codec-sdk\">NVENC<\/a>) enabled buttery-smooth playback and scrubbing of his high-resolution and multi-stream footage, saving countless hours.<\/p>\n<\/p>\n<p>Matthews\u2019 RTX GPU accelerated a variety of <a href=\"https:\/\/www.adobe.com\/products\/premiere\/ai-video-editing.html\">AI-powered Adobe video editing tools<\/a>, such as Enhance Speech, Scene Edit Detection and Auto Color, which applies color corrections with just a few clicks.<\/p>\n<\/p>\n<p>Finally, Matthews added sound design before exporting the final files twice as fast thanks to NVENC\u2019s dual AV1 encoders.<\/p>\n<p>\u201cThe entire edit used GPU acceleration,\u201d he shared. \u201cEffects in Premiere Pro, along with the NVENC video encoders on the GPU, unlocked a seamless workflow and essentially allowed me to get into my flow state faster.\u201d<\/p>\n<figure id=\"attachment_69922\" aria-describedby=\"caption-attachment-69922\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/studio-itns-james-matthews-wk97-artist-feature-1280w.png\"><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2024\/02\/studio-itns-james-matthews-wk97-artist-feature-1280w-672x218.png\" alt=\"\" width=\"672\" height=\"218\"><\/p>\n<p><\/a><figcaption id=\"caption-attachment-69922\" class=\"wp-caption-text\">Filmmaker James Matthews.<\/figcaption><\/figure>\n<p>Watch Matthews\u2019 content on <a href=\"http:\/\/youtube.com\/JamesMatthews\">YouTube<\/a>.<\/p>\n<p><i>Follow NVIDIA Studio on <a href=\"https:\/\/www.facebook.com\/NVIDIAStudio\/\">Facebook, <\/a><\/i><a href=\"https:\/\/www.instagram.com\/nvidiastudio\/\"><i>Instagram <\/i><\/a><i>and <\/i><i><\/i><i><a href=\"https:\/\/twitter.com\/NVIDIAStudio\">X<\/a> <\/i><i>. Access tutorials on the <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCDeQdW6Lt6nhq3mLM4oLGWw\"><i>Studio YouTube channel<\/i><\/a><i> and get updates directly in your inbox by subscribing to the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/?nvmid=subscribe-creators-mail-icon\"><i>Studio newsletter<\/i><\/a><i>.<\/i><b>\u00a0<\/b><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/studio-driver-app-rtx-ai-adobe-premiere-pro\/<\/p>\n","protected":false},"author":0,"featured_media":3358,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/3357"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=3357"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/3357\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/3358"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=3357"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=3357"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=3357"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}