{"id":3015,"date":"2023-05-30T13:51:58","date_gmt":"2023-05-30T13:51:58","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2023\/05\/30\/nvidia-rtx-transforming-14-inch-laptops-plus-simultaneous-screen-encoding-and-may-studio-driver-available-today\/"},"modified":"2023-05-30T13:51:58","modified_gmt":"2023-05-30T13:51:58","slug":"nvidia-rtx-transforming-14-inch-laptops-plus-simultaneous-screen-encoding-and-may-studio-driver-available-today","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2023\/05\/30\/nvidia-rtx-transforming-14-inch-laptops-plus-simultaneous-screen-encoding-and-may-studio-driver-available-today\/","title":{"rendered":"NVIDIA RTX Transforming 14-Inch Laptops, Plus Simultaneous Screen Encoding and May Studio Driver Available Today"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2023\/05\/30\/computex-studio-laptops-encoding-capcut\/\" data-title=\"NVIDIA RTX Transforming 14-Inch Laptops, Plus Simultaneous Screen Encoding and May Studio Driver Available Today\" data-hashtags=\"\">\n<p><i>Editor\u2019s note: This post is part of our weekly <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/in-the-nvidia-studio\/\"><i>In the NVIDIA Studio<\/i><\/a><i> series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\"><i>NVIDIA Studio<\/i><\/a><i> technology improves creative workflows. We\u2019re also deep diving on new <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/40-series\/\"><i>GeForce RTX 40 Series GPU<\/i><\/a><i> features, technologies and resources, and how they dramatically accelerate content creation.<\/i><\/p>\n<p>New 14-inch <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/laptops-desktops\/\">NVIDIA Studio laptops<\/a>, equipped with <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/laptops\/\">GeForce RTX 40 Series Laptop GPUs<\/a>, give creators peak portability with a significant increase in performance over the last generation. AI-dedicated hardware called Tensor Cores power time-saving tasks in popular apps like Davinci Resolve. Ray Tracing Cores together with our neural rendering technology, DLSS 3, boost performance in real-time 3D rendering applications like D5 Render and <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse<\/a>.<\/p>\n<p>NVIDIA also introduced a new method for accelerating video encoding. Simultaneous Scene Encoding sends independent groups of frames, or scenes, to each NVIDIA Encoder (NVENC). With multiple NVENCs fully utilized, video export times can be reduced significantly, without affecting image quality. The first software to integrate the technology is the popular video editing app <a href=\"https:\/\/www.capcut.com\/tools\/desktop-video-editor?utm_source=nvidia&amp;utm_medium=referral\">CapCut<\/a>.<\/p>\n<p>The May Studio Driver is ready for download now. This month\u2019s release includes support for updates to MAGIX VEGAS Pro, D5 Render and VLC Media Player \u2014 in addition to CapCut \u2014 plus AI model optimizations for popular apps.<\/p>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/events\/computex\/\">COMPUTEX<\/a>, Asia\u2019s biggest annual tech trade show, kicks off a flurry of updates, bringing creators new tools and performance from the NVIDIA Studio platform \u2014 and plenty of AI power.<\/p>\n<p>During his <a href=\"https:\/\/www.nvidia.com\/en-us\/events\/computex\/\">keynote address<\/a> at COMPUTEX, NVIDIA founder and CEO Jensen Huang introduced a new generative AI to support game development, <a href=\"https:\/\/developer.nvidia.com\/omniverse\/ace\">NVIDIA Avatar Cloud Engine (ACE) for Games<\/a>. The platform adds intelligence to non-playable characters (NPCs) in gaming, with AI-powered natural language interactions.<\/p>\n<\/p>\n<p>The <i>Kairos <\/i>demo \u2014 a joint venture with Convai led by NVIDIA Creative Director Gabriele Leone \u2014 demonstrates how a single model can transform into a living, breathing, lifelike character this week<i> In the NVIDIA Studio<\/i>.<\/p>\n<h2><b>Ultraportable, Ultimate Performance<\/b><\/h2>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/laptops-desktops\/\">NVIDIA Studio laptops<\/a>, powered by the NVIDIA Ada Lovelace architecture, are the world\u2019s fastest laptops for creating and gaming.<\/p>\n<p>For the first time, GeForce RTX performance comes to 14-inch devices. In the process, it\u2019s transforming the ultraportable market, delivering the ultimate combination of performance and portability.<\/p>\n<figure id=\"attachment_64442\" aria-describedby=\"caption-attachment-64442\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ASUS-Zenbook-Pro-14.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64442\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ASUS-Zenbook-Pro-14-672x299.png\" alt=\"\" width=\"672\" height=\"299\"><\/a><figcaption id=\"caption-attachment-64442\" class=\"wp-caption-text\">ASUS Zenbook Pro 14 comes with up to a GeForce RTX 4070 Laptop GPU.<\/figcaption><\/figure>\n<p>These purpose-built creative powerhouses do it all. Backed by <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\">NVIDIA Studio<\/a>, the platform supercharges over 110 creative apps, provides lasting stability with NVIDIA Studio Drivers and includes a powerful suite of AI-powered Studio software, such as <a href=\"https:\/\/www.nvidia.com\/en-in\/omniverse\/creators\/\">NVIDIA Omniverse<\/a>, <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/canvas\/\">Canvas<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/broadcasting\/broadcast-app\/\">Broadcast<\/a>.<\/p>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/gaming-laptops\/max-q-technologies\/\">Fifth-generation Max-Q technologies<\/a> bring an advanced suite of AI-powered technologies that optimize laptop performance, power and acoustics for peak efficiency. Battery life improves by up to 70%. And DLSS is now optimized for laptops, giving creators incredible 3D rendering performance with <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/technologies\/dlss\/\">DLSS 3 optical multi-frame generation and super resolution<\/a> in Omniverse and D5 Render, and in hit games like <i>Cyberpunk 2077<\/i>.<\/p>\n<p>As the ultraportable market heats up, PC laptop makers are giving creators more options than ever. Recently announced models, with more on the way, include the <a href=\"https:\/\/www.acer.com\/us-en\/laptops\/swift\/swift-x-14\">Acer Swift X 14<\/a>, <a href=\"https:\/\/www.asus.com\/laptops\/for-creators\/zenbook\/zenbook-pro-14-oled-ux6404\/\">ASUS Zenbook Pro 14<\/a>, <a href=\"https:\/\/www.gigabyte.com\/Laptop\/AERO-14-OLED--2023#kf\">GIGABYTE Aero 14<\/a>, Lenovo\u2019s Slim Pro 9i 14 and <a href=\"https:\/\/us.msi.com\/Laptop\/Stealth-14-Studio-A13VX\">MSI Stealth 14<\/a>.<\/p>\n<p>Visit the <a href=\"https:\/\/store.nvidia.com\/en-us\/studio\/store\/?page=1&amp;limit=9&amp;locale=en-us\">Studio Shop<\/a> for the latest GeForce RTX-powered NVIDIA Studio systems and explore the range of high-performance Studio products.<\/p>\n<h2><b>Simultaneous Scene Encoding<\/b><\/h2>\n<p>The recent release of Video Codec SDK 12.1 added support for multi-encoder support, which can cut export times in half. Our previously announced split encoding method \u2014 which splits a frame and sends each section to an encoder \u2014 now has an API that app developers can expose to their end users. Previously, split encoding would be engaged automatically for 4K or higher video and the faster export presets. With this update, developers can simply allow users to toggle on this option.<\/p>\n<p>Video Codec SDK 12.1 also introduces a new encoding method: simultaneous scene encoding. Video apps can split groups of pictures or scenes as they\u2019re sent into the rendering pipeline. Each group can then be rendered independently and ordered properly on the final output.<\/p>\n<p>The result is a significant increase in encoding speed \u2014 approximately 80% for dual encoders, and further increases when more than two NVENCs are present, like in the NVIDIA RTX 6000 Ada Generation professional GPU. Image quality is also improved compared to current split encoding methods, where individual frames are sent to each encoder and then stitched back together in the final output.<\/p>\n<\/p>\n<p>CapCut users will be the first to experience this benefit on RTX GPUs with two or more encoders, starting with the software\u2019s current release, available today.<\/p>\n<h2><b>Massive May Studio Driver Drops<\/b><\/h2>\n<p>The May Studio Driver features significant upgrades and optimizations.<\/p>\n<p>MAGIX partnered with NVIDIA to move its line of VEGAS Pro AI models on WinML, enabling video editors to apply AI effects much faster.<\/p>\n<p>The driver also optimizes AI features for applications running on WinML, including Adobe Photoshop, Lightroom, MAGIX Vegas Pro, ON1 and DxO, among many others.<\/p>\n<p>The real-time ray tracing renderer D5 Render also added <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/technologies\/dlss\/\">NVIDIA DLSS 3<\/a>, delivering a smoother viewport experience to navigate scenes with super fluid motion, massively benefiting architects, designers, interior designers and all professional 3D artists.<\/p>\n<figure id=\"attachment_64445\" aria-describedby=\"caption-attachment-64445\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/D5-Render-v2.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64445\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/D5-Render-v2-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-64445\" class=\"wp-caption-text\">D5 Render and DLSS 3 work brilliantly to create photorealistic imagery.<\/figcaption><\/figure>\n<p><a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/02\/28\/rtx-video-super-resolution\/\">NVIDIA RTX Video Super Resolution<\/a> \u2014 video upscaling technology that uses AI and RTX Tensor Cores to upscale video quality \u2014 is now fully integrated into VLC Media Player, no longer requiring a separate download. <a href=\"https:\/\/nvidia.custhelp.com\/app\/answers\/detail\/a_id\/5448\/~\/rtx-video-super-resolution-faq\">Learn more<\/a>.<\/p>\n<p>Download <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/geforce-experience\/\">GeForce Experience<\/a> or <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/software\/rtx-experience\/\">NVIDIA RTX Experience<\/a> for the easiest way to upgrade and to be notified of the latest driver releases.<\/p>\n<h2><b>Gaming\u2019s ACE in the Hole<\/b><\/h2>\n<p>During NVIDIA founder and CEO Jensen Huang\u2019s <a href=\"https:\/\/www.nvidia.com\/en-us\/events\/computex\/\">keynote address<\/a> at COMPUTEX, he <a href=\"https:\/\/developer.nvidia.com\/blog\/announcing-nvidia-ace-for-games-generative-ai-sparks-life-into-virtual-characters\/\">introduced NVIDIA ACE for Games<\/a>, a new foundry that adds intelligence to NPCs in gaming with AI-powered natural language interactions.<\/p>\n<p>Game developers and studios can use ACE for Games to build and deploy customized speech, conversation and animation AI models in their software and games. The AI technology can transform entire worlds, breathing new life into individuals, groups or an entire town\u2019s worth of characters \u2014 the sky\u2019s the limit.<\/p>\n<p>ACE for Games builds on technology inside <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\">NVIDIA Omniverse<\/a>, an open development platform for building and operating metaverse applications, including optimized AI foundation models for speech, conversation and character animation.<\/p>\n<p>This includes the <a href=\"https:\/\/developer.nvidia.com\/nemo\">NVIDIA NeMo<\/a> for conversational AI fine-tuned for game characters, <a href=\"https:\/\/developer.nvidia.com\/riva\">NVIDIA Riva<\/a> for automatic speech recognition and text-to-speech, and <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/audio2face\/\">Omniverse Audio2Face<\/a> for instantly creating expressive facial animation of game characters to match any speech tracks. Audio2Face features Omniverse connectors for Unreal Engine 5, so developers can add facial animation directly to MetaHuman characters.<\/p>\n<h2><b>Seeing Is Believing: Kairos Demo<\/b><\/h2>\n<p>Huang debuted for COMPUTEX attendees ACE for Games \u2014 and provided a sneak-peek of the future of gaming \u2014 in a demo dubbed <i>Kairos<\/i>.<\/p>\n<p><a href=\"https:\/\/www.convai.com\/\">Convai<\/a>, an <a href=\"https:\/\/www.nvidia.com\/en-us\/startups\/\">NVIDIA Inception<\/a> startup, specializes in cutting-edge conversational AI for virtual game worlds. NVIDIA Lightspeed Studios, led by Creative Director and 3D artist Gabriele Leone, built the remarkably realistic scene and demo. Together, they\u2019ve showcased the opportunity developers have to use NVIDIA ACE for Games to build NPCs.<\/p>\n<p>In the demo, players interact with Jin, owner and proprietor of a ramen shop. The photorealistic shop was modeled after the <a href=\"https:\/\/blogs.nvidia.com\/blog\/2022\/04\/25\/making-of-omniverse-ramen-shop\/\">virtual ramen shop<\/a> built in NVIDIA Omniverse.<\/p>\n<p>For this, an NVIDIA artist traveled to a <i>real <\/i>ramen restaurant in Tokyo and collected over 2,000 high-resolution reference images and videos. Each captured aspects from the kitchen\u2019s distinct areas for cooking, cleaning, food preparation and storage. \u201cWe probably used 70% of the existing models, 30% new and 80% retextures,\u201d said Leone.<\/p>\n<figure id=\"attachment_64448\" aria-describedby=\"caption-attachment-64448\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ace-for-games-4k-partner-kairos-06-3840x2160-1-scaled.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64448\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ace-for-games-4k-partner-kairos-06-3840x2160-1-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-64448\" class=\"wp-caption-text\">Kairos: Beautifully rendered in Autodesk Maya, Blender, Unreal Engine 5 and NVIDIA Omniverse.<\/figcaption><\/figure>\n<p>In the digital ramen shop, objects were modeled in Autodesk 3ds Max with RTX-accelerated AI denoising, and Blender benefiting from RTX-accelerated OptiX ray tracing for smooth, interactive movement in the viewport \u2014 all powered by the team\u2019s arsenal of GeForce RTX 40 Series GPUs.<\/p>\n<div class=\"simplePullQuote right\">\n<p>\u201cIt\u2019s fair to say that without GeForce RTX GPUs and Omniverse, this project would\u2019ve been impossible to complete without adding considerable time\u201d \u2014 Gabriele Leone<\/p>\n<\/div>\n<p>The texture phase in Adobe Substance 3D Painter used NVIDIA Iray rendering technology with RTX-accelerated light and ambient occlusion, baking large assets in mere moments.<\/p>\n<p>Next, Omniverse and the <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/audio2face\/\">Audio2Face app<\/a>, via the Unreal Engine 5 Connector, allowed the team to add facial animation and audio directly to the ramen shop NPC.<\/p>\n<p>Although he is an NPC, Jin replies to natural language realistically and consistent with the narrative backstory \u2014 all with the help of <a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/data-science\/generative-ai\/\">generative AI<\/a>.<\/p>\n<p>Lighting and animation work was done in Unreal Engine 5 aided by NVIDIA DLSS using AI to upscale frames rendered at lower resolution while still retaining high-fidelity detail, again increasing interactivity in the viewport for the team.<\/p>\n<figure id=\"attachment_64451\" aria-describedby=\"caption-attachment-64451\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ace-for-games-4k-partner-kairos-17-3840x2160-1-scaled.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64451\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ace-for-games-4k-partner-kairos-17-3840x2160-1-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-64451\" class=\"wp-caption-text\">Direct your ramen order to the NPC, ahem, interactive, conversational character.<\/figcaption><\/figure>\n<p>Suddenly, NPCs just got a whole lot more engaging. And they\u2019ve never looked this good.<\/p>\n<p><i>Follow NVIDIA Studio on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiastudio\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/NVIDIAStudio\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.facebook.com\/NVIDIAStudio\/\"><i>Facebook<\/i><\/a><i>. Access tutorials on the <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCDeQdW6Lt6nhq3mLM4oLGWw\"><i>Studio YouTube channel<\/i><\/a><i> and get updates directly in your inbox by subscribing to the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/?nvmid=subscribe-creators-mail-icon\"><i>Studio newsletter<\/i><\/a><i>.<\/i><b>\u00a0<\/b><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2023\/05\/30\/computex-studio-laptops-encoding-capcut\/<\/p>\n","protected":false},"author":0,"featured_media":3016,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/3015"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=3015"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/3015\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/3016"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=3015"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=3015"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=3015"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}