{"id":2987,"date":"2023-05-18T14:10:37","date_gmt":"2023-05-18T14:10:37","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2023\/05\/18\/beyond-fast-geforce-rtx-4060-gpu-family-gives-creators-more-options-to-accelerate-workflows-starting-at-299\/"},"modified":"2023-05-18T14:10:37","modified_gmt":"2023-05-18T14:10:37","slug":"beyond-fast-geforce-rtx-4060-gpu-family-gives-creators-more-options-to-accelerate-workflows-starting-at-299","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2023\/05\/18\/beyond-fast-geforce-rtx-4060-gpu-family-gives-creators-more-options-to-accelerate-workflows-starting-at-299\/","title":{"rendered":"Beyond Fast: GeForce RTX 4060 GPU Family Gives Creators More Options to Accelerate Workflows, Starting at $299"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2023\/05\/18\/geforce-rtx-4060-ti\/\" data-title=\"Beyond Fast: GeForce RTX 4060 GPU Family Gives Creators More Options to Accelerate Workflows, Starting at $299\" data-hashtags=\"\">\n<p><i>Editor\u2019s note: This post is part of our weekly <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/in-the-nvidia-studio\/\"><i>In the NVIDIA Studio<\/i><\/a><i> series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\"><i>NVIDIA Studio<\/i><\/a><i> technology improves creative workflows. We\u2019re also deep diving on new <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/40-series\/\"><i>GeForce RTX 40 Series GPU<\/i><\/a><i> features, technologies and resources, and how they dramatically accelerate content creation.<\/i><\/p>\n<p>The GeForce RTX 4060 family will be available starting next week, bringing massive creator benefits to the popular 60-class GPUs.<\/p>\n<p>The latest GPUs in the 40 Series come backed by <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\">NVIDIA Studio<\/a> technologies, including hardware acceleration for 3D, video and AI workflows; optimizations for <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/rtx\/\">RTX<\/a> hardware in over 110 of the most popular creative apps; and exclusive <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/resources\/\">Studio apps<\/a> like <a href=\"https:\/\/resources.nvidia.com\/en-us-omniverse-creators\/\">Omniverse<\/a>, <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/broadcasting\/broadcast-app\/\">Broadcast<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/canvas\/\">Canvas<\/a>.<\/p>\n<p>Real-time ray-tracing renderer D5 Render introduced support for <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/technologies\/dlss\/\">NVIDIA DLSS 3<\/a> technology, enabling super smooth real-time rendering experiences, so creators can work with larger scenes without sacrificing speed or interactivity.<\/p>\n<\/p>\n<p>Plus, the new <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/05\/10\/elara-systems-omniverse-creator\/\"><i>Into the Omniverse<\/i><\/a> series highlights the latest advancements to <a href=\"https:\/\/resources.nvidia.com\/en-us-omniverse-creators\/\">NVIDIA Omniverse<\/a>, a platform furthering the evolution of the <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/08\/10\/what-is-the-metaverse\/\">metaverse<\/a> with the OpenUSD framework. The series showcases how artists, developers and enterprises can use the open development platform to transform their 3D workflows. The first installment highlights an update coming soon to the Adobe Substance 3D Painter Connector.<\/p>\n<p>In addition, NVIDIA 3D artist Daniel Barnes <a href=\"https:\/\/blogs.nvidia.com\/blog\/2022\/06\/29\/in-the-nvidia-studio-june-29\/\">returns<\/a> this week <i>In the NVIDIA Studio<\/i> to share his mesmerizing, whimsical animation, <i>Wormhole 00527<\/i>.<\/p>\n<h2><b>Beyond Fast<\/b><\/h2>\n<p>The GeForce RTX 4060 family is powered by the <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/news\/rtx-40-series-graphics-cards-announcements\/#marvels-of-ada-lovelace-architecture\">ultra-efficient NVIDIA Ada Lovelace architecture<\/a> with fourth-generation Tensor Cores for AI content creation, third-generation RT Cores and compatibility with DLSS 3 for ultra-fast 3D rendering, as well as the eighth-generation NVIDIA encoder (NVENC), now with support for AV1.<\/p>\n<figure id=\"attachment_64128\" aria-describedby=\"caption-attachment-64128\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-rtx-4060ti-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64128\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-rtx-4060ti-1280w-672x293.jpg\" alt=\"\" width=\"672\" height=\"293\"><\/a><figcaption id=\"caption-attachment-64128\" class=\"wp-caption-text\">The GeForce RTX 4060 Ti GPU.<\/figcaption><\/figure>\n<p>3D modelers can build and edit realistic 3D models in real time, up to 45% faster than the previous generation, thanks to third-generation RT Cores, DLSS 3 and the NVIDIA Omniverse platform.<\/p>\n<figure id=\"attachment_64131\" aria-describedby=\"caption-attachment-64131\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/nvidia-studio-itns-wk57-rtx-4060-family-perf-chart-5k-dark.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64131\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/nvidia-studio-itns-wk57-rtx-4060-family-perf-chart-5k-dark-672x432.png\" alt=\"\" width=\"672\" height=\"432\"><\/a><figcaption id=\"caption-attachment-64131\" class=\"wp-caption-text\">Tested on GeForce RTX 4060 and 3060 GPUs. Maya with Arnold 2022 (7.1.1) measures render time of NVIDIA SOL 3D model. DaVinci Resolve measures FPS applying Magic Mask effect \u201cFaster\u201d quality setting to 4K resolution. ON1 Resize AI measures time required to apply effect to batch of 10 photos. Time measurement is normalized for easier comparison across tests.<\/figcaption><\/figure>\n<p>Video editors specializing in Adobe Premiere Pro, Blackmagic Design\u2019s DaVinci Resolve and more have at their disposal a variety of AI-powered effects, such as auto-reframe, magic mask and depth estimation. Fourth-generation Tensor Cores seamlessly hyper-accelerate these effects, so creators can stay in their flow states.<\/p>\n<p>Broadcasters can jump into next-generation livestreaming with the eighth-generation NVENC with support for AV1. The new encoder is 40% more efficient, making livestreams appear as if there were a 40% increase in bitrate \u2014 a big boost in image quality that enables 4K streaming on apps like OBS Studio and platforms such as YouTube and Discord.<\/p>\n<figure id=\"attachment_64134\" aria-describedby=\"caption-attachment-64134\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-gpu-comparison-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64134\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-gpu-comparison-1280w-672x375.jpg\" alt=\"\" width=\"672\" height=\"375\"><\/a><figcaption id=\"caption-attachment-64134\" class=\"wp-caption-text\">10 Mbps with default OBS streaming settings.<\/figcaption><\/figure>\n<p>NVENC boasts the most efficient hardware encoding available, providing significantly better quality than other GPUs. At the same bitrate, images will look better, sharper and have less artifacts, like in the example above.<\/p>\n<figure id=\"attachment_64165\" aria-describedby=\"caption-attachment-64165\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ITNS-AV1.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64165\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/ITNS-AV1-672x378.png\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-64165\" class=\"wp-caption-text\">Encode quality comparison, measured with BD-BR.<\/figcaption><\/figure>\n<p>Creators are embracing AI en masse. DLSS 3 multiplies frame rates in popular 3D apps. ON1 ResizeAI, software that enables high-quality photo enlargement, is sped up 24% compared with last-generation hardware. DaVinci Resolve\u2019s AI Magic Mask feature saves video editors considerable time automating the highly manual process of rotoscoping, carried out 20% faster than the previous generation.<\/p>\n<p>The GeForce RTX 4060 Ti (8GB) will be available starting Wednesday, May 24, at $399. The GeForce RTX 4060 Ti (16GB) will be available in July, starting at $499. GeForce RTX 4060 will also be available in July, starting at $299.<\/p>\n<p>Visit the <a href=\"https:\/\/store.nvidia.com\/en-us\/studio\/store\/?page=1&amp;limit=9&amp;locale=en-us\">Studio Shop<\/a> for GeForce RTX 4060-powered <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/laptops-desktops\/\">NVIDIA Studio systems<\/a> when available, and explore the range of high-performance Studio products.<\/p>\n<h2><b>D5 Render, DLSS 3 Combine to Beautiful Effect<\/b><\/h2>\n<p>D5 Render adds support for NVIDIA DLSS 3, bringing a vastly improved real-time experience to architects, designers, interior designers and 3D artists.<\/p>\n<p>Such professionals want to navigate scenes smoothly while editing, and demonstrate their creations to clients in the highest quality. Scenes can be incredibly detailed and complex, making it difficult to maintain high real-time viewport frame rates and present in original quality.<\/p>\n<p>D5 is coveted by many artists for its <a href=\"https:\/\/www.d5render.com\/post\/d5-render-global-illumination\">global illumination<\/a> technology, called D5 GI, which delivers high-quality lighting and shading effects in real time, without sacrificing workflow efficiency.<\/p>\n<figure id=\"attachment_64137\" aria-describedby=\"caption-attachment-64137\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-d5render-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64137\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-d5render-1280w-672x375.jpg\" alt=\"\" width=\"672\" height=\"375\"><\/a><figcaption id=\"caption-attachment-64137\" class=\"wp-caption-text\">D5 Render and DLSS 3 work brilliantly to create photorealistic imagery.<\/figcaption><\/figure>\n<p>By integrating DLSS 3, which combines AI-powered <a href=\"https:\/\/nvidianews.nvidia.com\/news\/nvidia-introduces-dlss-3-with-breakthrough-ai-powered-frame-generation-for-up-to-4x-performance\">DLSS Frame Generation<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/technologies\/dlss\/\">Super Resolution<\/a> technologies, real-time viewport frame rates increase up to 3x, making creator experiences buttery smooth. This allows designers to deal with larger scenes, higher-quality models and textures \u2014 all in real time \u2014 while maintaining a smooth, interactive viewport.<\/p>\n<p>Learn more about the <a href=\"http:\/\/www.d5render.com\/posts\/d5-render-nvidia-dlss-3-real-time-interactive-3d-rendering\">update<\/a>.<\/p>\n<h2><b>Venture \u2018<\/b><b>Into the Omniverse\u2019<\/b><\/h2>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/creators\/\">NVIDIA Omniverse<\/a> is a key component of the <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\">NVIDIA Studio<\/a> platform and the future of collaborative 3D content creation.<\/p>\n<p>A new monthly blog series, <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/05\/17\/adobe-substance-3d-omniverse-enhance-creative-freedom\/\"><i>Into the Omniverse<\/i><\/a>, showcases how artists, developers and enterprises can transform their creative workflows using the latest Omniverse advancements.<\/p>\n<p>This month, 3D creators across industries are set to benefit from the pairing of Omniverse and the Adobe Substance 3D suite of creative tools.<\/p>\n<figure id=\"attachment_64140\" aria-describedby=\"caption-attachment-64140\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/into-the-ov.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64140\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/into-the-ov-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-64140\" class=\"wp-caption-text\">\u201cEnd of Summer,\u201d created by the Adobe Substance 3D art and development team, built in Omniverse.<\/figcaption><\/figure>\n<p>An upcoming update to the Omniverse Connector for Adobe Substance 3D Painter will dramatically increase flexibility for users, with new capabilities including an export feature using <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/usd\/\">Universal Scene Description (OpenUSD)<\/a>, an open, extensible file framework enabling non-destructive workflows and collaboration in scene creation.<\/p>\n<p>Find details in the <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/05\/17\/adobe-substance-3d-omniverse-enhance-creative-freedom\/\">blog<\/a> and check in every month for more <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/news\/\">Omniverse news<\/a>.<\/p>\n<h2><b>Your Last Worm-ing<\/b><\/h2>\n<p>NVIDIA 3D artist Daniel Barnes has a simple initial approach to his work: sketch until something seems cool enough to act on. While his piece <i>Wormhole 00527<\/i> was no exception to this usual process, an emotional component made a significant impact on it.<\/p>\n<p>\u00a0<\/p>\n<p>\u201cAfter the pandemic and various global events, I took even more interest in spaceships and escape pods,\u201d said Barnes. \u201cIt was just an abstract form of escapism that really played on the idea of \u2018get me out of here,\u2019 which I think we all experienced at one point, being inside so much.\u201d<\/p>\n<p>Barnes imagined<i> Wormhole 00527<\/i> to comprise each blur one might pass by as an alternate star system \u2014 a place on the other side of the galaxy where things are really similar but more peaceful, he said. \u201cAn alternate Earth of sorts,\u201d the artist added.<\/p>\n<p>Sculpting on his tablet one night in the Nomad app, Barnes imported a primitive model into Autodesk Maya for further refinement. He retopologized the scene, converting high-resolution models into much smaller files that can be used for animation.<\/p>\n<figure id=\"attachment_64158\" aria-describedby=\"caption-attachment-64158\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-wireframe-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64158\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-wireframe-1280w-672x375.jpg\" alt=\"\" width=\"672\" height=\"375\"><\/a><figcaption id=\"caption-attachment-64158\" class=\"wp-caption-text\">Modeling in Autodesk Maya.<\/figcaption><\/figure>\n<div class=\"simplePullQuote right\">\n<p>\u201cI\u2019ve been creating in 3D for over a decade now, and GeForce RTX graphics cards have been able to power multiple displays smoothly and run my 3D software viewports at great speeds. Plus, rendering in real time on some projects is great for fast development.\u201d \u2014 Daniel Barnes<\/p>\n<\/div>\n<p>Barnes then took a screenshot, further sketched out his modeling edits and made lighting decisions in Adobe Photoshop.<\/p>\n<p>His GeForce RTX 4090 GPU gives him access to over 30 GPU-accelerated features for quickly, smoothly modifying and adjusting images. These features include blur gallery, object selection and perspective warp.<\/p>\n<p>Back in Autodesk Maya, Barnes used the quad-draw tool \u2014 a streamlined, one-tool workflow for retopologizing meshes \u2014 to create geometry, adding break-in panels that would be advantageous for animating.<\/p>\n<figure id=\"attachment_64152\" aria-describedby=\"caption-attachment-64152\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-wormhole-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64152\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-wormhole-1280w-672x375.jpg\" alt=\"\" width=\"672\" height=\"375\"><\/a><figcaption id=\"caption-attachment-64152\" class=\"wp-caption-text\">So this is what a wormhole looks like.<\/figcaption><\/figure>\n<p>Barnes used Chaos V-Ray with Autodesk Maya\u2019s Z-depth feature, which provides information about each object\u2019s distance from the camera in its current view. Each pixel representing the object is evaluated for distance individually \u2014 meaning different pixels for the same object can have varying grayscale values. This made it far easier for Barnes to tweak depth of field and add motion-blur effects.<\/p>\n<figure id=\"attachment_64146\" aria-describedby=\"caption-attachment-64146\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-vray-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64146\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-vray-1280w-672x375.jpg\" alt=\"\" width=\"672\" height=\"375\"><\/a><figcaption id=\"caption-attachment-64146\" class=\"wp-caption-text\">Example of Z-depth. Image courtesy of Chaos V-Ray with Autodesk Maya.<\/figcaption><\/figure>\n<p>He also added a combination of lights and applied materials with ease. Deploying RTX-accelerated ray tracing and AI denoising with the default Autodesk Arnold renderer enabled smooth movement in the viewport, resulting in beautifully photorealistic renders.<\/p>\n<figure id=\"attachment_64155\" aria-describedby=\"caption-attachment-64155\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-wormhole-blur-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64155\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-wormhole-blur-1280w-672x375.jpg\" alt=\"\" width=\"672\" height=\"375\"><\/a><figcaption id=\"caption-attachment-64155\" class=\"wp-caption-text\">The Z-depth feature made it easier to apply motion-blur effects.<\/figcaption><\/figure>\n<p>He finished the project by compositing in Adobe After Effects, using GPU-accelerated features for faster rendering with NVIDIA CUDA technology.<\/p>\n<figure id=\"attachment_64162\" aria-describedby=\"caption-attachment-64162\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-featured-setup-1280w.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-64162\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/05\/studio-itns-daniel-barnes-wk57-featured-setup-1280w-672x221.png\" alt=\"\" width=\"672\" height=\"221\"><\/a><figcaption id=\"caption-attachment-64162\" class=\"wp-caption-text\">3D artist Daniel Barnes.<\/figcaption><\/figure>\n<p>When asked what his favorite creative tools are, Barnes didn\u2019t hesitate. \u201cDefinitely my RTX cards and nice large displays!\u201d he said.<\/p>\n<p>Check out Barnes\u2019 portfolio on <a href=\"https:\/\/www.instagram.com\/danielin3d\/?hl=en\">Instagram<\/a>.<\/p>\n<p><i>Follow NVIDIA Studio on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiastudio\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/NVIDIAStudio\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.facebook.com\/NVIDIAStudio\/\"><i>Facebook<\/i><\/a><i>. Access tutorials on the <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCDeQdW6Lt6nhq3mLM4oLGWw\"><i>Studio YouTube channel<\/i><\/a><i> and get updates directly in your inbox by subscribing to the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/?nvmid=subscribe-creators-mail-icon\"><i>Studio newsletter<\/i><\/a><i>.<\/i><b>\u00a0<\/b><\/p>\n<p><i>Get started with NVIDIA Omniverse by downloading the standard license <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/download\/\"><i>free<\/i><\/a><i>, or learn how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/enterprise\/\"><i>Omniverse Enterprise<\/i><i> can connect your team<\/i><\/a><i>. Developers can <\/i><a href=\"https:\/\/developer.nvidia.com\/omniverse\/get-started\/\"><i>get started with Omniverse<\/i><\/a><i> resources. Stay up to date on the platform by subscribing to the <\/i><a href=\"https:\/\/nvda.ws\/3u5KPv1\"><i>newsletter<\/i><\/a><i>, and follow NVIDIA Omniverse on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiaomniverse\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/medium.com\/@nvidiaomniverse\"><i>Medium<\/i><\/a><i> and <\/i><a href=\"https:\/\/twitter.com\/nvidiaomniverse\"><i>Twitter<\/i><\/a><i>.\u00a0<\/i><\/p>\n<p><i>For more, join the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/community\/\"><i>Omniverse community<\/i><\/a><i> and check out the Omniverse <\/i><a href=\"https:\/\/forums.developer.nvidia.com\/c\/omniverse\/300\"><i>forums<\/i><\/a><i>, <\/i><a href=\"https:\/\/discord.com\/invite\/XWQNJDNuaC\"><i>Discord server<\/i><\/a><i>, <\/i><a href=\"https:\/\/www.twitch.tv\/nvidiaomniverse\"><i>Twitch<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\"><i>YouTube<\/i><\/a><i> channels.<\/i><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2023\/05\/18\/geforce-rtx-4060-ti\/<\/p>\n","protected":false},"author":0,"featured_media":2988,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2987"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2987"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2987\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2988"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2987"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2987"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2987"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}