{"id":2868,"date":"2023-02-08T18:28:03","date_gmt":"2023-02-08T18:28:03","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2023\/02\/08\/new-nvidia-studio-laptops-powered-by-geforce-rtx-4090-4080-laptop-gpus-unleash-creativity\/"},"modified":"2023-02-08T18:28:03","modified_gmt":"2023-02-08T18:28:03","slug":"new-nvidia-studio-laptops-powered-by-geforce-rtx-4090-4080-laptop-gpus-unleash-creativity","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2023\/02\/08\/new-nvidia-studio-laptops-powered-by-geforce-rtx-4090-4080-laptop-gpus-unleash-creativity\/","title":{"rendered":"New NVIDIA Studio Laptops Powered by GeForce RTX 4090, 4080 Laptop GPUs Unleash Creativity"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2023\/02\/08\/in-the-nvidia-studio-february-08\/\" data-title=\"New NVIDIA Studio Laptops Powered by GeForce RTX 4090, 4080 Laptop GPUs Unleash Creativity\" data-hashtags=\"\">\n<p><i>Editor\u2019s note: This post is part of our weekly <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/in-the-nvidia-studio\/\"><i>In the NVIDIA Studio<\/i><\/a><i> series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\"><i>NVIDIA Studio<\/i><\/a><i> technology improves creative workflows. We\u2019re also deep diving on new <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/graphics-cards\/40-series\/\"><i>GeForce RTX 40 Series GPU<\/i><\/a><i> features, technologies and resources, and how they dramatically accelerate content creation.<\/i><\/p>\n<p>The first <a href=\"https:\/\/blogs.nvidia.com\/blog\/2023\/01\/03\/studio-laptops-omniverse-ces\/\">NVIDIA Studio laptops powered by GeForce RTX 40 Series Laptop GPUs<\/a> are now available, starting with systems from MSI and Razer \u2014 with many more to come.<\/p>\n<p>Featuring GeForce RTX 4090 and 4080 Laptop GPUs, the new Studio laptops use the NVIDIA Ada Lovelace architecture and fifth-generation Max-Q technologies for maximum performance and efficiency. They\u2019re fueled by powerful <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/technologies\/rtx\/\">NVIDIA RTX technology<\/a> like <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/technologies\/dlss\/\">DLSS 3<\/a>, which routinely increases frame rates by 2x or more.<\/p>\n<p>Backed by the <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\">NVIDIA Studio platform<\/a>, these laptops give creators exclusive access tools and apps \u2014 including <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/creators\/\">NVIDIA Omniverse<\/a>, <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/canvas\/\">Canvas<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/broadcasting\/broadcast-app\/\">Broadcast<\/a> \u2014 and deliver breathtaking visuals with full ray tracing and time-saving AI features.<\/p>\n<p>They come preinstalled with regularly updated NVIDIA Studio Drivers. This month\u2019s driver is available for download starting today.<\/p>\n<p>And when creating turns to gaming, the laptops enable playing at previously impossible levels of detail and speed.<\/p>\n<p>Plus, <i>In the NVIDIA Studio<\/i> this week highlights the making of <a href=\"https:\/\/www.youtube.com\/watch?v=EKJXI1xW4gw\"><i>The Artists\u2019 Metaverse<\/i><\/a>, a video showcasing the journey of 3D collaboration between seven creators, across several time zones, using multiple creative apps simultaneously \u2014 all powered by NVIDIA Omniverse.<\/p>\n<h2><b>The Future of Content Creation, Anywhere<\/b><\/h2>\n<p>NVIDIA Studio laptops, powered by new GeForce RTX 40 Series Laptop GPUs, deliver the largest-ever generational leap in portable performance and are the world\u2019s fastest laptops for creating and gaming.<\/p>\n<p>These creative powerhouses run up to 3x more efficiently than the previous generation, enabling users to power through creative workloads in a fraction of the time, all using thin, light laptops \u2014 with 14-inch designs coming soon for the first time.<\/p>\n<figure id=\"attachment_62300\" aria-describedby=\"caption-attachment-62300\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-jan-driver-wk38-msistealth17-1280w-2.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62300\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-jan-driver-wk38-msistealth17-1280w-2-672x378.png\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-62300\" class=\"wp-caption-text\">MSI\u2019s Stealth 17 Studio comes with up to a GeForce RTX 4090 Laptop GPU.<\/figcaption><\/figure>\n<p>MSI\u2019s Stealth 17 Studio comes with up to a GeForce RTX 4090 Laptop GPU and an optional 17-inch, Mini LED 4K, 144Hz, 1000 Nits, DisplayHDR 1000 display \u2014 perfect for creators of all types. It\u2019s available in various configurations at <a href=\"https:\/\/www.amazon.com\/MSI-Stealth-Studio-Gaming-Laptop\/dp\/B0BT3DFR8W\/\">Amazon<\/a>, <a href=\"https:\/\/www.bestbuy.com\/site\/msi-stealth-17-3-240hz-qhd-gaming-laptop-intel-core-i9-13900h-nvidia-geforce-rtx-4090-2tb-ssd-64gb-memory-black\/6532165.p\">Best Buy<\/a>, <a href=\"https:\/\/www.bhphotovideo.com\/c\/product\/1746171-REG\/msi_stealth_17studio_a13vi_017us_17_3_stealth_17_studio.html\">B&amp;H<\/a> and <a href=\"https:\/\/www.newegg.com\/p\/N82E16834156423?Item=N82E16834156423\">Newegg<\/a>.<\/p>\n<figure id=\"attachment_62297\" aria-describedby=\"caption-attachment-62297\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-jan-driver-wk38-razerblade18-1280w.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62297\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-jan-driver-wk38-razerblade18-1280w-672x378.png\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-62297\" class=\"wp-caption-text\">New Razer Blade Studio Laptops come preinstalled with NVIDIA Broadcast.<\/figcaption><\/figure>\n<p>Razer is upgrading its Blade laptops with up to a GeForce RTX 4090 Laptop GPU. Available with a 16- or 18-inch HDR-capable, dual-mode, mini-LED display, they feature a Creator mode that enables sharp, ultra-high-definition+ native resolution at 120Hz. It\u2019s available at <a href=\"https:\/\/www.razer.com\/gaming-laptops\/Razer-Blade-18\/RZ09-0484TEH3-R3U1\">Razer<\/a>, <a href=\"https:\/\/www.amazon.com\/Razer-Blade-18-Gaming-Laptop\/dp\/B0BQNVV5ZX\/ref=sr_1_2?crid=TPCZDTSO9KUX&amp;keywords=razer+blade+18&amp;qid=1675735258&amp;s=electronics&amp;sprefix=razer+blade+18%2Celectronics%2C169&amp;sr=1-2\">Amazon<\/a>, <a href=\"https:\/\/www.bestbuy.com\/site\/razer-blade-18-18-gaming-laptop-qhd-240-hz-intel-24-core-i9-13950hx-nvidia-geforce-rtx-4090-32gb-ram-2tb-ssd-black\/6533840.p?skuId=6533840\">Best Buy<\/a>, <a href=\"https:\/\/www.bhphotovideo.com\/c\/product\/1742204-REG\/razer_rz09_0484teh3_r3u1_18_razer_blade_18.html\/qa?ap=y&amp;smp=y&amp;lsft=BI%3A514&amp;gclid=Cj0KCQiA54KfBhCKARIsAJzSrdredlA-IopRBIM-7pvyCcrNrHb7hlP2evsWRrGX3t1h2M7hcCJGPXEaAm-gEALw_wcB\">B&amp;H<\/a> and <a href=\"https:\/\/www.newegg.com\/black-razer-blade-18-rz09-0484teh3-r3u1-gaming\/p\/N82E16834326091?item=N82E16834326091&amp;nm_mc=knc-googleadwords&amp;cm_mmc=knc-googleadwords-_-gaming+laptops-_-razer-_-34326091&amp;source=region&amp;com_cvv=8532246be4358ef3cceffa856b5341c9070d6417214ffe7bca7b475c5b96189d\">Newegg<\/a>.<\/p>\n<p>The MSI Stealth 17 Studio and Razer Blade 16 and 18 come preinstalled with NVIDIA Broadcast. The app\u2019s recent update to version 1.4 added an <a href=\"https:\/\/www.nvidia.com\/en-us\/geforce\/news\/jan-2023-nvidia-broadcast-update\/\">Eye Contact feature<\/a>, ideal for content creators who want to record themselves while reading notes or avoid having to stare directly at the camera. The feature also lets video conference presenters appear as if they\u2019re looking at their audience, improving engagement.<\/p>\n<p>Designed for gamers, new units from ASUS, GIGABYTE and Lenovo are also available today and deliver great performance in creator applications with access to NVIDIA Studio benefits.<\/p>\n<h2><b>Groundbreaking Performance<\/b><\/h2>\n<p>The new Studio laptops have been put through rigorous testing, and many reviewers are detailing the new levels of performance and AI-powered creativity that GeForce RTX 4090 and 4080 Laptop GPUs make possible. Here\u2019s what some are saying:<\/p>\n<p><em>\u201cNVIDIA\u2019s GeForce RTX 4090 pushes laptops to blistering new frontiers: Yes, it\u2019s fast, but also much more.\u201d \u2014 <a href=\"https:\/\/www.pcworld.com\/article\/1504852\/tktktk.html\">PC World<\/a><\/em><\/p>\n<p><em>\u201cGeForce RTX 4090 Laptops can also find the favor of content creators thanks to NVIDIA Studio as well as AV1 support and the double NVENC encoder.\u201d \u2014 <a href=\"https:\/\/www.hdblog.it\/portatili\/recensioni\/n566082\/gpu-nvidia-rtx-4090-laptop-mobile-test-prova\/\">HDBLOG.IT<\/a><\/em><\/p>\n<p><em>\u201cWith its GeForce RTX 4090\u2026 and bright, beautiful dual-mode display, the Razer Blade 16 can rip through games with aplomb, while being equally adept at taxing, content creations workloads.\u201d \u2014 <a href=\"https:\/\/hothardware.com\/reviews\/razer-blade-16-with-rtx-4090-and-13th-gen-review?page=3\">Hot Hardware<\/a><\/em><\/p>\n<p><em>\u201cThe Nvidia GeForce RTX 4090 mobile GPU is a step up in performance, as we\u2019d expect from the hottest graphics chip.\u201d \u2014 <a href=\"https:\/\/www.pcmag.com\/news\/first-tests-the-geforce-rtx-4090-laptop-gpu-is-a-scorcher-but-dlss-helps\">PC Magazine<\/a><\/em><\/p>\n<p><em>\u201cAnother important point \u2013 particularly in the laptop domain \u2013 is the presence of enhanced AV1 support and dual hardware encoders. That\u2019s really useful for streamers or video editors using a machine like this.\u201d \u2013 <a href=\"https:\/\/www.kitguru.net\/lifestyle\/mobile\/laptops\/luke-hill\/gigabyte-aorus-17h-2023-laptop-review-i7-13700h-rtx-4080\/2\/\">KitGuru<\/a><\/em><\/p>\n<p>Pick up the latest Studio systems or configure a custom system today.<\/p>\n<h2><b>Revisiting \u2018The Artists\u2019 Metaverse\u2019<\/b><\/h2>\n<p>Seven talented artists join us <i>In the NVIDIA Studio<\/i> this week to discuss building <i>The Artists\u2019 Metaverse<\/i> \u2014 a spotlight demo from last month\u2019s CES. The group reflected on how easy it was to collaborate in real time from different parts of the world.<\/p>\n<\/p>\n<p>It started in NVIDIA Omniverse, a hub to interconnect 3D workflows replacing linear pipelines with live-sync creation. The artists connected to the platform via <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/cloud\/\">Omniverse Cloud<\/a>.<\/p>\n<p>\u201cSetting up the Omniverse Cloud collaboration demo was a super easy process,\u201d said award-winning 3D creator <a href=\"http:\/\/www.tibbiestories.com\/\">Rafi Nizam<\/a>. \u201cIt was cool to see avatars appearing as people popped in, and the user interface makes it really clear when you\u2019re working in a <i>live <\/i>state.\u201d<\/p>\n<figure id=\"attachment_62294\" aria-describedby=\"caption-attachment-62294\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-satellite-render-1280w_.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62294\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-satellite-render-1280w_-672x320.jpg\" alt=\"\" width=\"672\" height=\"320\"><\/a><figcaption id=\"caption-attachment-62294\" class=\"wp-caption-text\">Assets were exported into Omniverse with ease, thanks to the Universal Scene Description format.<\/figcaption><\/figure>\n<p>Filmmaker <a href=\"https:\/\/www.instagram.com\/jsfilmz0412\/?hl=en\">Jae Solina<\/a>, aka JSFILMZ, animated characters in Omniverse using Xsens and Unreal Engine.<\/p>\n<p>\u201cPrior to Omniverse, creating animations was such a hassle, let alone getting photorealistic animations,\u201d Solina said. \u201cInstead of having to reformat and upload files individually, everything is done in Omniverse in real time, leading to serious time saved.\u201d<\/p>\n<p>\u00a0<\/p>\n<p><a href=\"https:\/\/www.instagram.com\/jeremylightcap\">Jeremy Lightcap<\/a> reflected on the incredible visual quality of the virtual scene, highlighting the seamless movement within the viewport.<\/p>\n<figure id=\"attachment_62288\" aria-describedby=\"caption-attachment-62288\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-vr-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62288\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-vr-1280w-672x366.jpg\" alt=\"\" width=\"672\" height=\"366\"><\/a><figcaption id=\"caption-attachment-62288\" class=\"wp-caption-text\">The balloon 3D model was sculpted by hand in Gravity Sketch and imported into Omniverse.<\/figcaption><\/figure>\n<p>\u201cWe had three Houdini simulations, a volume database file storm cloud, three different characters with motion capture and a very dense Western town set with about 100 materials,\u201d Lightcap said. \u201cI\u2019m not sure how many other programs could handle that and still give you instant, <a href=\"https:\/\/blogs.nvidia.com\/blog\/2022\/03\/23\/what-is-path-tracing\/\">path-traced<\/a> lighting results.\u201d<\/p>\n<p>\u00a0<\/p>\n<p>For <a href=\"https:\/\/www.instagram.com\/ashdotpy\/\">Ashley Goldstein<\/a>, an NVIDIA 3D artist and tutorialist, the demo highlighted the versatility of Omniverse. \u201cI could update the scene and save it as a new USD layer, so when someone else opened it up, they had all of my updates immediately,\u201d she said. \u201cOr, if they were working on the scene at the same time, they\u2019d be instantly notified of the updates and could fetch new content.\u201d<\/p>\n<figure id=\"attachment_62282\" aria-describedby=\"caption-attachment-62282\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-ballon-render2-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62282\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-ballon-render2-1280w-672x274.jpg\" alt=\"\" width=\"672\" height=\"274\"><\/a><figcaption id=\"caption-attachment-62282\" class=\"wp-caption-text\">Applying colors and textures to the ballon in Adobe Substance 3D Painter.<\/figcaption><\/figure>\n<p><a href=\"https:\/\/www.youtube.com\/edstudios\">Edward McEvenue<\/a>, aka edstudios, reflected on the immense value Omniverse on RTX hardware provides, displaying fully ray-traced graphics with instant feedback. \u201c3D production is a very iterative process, where you have to make hundreds if not thousands of small decisions along the way before finalizing a scene,\u201d he said. \u201cUsing GPU acceleration with RTX path tracing in the viewport makes that process so much easier, as you get near-instant feedback on the changes you\u2019re making, with all of the full-quality lighting, shadows, reflections, materials and post-production effects directly in the working viewport.\u201d<\/p>\n<figure id=\"attachment_62279\" aria-describedby=\"caption-attachment-62279\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-ballon-render-1280w.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62279\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-ada-laptops-ov-nucleus-wk43-ballon-render-1280w-672x378.jpg\" alt=\"\" width=\"672\" height=\"378\"><\/a><figcaption id=\"caption-attachment-62279\" class=\"wp-caption-text\">Edits to the 3D model in Blender are reflected in real time with photorealistic detail in Omniverse.<\/figcaption><\/figure>\n<p>3D artist <a href=\"https:\/\/www.artstation.com\/shuiguoss\">Shangyu Wang<\/a> noted Omniverse is his preferred 3D collaborative content-creation platform. \u201cAutodesk\u2019s Unreal Live Link for Maya gave me a ray-traced, photorealistic preview of the scene in real time, no waiting to see the final render result,\u201d he said.<\/p>\n<p>Fellow 3D artist <a href=\"https:\/\/www.linkedin.com\/in\/pekka-varis-00933645\/\">Pekka Varis<\/a> mentioned Omniverse\u2019s positive trajectory. \u201cNew features are coming in faster than I can keep up!\u201d he said. \u201cIt can become the main standard of the <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/08\/10\/what-is-the-metaverse\/\">metaverse<\/a>.\u201d<\/p>\n<figure id=\"attachment_62276\" aria-describedby=\"caption-attachment-62276\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-artist-group-wk43-featured-setup-1280w.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-large wp-image-62276\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2023\/02\/studio-itns-artist-group-wk43-featured-setup-1280w-672x348.png\" alt=\"\" width=\"672\" height=\"348\"><\/a><figcaption id=\"caption-attachment-62276\" class=\"wp-caption-text\">Omniverse transcends location, time and apps, where collaboration, communication and creativity reign supreme.<\/figcaption><\/figure>\n<p><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/creators\/\">Download<\/a> Omniverse today, free for all NVIDIA and GeForce RTX GPU owners \u2014 including those with new GeForce RTX 40 Series laptops.<\/p>\n<p><i>Follow NVIDIA Studio on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiastudio\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/NVIDIAStudio\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.facebook.com\/NVIDIAStudio\/\"><i>Facebook<\/i><\/a><i>. Access tutorials on the <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCDeQdW6Lt6nhq3mLM4oLGWw\"><i>Studio YouTube channel<\/i><\/a><i> and get updates directly in your inbox by subscribing to the <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/?nvmid=subscribe-creators-mail-icon\"><i>Studio newsletter<\/i><\/a><i>. <\/i><i>Learn more about Omniverse on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiaomniverse\/\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/medium.com\/@nvidiaomniverse\"><i>Medium<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/nvidiaomniverse\"><i>Twitter<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\"><i>YouTube<\/i><\/a><i> for additional resources and inspiration. Check out the Omniverse <\/i><a href=\"https:\/\/forums.developer.nvidia.com\/c\/omniverse\/300\"><i>forums<\/i><\/a><i>, and join our <\/i><a href=\"https:\/\/discord.com\/invite\/XWQNJDNuaC\"><i>Discord server<\/i><\/a><i> and <\/i><a href=\"https:\/\/www.twitch.tv\/nvidiaomniverse\"><i>Twitch <\/i><\/a><i>channel to chat with the community.<\/i><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2023\/02\/08\/in-the-nvidia-studio-february-08\/<\/p>\n","protected":false},"author":0,"featured_media":2869,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2868"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2868"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2868\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2869"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2868"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2868"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2868"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}