{"id":1103,"date":"2021-10-29T08:40:12","date_gmt":"2021-10-29T08:40:12","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2021\/10\/29\/a-stream-come-true-att-ericsson-and-wevr-deliver-first-location-based-vr-experience-on-5g\/"},"modified":"2021-10-29T08:40:12","modified_gmt":"2021-10-29T08:40:12","slug":"a-stream-come-true-att-ericsson-and-wevr-deliver-first-location-based-vr-experience-on-5g","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2021\/10\/29\/a-stream-come-true-att-ericsson-and-wevr-deliver-first-location-based-vr-experience-on-5g\/","title":{"rendered":"A Stream Come True: AT&amp;T, Ericsson and Wevr Deliver First Location-Based VR Experience on 5G"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2021\/10\/28\/cloudxr-location-based-vr-5g\/\" data-title=\"A Stream Come True: AT&amp;T, Ericsson and Wevr Deliver First Location-Based VR Experience on 5G\">\n<p>5G and NVIDIA CloudXR are bringing magic to the future of immersive entertainment.<\/p>\n<p>AT&amp;T, Ericsson and Wevr have collaborated to create the first multi-user, location-based, virtual reality experience using 5G.\u00a0Creative teams designed a VR experience set in a fictional world, where participants can wield magical powers and freely explore inside a high-fidelity, 3D, photorealistic immersive environment. The designers built the experience on Dreamscape Immersive\u2019s VR platform and used technologies from Dell, Qualcomm and VMware.<\/p>\n<p>When <a href=\"https:\/\/developer.nvidia.com\/nvidia-cloudxr-sdk\">NVIDIA CloudXR<\/a> and <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/technologies\/rtx\/\">RTX technology<\/a> are combined with the power of AT&amp;T 5G, creatives and professionals can deliver a more vivid, realistic and dynamic experience than ever.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/10\/att-2-Copy-1.jpg\" alt=\"\" width=\"1280\" height=\"680\"><\/p>\n<h2><b>Bringing VR to the Next Level<\/b><\/h2>\n<p>VR brings many benefits to experiences across industries, from <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/06\/29\/cloudxr-aec\/\">design reviews<\/a> in architecture to <a href=\"https:\/\/blogs.nvidia.com\/blog\/2021\/08\/11\/cloudxr-google-cloud-demo\/\">content creation<\/a> in media and entertainment. However, a big challenge for VR adoption is the need to have VR devices tethered to high-powered computing resources.<\/p>\n<p>With their proof of concept, AT&amp;T, Ericsson and Wevr have pushed the boundaries of VR streaming and demonstrated how 5G addresses tethering issues.<\/p>\n<p>To give users a realistic interactive experience, the teams traded the corded headset and used\u00a0 AT&amp;T 5G to stream the immersive experience to Qualcomm XR2 VR headsets. This lets players use haptic devices to wield magical powers and move freely throughout the virtual world.<\/p>\n<p>The Qualcomm XR2 VR headset, powered by the company\u2019s latest <a href=\"https:\/\/www.qualcomm.com\/snapdragon\">Snapdragon technology<\/a>, enhances power efficiency and improves latency for head and controller tracking, as well as 5G optimizations for latency and coverage. The virtual experience allowed up to six users to participate simultaneously in a single 5G cell, and was enabled by the low latency and high bandwidth of AT&amp;T 5G to stream directly to a Qualcomm XR2 VR headset.<\/p>\n<p>A private 5G network from Ericsson D-15 powered the application validation. \u201c5G offers the high peak rates required to secure photorealism for XR, and a consistent low latency is key to avoid the risk of users getting dizzy,\u201d said Peter Linder, head of 5G marketing in North America at Ericsson.<\/p>\n<p>The virtual experience, built by Wevr and Dreamscape, ran on a virtualized cluster of NVIDIA CloudXR-enabled Dell EMC PowerEdge servers, powered by <a href=\"https:\/\/www.nvidia.com\/en-us\/design-visualization\/quadro\/rtx-8000\/\">NVIDIA Quadro RTX 8000<\/a> GPUs. This allowed the creative teams to design photorealistic details and full-fidelity models for the VR experience. And with the power of NVIDIA RTX Virtual Workstation and CloudXR, the teams distributed the VR experience from a centralized system.\u201cWhen the user\u2019s headset isn\u2019t required to do the heavy lifting, it creates a lighter, more comfortable experience all around,\u201d said Jay Cary, vice president of 5G Ecosystems and Partnerships at AT&amp;T. \u201cBy using the high throughput and low-latency of 5G paired with edge cloud, we can transform experiences to be more comfortable for fans, more productive for venue operators, and allows more freedom for creators.\u201d<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2021\/10\/att-3-Copy.jpg\" alt=\"\" width=\"1280\" height=\"680\"><\/p>\n<p>\u201cThis groundbreaking experience came together with the use of NVIDIA CloudXR and NVIDIA RTX graphics technology,\u201d said Anthony Batt, executive vice president and co-founder of Wevr. \u201cCloudXR dynamically adjusts to network conditions and maximizes image quality while minimizing effective latency. By moving the RTX graphics processing to the edge, our developers can deliver a new level of VR experience to users.\u201d<\/p>\n<h2><b>Dive Into XR Streaming at GTC<\/b><\/h2>\n<p>Learn how creators can use 5G and CloudXR to deliver stunning immersive experiences, at <a href=\"https:\/\/www.nvidia.com\/gtc\/\">NVIDIA GTC<\/a>, which starts on Nov. 8.<\/p>\n<p>Register now for free, and hear from <a href=\"https:\/\/events.rainfocus.com\/widget\/nvidia\/nvidiagtc\/sessioncatalog?search=A31631\">AT&amp;T, Ericsson and Wevr as they present<\/a> on how they created the first end-to-end, 5G, multi-user, location-based VR experience.<\/p>\n<p>And don\u2019t miss the special keynote address by <a href=\"https:\/\/www.nvidia.com\/gtc\/keynote\/\">NVIDIA CEO Jensen Huang<\/a> on Nov. 9 at 9 a.m. CET, with a rebroadcast at 8 a.m. PST.<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>http:\/\/feedproxy.google.com\/~r\/nvidiablog\/~3\/FMuGQ5dxCgE\/<\/p>\n","protected":false},"author":0,"featured_media":1104,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/1103"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=1103"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/1103\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/1104"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=1103"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=1103"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=1103"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}