{"id":2173,"date":"2022-06-23T00:42:07","date_gmt":"2022-06-23T00:42:07","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2022\/06\/23\/meet-the-omnivore-director-of-photography-revs-up-nvidia-omniverse-to-create-sleek-car-demo\/"},"modified":"2022-06-23T00:42:07","modified_gmt":"2022-06-23T00:42:07","slug":"meet-the-omnivore-director-of-photography-revs-up-nvidia-omniverse-to-create-sleek-car-demo","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2022\/06\/23\/meet-the-omnivore-director-of-photography-revs-up-nvidia-omniverse-to-create-sleek-car-demo\/","title":{"rendered":"Meet the Omnivore: Director of Photography Revs Up NVIDIA Omniverse to Create Sleek Car Demo"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2022\/06\/22\/brett-danton-omniverse-creator\/\" data-title=\"Meet the Omnivore: Director of Photography Revs Up NVIDIA Omniverse to Create Sleek Car Demo\" data-hashtags=\"\">\n<p><i>Editor\u2019s note: This post is a part of our <\/i><a href=\"https:\/\/blogs.nvidia.com\/blog\/tag\/meet-the-omnivore\/\" target=\"_blank\" rel=\"noopener\"><i>Meet the Omnivore<\/i><\/a><i> series, which features individual creators and developers who use <\/i><a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\" target=\"_blank\" rel=\"noopener\"><i>NVIDIA Omniverse<\/i><\/a><i> to <\/i><i>accelerate their 3D workflows and create virtual worlds.<\/i><\/p>\n<figure id=\"attachment_57857\" aria-describedby=\"caption-attachment-57857\" class=\"wp-caption alignright\">\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2022\/06\/Brett-Danton-Headshot-150x150.jpg\" alt=\"\" width=\"150\" height=\"150\"><figcaption id=\"caption-attachment-57857\" class=\"wp-caption-text\">Brett Danton<\/figcaption><\/figure>\n<p>A camera begins in the sky, flies through some trees and smoothly exits the forest, all while precisely tracking a car driving down a dirt path. This would be all but impossible in the real world, according to film and photography director Brett Danton.<\/p>\n<p>But Danton made what he calls this \u201cimpossible camera move\u201d possible for an automotive commercial \u2014 at home, with cinematic quality and physical accuracy.<\/p>\n<p>He pulled off the feat using <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse<\/a>, a 3D design collaboration and world simulation platform that enhanced his typical creative workflow and connected various apps he uses, including Autodesk Maya, Epic Games Unreal Engine and <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/apps\/create\/\" target=\"_blank\" rel=\"noopener\">Omniverse Create<\/a>.<\/p>\n<\/p>\n<p>With 30+ years of experience in the digital imagery industry, U.K.-based Danton <a href=\"https:\/\/www.brettdanton.tv\/\" target=\"_blank\" rel=\"nofollow noopener\">creates advertisements<\/a> for international clients, showcasing products ranging from cosmetics to cars.<\/p>\n<p>His latest projects, like the above using a Volvo car, demonstrate how a physical location can be recreated for a virtual shoot, delivering photorealistic rendered sequences that match cinematic real-world footage.<\/p>\n<p>\u201cThis breaks from traditional imagery and shifts the gears of what\u2019s possible in the digital arts, allowing multiple deliverables inside the one asset,\u201d Danton said.<\/p>\n<p>The physically accurate simulation capabilities of Omniverse took Danton\u2019s project the extra mile, animating a photorealistic car that reacts to the dirt road\u2019s uneven surface as it would in real life.<\/p>\n<p>And by working with <a href=\"https:\/\/developer.nvidia.com\/usd\" target=\"_blank\" rel=\"noopener\">Universal Scene Description (USD)<\/a>-based assets from connected digital content creation tools like Autodesk Maya and Unreal Engine in Omniverse, Danton collaborated with other art departments from his home, just outside of London.<\/p>\n<p>\u201cOmniverse gives me an entire studio on my desktop,\u201d Danton said. \u201cIt\u2019s impossible to tell the difference between the real location and what\u2019s been created in Omniverse, and I know that because I went and stood in the real location to create the virtual set.\u201d<\/p>\n<h2><b>Real-Time Collaboration for Multi-App Workflows<\/b><\/h2>\n<p>To create the forest featured in the car commercial, Danton collaborated with award-winning design studio Ars Thanea. The team shot countless 100-megapixel images to use as references, resulting in a point cloud \u2014 or set of data points representing 3D shapes in space \u2014 that totaled 250 gigabytes.<\/p>\n<p>The team then used Omniverse as the central hub for all of the data exchange, accelerated by NVIDIA RTX GPUs. Autodesk Maya served as the entry point for camera animation and initial lighting before the project\u2019s data was brought into Omniverse with an <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/#apps-connectors\" target=\"_blank\" rel=\"noopener\">Omniverse Connector<\/a>.<\/p>\n<p>And with the Omniverse Create app, the artists placed trees by hand, created tree patches and tweaked them to fit the forest floor. Omniverse-based real-time collaboration was key for enabling high-profile visual effects artists to work together remotely and on site, Danton said.<\/p>\n<p>Omniverse Create uses Pixar\u2019s USD format to accelerate advanced scene composition and assemble, light, simulate and render 3D scenes in real time.<\/p>\n<h2><b>Photorealistic Lighting With Path Tracing<\/b><\/h2>\n<p>When directing projects in physical production sites and studios, Danton said he was limited in what he could achieve with lighting \u2014 depending on resources, time of day and many other factors. Omniverse removes such creative limitations.<\/p>\n<p>\u201cI can now pre-visualize any of the shots I want to take, and on top of that, I can light them in Omniverse in a photorealistic way,\u201d Danton said.<\/p>\n<p>When he moves a light in Omniverse, the scene reacts exactly the way it would in the real world.<\/p>\n<\/p>\n<p>This ability, enabled by Omniverse\u2019s RTX-powered real-time ray tracing and <a href=\"https:\/\/blogs.nvidia.com\/blog\/2022\/03\/23\/what-is-path-tracing\/\" target=\"_blank\" rel=\"noopener\">path tracing<\/a>, is Danton\u2019s favorite aspect of the platform. It lets him create photorealistic, cinematic sequences with \u201ctrue feel of light,\u201d which wasn\u2019t possible before, he said.<\/p>\n<p>In the Volvo car clip above, for example, the Omniverse lighting reacts on the car as it would in the forest, with physically accurate reflections and light bouncing off the windows.<\/p>\n<p>\u201cI\u2019ve tried other software before, and Omniverse is far superior to anything else I have seen because of its real-time rendering and collaborative workflow capabilities,\u201d Danton said.<\/p>\n<h2><b>Join in on the Creation<\/b><\/h2>\n<p>Creators across the world can experience <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/creators\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Omniverse for free<\/a>, and <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/enterprise\/\" target=\"_blank\" rel=\"noopener\">enterprise teams<\/a> can use the platform for their projects.<\/p>\n<p>Plus, join the <a href=\"https:\/\/www.nvidia.com\/en-us\/omniverse\/machinima-contest\/\" target=\"_blank\" rel=\"noopener\">#MadeInMachinima contest<\/a>, running through June 27, for a chance to win the latest <a href=\"https:\/\/www.nvidia.com\/en-us\/studio\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Studio<\/a> laptop.<\/p>\n<p>Learn more about Omniverse by watching <a href=\"https:\/\/events.rainfocus.com\/widget\/nvidia\/gtcspring2022\/1644002056021001YfmV\" target=\"_blank\" rel=\"nofollow noopener\">GTC sessions on demand<\/a> \u2014 featuring visionaries from the Omniverse team, Adobe, Autodesk, Epic Games, Pixar, Unity and Walt Disney Studios.<\/p>\n<p><i>Follow<\/i><i> Omniverse on <\/i><a href=\"https:\/\/www.instagram.com\/nvidiaomniverse\/\" target=\"_blank\" rel=\"nofollow noopener\"><i>Instagram<\/i><\/a><i>, <\/i><a href=\"https:\/\/twitter.com\/nvidiaomniverse\" target=\"_blank\" rel=\"nofollow noopener\"><i>Twitter<\/i><\/a><i>, <\/i><a href=\"https:\/\/www.youtube.com\/channel\/UCSKUoczbGAcMld7HjpCR8OA\" target=\"_blank\" rel=\"nofollow noopener\"><i>YouTube<\/i><\/a><i> and <\/i><a href=\"https:\/\/medium.com\/@nvidiaomniverse\" target=\"_blank\" rel=\"nofollow noopener\"><i>Medium<\/i><\/a><i> for additional resources and inspiration. Check out the Omniverse <\/i><a href=\"https:\/\/forums.developer.nvidia.com\/c\/omniverse\/300\" target=\"_blank\" rel=\"nofollow noopener\"><i>forums<\/i><\/a><i> and join our <\/i><a href=\"https:\/\/discord.com\/invite\/XWQNJDNuaC\" target=\"_blank\" rel=\"nofollow noopener\"><i>Discord Server<\/i><\/a><i> to chat with the community.<\/i><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/blogs.nvidia.com\/blog\/2022\/06\/22\/brett-danton-omniverse-creator\/<\/p>\n","protected":false},"author":0,"featured_media":2174,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2173"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=2173"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/2173\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/2174"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=2173"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=2173"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=2173"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}