{"id":189,"date":"2020-09-08T20:28:41","date_gmt":"2020-09-08T20:28:41","guid":{"rendered":"https:\/\/machine-learning.webcloning.com\/2020\/09\/08\/office-ready-jetson-driven-double-robot-supports-remote-working\/"},"modified":"2020-09-08T20:28:41","modified_gmt":"2020-09-08T20:28:41","slug":"office-ready-jetson-driven-double-robot-supports-remote-working","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2020\/09\/08\/office-ready-jetson-driven-double-robot-supports-remote-working\/","title":{"rendered":"Office Ready? Jetson-Driven \u2018Double Robot\u2019 Supports Remote Working"},"content":{"rendered":"<div data-url=\"https:\/\/blogs.nvidia.com\/blog\/2020\/09\/08\/jetson-driven-double-robot-supports-remote-working\/\" data-title=\"Office Ready? Jetson-Driven \u2018Double Robot\u2019 Supports Remote Working\">\n<p>Apple\u2019s iPad 2 launch in 2011 ignited a touch tablet craze, but when David Cann and Marc DeVidts got their hands on one they saw something different: They rigged it to a remote-controlled golf caddy and posted a video of it in action on YouTube.<\/p>\n<p>Next came phone calls from those interested in buying such a telepresence robot.<\/p>\n<p>Hacks like this were second nature for the friends who met in 2002 while working on the set of the BattleBots TV series, featuring team-built robots battling before live audiences.<\/p>\n<p>That\u2019s how Double Robotics began in 2012. The startup went on to attend YCombinator\u2019s accelerator, and it has sold more than 12,000 units. That cash flow has allowed the small team with just $1.8 million in seed funding to carry on without raising capital, a rarity in hardware.<\/p>\n<p>Much has changed since they began. Double Robotics, based in Burlingame, Calif., today launched its third-generation model, the Double 3, sporting an <a href=\"https:\/\/www.nvidia.com\/en-us\/autonomous-machines\/embedded-systems\/\">NVIDIA Jetson TX2<\/a> for AI workloads.<\/p>\n<p>\u201cWe did a bunch of custom <a href=\"https:\/\/developer.nvidia.com\/about-cuda\">CUDA<\/a> code to be able to process all of the depth data in real time, so it\u2019s much faster than before, and it\u2019s highly tailored to the Jetson TX2 now,\u201d said Cann.<\/p>\n<h2><b>Remote Worker Presence<\/b><\/h2>\n<figure id=\"attachment_46512\" aria-describedby=\"caption-attachment-46512\" class=\"wp-caption alignright\"><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/blogs.nvidia.com\/wp-content\/uploads\/2020\/08\/Double3-263x400.jpg\" alt=\"\" width=\"263\" height=\"400\"><figcaption id=\"caption-attachment-46512\" class=\"wp-caption-text\">The Double helped engineers inspect Selene while it was under construction.<\/figcaption><\/figure>\n<p>The Double device, as it\u2019s known, was designed for remote workers to visit offices in the form of the robot so they could see their co-workers in meetings. Video-over-internet call connections allow people to see and hear their remote colleague on the device\u2019s tablet screen.<\/p>\n<p>The Double has been a popular ticket at tech companies on the East and West Coasts in the five years prior to the pandemic, and interest remains strong but in different use cases, according to the company. It has also proven useful in rural communities across the country, where people travel long distances to get anywhere, the company said.<\/p>\n<p>NVIDIA purchased a telepresence robot from Double Robotics so that non-essential designers sheltering at home could maintain daily contact with work on <a href=\"https:\/\/blogs.nvidia.com\/blog\/2020\/08\/14\/making-selene-pandemic-ai\/\">Selene, the world\u2019s seventh-fastest computer<\/a>.<\/p>\n<p>Some customers who use it say it breaks down communication barriers for remote workers, with the physical presence of the robot able to interact better than using video conferencing platforms.<\/p>\n<p>Also, COVID-19 has spurred interest for contact-free work using the Double. Pharmaceutical companies have contacted Double Robotics asking how the robot might aid in international development efforts, according to Cann. The biggest use case amid the pandemic is for using the Double robots in place of international business travel, he said. Instead of flying in to visit a company office, the office destination could offer a Double to would-be travelers.<\/p>\n<p>\u00a0<\/p>\n<h2><b>Double 3 Jetson Advances<\/b><\/h2>\n<p>Now shipping, the Double 3 features wide-angle and zoom cameras and can support night vision. It also uses two stereovision sensors for depth vision, five ultrasonic range finders, two wheel encoders and an inertial measurement unit sensor.<\/p>\n<p>Double Robotics will sell the head of the new Double 3 \u2014 which includes the Jetson TX2 \u2014 to existing customers seeking to upgrade its brains for access to increasing levels of autonomy.<\/p>\n<p>To enable the autonomous capabilities, Double Robotics relied on the NVIDIA Jetson TX2 to process all of the camera and sensor data in realtime, utilizing the CUDA-enabled GPUs and the accelerated multimedia and image processors.<\/p>\n<p>The company is working on autonomous features for improved self-navigation and safety features for obstacle avoidance as well as other capabilities, such as improved auto docking for recharging and auto pilot all the way into offices.<\/p>\n<p>Right now the Double can do automated assisted driving to help people avoid hitting walls. The company next aims for full office autonomy and ways to help it get through closed doors.<\/p>\n<p>\u201cOne of the reasons we chose the NVIDIA Jetson TX2 is that it comes with the <a href=\"https:\/\/developer.nvidia.com\/embedded\/jetpack\">Jetpack SDK<\/a> that makes it easy to get started and there\u2019s a lot that\u2019s already done for you \u2014 it\u2019s certainly a huge help to us,\u201d said Cann.<\/p>\n<p>\u00a0<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>http:\/\/feedproxy.google.com\/~r\/nvidiablog\/~3\/WeUaYw_zwQs\/<\/p>\n","protected":false},"author":0,"featured_media":190,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/189"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=189"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/189\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/190"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=189"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=189"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=189"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}