The solution integrates Drive Pegasus and runs Drive Sim software to process sensor data as if it were coming from sensors on an actual car travelling on the road.
NVIDIA has introduced a new datacentre solution for testing autonomous vehicles via virtual reality, creating a safer and more scalable way to bring self-driving cars to the roads.
The new capability, Drive Constellation, is a cloud-based photorealistic simulation based on two different servers. NVIDIA’s CEO Jensen Huang announced the product at the GPU Technology Conference 2018, in San Jose.
Drive Constellation’s first server will operate the NVIDIA Drive Sim software aiming to simulate an autonomous vehicle’s sensors, such as cameras and radars. The second server will contain NVIDIA Drive Pegasus, an AI car computer that will run the autonomous vehicle software stack and processes data as if it were coming from the sensors of a real car on the road.
NVIDIA GPUs will power the simulation server, which will generate simulated sensor data and be fed into the Drive Pegasus for processing. The aim of the new capability is to ensure autonomous vehicles are efficiently tested with an end-to-end solution, eensuring they are safe and secure.
“Deploying production self-driving cars requires a solution for testing and validating on billions of driving miles to achieve the safety and reliability needed for customers,” Rob Csongor, VP and GM of Automotive at NVIDIA said. “With Drive Constellation, we’ve accomplished that by combining our expertise in visual computing and data centres. With virtual simulation, we can increase the robustness of our algorithms by testing on billions of miles of custom scenarios and rare corner cases, all in a fraction of the time and cost it would take to do so on physical roads.”
Following issues in the US with Uber’s autonomous vehicles, the testing and safety of such vehicles is of the utmost importance and NVIDIA’s solution aims to ensure the safety. Despite the announcement, the company has suspended its self driving test of vehicles on public roads for the foreseeable future following Uber’s autonomous incident. The company provides technology to Uber for its self driving vehicles but said it will continue to gather self driving vehicle data.
Any driving commands from Drive Pegasus will be fed back to the simulator, completing a ‘hardware-in-the-loop’ cycle, which occurs 30 times a second. This cycle will be used to validate that algorithms and software running on Pegasus are operating the simulated vehicle correctly.
The Drive Sim software also generates photoreal data streams to create different testing environments, including rainstorms and snowstorms, blinding glare, limited vision at night, and different types of road surfaces and terrain. Additionally, dangerous situations can be scripted to test the car’s ability to react, without endangering anyone. In doing so, this prepares the vehicles for any weather permitted to make them as safe as a car with a driver would be.
“Autonomous vehicles need to be developed with a system that covers training to testing to driving,” said Luca De Ambroggi, research and analyst director at IHS Markit. “NVIDIA’s end-to-end platform is the right approach. DRIVE Constellation for virtually testing and validating will bring us a step closer to the production of self-driving cars.”
The Drive Constellation solution will be available in the third quarter of 2018.