November 3, 2020
Author:
Mike Taylor

Use Case: Drone Inspection

No items found.

Background

In late 2019, Duality was approached by an industrial customer with an acute need to accelerate their virtual product development and test process.  This customer had developed a highly-successful drone-based system for inspecting a wide array of infrastructure but was struggling under the weight of their virtual development pipeline and with the field testing required to back up the deficiencies in the simulation system.  With a simulation team approaching a dozen people and with progress stalling, this customer needed help to complete their vision of true end-to-end virtual development and to lay the framework for advanced development efforts in 2020.

Architecting a Solution

Addressing the customer’s needs went well beyond delivering a technology solution.

Our customer needed help in a number of areas:

  • Developing a solution capable of project-specific, photorealistic synthetic imagery.
  • Unblocking their efforts to produce an end-to-end virtual development pipeline.
  • Moving their testing from real-world flight testing to virtual full-system tests.
  • Dilating a limited set of real-world examples into thousands of training images.
  • Developing the infrastructure to train fresh algorithms on synthetic data.
  • Integrating advanced physics to enable realistic flight dynamics.

Based on these needs, Duality recognized that this customer needed both our tech stack and our support services to move the project forward.

The Duality team worked with the customer’s leadership and their engineering team to design a custom solution that was tailor-made to integrate with their existing development pipeline.  The solution leveraged Epic’s Unreal Engine to deliver a hyper-realistic and highly flexible simulation solution.

In a short, 10-week project a photorealistic environment was developed, delivered and integrated with the customer’s system.  In February 2020, the system went live.

Accuracy At All Scales

The customer’s system uses multiple cameras to collect imagery of the target environment.  These images are then analyzed by a collection of DNN-based modules to detect features of interest that indicate a need to repair or replace components.

Due to the Duality team’s long experience in robotics and simulation, they understand that the accuracy of a synthetic environment is judged on many different scales.  Beyond accurately modeled synthetic features, proper camera simulation, realistic lighting, and appropriate atmospherics are all crucial to ensuring that the customer system correctly detects the presence of these structural flaws. Further, the accuracy & variability of the surrounding environment is crucial for testing the real-world performance of these algorithms.

To meet these accuracy needs, Duality brought their world-generation toolkit to bear to create a replica of a particular location in the United States.  The final environment matched the real world terrain, vegetation patterns, vegetation species and sun patterns.  The vegetation included appropriate distributions of the plants found in this region, including variation in age, health and coloration. 

Falcon makes it easy to generate and customize scenes that reflect the real world.


Falcon's API allows users to quickly and easily customize the scene and add variation.


Crucially, this environment was created without direct observation of field data from the site in question.  Instead, only satellite and GIS data is needed, resulting in a solution that can be used by the customer to mimic a broad swath of the area of interest.

The Duality team also understands that the synthetic features in these sorts of use cases have to look good to the human eye as well as the AI system.  To that end, the project was designed around frequent, rapid iteration.  The synthetic detectable features were first created based on example imagery provided by the customer.  The team iterated on these features by feeding synthetic images through the customer’s detection system and then making adjustments based on the system performance.

Results

The completed environment was delivered to the customer ahead of schedule with data production occurring the very next day.  This 10 week project delivered a system that eclipsed an internal effort 2 years in the making. This represents a 90% reduction in development timeline. 

Further, the system delivered the first photoreal synthetic imagery usable by this team.  This project enabled the team to finally test their full perception system in a virtual environment.

In addition to a massive increase the quality of the imagery, the Duality solution also operated at a much higher data rate than the customer’s internally developed solution.  The system was capable to rendering 42 MP photoreal images at a rate four times faster than the customer’s original solution.  

At present, the flexibility of the environment has enabled the customer to continue using the environment for a new application.  The customer is using the Duality system to train an AI in advanced navigation approaches on a new flight vehicle and in a new operational domain.

The whole Duality team looks forward to traveling many more roads with this customer.