Full autonomy stack integration and photo-realistic environments for developing resilient unmanned systems

Simulate events too dangerous, logistically unfeasible, or where real-world data is limited
Accurate, deterministic simulation yields high-quality synthetic data, and decreases real-world data needs.
Expansive, photo-realistic environments that match your operating domain
Proven to surpass photogrammetry and yield field-relevant results.
Accurate, validated virtual sensors stream data directly to your autonomy stack
LIDAR, RGB, Depth, Distance, IMU, GPS, Velocity, Probe, and more — immediately available for simulation.
Verifiably relevant and valuable synthetic data
Clear metrics on simulation performance and synthetic data output.
Expand utility and value of field-test scenarios and field collected data
Re-run and iterate on field-test events for more comprehensive testing and training.
Designed by and for robotics engineers, not just simulation experts
Made for integration with real engineering workflows in your development pipeline.

As a contractor with diverse government groups, we have the tools, resources and experience to support your most complex projects from design to deployment.

Use Cases + Applications

Unmanned Systems Validation
User Need: Government contractor needs to validate the autonomy stack of their UGV in diverse off-road conditions.

Falcon Workflow: With our Peregrine API, customer connects their actual autonomy stack to Falcon where it controls a physically accurate model of the UGV in large, tunable, run-time customizable environments. Using Falcon's included suite of validated virtual sensors and out-of-the-box protocol integrations, synthetic perception data streams over ROS, enabling the UGV to interact with multitudes of relevant navigational challenges in real-time and developers to execute full regression testing.