News & Notices

ROSgeoregistration: Aerial Multi-spectral Image Simulator for the Robot Operating System

This article describes a software package called ROSgeoregistration intended for use with the Robot Operating System (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google's Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and wind speed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Hardware-Accelerated SAR Simulation with NVIDIA-RTX Technology

Article published to SPIE:

Title:

Hardware-Accelerated SAR Simulation with NVIDIA-RTX Technology

Abstract:

Synthetic Aperture Radar (SAR) is a critical sensing technology that is notably independent of the sensorto-target distance and has numerous cross-cutting applications, e.g., target recognition, mapping, surveillance, oceanography, geology, forestry (biomass, deforestation), disaster monitoring (volcano eruptions, oil spills, flooding), and infrastructure tracking (urban growth, structure mapping). SAR uses a high-power antenna to illuminate target locations with electromagnetic radiation, e.g., 10GHz radio waves, and illuminated surface backscatter is sensed by the antenna which is then used to generate images of structures. Real SAR data is difficult and costly to produce and, for research, lacks a reliable source ground truth. Few SAR software simulators are available and even less are open source and can be validated. This article proposes a open source SAR simulator to compute phase histories for arbitrary 3D scenes using newly available ray-tracing hardware made available commercially through the NVIDIA’s RTX graphics cards series. The OptiX GPU ray tracing library for NVIDIA GPUs is used to calculate SAR phase histories at unprecedented computational speeds. The simulation results are validated against existing SAR simulation code for spotlight SAR illumination of point targets. The computational performance of this approach provides orders of magnitude speed increases over CPU simulation. An additional order of magnitude of GPU acceleration when simulations are run on RTX GPUs which include hardware specifically to accelerate OptiX ray tracing. The article describes the OptiX simulator structure, processing framework and calculations that afford execution on massively parallel GPU computation device. The shortcoming of the OptiX library’s restriction to single precision float representation is discussed and modifications of sensitive calculations are proposed to reduce truncation error thereby increasing the simulation accuracy under this constraint.

Link to draft article in arxiv: https://arxiv.org/pdf/2005.09736.pdf

Page 1 of 8