Aggressive trajectories involve agile flight at high speeds. Accurately tracking such trajectories is an important robotics and control systems challenge that is relevant to many applications, such as collision avoidance systems, and autonomous search and rescue drones.
FlightGoggles is a development environment that allows the design, implementation, testing and validation of autonomous super-vehicles. FlightGoggles provides exteroceptive sensor simulation based on the Unity3D engine as well as vehicle dynamics and inertial sensor simulation capabilities. It features photorealistic exteroceptive sensor simulation, generated using photogrammetric scan of real environments. It allows traditional simulation, hardware-in-the-loop simulation, virtual-reality simulation and augmented-reality simulation capabilities.
Most autonomous vehicles exhibit very complex dynamics at high speeds. For example, a quadcopter aerial vehicle will experience complex aerodynamics, battery electrochemistry, and actuator dynamics. This project utilizes data-driven approaches to design very fast trajectories, accounting for these factors by optimizing for them during a set of carefully selected experiments.
Most mobile robotics tasks involve a strong element of perception. Motion planning that is tightly integrated with perception, or even motion for better perception, is critical particularly in high-performance robotics tasks. We study the foundations of perception-aware planning, resulting in decision-making algorithms that optimize perception objectives. The video shows the modified yaw angle to optimize visual-inertial state estimation performance.