Advanced Autonomous Systems
Cutting-Edge Technology for Autonomous Flight
Aurora is advancing autonomous decision-making capabilities and human-machine interaction across commercial and defense applications. We develop new technology for decision making in complex, multi-vehicle scenarios and apply our expertise in AI/ML, GNC, and robotics to rapidly iterate and prototype on real air vehicles.
We put experienced aviators in teams with unmanned systems to improve safety, enable new capabilities, and increase trust in autonomy. This work combines new human-autonomy interaction methods, autonomous advancements, and rigorous study on technology’s role in mission success.
Our Advanced Teaming Integration Lab (ATIL) enables user-centric, agile development and test of multi-vehicle autonomous systems, filling the gap between pure simulation and flight test.
Using ATIL, we are inventing and validating new mission capabilities. For example, our MUM-T simulation for MQ-25™ showed how an open behavioral software framework can be used to aggregate traditional unmanned system commands into an overall autonomous mission behavior.
Using a low-fidelity flight simulator and noninvasive sensors, such as eye trackers and heart rate monitors, Aurora’s humans and autonomy team can estimate a pilot’s workload and situational awareness to enhance safety.
Our advanced perception technologies, including detect and avoid and computer vision, enable autonomous flight across platforms from small UAS to urban air mobility.
Aurora develops conflict detection and resolution technology for detection, remaining well clear, and collision avoidance. This capability is critical to integrate unmanned aircraft into the National Airspace (NAS).
Aurora’s MIDAS counter-UAS uses deep learning computer vision algorithms to detect, localize, and track adversary sUAS targets.
Following our work with the Office of Naval Research on the Autonomous Aerial Cargo Utility System (AACUS) program, we continue to mature technology that enables rotary-wing aircraft to operate autonomously, including the capability to identify, map, and select landing sites in unknown and hazardous environments.
Guidance, Navigation and Control
Guidance, navigation, and control (GNC) is core to the operation and integration of autonomous air vehicles. Aurora’s GNC team has a broad reach across internal and customer vehicle programs in the commercial and defense industries.
Route and trajectory planning enables fast and safe decisions about the path ahead, taking into account obstacles, no-fly zones, weather, ATC clearances, and other considerations.
Contingency management and multi-vehicle route and schedule management will allow for Urban Air Mobility solutions at scale.
Our GNC engineers deploy flight controls onto real aircraft using the state-of-the-art in robust and adaptive control. Efforts include supporting Wisk as they design, certify, and bring a self-flying eVTOL aircraft to market.
Early-Stage Technology Development
Aurora partners with well-respected university research programs and government labs to develop technology at the cutting edge of AI and machine learning. Example programs include:
DARPA’s Intelligent Auto-Generation and Composition of Surrogate Models project, known as Ditto, where we focus on improving computer-generated design models by integrating artificial intelligence and machine learning to speed up simulations of future military equipment.
DARPA’s Shared-Experience Lifelong Learning (ShELL) program to develop AI algorithms to achieve life-long learning for agents that learn new tasks and share their experiences with each other, while accounting for communications and hardware constraints.
DARPA’s Enabling Confidence program, where our teams aim to enhance machine learning (ML) systems by developing scalable methods to generate and maintain accurate statistical models in state-of-the-art deep neural networks for object detection and tracking.