The sophistication of autonomous systems currently being developed across various domains and industries has markedly increased in recent years, due in large part to advances in computing, modeling, sensing, and other technologies. While much of the technology that has enabled this technical revolution has moved forward expeditiously, formal safety assurances for these systems still lag behind. This is largely due to their reliance on data-driven machine learning (ML) technologies, which are inherently unpredictable and lack the necessary mathematical framework to provide guarantees on correctness. Without assurances, trust in any learning enabled cyber physical system’s (LE-CPS’s) safety and correct operation is limited, impeding their broad deployment and adoption for critical defense situations or capabilities.
To address this challenge, DARPA’s Assured Autonomy (AA) program is working to provide continual assurance of an LE-CPS’s safety and functional correctness, both at the time of its design and while operational. The program is developing mathematically verifiable approaches and tools that can be applied to different types and applications of data-driven ML algorithms in these systems to enhance their autonomy and assure they are achieving an acceptable level of safety. To help ground the research objectives, the program is prioritizing challenge problems in the defense-relevant autonomous vehicle space, specifically related to air, land, and underwater platforms. The technology resulting from the program is in the form of a set of publicly available tools integrated into LE-CPS design toolchains that are being made widely available for use in commercial and defense sectors. The program technologies address the assurance challenge by analyzing the behavior of the system at design time using a combination of formal verification and simulation based approach. A collection of runtime assurance technologies, ensure that the system operates within or close to analyzed behavior by detecting and recovering from anomalies. Tools for assurance case technologies are used to construct evidence-based assurance arguments that principal hazards have been identified and mitigated.
The AA Virtual Organization allows the interaction with and exploration of Assured Autonomy program technologies via an access-managed web interface. The AA tools are deployed on a suite of standard services of the CPS-VO supporting authentication, graphical modeling, version control, dependency management, data access, visualization, execution and report generation. Each design studio is complemented by documentation, video demonstration and a suite of examples. The publication listing facility helps in showcasing the published research artifacts of different performers of the AA program. The tool listing portal in the AA Virtual Organization allows users to see the potential simulators, verification tools and tool suites that provide foundations for building assured autonomy systems. The tools are categorized based on taxonomy allowing interested users to quickly find relevant tools of interest. The news and event listing provides current news and events associated with the AA program.
DARPA Assured Autonomy(AA) Program site: https://www.darpa.mil/program/assured-autonomy