Bringing the Sully-Factor to UAS Operations

Leveraging Artificial Intelligence and Vision Systems Black Swift Technologies Brings the Sully-Factor to UAS Operations

Artificial intelligence (AI) can bring a predictive-measure to unmanned aircraft systems (UAS)—essentially “failure prediction” coupled with a vision system that could recognize objects on the ground that needed to be avoided in the event of a forced landing due to system or flight failure.

Using AI neural-networks, Black Swift Technology (BST) has been developing technology that would enable a UAS to autonomously locate a safe landing site (avoiding people, vehicles, structures, and terrain obstacles, in a hierarchal order). The technology, named SwiftSTLTM (Swift Safe To Land), essentially uses AI to act as the “Sully-factor” (in the event of a flight emergency where you have engine failures in flight, determine where the safest location is to land your plane—in the real world event, Capt. Sully put his aircraft down in the Hudson River since there were no other viable alternatives).

In this case the AI makes predictions based on the aircrafts current position, energy state, altitude, wind speed, etc. and then invokes an algorithm to steer the plane to a safe landing site. The AI could even predict failures before they happen so maintenance could be done, avoiding the potential failure and subsequent crash.

Read More:Red Cat Completes Fat Shark Acquisition, Expanding Presence in FPV Drone Market

Algorithms and Heuristics

Today, almost everything is a tensor or a sensor input, and those sensors can capture real-time video or still images. With this information, the AI algorithm can detect and classify the images. SwiftSTL then utilizes supplied heuristics defining what to avoid (people, vehicles, building structures, and terrain obstacles) while determining where to safely land. In addition to identifying this autonomous crash-landing site and BST’s engineers also added object detection, which is crucial because when the aircraft identifies a landing area, it can in near real-time look for objects like people or vehicles in and around the landing site that it couldn’t see from upper altitude.

How it Works

SwiftSTL technology integrates state of the art machine learning algorithms, artificial intelligence, and cutting-edge onboard processors to capture and segment images, at altitude, enabling a UAS to autonomously identify a safe landing area in the event of a catastrophe.

Black Swift’s technology processes high-resolution images quickly and efficiently onboard the UAS to enable the identification of objects and terrain that must be avoided during a safe emergency landing. The technology uses a machine vision technique known as semantic segmentation, which allows objects to be classified at the pixel level by assigning each pixel to a class label. It uses heuristics to make sure that it doesn’t hit people, vehicles, buildings, or structures. Once it finds the ideal landing location, it relies on object detection to finish the landing.

Read More: DJI Launches DJI Mini 2

SwiftSTL has demonstrated that it can accurately identify people, cars, trees, buildings, water, etc. These classifications are visible in the color-coded semantic segmentation map the system generates (Figure 1).

Press | UAV Expert News

All of the classifications identified in the vertical segmentation legend on the right-side of Figure 1 are already mapped into the landing heuristics: avoiding people, then avoiding cars, then trees, buildings, and finally rough terrain. Each classification has a specific color coding that is inside the heuristic algorithm and are mapped into the training of the AI.

Prevent the Accident Before it Happens

Machine learning is a very powerful tool and can be used for a variety of applications. BST has focused their development on two distinct areas: machine vision for safe landing solutions, and classifiers for preventive maintenance. The machine learning algorithms were developed by Black Swift using AdaBoost, which is a weak classifier, and then they boost it which gives it a stronger classification. With machine learning, SwiftSTL can try to predict whether the aircraft is going to fail. The goal is to do a classification on whether the servos are going to fail, or the propulsion system, are the batteries going to fail, or are you going to lose communications or GPS. These prediction algorithms, using the AdaBoost classifier, give you that information before the flight. The goal is to prevent the problem from occurring than to try to solve it.

Read More: Female Drone Pilot Is Continuing The Pioneering Aviation Tradition

This enables the system to detect if the electric motor or the propeller are going to fail, or you’re going to lose communications with the drone. These AdaBoost algorithms use machine learning to predict when you need to replace the electric motor or replace components on the aircraft. This preventative maintenance helps ensure that the drone doesn’t get into an emergency state as a result of pre-existing anomalies. In the event that the drone does get into an emergency state, the autonomous branch menus and site selection, which are running in the vehicle too, are used as the last attempt to try to save the vehicle. Ultimately, AI, with the vision camera system, is then used to actually land the vehicle safely.

“This technology is of value to all commercial drone operators, whether they are delivering packages or medical supplies,” says Jack Elston, Ph.D., CEO of Black Swift Technologies. “Regardless, the drone needs to identify where it is at all times so that when it makes its delivery that it avoids people, vehicles, or structures. The same is true if the UAS needs to make a safe emergency landing.”

Elston notes that the same concepts would be applicable in planetary exploration, such as future missions to Mars. This machine vision technology would enable the operators to know where the device landed, how it landed, and determine the status of the vehicle after it landed. Some might say this is “out of this world” technology. Elston refers to it BST ingenuity.

Read More: Falck Rethink Healthcare and Emergency Response

About Black Swift Technologies

2840 Wilderness Place. Suite D Boulder, Colorado 80301

Founded in 2011, Black Swift Technologies LLC, develops custom purpose-built unmanned aircraft system (UAS) leveraging the Company’s advanced SwiftCoreTM Flight Management System (FMS) consisting of the SwiftPilotTM autopilot system, the SwiftTabTM tablet-based user interface, the SwiftStationTM ground station, and application specific sensor integrations. The SwiftCoreTM FMS is designed to be modular, robust, and simple to operate allowing users to focus on data products.

In addition to the SwiftCoreTM FMS, Black Swift Technologies has unique capabilities to develop and deploy advanced small unmanned aircraft systems (UAS) due the team’s combined expertise in the design, implementation, and analysis of advanced “smart” control systems, expertise in legal and safe flight operations in the United States airspace, and their practical experience and knowledge from thousands of hours of UAS flight operations in demanding conditions. The SwiftCoreTM FMS enables advanced control systems. These “smart” control systems provide industry leading sensor-based control of the UAS that minimizes operator workload while improving the quality of the observed data by autonomously modifying the flight path based on sensor inputs.

More information on Black Swift Technologies and their suite of UAS solutions can be found at:

Source: Press Release


قالب وردپرس

The post Bringing the Sully-Factor to UAS Operations appeared first on Drone Magazine.

Leave a Reply

Your email address will not be published. Required fields are marked *