Loading…
Machine Learning on Small UAVs
Commonly, machine learning (ML) workflows for training and inferencing occur in resource rich environments. Draper Laboratory is pushing ML to the edge. This paper shows the concept of operations (CONOPs), design parameters, and constraints the team faced for edge implementation. The overarching req...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Commonly, machine learning (ML) workflows for training and inferencing occur in resource rich environments. Draper Laboratory is pushing ML to the edge. This paper shows the concept of operations (CONOPs), design parameters, and constraints the team faced for edge implementation. The overarching requirement is to fully integrate the machine learning element into the small unmanned aerial vehicle (UAV) or drone. Given the limited payload capacity and power available on small UAVs, integration of computing resources sufficient to host both ML and Autonomy functions is a challenge. Past efforts have relied on an Intel NUC as the primary processing unit. However, recent advances in GPUs provide greater computational power at low-SWaP, compatibility with ML algorithms, and sufficient CPU resources to host the UAVs autonomy element. More recently developed processing units, designed specifically for ML applications at the edge, enable scaled down variants of the algorithms for integration onto significantly smaller platforms. In this paper, we identify a common software architecting strategy that enables a micro UAV (~ 150 grams) supported by a traditional CPU and a small UAV (3 kg) configured with a GPU. Draper's automation strategy leverages the open-source Robotic Operating System (ROS). The ML models were built using open-source Python Pytorch libraries. We provide the flight test results for a vehicle detection algorithm. Future applications will include visual navigation and tracking. |
---|---|
ISSN: | 2332-5615 |
DOI: | 10.1109/AIPR50011.2020.9425090 |