Generating survey area coverage routes

Update: 29 November, 2017:  The work described below has been connected up to the on board autopilot and tested in simulation.  Today I am planning to go out and test with an actual survey aircraft in flight.  I can draw and save any number of areas together as a single project (and create and save any number of projects.)  Once in flight, I can call up a project, select an area, and send it up to the aircraft.  The aircraft itself will generate an optimized route based on planned survey altitude, wind direction, camera field of view, and desired picture overlap.  The result (with zero wind) looks like the following picture.  5 areas have been sketched, one area has been submitted to the aircraft to survey, the aircraft has computed it’s route, the route is trickled back to the ground station and drawn on the map.

Original Post

Today I am working on auto-generating routes that cover arbitrary (closed, non-self intersecting) polygon areas.  The operator is able to draw a collection of polygons on the ground station, save them by project name, and then during flight call up the project/area they wish to survey, send (just the area perimeter) to the aircraft, and it will generate the coverage route automatically on the fly.

The main benefit is that the ground station doesn’t need to upload a 1000 waypoint route, only the area perimeter.  The aircraft will already know the camera field of view and mounting orientation.  It will know target altitude, wind direction and speed.  The operator can include additional parameters like endlap and sidelap percentages.

The end goal is a smart, streamlined, easy to use (fixed wing) survey and mapping system.

There are additional issues to consider such as aircraft turn radius, turn strategies (dog bone turns, versus naive turns), and possibly interleaving transacts (a bit like a zamboni covers a hockey rink.)

Celebrating the 4000th git commit!

Few people really know much about the AuraUAS autopilot system, but this post celebrates the 4000th git commit to the main code repository!

The entire AuraUAS system is hosted on github and can be browsed here:

https://github.com/AuraUAS

AuraUAS traces it’s roots back to a simple open-source autopilot developed by Jung Soon Jang to run on the XBOW MNAV/Stargate hardware back in the 2005-2006 time frame.  I worked at the University of Minnesota at that time and we modified the code to run on an original 400mhz gumtsix linux computer which talked to the MNAV sensor head via a serial/uart connection.

From the mid-2000’s and through the 2010’s I have been advancing this code in support of a variety of fixed-wing UAS projects.  Initially I called the system “MicroGear” or “ugear” which was a nod of the head to my other long term open-source project: FlightGear.  Along the way I aligned myself with a small Alaska-based aerospace company called “Airborne Technologies” or ATI for short.  We branched a version of the code specifically for projects developed under NOAA funding as well as for various internal R&D.  However, throughout the development the code always stayed true to it’s open-source heritage.

In the summer of 2015 I took a full time position in the UAS lab at the Aerospace Engineering Department of the University of Minnesota.  Here I have been involved in a variety of UAS-related research projects and have assumed the roll of chief test pilot for the lab.  AuraUAS has been central to several projects at the UAS lab including a spin testing a project, a phd project to develop single surface fault detection and a single surface flight controller on a flying wing, and several aerial survey projects.  I continue to develop AuraUAS in support of ongoing research projects.

Design choices

What makes AuraUAS different?  What makes AuraUAS interesting?  Why is it important to me?

Big processor / little processor architecture

From the start AuraUAS has been designed with the expectation of a small embedded (arduino-style) processor to handle all the sensor inputs as well as the actuator outputs.  A “big processor” (i.e a raspberry pi, beaglebone, gumstix, edison, etc.) is used for all the higher level functionality such as the EKF (attitude determination), flight control, mission management, communication, and logging.  The advantage is that the system can be built from two smaller and simpler programs.  The “little” processor handles all the hard real time tasks.  This frees up the “big” processor to run a standard linux distribution along with all it’s available libraries and fanciness.  So AuraUAS is built around two simpler programs versus the one really complicated program architecture that most other autopilot systems use.

Single thread architecture

Because of the big/little processor architecture, the big processor doesn’t need to do hard real time tasks, and thus can be written using a single-threaded architecture.  This leads to code that is far simpler, and far easier to understand, manage, and maintain.  Anyone who has tried to understand and modify someone else’s threaded code might have a small inkling of why this could be important.  How many large applications suffer through a variety of obscure, rare, and nearly impossible to find bugs that maybe trace to the threading system, but no one knows for sure?

Python

The “big” processor in the AuraUAS system runs linux, so we can easily incorporate python in the mission critical main loop of the primary flight computer.  This has the advantage of further simplifying coding tasks and shortening the edit/compile/test development loop because there is often no need to compile and reflash code changes between test runs.  You can even do development work remotely, right on the flight computer.  For those that are skeptical of python in the critical main loop, I have successfully flown this system for 2 full flight seasons … all of 2016, and all of 2017.  The main flight computer is hitting it’s 100hz performance target and the memory usage is stable.  Speaking of myself personally, my python code is almost always less buggy than my C code.

In my experience, when porting C/C++ code to python, the result is a 50, 60, even 70% reduction in the number of lines of code.  I believe that fewer lines of code == more readable code on average.  More readable code means fewer bugs and bugs are often found more quickly.

Python/C++ hybrid

AuraUAS is a hybrid C++/Python application.  Some modules are written in C++ and some modules are written in Python.  My choice of language for a particular module tends to center around peformance versus logic.  Our 15-state EKF (also developed in house at the U of MN) remains C++ code for performance reasons.  The mission manager and individual tasks are all written in python.  Python scripts greatly accelerate coding of higher level logic tasks.  These typically are not performance critical and it s a great fit.

What makes this hybrid language system possible is a central property tree structure that is shared between the C++ and Python modules within the same concurrent app.  Imagine something like a dictionary() structure in javascript or a dict() structure in python (or imagine a json structure, or even an xml structure.)  We build an in memory tree structure that contains all the important shared data within the application, and then modules can read/write to the structure as they wish in a collaborative way.  This property tree fills much of the same place as “uorb” in the px4 software stack.  It glues all the various modules together and provides a structured way for them to communicate.

Simplicity and robustness

When you hand over control of an airplane to an autopilot (even a small model airplane) you are putting an immense amount of trust in that hardware, firmware and software.  Software bugs can crash your aircraft.  It’s important for an autopilot to be immensely robust, for the code base to be stable and change slowly, for new changes to be extensively tested.  The more complicated a system becomes, the harder it is to ensure robust, bug free operations.

Throughout the development of the AuraUAS project, the emphasis has been on keeping the code and structure simple and robust.  The overall all goal is to do a few core simple things very, very well.  There are other autopilot systems that have every feature that anyone has ever suggested or wanted to work on;  they support every sensor, run on every possible embedded computer board, and can fly every possible rotor and fixed wing airframe configuration.  I think it’s great that px4 and ardupilot cover this ground and provide a big tent that welcomes everyone.  But I think they do pay a price in terms of code complexity, which in turn has implications for introducing, collecting, and hiding bugs.

The first commit in the AuraUAS repository traces back to about 2006.  So hears to nearly 12 years of successful development!

 

Aerial Survey Flight (with Augmented Reality)

Basic Details

  • Date: October 11, 2017
  • Location: South Central Ag Lab (near Clay Center, NE)
  • Aircraft: Skywalker 1900 with AuraUAS autopilot system.
  • Wing camera with augmented reality elements added (flight track, astronomy, horizon.)
  • Wind: (from) 160 @ 16 kts (18.5 mph) and very turbulent.
  • Temperature: 65 F (18 C)
  • Target Cruise: 25 kts (~29 mph)

The Full Video

Notes

Here are a couple of comments about the flight.

The conditions were very windy and turbulent, but it was a long drive to the location so we decided the risk of airframe damage was acceptable if we could get good data.

The wing view was chosen so I could observe one aileron control surface in flight.  You might notice that the aileron ‘trim’ location puts the right aileron up significantly from the center point.  A 1.3 pound camera is hanging off the right wing and this weight of the camera has twisted the wing a bit and put the aircraft significantly out of roll trim.  The autopilot automatically compensates for the slightly warped wing by finding the proper aileron position to maintain level flight.

Throughout the flight you can see the significant crab angle, short turns up wind, and really wide/long turns down wind.

Because of the winds, the field layout, obstacles, etc. I was forced to spot the airplane landing in a very very tight area.  I mostly managed to do that and the result was a safe landing with no damage.

Despite the high winds and turbulence, the aircraft and autopilot handled itself remarkably well.  The HUD overlay uses simulated RC pilot sticks to show the autopilot control commands.

The augmented reality graphics are added after the flight in post processing using a combination of python and opencv.  The code is open-source and has some support for px4 data logs if anyone is  interested in augmenting their own flight videos.  I find it a very valuable tool for reviewing the performance of the EKF, the autopilot gains, and the aircraft itself.  Even the smallest EKF inaccuracies or tuning inefficiencies can show up clearly in the video.

I find it fascinating to just watch the video and watch how the autopilot is continually working to keep the aircraft on condition.  If you would like to see how the Skywalker + AuraUAS autopilot perform in smoother air, take a look at Flight #71 at the end of this post: http://gallinazo.flightgear.org/uas/drosophila-nator-prototype/