Generating survey area coverage routes

Update: 29 November, 2017:  The work described below has been connected up to the on board autopilot and tested in simulation.  Today I am planning to go out and test with an actual survey aircraft in flight.  I can draw and save any number of areas together as a single project (and create and save any number of projects.)  Once in flight, I can call up a project, select an area, and send it up to the aircraft.  The aircraft itself will generate an optimized route based on planned survey altitude, wind direction, camera field of view, and desired picture overlap.  The result (with zero wind) looks like the following picture.  5 areas have been sketched, one area has been submitted to the aircraft to survey, the aircraft has computed it’s route, the route is trickled back to the ground station and drawn on the map.

Original Post

Today I am working on auto-generating routes that cover arbitrary (closed, non-self intersecting) polygon areas.  The operator is able to draw a collection of polygons on the ground station, save them by project name, and then during flight call up the project/area they wish to survey, send (just the area perimeter) to the aircraft, and it will generate the coverage route automatically on the fly.

The main benefit is that the ground station doesn’t need to upload a 1000 waypoint route, only the area perimeter.  The aircraft will already know the camera field of view and mounting orientation.  It will know target altitude, wind direction and speed.  The operator can include additional parameters like endlap and sidelap percentages.

The end goal is a smart, streamlined, easy to use (fixed wing) survey and mapping system.

There are additional issues to consider such as aircraft turn radius, turn strategies (dog bone turns, versus naive turns), and possibly interleaving transacts (a bit like a zamboni covers a hockey rink.)

Celebrating the 4000th git commit!

Few people really know much about the AuraUAS autopilot system, but this post celebrates the 4000th git commit to the main code repository!

The entire AuraUAS system is hosted on github and can be browsed here:

https://github.com/AuraUAS

AuraUAS traces it’s roots back to a simple open-source autopilot developed by Jung Soon Jang to run on the XBOW MNAV/Stargate hardware back in the 2005-2006 time frame.  I worked at the University of Minnesota at that time and we modified the code to run on an original 400mhz gumtsix linux computer which talked to the MNAV sensor head via a serial/uart connection.

From the mid-2000’s and through the 2010’s I have been advancing this code in support of a variety of fixed-wing UAS projects.  Initially I called the system “MicroGear” or “ugear” which was a nod of the head to my other long term open-source project: FlightGear.  Along the way I aligned myself with a small Alaska-based aerospace company called “Airborne Technologies” or ATI for short.  We branched a version of the code specifically for projects developed under NOAA funding as well as for various internal R&D.  However, throughout the development the code always stayed true to it’s open-source heritage.

In the summer of 2015 I took a full time position in the UAS lab at the Aerospace Engineering Department of the University of Minnesota.  Here I have been involved in a variety of UAS-related research projects and have assumed the roll of chief test pilot for the lab.  AuraUAS has been central to several projects at the UAS lab including a spin testing a project, a phd project to develop single surface fault detection and a single surface flight controller on a flying wing, and several aerial survey projects.  I continue to develop AuraUAS in support of ongoing research projects.

Design choices

What makes AuraUAS different?  What makes AuraUAS interesting?  Why is it important to me?

Big processor / little processor architecture

From the start AuraUAS has been designed with the expectation of a small embedded (arduino-style) processor to handle all the sensor inputs as well as the actuator outputs.  A “big processor” (i.e a raspberry pi, beaglebone, gumstix, edison, etc.) is used for all the higher level functionality such as the EKF (attitude determination), flight control, mission management, communication, and logging.  The advantage is that the system can be built from two smaller and simpler programs.  The “little” processor handles all the hard real time tasks.  This frees up the “big” processor to run a standard linux distribution along with all it’s available libraries and fanciness.  So AuraUAS is built around two simpler programs versus the one really complicated program architecture that most other autopilot systems use.

Single thread architecture

Because of the big/little processor architecture, the big processor doesn’t need to do hard real time tasks, and thus can be written using a single-threaded architecture.  This leads to code that is far simpler, and far easier to understand, manage, and maintain.  Anyone who has tried to understand and modify someone else’s threaded code might have a small inkling of why this could be important.  How many large applications suffer through a variety of obscure, rare, and nearly impossible to find bugs that maybe trace to the threading system, but no one knows for sure?

Python

The “big” processor in the AuraUAS system runs linux, so we can easily incorporate python in the mission critical main loop of the primary flight computer.  This has the advantage of further simplifying coding tasks and shortening the edit/compile/test development loop because there is often no need to compile and reflash code changes between test runs.  You can even do development work remotely, right on the flight computer.  For those that are skeptical of python in the critical main loop, I have successfully flown this system for 2 full flight seasons … all of 2016, and all of 2017.  The main flight computer is hitting it’s 100hz performance target and the memory usage is stable.  Speaking of myself personally, my python code is almost always less buggy than my C code.

In my experience, when porting C/C++ code to python, the result is a 50, 60, even 70% reduction in the number of lines of code.  I believe that fewer lines of code == more readable code on average.  More readable code means fewer bugs and bugs are often found more quickly.

Python/C++ hybrid

AuraUAS is a hybrid C++/Python application.  Some modules are written in C++ and some modules are written in Python.  My choice of language for a particular module tends to center around peformance versus logic.  Our 15-state EKF (also developed in house at the U of MN) remains C++ code for performance reasons.  The mission manager and individual tasks are all written in python.  Python scripts greatly accelerate coding of higher level logic tasks.  These typically are not performance critical and it s a great fit.

What makes this hybrid language system possible is a central property tree structure that is shared between the C++ and Python modules within the same concurrent app.  Imagine something like a dictionary() structure in javascript or a dict() structure in python (or imagine a json structure, or even an xml structure.)  We build an in memory tree structure that contains all the important shared data within the application, and then modules can read/write to the structure as they wish in a collaborative way.  This property tree fills much of the same place as “uorb” in the px4 software stack.  It glues all the various modules together and provides a structured way for them to communicate.

Simplicity and robustness

When you hand over control of an airplane to an autopilot (even a small model airplane) you are putting an immense amount of trust in that hardware, firmware and software.  Software bugs can crash your aircraft.  It’s important for an autopilot to be immensely robust, for the code base to be stable and change slowly, for new changes to be extensively tested.  The more complicated a system becomes, the harder it is to ensure robust, bug free operations.

Throughout the development of the AuraUAS project, the emphasis has been on keeping the code and structure simple and robust.  The overall all goal is to do a few core simple things very, very well.  There are other autopilot systems that have every feature that anyone has ever suggested or wanted to work on;  they support every sensor, run on every possible embedded computer board, and can fly every possible rotor and fixed wing airframe configuration.  I think it’s great that px4 and ardupilot cover this ground and provide a big tent that welcomes everyone.  But I think they do pay a price in terms of code complexity, which in turn has implications for introducing, collecting, and hiding bugs.

The first commit in the AuraUAS repository traces back to about 2006.  So hears to nearly 12 years of successful development!

 

Aerial Survey Flight (with Augmented Reality)

Basic Details

  • Date: October 11, 2017
  • Location: South Central Ag Lab (near Clay Center, NE)
  • Aircraft: Skywalker 1900 with AuraUAS autopilot system.
  • Wing camera with augmented reality elements added (flight track, astronomy, horizon.)
  • Wind: (from) 160 @ 16 kts (18.5 mph) and very turbulent.
  • Temperature: 65 F (18 C)
  • Target Cruise: 25 kts (~29 mph)

The Full Video

Notes

Here are a couple of comments about the flight.

The conditions were very windy and turbulent, but it was a long drive to the location so we decided the risk of airframe damage was acceptable if we could get good data.

The wing view was chosen so I could observe one aileron control surface in flight.  You might notice that the aileron ‘trim’ location puts the right aileron up significantly from the center point.  A 1.3 pound camera is hanging off the right wing and this weight of the camera has twisted the wing a bit and put the aircraft significantly out of roll trim.  The autopilot automatically compensates for the slightly warped wing by finding the proper aileron position to maintain level flight.

Throughout the flight you can see the significant crab angle, short turns up wind, and really wide/long turns down wind.

Because of the winds, the field layout, obstacles, etc. I was forced to spot the airplane landing in a very very tight area.  I mostly managed to do that and the result was a safe landing with no damage.

Despite the high winds and turbulence, the aircraft and autopilot handled itself remarkably well.  The HUD overlay uses simulated RC pilot sticks to show the autopilot control commands.

The augmented reality graphics are added after the flight in post processing using a combination of python and opencv.  The code is open-source and has some support for px4 data logs if anyone is  interested in augmenting their own flight videos.  I find it a very valuable tool for reviewing the performance of the EKF, the autopilot gains, and the aircraft itself.  Even the smallest EKF inaccuracies or tuning inefficiencies can show up clearly in the video.

I find it fascinating to just watch the video and watch how the autopilot is continually working to keep the aircraft on condition.  If you would like to see how the Skywalker + AuraUAS autopilot perform in smoother air, take a look at Flight #71 at the end of this post: http://gallinazo.flightgear.org/uas/drosophila-nator-prototype/

Spin Testing

Wikipedia Spins: In aviation’s early days, spins were poorly understood and often fatal. Proper recovery procedures were unknown, and a pilot’s instinct to pull back on the stick served only to make a spin worse. Because of this, the spin earned a reputation as an unpredictable danger that might snatch an aviator’s life at any time, and against which there was no defense.

Even in today’s modern world, spins are disorienting and still can be fatal.  This project aims to study spins with a highly instrumented aircraft in order to better understand them, model them, and ultimately create cockpit instrumentation to help a pilot safely recover from a spin.

The test aircraft is an Ultrastick 120 operated by the University of Minnesota UAS Research Labs.  It is outfitted with two NASA designed air data booms, one at each wing tip along with a traditional IMU, GPS, pitot probe, and control surface position sensors.  The pilot is safely on the ground throughout the entire test flight (as well as before and after.)

These are in-flight videos of two test flights.  Flight #14 is the ‘nominal’ configuration.  In flight #15 the CG is moved to the aft limit and the plane is repeatedly put into an aggressive spin.

In both videos, the onboard attitude estimate and other sensors are drawn as a conformal overlay.  The pilot stick inputs are shown in the lower left and right corners of the display.  This aircraft is equipped with alpha/beta probes so the data from those sensors is used to draw a ‘flight path marker’ that shows angle of attack and side slip.  Airspeed units are in mps, altitude units are in meters.  The last 120 seconds of the flight track is also drawn into the video to help with visualizing the position of the aircraft in the air.

These flight tests are conducted by the University of Minnesota UAS Research Labs.

 

Sunset Flight

This is Skywalker Flight #74, flown on Sept. 7, 2017.  It ended up being a 38.5 minute flight–scheduled to land right at sunset.  The purpose of the flight was to carry insect traps at 300′ AGL and collect samples of what might be flying up at that altitude.

What I like about this flight is that the stable sunset air leads to very consistent autopilot performance.  The circle hold is near perfect.  The altitude hold is +/- 2 meters (usually much better), despite continually varying bank angles which are required to hold a perfect circle shape in 10 kt winds.

The landing at the end is 100% autonomous and I trusted it all the way down, even as it dropped in  between a tree-line and a row of turkey barns.  The whole flight is presented here for completeness, but feel free to skip to part 3 if you are interested in seeing the autonomous landing.

As an added bonus, stick around after I pick up the aircraft as I walk it back.  I pan the aircraft around the sky and you can clearly see the perfect circle hold as well as the landing approach.  I use augmented reality techniques to overlay the flight track history right into the video–I think it’s kind of a “cool tool” for analyzing your autopilot and ekf performance.

Field Comparison of MPU6000 vs VN100T

The U of MN UAV Lab has flown a variety of sensors in aircraft, ranging from the lowly MPU-6000 (such as is found on an atmel based APM2 board) all the way up to an expensive temperature calibrated VectorNAV VN-100T.  I wish to present a quick field comparison of these two sensors.

Hobby King Skywalker with MPU-6000.
Sentera Vireo with VectorNav VN-100T onboard.

[disclaimers: there are many dimensions to any comparison, there are many individual use cases, the vn100 has many features not found on a cheap MPU-6000, the conditions of this test are not perfectly comparable: two different aircraft flown on two different days.   These tests are performed with a specific MPU-6000 and a specific VN-100T — I can’t say these predict the performance of any other specific IMU.  Both sensors are being sampled at 100hz externally.  Internally the MPU-6000 is being sampled at 500hz and filtered.  I suspect the VN-100T is outputting unfiltered values — but that is just a guess from the plot results.]

The point of this post is not to pick on the expensive solution, comparisons are difficult with a perfectly level playing field.  But hopefully I can show that in many ways, the less expensive solution may not be as bad as you thought–especially with a little calibration help.

Exhibit A: Raw gyro data in flight

I will mostly let the plots speak for themselves, they share the same vertical scale and cover about the same time span.  The less expensive sensor is clearly less noisy.  This trend holds up when the two sensors are motionless on the ground.

MPU-6000 gyros (100 seconds of flight)
VN-100T gyros (100 seconds of flight)

Exhibit B: Raw accelerometer data in flight

Again, the plots speak for themselves.  Given the same vertical and horizontal scales, the less expensive sensor is by far less noisy.

MPU-6000 accelerometers (100 seconds of flight)
VN-100T accelerometers (100 seconds of flight)

Exhibit C: Sensor bias estimates

The UMN UAV lab flies it’s aircraft with our own high fidelity EKF library.  The EKF code is the fancy mathematical code that estimates the aircraft state (rolll, pitch, yaw, position, velocity, etc.) given raw IMU and GPS inputs.

One of the arguments against a cheap sensor versus an expensive temperature calibrated sensor is you are paying for high quality temperature calibration in the expensive model.  This is a big deal with mems IMU sensors, because temperature can have a big effect on the bias and accuracy of the sensor.

This next pair of plots requires a small bit of explanation (please read this first!)

  • For every temp step, the UMN EKF library estimates the bias (or error) of each individual gyro and accelerometer (along with estimating aircraft attitude, position, and velocity.)
  • We can plot this bias estimate over time and compare them.
  • The bias estimates are just estimates and other errors in the system (like gps noise) can make the bias estimates jump around.
  • I have developed a temperature calibration process for the inexpensive IMU’s.  This process is being used to pre-correct the MPU-6000 sensor values in flight.  This correction process uses past flight data to develop a temp calibration fit and the more you fly and the bigger range of temperatures you fly in, the better the calibration becomes.
  • Just to be clear: for these final plots, the MPU-6000 is using our external temp calibration process — derived entirely from past flight data.  The VN-100T is running it’s own internal temp calibration.
  • These bias estimates are not perfect, but they give a suggestion of how well the IMU is calibrated.  Higher bias values suggest a larger calibration error.

MPU-6000 sensor bias estimates from UMN EKF library.

VN-100T sensor bias estimates from UMN EKF library.

What can we learn from these plots?

  • The MPU-6000 gyro biases (estimated) are approximately 0.05 deg/sec.
  • The VN-100T gyro biases (estimated) are as high as -1.0 deg/sec in roll and 0.35 deg/sec in yaw.
  • The MPU-6000 accel biases (estimated) are in the 0.1 m/s^2 range.
  • The VN-100T accel biases (estimated) are also in the 0.1 m/s^2 range.

In some cases the MPU-6000 with external temperature calibration appears to be more accurate than the VN-100 and in some cases the VN-100T does better.

Summary

By leveraging a high quality EKF library and a bit of clever temperature calibration work, an inexpensive MPU-6000 seems to be able to hold it’s own quite well against an expensive temperature calibrated mems IMU.

Drosophila-nator (Prototype)

This is a joint Entomology / Aerospace project to look for evidence that Spotted Wing Drosophila (an invasive species to North America) may be migrating at higher altitudes where wind currents can carry them further and faster than otherwise expected.

Skywalker 1900 outfitted with 2 petri-dish sized insect traps.

Skywalker Flight #69

Altitude: 200′ AGL
Airspeed: 20 kts
Weather:  10 kts wind, 22C
Mission: Circle fruit fields with insect traps.

Skywalker Flight #70

Altitude: 300′ AGL
Airspeed: 20 kts
Weather:  12 kts wind, 20C
Mission: Circle fruit fields with insect traps.

Skywalker Flight #71

Altitude: 400′ AGL
Airspeed: 20 kts
Weather:  13-14 kts wind, 20C
Mission: Circle fruit fields with insect traps.

Flying on the Edge of a Storm

This is a follow up to my eclipse post.  I was forced to end my eclipse flight 10 minutes before the peak because a line of rain was just starting to roll over the top of me.  I waited about 20-30 minutes for the rain to clear and launched a post-eclipse flight that lasted just over an hour of flight time.

Here are some interesting things in this set of flight videos:

  • You will see the same augmented reality heads up display and flight track rendering.  This shows every little blemish in the sensors, EKF, flight control system, and airplane!  It’s a great testing and debugging tool if you really hope to polish your aircraft’s tuning and flight performance.
  • IT IS WINDY!!!!  The skywalker cruises at about 20 kts indicated airspeed.  Winds aloft were pushing 16 … 17 … 18 kts sustained.  At one point in the flight I record 19.5 kt winds.
  • At t=2517 (there is a timer in seconds in the lower left corner of the HUD) we actually get pushed backwards for a few seconds.  How does your autopilot navigation work when you are getting pushed backwards by the wind?!?  You can find this about 20 seconds into Part #3.  Check it out. 🙂
  • In the 2nd half of the flight the winds transition from 16-17 kts and fairly smooth, to 18-19 kts and violent.  The poor little skywalker is getting severely thrashed around the sky in places.  Still it and the autopilot seem to handle it pretty well.  I was a bit white knuckle watching the flight unfold from the ground, but the on board HUD shows the autopilot was pretty relaxed and handled the conditions without really breaking much of a sweat.
  • When the winds really crank up, you will see the augmented flight track pass by sideways just after we pass the point in the circle where we are flying directly into the wind  … literally the airplane is flying sideways relative to the ground when you see this.
  • Does this bumpy turbulent video give you a headache?  Stay tuned for an upcoming video in super smooth air with a butterworth filter on my airspeed sensor.

Note: the hobbyking skywalker (1900mm) aircraft flown in this video has logged 71 flights, 31.44 hours in the air (1886 minutes), and covered 614 nautical miles (1137 km) across the ground.

Flying on the Edge of an Eclipse (2017)

On August 21, 2017 a full solar eclipse sliced a shadowy swath across the entire continental USA.  The totality area missed Minnesota by a few hundred miles so we only saw about 85% obscuration at our peak.

I thought it could be interesting to put a UAV in the sky during our partial eclipse and record the flight.  I didn’t expect too much, but you never know.  In the end we had a line of rain move through a few minutes before the peak and it was really hard to say if the temperature drop and less light was due to a wave of rain or due to the eclipse.

Still, I needed to test some changes in the AuraUAS flight controller and was curious to see how the TECS system would fly with a completely unfiltered/raw/noisy airspeed input.  Why not roll all that together and go test fly!

Here is the full video of the 37 minute flight.  Even though this is slightly boring flight test video, you might find a few interesting things if you skip around.

  • I talk at the start and the end of the flight.  I can’t remember what I said, but I’m sure it is important and insightful.
  • I rendered the whole live flight track in the video using augmented reality techniques.  I think that’s pretty cool.
  • If you skip to the end, I pick up the plane and walk back to the runway.  I think that is the funnest part.  There I pan the airplane around the sky and show my flight path and approach drawn right into the real video using the same augmented reality techniques.
  • We had 100% cloud cover and zero view of the sun/moon.  But that doesn’t stop me from drawing the sun and moon in the view where it actually is.  Not surprisingly, they are sitting almost exactly on top of each other.  You can see this at the end of Part 3.
  • I flew a fully autonomous landing on this flight.  It worked out pretty well and shows up nicely at the end of Part 3.  If anyone is interested, the auto-land task is written as an embedded python script and runs right on-board in the main flight controller.  I think that might be pretty cool for people who are python fans.  If you want to geek out on the details you can see the whole landing script here: https://github.com/AuraUAS/aura-core/blob/master/src/mission/task/land2.py  (Then go watch it in action at the end of Part #3.)

Aerial Deer

Last Friday I flew an aerial photography test flight using a Skywalker 1900 and a Sony A6000 camera (with 20mm lens.)  On final approach we noticed a pair of deer crossing under the airplane.  I went back through the image set to see if I could spot the deer in any of the pictures.  I found at least one deer in 5 different shots.  Here are the zoom/crops: