Mistakes!

dj-mistakes-homer-dohmistake-pano_13891

I make thousands of mistakes a day, mistakes typing, mistakes coding software, mistakes driving, mistakes walking, forgetting to order my sandwich without mayo, etc.  Most of the time they are immediately obvious — a red squiggly line under a word I mistyped, a compiler spewing an error message on line #42, a stubbed toe, my gps suggesting a u-turn at the next intersection, etc.

mistakes

But what happens when the mistake isn’t obvious, isn’t noticed immediately, and doesn’t cause everything around me to immediately fail?  Often these mistakes can have a long lifespan.  Often we discover them when we are looking for something else.

Mistakes from the Trenches.

I wanted to write about a few subtle unnoticed mistakes that lurked in the AuraUAS code for quite some time.

Temperature Calibration #Fail

AuraUAS has a really cool capability where it can estimate the bias (error) of the accelerometers during flight.  The 15-state EKF does this as part of it’s larger task of estimating the aircraft’s attitude, location, and velocity.  These bias estimates along with the corresponding IMU temperature can be used to build up a temperature calibration fit for each specific IMU based on flight data over time.  The more you fly in different temperature conditions, the better your temperature calibration becomes.  Sweet!  Calibrated accelerometers are important because accel calibration errors directly translate to errors in initial roll and pitch estimates (like during launch or take off where these values can be critical.)  Ok, the EKF will sort them out once in the air, because that is a cool feature of the EKF, but it can’t work out the errors until after flying a bit.

The bias estimates and temperature calibration fit are handled by post-flight python scripts that work with the logged flight data.  Question: should I log the raw accel values or should I log the calibrated accel values.  I decided I should log the calibrated values and then use the inverse calibration fit function to derive the original raw values after the flight.  Then I use these raw values to estimate the bias (errors), add the new data to the total collection of data for this particular IMU, and revise the calibration fit.  The most straightforward path is to log calibrated values on board during flight (in real time) and push the complicated stuff off into post processing.

However, I made a slight typo in the property name of the temperature range limits for the fit (we only fit within the range of temperatures we have flight data for.)  This means the on-board accel correction was forcing the temperature to 27C (ignoring the actual IMU temperature.)  However, when backing out the raw values in post processing, I was using the correct IMU temperature and thus arriving at a wrong raw value.  What a mess.  That means a year of calibration flight data is basically useless and I have to start all my IMU calibration learning over from scratch.  So I fixed the problem and we go forward from here with future flights producing a correct calibration.

Integer Mapping #Fail

This one is subtle.  It didn’t produce incorrect values, it simply reduced the resolution of the IMU gyros by a factor of 4 and the accels by a factor of 2.

Years ago when I first created the apm2-sensor firmware — that converts a stock APM2 (atmega2560) board into a pure sensor head — I decide to change the configured range of the gyros and accels.  Instead of +/-2000 degrees per second, I set the gyros for +/-500 degrees per second.  Instead of +/-8 g’s on the accels, I set them for +/- 4 g’s.  The sensed values get mapped to a 16 bit integer, so using a smaller range results in more resolution.

The APM2 reads the raw 16 bit integer values from the IMU and converts this to radians per second.  However, when the APM2 sends these values to the host, it re-encodes them from a 4-byte float to a 2-byte (16-bit) integer to conserve bandwidth.  Essentially this undoes the original decoding operation to efficiently transmit the values to the host system.  The host reads the encoded integer value and reconverts it into radians per second for the gyros (or mps^2 for the accels.)

The problem was that for encoding and decoding between the APM2 and the host, I used the original scaling factor for +/-2000 dps and +/-8g, not the correct scaling factor for the new range I had configured.  This mistake caused me to lose all the resolution I intended to gain.  Because the system produced the correct values on the other end, I didn’t notice this problem until someone asked me exactly what resolution the system produced, which sent me digging under the hood to refresh my memory.

This is now fixed in apm2-sensors v2.52, but requires a change to the host software as well so the encoding and decoding math agrees.  Now the IMU reports the gyro rates with a resolution of 0.015 degrees per second where as previously the resolution was 0.061 degrees per second.  Both are actually pretty good, but it pained me to discover I was throwing away resolution needlessly.

Timing #Fail

This one is also very subtle; timing issues often are.  In the architecture of the AuraUAS flight controller there is an APM2 spitting out new sensor data at precisely 100 hz.  The host is a beaglebone (or any linux computer) running it’s own precise 100 hz main loop.  The whole system runs at 100 hz throughput and life is great — or so I thought.

I had been logging flight data at 25hz which has always been fine for my own needs.  But recently I had a request to log the flight data at the full 100 hz rate.  Could the beaglebone handle this?  The answer is yes, of course, and without any trouble at all.

A question came up about logging high rate data on the well known PX4, so we had a student configure the PX4 for different rates and then plot out the time slice for each sample.  We were surprised at the huge variations in the data intervals, ranging from way too fast, to way too slow, and rarely exactly what we asked for.

I know that the AuraUAS system runs at exactly 100hz because I’ve been very careful to design it that way.  Somewhat smugly I pulled up a 100hz data set and plotted out the time intervals for each IMU record.  The plot surprised me — my timings were all over the map and not much better than the PX4.  What was going on?

I took a closer look at the IMU records and noticed something interesting.  Even though my main loop was running precisely and consistently at 100 hz, it appeared that my system was often skipping every other IMU record.  AuraUAS is designed to read whatever sensor data is available at the start of each main loop iteration and then jump into the remaining processing steps.  Because the APM2 runs it’s own loop timing separate from the host linux system, the timing between sending and receiving (and uart transferring) can be misalligned so that when the host is ready to read sensor data, there might not be any yet, and next time there may be 2 records waiting.  It is subtle, but communication between to free running processor loops can lead to issues like this.  The end result is usually still ok, the EKF handles variable dt just fine, the average processing rate maybe drops to 50hz, and that’s still just fine for flying an airplane around the sky … no big deal right?  And it’s really not that big of a deal for getting the airplane from point A to point B, but if you want to do some analysis of the flight data and want high resolution, then you do have a big problem.

What is the fix?  There are many ways to handle timing issues in threaded and distributed systems.  But you have to be very careful, often what you get out of your system is not what you expected or intended.  In this case I have amended my host system’s main loop structure to throw away it’s own free running main loop.  I have modified the APM2 data output routine to send the IMU packet at the end of each frame’s output to mark the end of data.  Now the main loop on the host system reads sensor data until it receives an IMU packet.  Then and only then does it drop through to the remaining processing steps.  This way the timing of the system is controlled precisely by the APM2, the host system’s main loop logic is greatly simplified, and the per frame timing is far more consistent … but not consistent enough.

The second thing I did was to include the APM2 timestamp with each IMU record.  This is a very stable, consistent, and accurate timestamp, but it counts up from a different starting point than the host.  On the host side I can measure the difference between host clock and APM2 clock, low pass filter the difference and add this filtered difference back to the APM2 timestamp.  The result is a pretty consistent value in the host’s frame of (time) reference.

Here is a before and after plot. The before plot is terrible! (But flies fine.)  The after plot isn’t perfect, but might be about as good as it gets on a linux system.  Notice the difference in Y-scale between the two plots.  If you think your system is better than mine, log your IMU data at 100hz and plot the dt between samples and see for yourself.  In the following plots, the Y axis is dt time in seconds.  The X axis is elapsed run time in seconds.

imu_dt_beforedt using host’s timestamp when imu packet received.

imu_dt_afterdt when using a hybrid of host and apm2 timestamps.

Even with this fix, I see the host system’s main loop timing vary between 0.008 and 0.012 seconds per frame, occasionally even worse (100hz should ideally equal exactly 0.010 seconds.)  This is now far better than the system was doing previously, and far, far better than the PX4 does … but still not perfect.  There is always more work to do!

Conclusions

These mistakes (when finally discoveded) all led to important improvements with the AuraUAS system: better accelerometer calibration, better gyro resolution, better time step consistency with no dropped frames.  Will it help airplanes get from point A to point B more smoothly and more precisely?  Probably not in any externally visible way.  Mistakes?  I still make them 1000’s of times a day.  Lurking hidden mistakes?  Yes, those too.  My hope is that no matter what stage of life I find myself in, I’m always working for improvements, always vigilant to spot issues, and always focused on addressing issues when they are discovered.

Automated Movie Frame Extracting and Geotagging

This is a short tutorial on an automated method to extract and geotag movie frames.  One specific use case is that you have just flown a survey with your quad copter using a 2-axis gimbal pointing straight down, and a gopro action cam in movie mode.  Now you’d like to create a stitched map from your data using tools like pix4d or agisoft.

The most interesting part of this article is the method I have developed to correlate the frame timing of a movie with the aircraft’s flight data log.  This correlation process yields a result such that for any and every frame of the movie, I can find the exact corresponding time in the flight log, and for any time in the flight log, I can find the corresponding video frame.  Once this relationship is established, it is a simple matter to walk though the flight log and pull frames based on the desired conditions (for example, grab a frame at some time interval, while above some altitude AGL, and only when oriented +/- 10 degrees from north or south.)

Video Analysis

The first step of the process is to analyze the frame to frame motion in the video.

features

Example of feature detection in a single frame.

  1. For each video frame we run a feature detection algorithm (such as SIFT, SURF, Orb, etc.) and then compute the descriptors for each found feature.
  2. Find all the feature matches between frame “n-1” and frame “n”.  This is done using standard FLANN matching, followed by a RANSAC based homography matrix solver, and then discarding outliers.  This approach has a natural advantage of being able to ignore extraneous features from the prop or the nose of the aircraft because those features don’t fit into the overall consensus of the homography matrix.
  3. Given the set of matching features between frame “n-1” and frame “n”, I then compute a best fit rigid affine matrix transformation from the features locations (u, v) from one frame to the next.  The affine transformation can be decomposed into a rotational component, a translation (x, y) component, and a scale component.
  4. Finally I log the frame number, frame time (starting at t=0.0 for the first frame), and the rotation (deg), x translation (pixels), and y translation (pixels.)

The cool, tricky observation

I haven’t seen anyone else do anything like this before, so I’ll pretend I’ve come up with something new and cool.  I know there is never anything new under the sun, but it’s fun to rediscover things for oneself.

Use case #1: Iris quad copter with a two axis Tarot gimbal, and a go-pro hero 3 pointing straight down.  Because the gimbal is only 2-axis, the camera tracks the yaw motion of the quad copter exactly.  The camera is looking straight down, so camera roll rate should exactly match the quad copter’s yaw rate.  I have shown I can compute the frame to frame roll rate using computer vision techniques, and we can save the iris flight log.  If these two signal channels aren’t too noisy or biased relative to each other, perhaps we can find a way to correlate them and figure out the time offset.

iris-tarot

3DR Iris + Tarot 2-axis gimbal

Use case #2:  A Senior Telemaster fixed wing aircraft with a mobius action cam fixed to the airframe looking straight forward.  In this example, camera roll should exactly correlate to aircraft roll.  Camera x translation should map to aircraft yaw, and camera y translation should map to aircraft pitch.

IMG_20150804_073522

Senior Telemaster with forward looking camera.

In all cases this method requires that at least one of the camera axes is fixed relative to at least one of the aircraft axes.  If you are running a 3 axis gimbal you are out of luck … but perhaps with this method in mind and a bit of ingenuity alternative methods could be devised to find matching points in the video versus the flight log.

Flight data correlation

This is the easy part.  After processing the movie file, we  now have a log of the frame to frame motion.  We also have the flight log from the aircraft.  Here are the steps to correlate the two data logs.

overlay

Correlated sensor data streams.

  1. Load both data logs (movie log and flight log) into a big array and resample the data at a consistent interval.  I have found that resampling at 30hz seems to work well enough.  I have experimented with fitting a spline curve through lower rate data to smooth it out.  It makes the plots look prettier, but I’m sure does not improve the accuracy of the correlation.
  2. I coded this process up in python.  Luckily python (numpy) has a function that takes two time sequences as input and does brute force correlation.  It slides the one data stream forward against the other data stream and computes a correlation value for every possibly overlap.  This is why it is important to resample both data streams at the same fixed sample rate.
    ycorr = np.correlate(movie_interp[:,1], flight_interp[:,1], mode='full')
  3. When you plot out “ycorr”, you will hopefully see a spike in the plot and that should correspond to the best fit of the two data streams.

correlation

Plot of data overlay position vs. correlation.

Geotagging move frames

 

GOPR0002-009054

Raw Go-pro frame grab showing significant lens distortion.  Latitude = 44.69231071, Longitude = -93.06131655, Altitude = 322.1578

The important result of the correlation step is that we have now determined the exact offset in seconds between the movie log and the flight log.  We can use this to easily map a point in one data file to a point in the other data file.

Movie encoding formats are sequential and the compression algorithms require previous frames to generate the next frame.  Thus the geotagging script steps through the movie file frame by frame and finds the point in the flight log data file that matches.

For each frame that matches the extraction conditions, it is a simple matter to lookup the corresponding longitude, latitude, and altitude from the flight log.  My script provides an example of selecting movie frames based on conditions in the flight log.  I know that the flight was planned so the transacts were flown North/South and the target altitude was about 40m AGL.  I specifically coded the script to extract movie frames at a specified interval in seconds, but only consider frames taken when the quad copter was above 35m AGL and oriented within +/- 10 degrees of either North or South.  The script is written in python so it could easily be adjusted for other constraints.

The script writes each selected frame to disk using the opencv imwrite() function, and then uses the python “pyexiv2” module to write the geotag information into the exif header for that image.

geolocated

A screen grab from Pix4d showing the physical location of all the captured Go-pro movie frames.

Applications?

Aerial surveying and mapping

The initial use case for this code was to automate the process of extracting frames from a go-pro movie and geotagging them in preparation for handing the image set over to pix4d for stitching and mapping.

gopro-stitch

Final stitch result from 120 geotagged gopro movie frames.

Using video as a truth reference to analyze sensor quality

It is interesting to see how accurately the video roll rate corresponds to the IMU gyro roll rate (assume a forward looking camera now.)  It is also interesting in plots to see how the two data streams track exactly for some periods of time, but diverge by some slowly varying bias for other periods of time.  I believe this shows the variable bias of MEMS gyro sensors.  It would be interesting to run down this path a little further and see if the bias correlates to g force in a coupled axis?

Visual odometry and real time mapping

Given feature detection and matching from one frame to the next, knowledge of the camera pose at each frame, opencv pnp() and triangulation() functions, and a working bundle adjuster … what could be done to map the surface or compute visual odometry during a gps outage?

Source Code

The source code for all my image analysis experimentation can be found at the University of Minnesota UAV Lab github page.  It is distributed under the MIT open-source license:

https://github.com/UASLab/ImageAnalysis

Comments or questions?

I’d love to see your comments or questions in the comments section at the end of this page!

Image Stitching Tutorial Part #2: Direct Georeferencing

What is Direct Georeferencing?

uav-for-PA_01-image-collection

The process of mapping a pixel in an image to a real world latitude, longitude, and altitude is called georeferencing.  When UAS flight data is tightly integrated with the camera and imagery data, the aerial imagery can be directly placed and oriented on a map.   Any image feature can be directly located.  All of this can be done without needing to feature detect, feature match, and image stitch.  The promise of “direct georeferencing” is the ability to provide useful maps and actionable data immediately after the conclusion of a flight.

Typically a collection of overlapping images (possibly tagged with the location of the camera when the image was snapped) are stitched together to form a mosaic.  Ground control points can be surveyed and located in the mosaic to georectify the entire image set.   However the process of stitching imagery together can be very time consuming and specific images sets can fail to stitch well for a variety of reasons.

If a good estimate for the location and orientation of the camera is available for each picture, and if a good estimate of the ground surface is available, the tedious process of image stitching can be skipped and image features can be georeferenced directly.

Where is Direct Georeferencing Suitable for use?

stressed_trees

There are a number of reasons to skip the traditional image stitching pipeline and use direct georeferencing.  I list a few here, but if the readers can think of other use cases, I would love to hear your thoughts and expand this list!

  • Any situation where we are willing to trade a seamless ‘nice’ looking mosaic for fast results.  i.e. a farmer in a field that would like to make a same-day decision rather than wait for a day or two for image stitching results to be computed.
  • Surveys or counting.  One use case I have worked on is marine surveys.  In this use case the imagery is collected over open-ocean with no stable features to stitch.  Instead we were more interested on finding things that were not water and getting a quick but accurate location estimate for them.  Farmers might want to count livestock, land managers might want to locate dead trees, researchers might want to count bird populations on a remote island.

debris1

debris

How can Direct Georeferencing Improve the Traditional Image Stitching Pipeline?

There are some existing commercial image stitching applications that are very good at what they do.  However, they are closed-source and don’t give up their secrets easily.  Thus it is hard to do an apples-to-apples comparison with commercial tools to evaluate how (and how much) direct georeferencing can improve the traditional image stitching pipeline.  With that in mind I will forge ahead and suggest several ways I believe direct georeferencing can improve the traditional methods:

  • Direct georeferencing provides an initial 3d world coordinate estimate for every detected feature before any matching or stitching work is performed.
  • The 3d world coordinates of features can be used to pre-filter match sets between images.  When doing an n vs. n image compare to find matching image pairs, we can compare only images with feature sets that overlap in world coordinates, and then only compare the overlapping subset of features.  This speeds up the feature matching process by reducing the number of individual feature comparisons.  This increases the robustness of the feature matching process by reducing the number of potential similar features in the comparison set.  And this helps find potential match pairs that other applications might miss.
  • After an initial set of potential feature matches is found between a pair of images, these features must be further evaluated and filtered to remove false matches.  There are a number of approaches for filtering out incorrect matches, but with direct georeferencing we can add real world proximity to our set of strategies for eliminating bad matches.
  • Once the entire match set is computed for the entire image set, the 3d world coordinate for each matched feature can be further refined by averaging the estimate from each matching image together.
  • When submitting point and camera data to a bundle adjustment algorithm, we can provide our positions already in 3d world coordinates.  We don’t need to build up an initial estimate in some arbitrary camera coordinate system where each image’s features are positioned relative to neighbors.  Instead we can start with a good initial guess for the location of all our features.
  • When the bundle adjustment algorithm finishes.  We can compare the new point location estimates against the original estimates and look for features that have moved implausibly far.  This could be evidence of remaining outliers.

Throughout the image stitching pipeline, it is critical to create a set of image feature matches that link all the images together and cover the overlapping portions of each image pair as fully as possible.  False matches can cause ugly imperfections in the final stitched result and they can cause the bundle adjustment algorithm to fail so it is critical to find a good set of feature matches.  Direct georeferencing can improve the critical feature matching process.

What Information is Required for Direct Georeferencing?

  • An accurate camera position and orientation for each image.  This may require the flight data log from your autopilot and an accurate (sub second) time stamp when the image was snapped.
  • An estimate of the ground elevation or terrain model.
  • Knowledge of your camera’s lens calibration (“K” matrix.)  This encompasses the field of view of your lens (sensor dimensions and focal length) as well as the sensor width and height in pixels.
  • Knowledge of your camera’s lens distortion parameters.  Action cameras like a gopro or mobius have significant lens distortion that must be accounted for.

Equipment to Avoid

I don’t intend this section to be negative, every tool has strengths and times when it is a good choice.  However, it is important to understand what equipment choices works better and what choices may present challenges.

  • Any camera that makes it difficult to snap a picture precisely (< 0.1 seconds) from the trigger request.  Autofocus commercial cameras can introduce random latency between the camera trigger and the actual photo.
  •  Rolling shutter cameras.  This is just about every commercial off the shelf camera sadly, but rolling shutter introduces warping into the image which can add uncertainty to the results.  This can be partially mitigated by setting your camera to a very high shutter speed (i.e. 1/800 or 1/1000th of a second.)
  • Cameras with slow shutter speeds or cameras that do not allow you to set your shutter speed or other parameters.
  • Any camera mounted to an independent gimbal.  A gimbal works nice for stable video, but if it separates the camera orientation from the aircraft orientation, then we can no longer use aircraft orientation to compute camera orientation.
  • Any flight computer that doesn’t let you download a complete flight log that includes real world time stamp, latitude, longitude, altitude, roll, pitch, and yaw.

The important point I am attempting to communicate is that tight integration between the camera and the flight computer is an important aspect for direct georeferencing.  Strapping a gimbaled action cam to a commercial quad-copter very likely will not allow you to extract all the information required for direct georeferencing.

 

Image Stitching Tutorial Part #1: Introduction

features

Background

During the summer of 2014 I began investigating image stitching techniques and technologies for a NOAA sponsored UAS marine survey project.  In the summer of 2015 I was hired by the University of Minnesota Department of Aerospace Engineering and Mechanics to work on a Precision Agriculture project that also involves UAS’s and aerial image stitching.

Over the past few months I have developed a functional open-source image stitching pipeline written in python and opencv.  It is my intention with this series of blog postings to introduce this work and further explain our approach to aerial image processing and stitching.

Any software development project is a journey of discovery and education, so I would love to hear your thoughts, feedback, and questions in the comments area of any of these posts.  The python code described here will be released under the MIT open-source software license (one of my to-do list items is to publish this project code, so that will happen “soon.”)
feature-matching

 Why?

The world already has several high quality commercial image stitching tools as well as several cloud based systems that are free to use.  Why develop yet another image stitching pipeline?  There are several reasons we began putting this software tool chain together.

  • We are looking at the usefulness and benefits of ‘direct georeferencing.’  If we have accurate camera pose information (time, location, and camera orientation of each image), then how can this improve the image analysis, feature detection, feature matching, stitching, and rendering process?
  • One of the the core strengths of the UMN Aerospace Engineering Department is a high quality 15-state kalman filter attitude determination system.  This system uses inertial sensors (gyros and accelerometers) in combination with a GPS to accurately estimate an aircraft’s ‘true’ orientation and position.  Our department is uniquely positioned to provide a high quality camera pose estimate and thus examine ‘direct georeferencing’ based image processing.
  • Commercial software and closed source cloud solutions do not enable the research community to easily ask questions and test ideas and theories.
  • We hope to quantify the sensor quality required to perform useful direct georeferencing as well as the various sources of uncertainty that can influence the results.
  • We would love to involve the larger community in our work, and we anticipate there will be some wider interest in free/open image processing and stitching tools that anyone can modify and run on their own computer.

Outline

I will be posting new tutorials in this series as they are written.  Here is a quick look ahead at what topics I plan to cover:

  • Direct Georeferencing
  • Image stitching basics
  • Introduction to the open-source software tool chain
  • Aircraft vs. camera poses and directly visualizing your georeferenced data set
  • Feature detection
  • Feature matching
  • Sparse Bundle Adjustment
  • Seamless 3D and 2D image mosaics, DEM’s, Triangle meshes, etc.

Throughout the image collection and image stitching process there is art, science, engineering, math, software, hardware, aircraft, skill, and a maybe bit of luck once in a while (!) that all come together in order to produce a successful aerial imaging result.

Software Download

The software referenced in this tutorial series is licensed with the MIT license and available on the University of Minnestoa UAV Lab public github page under the ImageAnalysis repository.

Credits

Flight Milestones

Congratulations!

Congrats ATI Resolution 3, Hobby Lobby Senior Telemaster, Hobbyking Skywalker, and Avior-lite autopilot on your recent milestones!

IMG_20130923_183033 IMG_20150804_073522 IMG_20150804_093541 IMG_20150801_135249

Avior-lite (beaglebone + apm2 hybrid) autopilot:

  • 300th logged flight
  • 7000+ logged flight minutes (117.8 hours)
  • 6400+ fully autonomous flight minutes (107.2 hours)
  • 2895 nautical miles flown (3332 miles, 5362 km)

Hobby Lobby Senior Telemaster (8′ wing span)

  • Actively flight testing autopilot hardware and software changes since 2007!
  • 200th logged flight.
  • 5013 logged flight minutes (83.5 hours)
  • 4724 fully autonomous flight minutes (78.7 hours)
  • 2015 nautical miles flown (2319 miles, 3733 km)

Today (October 7, 2015) I logged the 300th avior-lite flight and simultaneously logged the 200th flight on my venerable Senior Telemaster.  I realize these are just numbers, and they wouldn’t be big numbers for a full scale aircraft operation or even a production uav operation, but it represents my personal effort in the UAV field.

I’m proud of a mishap-free 2015 flying season so far!  (Ok, err, well one mishap on the very first launch of the skywalker … grrr … pilot error … and fixable thankfully.)

Enjoy the fall colors and keep flying!

fall-colors-skywalker

APM2 Sensor Head

apm2

The ardupilot mega is a fairly capable complete autopilot from both the hardware and the software perspective.  But what if you projects needs all the sensors and not the full APM2 autopilot code?

Overview

The apm2-sensorhead project provides a quick, robust, and inexpensive way to add a full suite of inertial and position sensors to your larger robotics project.  This project is a replacement firmware for the ardupilot-mega hardware.  The stock arduplane firmware has been stripped down to just include the library code that interrogates the connected sensors, and also maintains the code that can read your RC transmitter stick positions through a stock RC receiver as well as drive up to 8 servo outputs.  It also includes manual override (safety pilot) code and code to communicate with a host computer.  The next upcoming feature will be onboard mixing for vtail, elevon, and flaperon aircraft configurations.  This allows you the option to fly complicated airframes (and have an autopilot fly complicated airframes) without needing a complicated transmitter or adding complicated mixing code to your autopilot application.

Why?

Speaking a bit defensively: I want to address the “why?” question.  First of all, I needed something like this for one of my own autopilot projects.  That’s really the primary motivation right there and I could just skip to the next section.  If you can answer this question for yourself, then you are my target audience!  Otherwise if your imagination is not already running off on it’s own, why should you or anyone else possibly be interested?

  • You are working in the context of a larger project and need to incorporate an IMU and GPS and possibly other sensors.  Do you design your own board?  Do you shoehorn your code onto the ardupilot?  Do you look at some of the other emerging boards (pixhawk, etc.?)  What if you could integrate the sensors you need quickly and easily?
  • The ardupilot mega is a relatively inexpensive board.  There are some APM clones available that are even less expensive.  It would be hard to put together the same collection of sensors for a lower price by any other means.
  • The ardupilot mega is proven and popular and has seen very wide scale testing.  It is hard to find any other similar device on the market that has been tested as thoroughly and under as wide a variety of applications and environments than the ardupilot.
  • The ardupilot mega code (especially the sensor I/O and RC code) is also tremendously well tested and ironed out.
  • By stripping all the extra functionality out of the firmware and concentrating simply on sensor IO and communication with a host computer, the new firmware is especially lean, mean, fast, and simple.  Where as the full arduplane code is bursting the atmega2560 cpu at the seams with no more room to add anything, compiling the amp2-sensorhead code reports: “Binary sketch size: 36,132 bytes (of a 258,048 byte maximum)”
  • Along with plenty of space for code, removing all the extraneous code allows to CPU to run fast and service all it’s required work without missing interrupts and without dropping frames.
  • There is a design philosophy the prefers splitting the hard real-time work of low level sensor interrogation from the higher level intelligence of the application.  This can lead to two simpler applications that each do their own tasks efficiently and well, versus a single monolithic conglomerations of everything which can grow to be quite complex.  With all the hard real time work taken care of by the apm2, the host computer application has far less need for a complicated and hard-to-debug thread based architecture.
  • The APM2 board can connect to a host computer trivially via a standard USB cable.  This provides power and a UART-based communication channel.  That’s really all you need to graft a full suite of sensors to your existing computer and existing application.  Some people like to solder chips onto boards, and some people don’t.  Some people like to write SPI and I2C drivers, and some people don’t.  Personally, I don’t mind if once in a while someone hands me an “easy” button. 🙂

Some Technical Details

  • The apm2-sensorhead firmware reports the internal sensor data @ 100hz over a 115,200 baud uart.
  • The apm2-sensorhead firmware can be loaded on any version of the ardupilot-mega from 2.0 to 2.5, 2.6, and even 2.7.2 from hobbyking.
  • The UART on the APM2 side is 5V TTL which you don’t have to worry about if you connect up with a USB cable.
  • This firmware has flown for extensively for 3 flying seasons at the time of this writing (2012, 2013, 2014) with no mishap attributable to a firmware problem.
  • The firmware communicates bi-directionally with a host computer using a simple binary 16-bit checksum protocol.  Sensor data is sent to the host computer and servo commands are read back from the host computer.
  • Released under the GPLv3 license (the same as the arduplane library code.)
  • Code available at the project site: https://github.com/clolsonus/apm2-sensorhead  This code requires a minor ‘fork’ of the APM2 libraries available here: https://github.com/clolsonus/libraries

Questions?

I am writing this article to coincide with the public release of the apm2-sensorhead firmware under the LGPL license.  I suspect there will not be wide interest, but if you stumble upon this page and have a question or see something important I have not addressed, please leave a comment.  I continue to actively develop and refine and fly (fixed wing aircraft) with this code as part of my larger system.

 

PBY Catalina

January 27, 2015

All shiny again!

IMG_20150127_174222 IMG_20150127_174150 IMG_20150127_174100

January 27, 2015

Today my replacement plastic parts arrived … a new canopy and a new hull shield.  It wasn’t too hard to pry the old shield off, and the new part is already to slap on.

IMG_20150127_112053

January 26, 2015

IMG_20150126_102541

Here is a quick snapshot showing the new orange color on the engine cowls, the upper side wing tips, and also notice the tail stripes have been repainted orange (originally red.)  I’m just waiting on a replacement canopy which will hopefully arrive tomorrow and I should be able to completeley reassemble the model and re-maiden it after the crash.  Oh, I also have soldered up the led landing lights and will have those illuminated now just for fun.

January 23, 2015

Repair status update:  The crash repairs continue at a slow but steady rate; I am hoping the result will be better than ever!

Here is a picture showing the motor nacelle damage.  This has now been reglued and is as strong as ever.

IMG_20150117_142805

The next picture shows the repair process of the wing center pylon. This got shredded when the wing ripped off.  2 pieces have already been attached in this picture (with a battery on top to weight it down) and I am holding the final piece in it’s approximate location.

IMG_20150117_165949

My repair plan includes adding some orange color to the engine nacelles, the top of the wing, and the tail for enhanced visibility.  In the next picture I have removed the cowls before painting them.

IMG_20150120_125311

Now we see the cowls painted orange and reattached.  I think the final result will look nice if I can get some orange stripes on the wing tops and tail and do it cleanly.  In this next picture you can also see a view of the repaired wing center pylon (with some white filler that still needs to be painted.)  I found some flat gray spray primer at the hardware store that is a pretty close match to the factory paint, but it is cheap paint and covers poorly and requires a couple coats.

IMG_20150123_140702

I have ordered a replacement canopy and replacement forward hull shield because these thin plastic parts were substantially damaged in the crash.  Grayson hobby had these parts in stock for $9.99.  If you are frustrated because nitroplanes.com is always out of stock of the airplane or part you want, check out grayson hobbies online.  They have a wide range of dynam stuff.

January 4, 2015

IMG_20150104_212950 IMG_20150104_210307 IMG_20150104_210229 IMG_20150104_210112

More detailed damage assessment:

  • The fuselage center wing mounting pylon is pretty shredded.  I’m hoping I can glue the parts back together to get the shape, and then do some reinforcement to get the necessary structural strength.
  • The right wing tip float broke off.  I’m strongly considering a mod to convert this model to retractable wing tip floats.  There are a few people on youtube that have done this and it turns out pretty cool.
  • The canopy got shattered into dozens of bits.  I need to either buy replacement plastic parts or carve a replacement from something.
  • The left engine nacelle got completely ripped off.  I think this will go back together again ok with enough surface area to be structurally sound.
  • The right front bottom of the nose took some damage I just now noticed.  That’s probably something I can just glue back together and it will be ok.
  • I am thinking about getting out my can of orange spray paint and doing an outboard section of the top of the wing in high visibility orange (and maybe a couple other bits while I’m at it.)  Part of the reason for the crash was that I completely lost orientation on the model, and part of the reason for that (I think) was flying an all gray model on an overcast day.

January 3, 2015

IMG_20150103_134506

I went out to fly this off the lake this afternoon (frozen this time of year.)  I did one good flight, switched batteries and midway through the 2nd flight some winds started to come up a bit. Then I did the unthinkable … I lost my perspective/orientation with the model, executed one of those dumb thumb moves, and corkscrewed it right in.  It’s been a long time since I’ve had a crash I couldn’t blame on mechanical, electrical, or weather factors.  This one was just dumb thumbs and it was over before I could figure it out.  Now I’m really bummed.  Here is a picture of the wreckage I carted home.  Major damage includes the center wing pylon sheared off (foam/structural), the canopy shattered to bits, one of the wing tip floats broke off (plastic part broke), one of the engine nacelles broke off, and probably a few other odds and ends I’ll find if I dig in further.

I think the engine nacelle and center wing pylon will be repairable.  I don’t know what I will do about the canopy … maybe I can buy a replacement?  Same with the wing tip float mount?  I suppose I’ll set it to the side for now.  Usually after a few days I start to get a bit curious and dig in and often a lot of the damage is quickly repairable.  Some of it might be a little harder though … guess I’ll post a repair log if I think I’ll be able to fix it up and fly it again.

June 21, 2013

This is the PBY from nitroplanes.com.  It is a relatively big airplane compared to most ‘foamies’ these days with a 57-7/8″ wing span.  It is also a twin and a seaplane.  I have nothing but good things to say about it.  For the cost and the effort to assemble it, it looks really great!  On the ground, in the water, in the air, sitting still, water taxiing, flying … it just looks great.

IMG_20120129_133409

I purchased the PBY in January of 2012, so my first flights were off a snow covered lake on a bitter cold January afternoon.  I fly it with a 2200 mah 3 cell battery and that provides tons of flight time … probably pushing 20 minutes or more of relaxed flying.  Relaxed is the key word. There is nothing white knuckle about flying this airplane.

IMG_20120129_133337

One of the fun things I did this day was to snow taxi downwind a bit, turn back around into the wind and give it about 3 notches of throttle.  It began to slide across the slick snow and pick up speed slowly.  Little by little it accelerated and before I thought it would be ready to fly, it slowly lifted off on it’s own.  It passed in front of me about eye level, still climbing out at a very slow, very scale looking flight speed.  There are moments that are so perfect you always remember them, and in the context of flying RC airplanes, this was one of them.2012-04-19

She handles water operations just as effortlessly as flying off snow.   There is a little park with a dock at one corner of this lake in walking distance from my house.  It’s a perfect place to go fly a park-flyer seaplane on a nice calm summer evening.  The twin engines, the big wing with slow flight characteristics, smooth water handling … it just makes for a calm, relaxing fun evening of flying.  At the park, the PBY always seems to attract an audience too.

If you are into aerobatics, the catalina can do all the basic loops, rolls, inverted flight, wing overs, etc.  It can, but for some reason, I’d rather watch it do slow, scale, near perfect fly-by’s all evening long, mix in some touch and goes off the water … it’s really pretty!IMG_20120811_111738

For whatever it’s worth, the Catalina has plenty of power to do dry grass takeoffs from the RC club field.  I try to be extra careful keeping the wings level on landing so I don’t catch one of those wing tip floats.  The ground is a little less forgiving than water.

Just to summarize.  This is a very scale looking airplane.  It has a big fat wing and is lightly wing loaded (just like the real thing.)   It can fly amazingly slow in the air.  If you just want to enjoy a beautiful scale looking/flying aircraft, this one is hard to beat!

Sonic 64

Hobbyking Sonic 64

This is 1230mm (48.4″) wing span flying wing.  It is powered by a 64mm electric ducted fan.  It is a simple build and flies great!

Here are a couple pictures before the maiden flight (temperature was about +18F, winds were calm, skies had a medium thin overcast.)

IMG_20150110_143003  IMG_20150110_143017 IMG_20150110_143010

My one gripe about the Sonic 64

Of course we buy airplanes like the Sonic 64 because they are inexpensive, quick to assemble, and fast to get out to the field and up in the air.  It doesn’t hurt that they look great and are a lot of fun to fly!  But in the case of the Sonic 64, perhaps it is a little too inexpensive.  The servos that came with the kit were utter garbage.  One of them was completely useless and unsafe to fly.  It couldn’t hold a position, and it had a lot of trouble with it’s potentiometer.  The other servo probably could have been able to fly safely, but not fly well.  I submitted an RMA to Hobbyking where I purchased this ARF, and to-date they have not responded with an “initial evaluation.”  Typically the stuff I get from Hobbyking works and meets expectations, but this is pretty poor customer service to not even acknowledge my RMA request after a week and a half of waiting.  So I gave up, dug out the original servos, and replaced them with Hitec HS-55 servos.  These are $10 servos and far better than the $0.50 servos that came installed with the kit.  Unfortunately the servo leads on the Hitec servos were shorter so add two $3 servo extensions.  And finally add the cost of some glue to install the servos properly … altogether that added about $31 to the cost of the airplane … and Hobbyking still has yet to respond to my RMA.

Servo Installation in foam wings

My new favorite glue for installing servos into foam wings is: E6000 (available at Michaels and probably lots of other places.)  It is not a quick dry cement (let it dry over night) but the result is solid, it seems to be foam safe, it dries clear, and according to the video it’s pretty easy to cut through with a knife to later separate the glue joint.
I ran up to Michaels last night and got a small tube to try on my Sonic 64 and I’m really pleased with the results this morning.  I’ve tried different techniques over the years, but I’ve had hot-glue and double sided tape be wiggly or just pop out.  Epoxy is a bit too permanent when you are talking about cheap servos … E6000 seems to be a really good balance between creating a solid glue joint between a servo and foam, but then being able to separate it later if needed.  Here is an in depth how-to video (not mine) with a bunch more explanation.

e6000

 

Variants

This same design is sold under the name “Neptune” by Nitroplanes.com (but was out of stock when I was interested in ordering it.)  The manual still has at least one reference to Neptune that hadn’t been updated. 🙂

Launching

The Sonic 64 is hand launched.  This can be a little tricky if you are flying by yourself or need to get help from someone who doesn’t have a lot of airplane hand launching experience … so be cautious, things can go wrong quickly.

The manual suggests running the throttle up to about 50% (that feels about right) and then running 2-3 steps and giving it a good firm level throw (wings level, nose level, throw direction straight and level.)  I had one good launch, one bad/failed lauch (no damage, whew!), and a 2nd good launch on my 3rd try.  I noticed the left wing would really drop quickly.  I made every effort to make a straight level through, but maybe my technique is flawed?  Maybe there is some torque or spiral thrust issue going on?  Hopefully I can learn to do this more reliably on my own.  Often I’m out flying by myself, and then no offense to any of my fellow RC club members, but many of us don’t have much hand launch experience.

Next time out I think I may try to run the throttle up a little higher (maybe 2/3rds) and try to release it on an upwards trajectory … maybe 15-20 degrees up?

Flying

My maiden flight was on a dead calm afternoon and the Sonic 64 flies beautifully.  It is stable, tracks nicely, and responds well to control inputs.  It has no rudder, but it can do all the bank and yank type aerobatic maneuvers you can think of.  I attempted a high altitude stall to see what kind of craziness would happen and it was boring … it just dropped the nose and kept flying.  I’m sure I could get it to spin if I stalled it out in a steep turn, but I didn’t want to push it too hard on my first time out.  One thing I did manage to do was a sequence of really graceful wingovers.  I could fly a fast eye level pass, climb out steeply, and then as speed bleeds off, roll into a bank (maybe 10-20 degrees) and hold that as you run out of airspeed.  The airplane naturally does a wing over and comes back flying right past you the other direction.  It’s really graceful.  If you hold the throttle fixed through the maneuver, it tends to kick you over just a bit faster when you are at the apex and have minimal aerodynamic control and the thrust starts to dominate.  If you pull the throttle back at the apex, then it’s a bit more lazy and graceful.

I was flying a cold day (+18F) so the air was relatively dense.  As a result, top speed didn’t seem too fast, even though the motor was really wound up.  I expect on a warm summer evening it will zip by much faster in hotter, less dense air.

Landing

The Sonic 64 is a flying wing and behaves exactly how you would expect.  It has a relatively long, flat glide, and if you carry any speed over the threshold, it will glide forever in ground effect.  So there really isn’t anything difficult or unexpected, just be ready for a fairly long, flat approach and be ready for it to carry during the flare as it wants to just keep flying forever in ground effect.

Conclusions

This is a great flying, inexpensive airplane that goes together quickly (probably an hour or two maximum.)  It is appropriate for a moderately experienced sport flyer.  It’s also a big bundle of fun and looks really cool in the air.  However, I am now aware that if I do have any problem with something ordered from Hobbyking, I shouldn’t have high hopes for their customer service and for their response times.  That’s unfortunate because I actually like Hobbyking otherwise and everything else I’ve bought from them has met expectations.  In this case the problem was easy to fix (yeah model making skills) but that was an extra chunk of money out of my own pocket.

IMG_20150110_143109

Polaris Ultra

Manufacturer: Model Aero

Design: Polaris Ultra

Web sitehttp://modelaero.com/product/POL-ULT.html

Summary: Pretty awesome airplane

Model Aviation also has a nice review of the Polaris: http://modelaviation.com/modelaeropolarisultra

My story is that I missed one of the model aero sales by a day or two so I emailed the owner to see if I could sneak in my order late.  He replied that he had a couple unpainted polaris arf’s and he’d be willing to give me a package deal on one of those.  So I figured it couldn’t be that hard to paint a foam model and off I went.

The aircraft arrived completely unpainted. The foam was white, and even the plastic parts were just clear.  I test fit a few of the pieces and it sure was looking pretty cool even without paint:

IMG_20141029_141228 IMG_20141029_141246

The assembly process is pretty straight forward and the instructions are good.  It’s perhaps a bit more work than some of the newer foam arfs.  It took me a couple evenings (including the extra work of painting.)  But, there was nothing too difficult.  I chose to mount my elevator servo externally to simplify the install.

Because I was painting the model from scratch, I got to make all the choices.  That is good and bad.  I decided to try to keep things simple and roughly follow the original scheme.  I happened to have flat gray primer in a can, and then I ran to the local hardware store and picked up a can of bright red.  So those were my two colors.  The cockpit and dorsal vent got painted flat gray.  The cockpit piece is removable for swapping flight batteries so it was easy to get a nice clean edge there.

IMG_20141101_162920 IMG_20141101_162930 IMG_20141103_134127

Painting the color was a bit more work.  I’d never done this before, not exactly, so I had to make things up a bit as I went along.  It turns out that if I hit the foam surface hard with the spray paint, the propellant would melt the foam. That wouldn’t be a good thing.  Same problem with the primer.  I did some googling and water based spray paint exists, but I didn’t have it in hand.  I found some test foam and it turns out that if you start by lightly dusting the surface from a good distance (maybe 18+ inches) from the surface.  This lets most of the propellant evaporate off before it hits the foam, and as the dust accumulates the surfaces is more and more protected.  So I did this trick with the primer and then later could paint normally over that with the color layer.

I decided to use electrical tape to mask the paint areas.  This was good in the sense that it didn’t seem to bleed too much, but bad in the sense that it was too sticky and pulled some of the shiny foam finish off when I removed it.  That’s hard to notice though so I guess it wasn’t a complete disaster.  I layed down my exact lines and then masked off with newspaper after that.  One thing I discovered though is that the spray goes everywhere.  It managed to sneak through some cracks I thought were sealed.  Again, not perfect, but not too noticeable on the final result so I guess it’s not the end of the world.

IMG_20141103_145529 IMG_20141103_193730IMG_20141104_131857

After painting the color areas, I went back with 1/2″ electrical tape and used that as edge trim to hide paint edge and tidy things up.  Again, not perfect and kind of crude, but I think it turned out pretty sharp in the end.

IMG_20141104_184237 IMG_20141103_202244 IMG_20141103_205632

I painted the bottom of the wings flat gray to differentiate them from the top, and with the 1/2″ black tape on the edges I think it set it off and made it look pretty sharp.  I’m happy with the result.

IMG_20141103_202407

Here is the end result.  The canopy is glued on, the cowl is painted red and attached.  3-bladed prop looks great.  The decals are applied in the final shot.

IMG_20141104_184329 IMG_20141104_184425 IMG_20141105_163631

I don’t have any flight videos or pictures yet, but I did do the maiden flight. I waited and waited (almost a month) for decent weather and finally got a day with almost dead calm winds here in MN.  The Saturday after Thanksgiving was the day.  There is a small lake across the street from my house and this time of year it is frozen.  Fortunately due to early extra cold weather, the ice on the lake was thick enough to walk on safely.  So I put on my boots and headed out to find sunshine, upper 30’s, and dead calm winds.  It was later in the afternoon, but the sun was still bright.  The first flight went perfect.  It needed a bunch of clicks of right trim and a couple clicks of up, but once I got her dialed in, then WOW!  She really flies spectacularly.  I love how you can pull high alpha in the turns to make them extra tight.  She tracks and handles so nicely, like she is on rails.  She looks so sweet in the air!  Take offs from the snow were amazingly smooth.  I was flying on some tired old 3S batteries, but honestly that was plenty of power and plenty fast for the area I was flying.  It is a relatively long model so it is extremely stable in pitch.  It is all around really wonderful … looks and flies spectacularly!

 

 

Coding and Complexity

Norris Numbers

I recently stumbled on the following article about “Norris numbers”

http://www.teamten.com/lawrence/writings/norris-numbers.html

The quick summary is that for an untrained programmer, 1500 lines of code is about where they hit the wall before they succumb to complexity and organizational issues.  The next big barrier is 20,000 lines of code for those that have some training and a bit of experience.  The next big barriers are at 200,000 lines, 2 million lines, etc.  At each level, a programmer or team has to develop new techniques and strategies to overcome the inherent cognitive limits of our human brains.

SLOCCOUNT

Linux offers a neat little tool called “sloccount”.  You can run it on just about any source tree, and it will count up the lines of actual code and then estimate how long it might have taken to develop that code and what the expected costs might be.

For example, my core UAS autopilot project is about 41,000 lines of code currently.  Sloccount estimates this represents nearly 10 years of effort (I started the work in 1995) and at a modest yearly salary estimates the cost to develop this code at about $1.3 million dollars.

My first reaction is this is crazy talk, sloccount is just pulling numbers out of it’s rear aperture.  But after thinking about it, maybe these numbers aren’t so far off.  If you also consider the need for a project to mature over a period of time and consider bugs, testing, etc. then it’s not so much about how many lines of code can you crank out per hour, but more about how much effort is required to create a body of code that is functional, robust, and mature.  If you take a mature software package and work backwards, I suspect sloccount’s numbers would start to look much more reasonable.

Workload and Expectations?

I’ve spent my career in the trenches.  I never really was interested in the management track.  This clearly colors my perspective.  When I read generalized comments (like the article linked above) my first thought is to apply them to myself.

After reading the article about code complexity barriers, I immediately went out and evaluated several of my projects.  FlightGear = 264,000 lines of code.  My UAS autopilot is 41,000 lines of code.  My little summer side project is 6,000 lines of code (sloccount estimates that is 15.76 months worth of effort packed into about 2 months of time.)  My model airplane designer project that I’ve worked on during Christmas break the past 2 years is 5,600 lines of code.

I still don’t think we can put too much faith in sloccount’s exact numbers, but when I wonder about why I’m so overwhelmed and feel buried in complexity and deadlines and pressure, maybe this offers a little perspective, and maybe I’m doing ok, even if I feel like I’m coming up short of everyone’s expectations.  For the managers and entrepreneurs out there, maybe this can guide your expectations a bit more accurately.