Sunset Flight

This is Skywalker Flight #74, flown on Sept. 7, 2017.  It ended up being a 38.5 minute flight–scheduled to land right at sunset.  The purpose of the flight was to carry insect traps at 300′ AGL and collect samples of what might be flying up at that altitude.

What I like about this flight is that the stable sunset air leads to very consistent autopilot performance.  The circle hold is near perfect.  The altitude hold is +/- 2 meters (usually much better), despite continually varying bank angles which are required to hold a perfect circle shape in 10 kt winds.

The landing at the end is 100% autonomous and I trusted it all the way down, even as it dropped in  between a tree-line and a row of turkey barns.  The whole flight is presented here for completeness, but feel free to skip to part 3 if you are interested in seeing the autonomous landing.

As an added bonus, stick around after I pick up the aircraft as I walk it back.  I pan the aircraft around the sky and you can clearly see the perfect circle hold as well as the landing approach.  I use augmented reality techniques to overlay the flight track history right into the video–I think it’s kind of a “cool tool” for analyzing your autopilot and ekf performance.

Field Comparison of MPU6000 vs VN100T

The U of MN UAV Lab has flown a variety of sensors in aircraft, ranging from the lowly MPU-6000 (such as is found on an atmel based APM2 board) all the way up to an expensive temperature calibrated VectorNAV VN-100T.  I wish to present a quick field comparison of these two sensors.

Hobby King Skywalker with MPU-6000.
Sentera Vireo with VectorNav VN-100T onboard.

[disclaimers: there are many dimensions to any comparison, there are many individual use cases, the vn100 has many features not found on a cheap MPU-6000, the conditions of this test are not perfectly comparable: two different aircraft flown on two different days.   These tests are performed with a specific MPU-6000 and a specific VN-100T — I can’t say these predict the performance of any other specific IMU.  Both sensors are being sampled at 100hz externally.  Internally the MPU-6000 is being sampled at 500hz and filtered.  I suspect the VN-100T is outputting unfiltered values — but that is just a guess from the plot results.]

The point of this post is not to pick on the expensive solution, comparisons are difficult with a perfectly level playing field.  But hopefully I can show that in many ways, the less expensive solution may not be as bad as you thought–especially with a little calibration help.

Exhibit A: Raw gyro data in flight

I will mostly let the plots speak for themselves, they share the same vertical scale and cover about the same time span.  The less expensive sensor is clearly less noisy.  This trend holds up when the two sensors are motionless on the ground.

MPU-6000 gyros (100 seconds of flight)
VN-100T gyros (100 seconds of flight)

Exhibit B: Raw accelerometer data in flight

Again, the plots speak for themselves.  Given the same vertical and horizontal scales, the less expensive sensor is by far less noisy.

MPU-6000 accelerometers (100 seconds of flight)
VN-100T accelerometers (100 seconds of flight)

Exhibit C: Sensor bias estimates

The UMN UAV lab flies it’s aircraft with our own high fidelity EKF library.  The EKF code is the fancy mathematical code that estimates the aircraft state (rolll, pitch, yaw, position, velocity, etc.) given raw IMU and GPS inputs.

One of the arguments against a cheap sensor versus an expensive temperature calibrated sensor is you are paying for high quality temperature calibration in the expensive model.  This is a big deal with mems IMU sensors, because temperature can have a big effect on the bias and accuracy of the sensor.

This next pair of plots requires a small bit of explanation (please read this first!)

  • For every temp step, the UMN EKF library estimates the bias (or error) of each individual gyro and accelerometer (along with estimating aircraft attitude, position, and velocity.)
  • We can plot this bias estimate over time and compare them.
  • The bias estimates are just estimates and other errors in the system (like gps noise) can make the bias estimates jump around.
  • I have developed a temperature calibration process for the inexpensive IMU’s.  This process is being used to pre-correct the MPU-6000 sensor values in flight.  This correction process uses past flight data to develop a temp calibration fit and the more you fly and the bigger range of temperatures you fly in, the better the calibration becomes.
  • Just to be clear: for these final plots, the MPU-6000 is using our external temp calibration process — derived entirely from past flight data.  The VN-100T is running it’s own internal temp calibration.
  • These bias estimates are not perfect, but they give a suggestion of how well the IMU is calibrated.  Higher bias values suggest a larger calibration error.

MPU-6000 sensor bias estimates from UMN EKF library.

VN-100T sensor bias estimates from UMN EKF library.

What can we learn from these plots?

  • The MPU-6000 gyro biases (estimated) are approximately 0.05 deg/sec.
  • The VN-100T gyro biases (estimated) are as high as -1.0 deg/sec in roll and 0.35 deg/sec in yaw.
  • The MPU-6000 accel biases (estimated) are in the 0.1 m/s^2 range.
  • The VN-100T accel biases (estimated) are also in the 0.1 m/s^2 range.

In some cases the MPU-6000 with external temperature calibration appears to be more accurate than the VN-100 and in some cases the VN-100T does better.

Summary

By leveraging a high quality EKF library and a bit of clever temperature calibration work, an inexpensive MPU-6000 seems to be able to hold it’s own quite well against an expensive temperature calibrated mems IMU.

Drosophila-nator (Prototype)

This is a joint Entomology / Aerospace project to look for evidence that Spotted Wing Drosophila (an invasive species to North America) may be migrating at higher altitudes where wind currents can carry them further and faster than otherwise expected.

Skywalker 1900 outfitted with 2 petri-dish sized insect traps.

Skywalker Flight #69

Altitude: 200′ AGL
Airspeed: 20 kts
Weather:  10 kts wind, 22C
Mission: Circle fruit fields with insect traps.

Skywalker Flight #70

Altitude: 300′ AGL
Airspeed: 20 kts
Weather:  12 kts wind, 20C
Mission: Circle fruit fields with insect traps.

Skywalker Flight #71

Altitude: 400′ AGL
Airspeed: 20 kts
Weather:  13-14 kts wind, 20C
Mission: Circle fruit fields with insect traps.

Flying on the Edge of a Storm

This is a follow up to my eclipse post.  I was forced to end my eclipse flight 10 minutes before the peak because a line of rain was just starting to roll over the top of me.  I waited about 20-30 minutes for the rain to clear and launched a post-eclipse flight that lasted just over an hour of flight time.

Here are some interesting things in this set of flight videos:

  • You will see the same augmented reality heads up display and flight track rendering.  This shows every little blemish in the sensors, EKF, flight control system, and airplane!  It’s a great testing and debugging tool if you really hope to polish your aircraft’s tuning and flight performance.
  • IT IS WINDY!!!!  The skywalker cruises at about 20 kts indicated airspeed.  Winds aloft were pushing 16 … 17 … 18 kts sustained.  At one point in the flight I record 19.5 kt winds.
  • At t=2517 (there is a timer in seconds in the lower left corner of the HUD) we actually get pushed backwards for a few seconds.  How does your autopilot navigation work when you are getting pushed backwards by the wind?!?  You can find this about 20 seconds into Part #3.  Check it out. 🙂
  • In the 2nd half of the flight the winds transition from 16-17 kts and fairly smooth, to 18-19 kts and violent.  The poor little skywalker is getting severely thrashed around the sky in places.  Still it and the autopilot seem to handle it pretty well.  I was a bit white knuckle watching the flight unfold from the ground, but the on board HUD shows the autopilot was pretty relaxed and handled the conditions without really breaking much of a sweat.
  • When the winds really crank up, you will see the augmented flight track pass by sideways just after we pass the point in the circle where we are flying directly into the wind  … literally the airplane is flying sideways relative to the ground when you see this.
  • Does this bumpy turbulent video give you a headache?  Stay tuned for an upcoming video in super smooth air with a butterworth filter on my airspeed sensor.

Note: the hobbyking skywalker (1900mm) aircraft flown in this video has logged 71 flights, 31.44 hours in the air (1886 minutes), and covered 614 nautical miles (1137 km) across the ground.

Flying on the Edge of an Eclipse (2017)

On August 21, 2017 a full solar eclipse sliced a shadowy swath across the entire continental USA.  The totality area missed Minnesota by a few hundred miles so we only saw about 85% obscuration at our peak.

I thought it could be interesting to put a UAV in the sky during our partial eclipse and record the flight.  I didn’t expect too much, but you never know.  In the end we had a line of rain move through a few minutes before the peak and it was really hard to say if the temperature drop and less light was due to a wave of rain or due to the eclipse.

Still, I needed to test some changes in the AuraUAS flight controller and was curious to see how the TECS system would fly with a completely unfiltered/raw/noisy airspeed input.  Why not roll all that together and go test fly!

Here is the full video of the 37 minute flight.  Even though this is slightly boring flight test video, you might find a few interesting things if you skip around.

  • I talk at the start and the end of the flight.  I can’t remember what I said, but I’m sure it is important and insightful.
  • I rendered the whole live flight track in the video using augmented reality techniques.  I think that’s pretty cool.
  • If you skip to the end, I pick up the plane and walk back to the runway.  I think that is the funnest part.  There I pan the airplane around the sky and show my flight path and approach drawn right into the real video using the same augmented reality techniques.
  • We had 100% cloud cover and zero view of the sun/moon.  But that doesn’t stop me from drawing the sun and moon in the view where it actually is.  Not surprisingly, they are sitting almost exactly on top of each other.  You can see this at the end of Part 3.
  • I flew a fully autonomous landing on this flight.  It worked out pretty well and shows up nicely at the end of Part 3.  If anyone is interested, the auto-land task is written as an embedded python script and runs right on-board in the main flight controller.  I think that might be pretty cool for people who are python fans.  If you want to geek out on the details you can see the whole landing script here: https://github.com/AuraUAS/aura-core/blob/master/src/mission/task/land2.py  (Then go watch it in action at the end of Part #3.)

Aerial Deer

Last Friday I flew an aerial photography test flight using a Skywalker 1900 and a Sony A6000 camera (with 20mm lens.)  On final approach we noticed a pair of deer crossing under the airplane.  I went back through the image set to see if I could spot the deer in any of the pictures.  I found at least one deer in 5 different shots.  Here are the zoom/crops:





Continuously Self Calibrating UAV Compass

Manual UAV sensor calibration is dead!

I know the above statement isn’t exactly true, but it could be true if everyone who develops UAVs would read this article. 🙂

In this article I propose a system that continuously and dynamically self calibrates the magnetometers on a flying UAV so that manual calibration is no longer ever needed.

With traditional UAVs, one the most important steps before launching your UAV is calibrating the magnetometers.  However, magnetometers are also one of the most unpredictable and troublesome sensors on your UAV.  Electric motors, environmental factors, and many other things can significantly interfere with the accuracy and consistency of the magnetometer readings.   Your UAV uses the magnetometer (electronic compass) to compute its heading and thus navigate correctly through the sky.

We can accurately estimate heading without a compass

Most UAV autopilots do depend on the magnetometer to determine heading.  However, their is a class of “attitude estimators” that run entirely on gyros, accelerometers, and gps.  These estimators can accurately compute roll, pitch, and yaw by fitting the predicted position and velocity to the actual position and velocity each time a new gps measurement is received.   In order for these estimators to work, they require some amount of variation in gps velocity.  In other words, they don’t estimate heading very well when you are sitting motionless on the ground, hovering, or flying exactly straight and level for a long time.

A compass is still helpful!

For fixed wing UAVs it is possible to fly entirely without a magnetometer and still estimate the aircraft attitude accurately throughout the flight.  However, this requires some carefully planned motion before launch to help the attitude estimate converge.  A compass is obviously still helpful to improve the attitude estimate before launch and in the initial moments of the launch–before the inertial only system sees enough change in velocity to converge well on it’s own.  A compass is also important for low dynamic vehicles like a hovering aircraft or a surface vehicle.

Maths and stuff…

The inertial only (no compass) attitude estimator I fly is developed at the University of Minnesota UAV lab.  It is a 15-state kalman filter and has been at the core of our research here for over a decade.  A kalman filter is a bit like a fancy on-the-fly least squares fit of a bunch of complicated interrelated parameters.  The kalman filter is based on minimizing statistical measures and is pretty amazing when you see it in action.  However, no amount of math can overcome bad or poorly calibrated sensor data.

So Here We Go…

First of all, we assume we have a kalman filter (attitude and location estimator) that works really well when the aircraft is moving, but not so well when the aircraft is stationary or hovering.  (And we have that.)

We know our location and time from the gps, so we can use the World Magnetic Model 2015 to compute the expected 3d vector that points at the north pole.  We only have to do this once at the start of the flight because for a line of sight UAV, position and time doesn’t change all that much (relative to the magnetic pole) during a single flight.

During the flight we know our yaw, pitch, and roll.  We know the expected direction of the magnetic north pole, so we can combine all this to compute what our expected magnetometer reading should be.  This predicted magnetometer measurement will change as the aircraft orientation changes.

The predicted magnetometer measurement and the actual magnetometer measurements are both 3d vectors.  We can separate these into their individual X, Y, and Z components and build up a linear fit in each axis separately.  This fit can be efficiently refined during flight in a way that expires older data and factors in new data.   When something changes (like we travel to a new operating location, or rearrange something on our UAV) the calibration will always be fixing and improving itself as soon as we launch our next flight.

Once we have created the linear fit between measured magnetometer value and expected magnetometer value in all 3 axes, we can use this to convert our raw magnetometer measurements into calibrate our magnetometer measurements.  We can feed the calibrated magnetometer measurements back into our EKF to improve our heading estimate when the UAV is on the ground or hovering.

Does this actually work?

Yes it does.  In my own tests, I have found that my on-the-fly calibration produces a calibrated magnetometer vector that is usually within 5 degrees of the predicted magnetometer vector.  This error is on par with the typical cumulative attitude errors of a small UAS kalman filter and on par with a typical hand calibration that is traditionally performed.

Source Code

The source code for the University of Minnesota 15-state kalman filter, along with a prototype self calibration system, and much more can be found at my AuraUAS github page here: https://github.com/AuraUAS/navigation

This article is fairly rough and I’ve done quite a bit of hand waving throughout.  If you have questions or comments I would love to hear them and use that as motivation to improve this article.  Thanks for reading!

Autopilot Visualization: Flight Track

 

Circling back around to see our pre-flight, launch, and climb-out trajectory.

Augmented reality

Everything in this post shows real imagery taken from a real camera from a real uav which is really in flight.  Hopefully that is obvious, but I just want to point out I’m not cheating.  However, with a bit of math and a bit of camera calibration work, and a fairly accurate EKF, we can start drawing the locations of things on top of our real camera view.  These artificial objects appear to stay attached to the real world as we fly around and through them.  This process isn’t perfected, but it is fun to share what I’ve been able to do so far.

Landing approach to touchdown with circle hold tracks in the background. Notice the sun location is correctly computed for location and time of day.

For the impatient

In this post I share 2 long videos.  These are complete flights from start to finish.  In them you can see the entire previous flight track whenever it comes into view.  I don’t know how best to explain this, but watch the video and feel free to jump ahead into the middle of the flight.  Hopefully you can see intuitively exactly what is going on.

Before you get totally bored, make sure to jump to the end of each video.  After landing I pick up the aircraft and point the camera to the sky where I have been flying. There you can see my circles and landing approach drawn into the sky.

I think it’s pretty cool and it’s a pretty useful tool for seeing the accuracy and repeatability of a flight controller.  I definitely have some gain tuning updates ready for my next time out flying based on what I observed in these videos.

The two videos

Additional notes and comments

  • The autopilot flight controller shown here is built from a beaglebone blaock + mpu6000 + ublox8 + atmega2560.
  • The autopilot is running the AuraUAS software (not one of the more popular and well known open-source autopilots.)
  • The actual camera flown is a Runcam HD 2 in 1920×1440 @ 30 fps mode.
  • The UAV is a Hobby King Skywalker.
  • The software to post process the videos is all written in python + opencv and licensed under the MIT license.  All you need is a video and a flight log and you can make these videos with your own flights.
  • Aligning the flight data with the video is fully automatic (described in earlier posts here.)  To summarize, I can compute the frame-to-frame motion in the video and automatically correlate that with the flight data log to find the exact alignment between video and flight data log.
  • The video/data correlation process can also be used to geotag video frames … automatically … I don’t know … maybe to send them to some image stitching software.

If you have any questions or comments, I’d love to hear from you!

Zombie Door

Run!!!

Zombies are pretty cool.  This post describes something a little less cool, but uses zombies to explain the concept (in a shallow, transparent attempt to capture your attention!)

Zombie Door Method

Imagine we want to generate a uniformly distributed random sampling in some complex space that our random number generator does not directly support.

Let me start with an simple example.  Imagine we have a random number generator that produces a random integer between 1 and 100.  However, we actually want to generate random numbers between 41 and 50.  (I know there are better ways to do this, but stick with me for this example.)  Let’s solve this with the zombie door method.

  • Build a wall and label it 1-100 where 1 is to the far left, and 100 is to the far right.
  • Cut a door in your wall from 41 to 50.
  • Now create random zombies starting anywhere from 1 to 100 and let them walk straight towards the wall.
  • The zombies will lurch across the room and eventually hit wall, explode, and dissolve in a fizzing puddle of bloody goo … or whatever it is that zombies do when they die.
  • The zombies that are lucky enough to walk through the door survive!

For this example it would be easy to scale and offset the usually available random number generator that produces a floating point value between 0 and 1, but it illustrates the basic approach of the ‘zombie door’ method.

A More Interesting Example

Imagine we have an arbitrary polygon outline and we want to splatter it with random circles.  However, we only want circles that are completely within the polygon.  (And we want our random circles to be a true  unbiased, uniformly distributed random sample.)  This example is just like the simple ‘wall’ example except now we have gone ‘2D’.

Imagine an arbitrary polygon shape:

We would like to fill this shape with 200 random circles, making sure none of our circles straddle the boundary or lie outside the polygon.  We want an even distribution over the interior area of our polygon.

We can do this by generating random circles within the min/max range of our shape, and then testing if they lie completely inside the polygon.  We reject any outliers and keep the inliers.  It’s very simple, but very cool because … you know … zombies!  Here is a picture of how our circles turned out:

If you are at all curious as to what became of our dead zombies (all the ones that splatted against the wall) here is the picture of that:

If you the reader are concerned that I am making light of a very real pending zombie apocalypse issue then here is my best tip for protecting your residence:

Finally, here is the python script I used to generate the pictures in this posting.  I leverage the very, very, very cool GPC polygon clipping library to test if the outline polygon ‘covers’ the circle.

#!/usr/bin/python
 
import random
 
# polygon (GPC) python package: https://pypi.python.org/pypi/Polygon2
import Polygon
import Polygon.IO
import Polygon.Shapes
 
# this will seed the random number generator with current time and
# force different results on each run.
random.seed()
 
# this is the bounds of the shape that will contain our splatters
outline = [ [0.5, -0.01], [1.01, -0.01], [0.5, 1.01], [-0.01, 1.01] ]
 
# define circle properties
num_circles = 200
min_r = 0.002
max_r = 0.02
 
# no closer than this to the boundary
margin = 0
#margin = 0.002
 
# create the polygon template (outline)
template = Polygon.Polygon(outline)
 
# make the shape a bit more interesting
c = Polygon.Shapes.Circle(radius=0.4, center=(1, 0.5), points=32)
template = template - c
c = Polygon.Shapes.Circle(radius=0.3, center=(0, 1), points=32)
template = template - c
 
# determine max/min of template
min_x = max_x = outline[0][0]
min_y = max_y = outline[0][1]
for p in outline:
 if p[0] < min_x: min_x = p[0]
 if p[0] > max_x: max_x = p[0]
 if p[1] < min_y: min_y = p[1]
 if p[1] > max_y: max_y = p[1]
 
print 'template bounds:', min_x, min_y, 'to', max_x, max_y
print 'radius range:', min_r, max_r
print 'margin:', margin
print 'num circles:', num_circles
 
# generate splats using zombie door method
circles = []
discarded = []
while len(circles) < num_circles:
 x = random.uniform(min_x, max_x)
 y = random.uniform(min_y, max_y)
 r = random.uniform(min_r, max_r)
 
 # make the circle
 c = Polygon.Shapes.Circle(radius=r, center=(x, y), points=32)
 
 # make the circle padded with extra margin
 cm = Polygon.Shapes.Circle(radius=(r+margin), center=(x, y), points=32)
 
 if template.covers(cm):
  # circle + margin fully contained inside the template
  circles.append(c)
 else:
  discarded.append(c)
 
# assemble final polygons and write output
final = Polygon.Polygon()
for c in circles:
 final += c
Polygon.IO.writeGnuplot('in.plt', [template, final])
Polygon.IO.writeSVG('in.svg', [final], fill_color=(0,0,0))
 
reject = Polygon.Polygon()
for c in discarded:
 reject += c
Polygon.IO.writeGnuplot('out.plt', [template, reject])
Polygon.IO.writeSVG('out.svg', [reject], fill_color=(0,0,0))

Failure is not fatal

This post is penned during a moment of extreme frustration, beware!

Kobayashi Maru

https://en.wikipedia.org/wiki/Kobayashi_Maru

One of the reasons I loved the original Star Trek series is because no matter what the odds, no matter how hopeless the circumstances, no matter how impossible the foe, Captain Kirk always found a way to think his way out of the mess.  He never ultimately failed or lost to an opponent, not once, not ever.  That makes a great hero and fun TV!  Fictional super heroes do things that normal human beings could never possibly do … like fly, or be stronger than steel, or always win.

Stress and the Brain

I don’t have time to read stuff like the following link, especially when I’m coming up short of a promised deadline.  Maybe you do?  http://www.health.harvard.edu/staying-healthy/understanding-the-stress-response

I’m told that when we begin to get stressed, the front area of the brain that is responsible for logic and reason starts to shut down, and command functions begin to be transferred back to the “fight or flight” portion of the brain.  I think about standing up in front of a group and speaking, then sitting down and wondering what I even said?  I think about arguments that got out of hand?  Where was the front part of my brain in all of that?  I think about looming deadlines and mounting stress … and … and … and mounting stress!

Recursive Stress

My job largely amounts to puzzle solving.   I love the process and I love finding clever solutions.  But if I ask you a riddle or give you a logic problem, can you give me a specific estimate of how much time it will take you to solve it?  That’s not how puzzle solving works, it’s not a step by step recipe that leads to a solution in a known time.  Failing to solve the problem in time stresses me out!  What is needed in these situations is clear, logical, and calm thinking.  But that is the first part of the brain to turn off during stressful situations!  It’s exactly the part of the brain we desperately need the most.  I know all this, and I watch helplessly as it happens.  What does that create?  More stress of course which accelerates the process of losing the most important part of my brain!

What is the solution?

No, seriously, what is the solution???

People often say they do their best work under pressure.  I know for myself, I do my worst work under pressure.  I strive whenever possible to get a long head start on a complex and difficult task.  I strive whenever possible to identify and solve the hardest parts of the task first.  But that isn’t always possible.

So instead I sometimes see failure coming weeks away, maybe like an asteroid on a collision course with earth.  I’m very serious about the task, I do everything I possibly can, I pour in all my energy and expertise, but it’s not always enough.  Things I thought would be easy turn out to be 10x more difficult than imagined.  Things that were working break for unexpected reasons.  Things that shouldn’t take time, take way too much precious time.

Captain Kirk to the rescue?  Sadly, no … he is a fictional character.  In the real world the asteroid looms bigger and bigger, it’s trajectory is a mathematical certainty.  The physics of the impact can largely be predicted.  At some point it becomes clear my efforts will fall short and there’s nothing left to do but watch the asteroid in it’s last few hours of flight.  Then <boom>.

Is it just me that fails colossally?

It usually seems like I’m the only one that makes a miserable mess of things I try to do, the things I’ve promised I could do, things I’ve been paid to do.  Everyone else is posting about their giant success on facebook.  Everyone else’s resume is a spotless collection of triumphs.  But not me.  Maybe once or twice I got lucky and the rest of the time is a huge struggle!  Honestly though, the only reason I’m posting this is because I know it’s not just me.  Any sports fans out there?  How many teams and players succeed to win the championship every season?  What percentage of players ever win a championship in their whole career?  Political success and failure?  How many new businesses succeed versus failing?

High Profile Failures

By mentioning specific companies, I don’t mean to imply specific people or imply anything negative here.  My intent here is to show we are all in this together and we all, even the best and most successful of us, suffer set backs in our work.  I live and work in the world of drones (or small unmanned aerial systems, aka UAS’s).  This is a tough business.  For all the hype and excitement even big companies can struggle.  Gopro recently did a full recall of their new drone product.  Hopefully they’ll try again in 2017, and hopefully the process will go better for them.  Recently 3DR, the king of DIY drones announced they were cancelling all their hardware efforts to focus on a software package for managing data collected by drones.  Parrot (another big name in the small drone market) just announced layoffs.  Edit: 12 Jan, Lily just announces it is dropping out of the drone race and shutting down.  Edit: Facebook Aquila crashed on first test flight.  Edit:  Titan (google’s own high altitude effort solar powered effort) is shut down.  It’s tough, even for the big guys with enough money to hire the best engineers, best managers and do professional marketing.

There are even higher profile failures than these … Chernobyl, TIger Woods, the Titanic, Exon Valdez, the Challenger, and most Sylvester Stallone movies except for Rocky I.

The Aftermath

So the asteroid hit.  In the last moments we just threw up our hands, gave up, and just watched it come in and do it’s destruction.  The dust is settling, what happens next?  Maybe the asteroid wasn’t as big as we imagined?  Maybe the damage not as severe?  Maybe life goes on.  In a work context, maybe you take a big professional hit on your reputation?  Maybe you don’t?  Maybe it’s about survival and living to fight another day?

Failures suck.  Failures happen to everyone.  Failures are [usually] not fatal.  The sun will still rise tomorrow for most of us.

Survival – Yes?!?

If you are reading this, you are still here and still surviving.  That’s great news!  Hopefully I’m here too!  Lets all live to fight another day.  Let’s all help each other out when we can!  There is a word called “grace” which is worth looking up if you don’t know what it means.  It’s a quantity that we all need more of and we all need to give each other in big healthy doses!

“Success is not final; failure is not fatal. It is the courage to continue that counts.” — Budweiser ad, 1938.  (Not Winston Churchill)