The process of mapping a pixel in an image to a real world latitude, longitude, and altitude is called georeferencing. When UAS flight data is tightly integrated with the camera and imagery data, the aerial imagery can be directly placed and oriented on a map. Any image feature can be directly located. All of this can be done without needing to feature detect, feature match, and image stitch. The promise of “direct georeferencing” is the ability to provide useful maps and actionable data immediately after the conclusion of a flight.
During the summer of 2014 I began investigating image stitching techniques and technologies for a NOAA sponsored UAS marine survey project. In the summer of 2015 I was hired by the University of Minnesota Department of Aerospace Engineering and Mechanics to work on a Precision Agriculture project that also involves UAS’s and aerial image stitching.
Over the past few months I have developed a functional open-source image stitching pipeline written in python and opencv. It is my intention with this series of blog postings to introduce this work and further explain our approach to aerial image processing and stitching.… Read the rest... >>