ArcGIS Blog

Imagery & Remote Sensing

ArcMap

Raster Image Processing Tips and Tricks — Part 1: Georeferencing

By Allison Muise

This is the first in a series of blog posts that will cover some tips and tricks for working with aerial images, and is based on a project I recently completed looking at the structure of a seabird colony off the coast of Nova Scotia, Canada.

Project overview

For this project, I had two high-resolution aerial images (22cm resolution) of the island that I needed to mosaic into a single image. I needed the image to be georeferenced to known coordinate system which was very problematic considering the nearest known survey point was 500 kilometers away! As it turned out, the only control source I had for georeferencing was a LiDAR image that had 1 meter resolution, nearly 4 times coarser than my aerial imagery. Using with the aerial imagery and the LiDAR dataset, I was able to create a detailed vegetation map of the island.

The basic workflow of the project is shown below. Each part is a separate blog post.

Georeferencing imagery is a critical starting point for many projects, as any errors will be passed along to the other project components (such as trail maps or vegetation classifications). In this post, I will show how to increase the accuracy of your georeferencing results, especially when working with less-than-ideal images which may present issues such as differing resolutions between target and reference data and uneven distribution of features on which to place links.

Recognize the limitations of your images

Images of street networks or residential areas are quite straightforward to georeference. They contain features with clearly defined corners and edges which are relatively easy to link to their matching features in projected datasets. Unfortunately, most images are not ideal. Georeferencing the imagery for this project, shown below in Figures 1, 2 and 3, was troublesome in several ways:

  • Different resolutions: No existing maps covered the island in sufficient detail to serve as reference data for georeferencing this imagery, so the intensity grid from the LiDAR survey was selected as reference data. The imagery is 0.22m resolution and the LiDAR is 1m resolution meaning that many of the features which are clearly defined in the imagery are highly pixelated, or too small to see at all in the LiDAR data.
  • Feature dispersion: The imagery and the intensity grid display different reflectance information about the surface; the RGB imagery shows reflectance of the visible wavelengths and the intensity grid reflectance of the laser wavelength. For this reason some features are more visible in the LiDAR and others are more visible in the imagery. Of the features which are clearly visible in both the LiDAR and the aerial imagery, most are clumped into two areas. Even when other, less-traditional features such as trees, rocks and distinct areas of vegetation are considered, there are wide areas of the island with few discernible features which can act as links between the projected and non-projected data.

Once these obstacles are overcome, the images can be georeferenced and the two images can be mosaicked into a single image of the island for further processing.

Figure 1: Images along flight path

Figure 2: LiDAR intensity raster to which the imagery will be georeferenced

Figure 3: Adjusting the transparency shows the differences in resolution between the 1m LiDAR raster (dark shadow) and the 22cm RGB imagery, and highlights the difficulty of placing accurate links, even on features as clear as foundations.

The differences in resolutions were my first challenge to overcome. Many of the very distinct features in the aerial imagery, such as individual rocks, are simply too small to appear in the much lower-resolution LiDAR raster. This means that larger, less distinct objects had to be chosen to act as control points. Less distinct control points increase the difficulty in identifying corresponding locations between the un-projected and projected imagery which can decrease the quality of your links. Tricks such as placing the link at the center of features with indistinct edges can be helpful, as the centre is likely to remain constant regardless of where the actual edges are.

With images like these it’s also important to remember that having more links is not always better. It’s generally better to have good links spread as evenly as possible throughout the area of interest, than it is to put a link on every identifiable object. In imagery like this, that would result in high concentrations of links in some areas, while others have very few. Grouping too many links together will skew the image and make it less accurate in the areas with fewer links. In this case, I wasn’t worried about skewing the ocean, so I concentrated on evenly spacing links across the island.

Maximize the value of your links

As you place your links, keep an eye on your RMS (Root Mean Square) value in the links table (Figure 4).

Figure 4: Placing and evaluating georeferencing links.

 

Ideally the RMS value should be less than or equal to the lowest resolution used, in this case 1. The individual residuals for each link can be telling as well. If a particular link has a residual value well above the others, it may have been poorly placed and should probably be deleted.

A lot of time can be spent zooming around the imagery looking for good control points on which to place links. You can save yourself some clicking and scrolling by making use of the Magnifier window (Figure 5). You can move the magnifier around your image at whatever magnification you choose, and place links right inside the magnifier.

Figure 5: Using the Magnifier window to zoom in to a building on the island. The RGB imagery is semi-transparent and overlays the LiDAR imagery.

 

The Transparency, Swipe and Flicker tools, found on the Effects toolbar, can also be helpful tools. Transparency allows you to see through one layer to another and can be a huge help in finding related objects in two images. Some features become hard to see using transparency, but the Flicker tool will quickly turn one layer on and off, revealing the layer below with no loss of color or detail. Once you have placed a link or two, use the Swipe tool show in Figure 6 to assess how well the images are aligning by peeling back the top image to reveal the layers underneath.

Figure 6: Assessing the quality of the links using the swipe tool to peel back the RGB

 

The transformation algorithm determines how the image will be adjusted to conform to the coordinate system of the reference imagery (i.e. the LiDAR raster) and can dramatically impact how well your control points align. Generally, a 1st degree polynomial should be sufficient, but if your image is warped or the main feature and your control points are unevenly distributed, a 2nd degree polynomial transformation may be required to obtain good alignment. If you feel it necessary to go to a 2nd degree or higher polynomial, even distribution of links throughout the area of interest becomes very important to prevent warping your image further. The effects of 1st, 2nd, and 3rd degree polynomial transformations on this imagery can be seen in Figure 7.

Figure 7: The effects of three polynomial transformations on the same imagery using the same georeferencing links. Note the warping along the edges of the images while the island remains largely unchanged with the 3rd degree polynomial.

Georeference the Image

Georeferencing the imagery is as simple as selecting either ‘Rectify’ or ‘Update Georeferencing’ from the menu on the Georeferencing Toolbar. ‘Rectify’ will create a new, georeferenced raster, whereas ‘Update Georeferencing’ simply saves the transformation information in accompanying .AUX.XML and .XML files.

More Information

Learn more about georeferencing a raster dataset

Basic steps to georeferenced a raster dataset

Watch a georeferencing video

Learn more about the Georeferencing Toolbar

Learn more about the Effects Toolbar

Thank you to the Applied Geomatics Research Group of Middleton, Nova Scotia, Canada for the imagery

Share this article