Introduction

This manual serves as an important reference for users of the Center for Earth Observation (CEO), and especially students in the course Observing the Earth from Space (OEFS). It also provides a concise general introduction to the field of remote sensing and digital image analysis. Part I provides an introduction to the facilities of the lab and their use. Part II is an introduction to the fundamental characteristics and sources of remotely sensed data. Part III describes the major composite datasets used in the CEO. Basic image processing techniques are outlined in Part IV. Parts V and VI will steer you in the right direction to find more information on these topics.

The OEFS course is taught using the remote sensing software package ENVI from ITT Visual Information Solutions. Most of the specific instructions in this guide, as well as in the more extensive CEO Online Documentation section, focus on how to do various functions using either ENVI or ERMapper. Because no one software package will be best for every feature, the CEO Lab also provides a variety of other leading geospatial software. These include ERDAS Imagine and ERMapper by Leica Geosystems, and ArcGIS by ESRI. Please see a member of the CEO staff if you have specific software needs.

Students taking the OEFS course should be aware that some of the information in this guide might change during the course of the semester. Such changes will be announced in class or lab and will most likely be posted on the Classes V.2 server. The Observing the Earth From Space web page is the source for the most up to date class information; check it regularly. Please report to the course instructors any inaccuracies or problems you discover with this guide or the lab.

I. Observing the Earth From Space

I.1Help

There are several sources of help available to students in this course. Many of the most frequently asked questions are answered in this guide. You should check here when you have a question about the class or lab facilities. Several copies of the documentation for each of the software packages used in the course are located in the CEO lab, but the most up-to-date information will be contined in the individual software package help screens. Due to the limited number of copies, these documents must remain in the CEO lab!

In addition to these printed manuals, you should take advantage of several sources of on-line help.

For most topics, especially concerning ENVI or Observing the Earth From Space, use the web!!!

Several web pages provide very useful information for this course. From the CEO's home page you will be able to access the following pages, among others:

Use the CEO and OEFS email lists to contact other lab users.

Email is a powerful tool for getting help. There are mailing lists for students of the Observing Earth From Space course " oefs-list@pantheon" and for all Yale remote sensing users "remote_sensing@panlists.yale.edu" for this purpose. If the answer to your question is not in any of the above sources, you can send a question to all the people on the appropriate list via email. In most cases, someone else will have run into your problem before and will offer a solution, suggestion, or at least moral support.

You may also consult the instructors and TA's for the Observing the Earth From Space course individually. Specific office hours will vary.

I.2 Using the CEO Facilities

The computing facilities of the Center For Earth Observation are located in Room 119 of the Environmental Science Center. Currently the lab houses 12 Dell Precision workstations running on Windows XP Pro and connected to our server using a gigabit network. High-resolution color printing is available locally using the HP Color LaserJet 4600 printer. Currently we do not have a plotter.

I.2.1 Access

In general, the CEO lab is open weekdays from 9AM to 6PM. When the Observing the Earth From Space course is in session students will be assigned lab sections to use the CEO facilities at regular times. Other CEO users should not plan on using the lab during those times unless they are assigned to that lab section as a student of OEFS. In addition, OEFS students wishing evening and weekend access can request that they be added to the special security group set up for this purpose. Finally, one can always find out if the lab is open by calling the lab at 432-1064.

I.2.2 CD/DVD Burning

All of the PC's in the CEO have DVD/ CD burners that can be used to create your own DVDs or CDs. This makes them ideal for "publishing" finished projects, retaining a permanent copy of your work, or sharing data with colleagues at other institutions. Each CD holds approximately 700 megabytes of information and DVDs hold approximately 4.3 gigabytes of data. Each PC has the Roxio Creator application installed for creating data DVDs and CDs.

I.2.3 Printing

Color and grayscale images and text files may be printed on the printers in the CEO Lab at no charge. While there are no current page-limits implemented for printing at the CEO lab, users are reminded that color printing is costly. You are requested to limit color printing to final output and lab assignments where possible. Most applications provide a Print Preview option that should be used to preview output prior to actually printing an image or map. The lab printers have duplex printing capabilities. From the Printer Properties screen in Windows you can select the "Print on both sides" option. This will help conserve supplies and reduce the operating costs of the lab.

I.2.4 Portable Spectrometer

A portable spectrometer is a device that can be used to measure spectral radiance, spectral irradiance and spectral reflectivity of surfaces. The Center for Earth Observation has a portable spectrometer from Analytical Spectral Devices Inc. (ASD) in Boulder, Colorado. This instrument is sensitive to a wavelength range from about 350 to 2500 nm and has over 1000 channels. This range includes a little bit of the near ultra-violet, all of the visible and near infrared parts of the electromagnetic spectrum.

The spectrometer uses a fiber optic cable with a foreoptic attachment to limit the sensor field-of-view. A detector array in the spectrometer captures photons that are converted and stored as electrons. The stored electrons are converted from a voltage to digital data and transferred to a PC as "raw" digital numbers.

The spectrometer measures three specific radiation quantities: reflectance, radiance, and irradiance. It uses a specially configured Toshiba laptop computer to perform the numerous calibrations and reference corrections that are required when measuring radiation quantities. These calibrations and corrections remove the "dark current" portion of the signal associated with thermal electrons and produce signal ratios to adjust for varying ambient lighting.

Detailed operating instructions can be found in the Observing Earth From Space class exercise "Experiments with a Personal Spectrometer". A copy of this document is located in the On Line Documentation section of the CEO website.

I.3 CEO Data Archive

The CEO has an extensive collection of satellite images from around the globe. These include hundreds of Landsat MSS and Thematic Mapper (TM) scenes. Many of these scenes are located in Southwest Asia, Africa, the United States, Brazil, and Southeast Asia. We also have several dozen high-resolution IKONOS images, various SPOT scenes, individual AVHRR scenes, and several composite AVHRR datasets and archived data from the new MODIS and ASTER sensors. The CEO Data Archive web pages contain a list of images, organized by continent then country, with a "quick look" browse image for most datasets. Since it would be impractical to keep all of the full resolution images permanently loaded on our hard disks, they have been placed on CD-ROM for easy access whenever necessary. Following are the general steps one would perform to load a dataset from the CD-ROM archive.

  1. Find the scene you want on the CEO Data Archive and make note of the CD#". Retrieve this CD from the cabinet in the back of the lab.
  2. Place the CD in the CD-ROM drive.
  3. Click the "load algorithm" button on the ERMapper main menu bar. Select the CDROM drive from the "Directories" menu of the Load Algorithm window. Select the "algorithm" directory and click "OK". Finally, select the ready-made algorithm for the scene you want to view and click "OK". The algorithm will start displaying.
  4. Zoom around the image, adjust enhancements and so on until you identify the area you wish to cut out of the image. Open the "Geoposition" window under the "View" menu of the main toolbar, and select "Extents" on the Geoposition window. Record the values for "Cell X" and "Cell Y" for the top left and bottom right corners of the current zoom. Round these values down to the nearest integer.
  5. Use the "Save a Subset of an Image" module found in the ERMapper main menu [Utilities | File Maintenance | Datasets]. Use the Cell X and Y values from step 4 to specify a range to cut from the source image. Save this new file in your work area. When you are done return the CD-ROM to the cabinet in the CEO lab where you found it.

Note: The Geoposition window lists cell coordinates using X and Y labels. The Image Subset Wizard uses Row and Column notation. The X and Y order is NOT the same as the Row and Column order. The Start Row in the wizard is equivalent to the "Top Left Cell Y" in the Geoposition window.

I.4 Digital Elevation Model (DOQ)

I.4.1 Background

Digital elevation data can be used to great advantage when studying the environment. Remote sensing and GIS software programs can use these data to digitally enhance images, revealing previously hidden topographical relationships. There are a variety of DEM products available today, with resolutions ranging from 30 to 1,000 meters. These data are normally stored as raster datasets using a signed 16-bit data type. Please read the DEM document in FAQ section of the CEO web site to learn more about the various datasets and how to obtain and import them.

I.4.2 U.S. Datasets

The primary source of elevation data for the U.S. is the USGS National Map Seamless Server. These data are constantly updated with the better data as it becomes available. The server provides complete coverage of the conterminous U.S. at 1 arc second (30 meter). Most of the country is also available at 1/3 arc second (10 meter) resolution. There are guidelines on how to download and import DEM data in the FAQ section of the CEO web site.

SRTM data are also available for the U.S. at a 30 meter resolution and 90 meter resolution globally. See the CEO FAQ on SRTM DEM data. Version 1 of these data had many voids so you should always select the filled-finished version 2. The CEO also has a nationwide 100 meter dataset in ERMapper format on CDs. You should see a member of the CEO Staff if you wish to use this dataset.

I.4.3 Global Datasets

Global elevation data are generally available at 1000 m and 90 m resolutions. The GTOPO30 data have 1000 m resolution. This product was developed by the USGS in 1996. The CEO has the complete global coverage in ERMapper format online at N:\ERM_files\GTOPO30_DEM. You can learn more about this data at the USGS GTOPO30 site.

The Shuttle Radar Topography Mission (SRTM) mapped 80% of the Earth's surface in 2000. These data have now been released in one degree tiles with a resolution of 90 meters. Currently the data have some voids and gaps but the data are in the process of being "cleaned". You can learn more about processing the SRTM data on the CEO web site.

The USGS offers the ASTER Global Digital Elevation Model which is a global 30 meter resolution elevation data that has been created by the ASTER sensor. This is version 1 data with data viods and irregual coastlines and other issues. You can search for these data on the NASA WIST site.

I.5 Importing and Exporting Data

I.5.1 General Guidelines:

The field of satellite based remote sensing is constantly expanding, with new sensors being launched by several governments and corporations each year. Many of these sensors capture increasingly complex data and store them in new formats. As a result, the requirements and techniques for importing and exporting images change often. This challenge is mitigated to some degree by maturing data format standards and expanded software capabilities.

If you obtain data that is in a format you are not familiar with, you should first try to open the file directly with the software you are using. If this does not work, then look for an import process that may work with your data. For example, ERMapper can now directly open many image graphics formats such as JPG and TIFF. It can also open the hierarchical data format (HDF) that is being used extensively by NASA. Once you open these data it may be necessary to save them in ERMapper format to perform advanced functions with the software. ERMapper also has many different import options listed under the Utilities section of the main menu. In some cases you may be able to open an image using the ENVI software package. You can then save the image in ERMapper format for subsequent processing.

While many types of imagery can be opened easily with ERMapper and other software, some data files require special processing before you can use them successfully. The CEO staff has documented detailed instructions on how to locate and import various types of data. You should look on the CEO web site in the Online Documentation section for specific instructions. If you are still not able to process the data you have obtained, contact a member of the CEO staff for assistance.

I.5.2 Scanned images to ERMapper:

  1. Use the CEO scanner, or one of the public scanners in other computer labs at Yale, to scan your map/aerial photograph. Save as an uncompressed, 8-bit grayscale or 24-bit color TIFF file in your directory space.
  2. Directly open the image in ERMapper and save this as an ERMapper raster dataset. You may need to georeference the image in order to use it with other datasets.

I.6 Annotations

ERMapper has two built in utilities for creating annotations on your image. You can add an "Annotation Overlay" or a "Map Composition Overlay" or combinations of the two to your algorithm to add things like text, vectors, scale bars and so on. There are too many options to go into detail here, so you should see the sections of the ERMapper User Guide that discusses Annotations and Map composition for instructions on using these features.

Some of the features included in the annotation and map composition overlays are very powerful and useful. For example, scale bars, north arrows, lat/long grids, circles, labeling, point features, etc... To take full advantage of these features you should learn the ERMapper vector file format, which is described in the ERMapper Customizing manual.

Some users combine ERMapper and PowerPoint to create effective presentation graphics. You can use the annotation tools in ERMapper to add a scale bar and perhaps a legend or north arrow then save the result as a JPG image. The image can placed in a PowerPoint slide where you can easily add a title, text boxes, circles, arrows and other graphics to complete the graphic. You can also do the same in other graphics programs or Microsoft Word.

I.7 Frequently Asked Questions

1. What kind of image should I use for my project?

This is a complex question. Some of the basic issues you must consider are how frequently do you need to "look" at the surface of the earth? Are you interested in changes between two decades or two growing seasons? How much detail do you need to see on the ground? Is a 30 m pixel sufficient? If you are working on a very large area, perhaps you will need to use an image with a 1000 m pixel. What portions of the electro magnetic spectrum are of interest for the work you are doing? What budgetary constraints do you have? Must you stay with free images or can you afford a high resolution image that may cost $3,500? Please contact the CEO staff for assistance in evaluating which of the different sensors may provide the information you will need for your research.

2. Now I know what I want, where can I find an image?

The CEO Data Archive should be the first place to look for imagery. If you are not able to find the data you need in the archive, there are many places on the Internet that allow you to search for imagery and download or order it. The CEO maintains a list of good sites to look for imagery. From the main web page follow the CEO Links and navigate to the "Data Search Sites" section.

3. What projection should I use?

In many cases it's pretty arbitrary which projection you should use. One overriding factor in this decision is whether you already have some other information that is in a given projection. If so, certainly use that projection. See the discussion in the Geometric Corrections section of the Guide for more information on map projections in general, or the discussion in the ERMapper Users Forum for more information on how ERM deals with map projections.

4. Someone is giving me an image, what format should I ask for?

There are two formatting considerations when obtaining your own imagery. First, you need to decide on the physical media that will transport the image from the source system to the CEO. We can handle, in order of decreasing preference; ftp transfer over the Internet, CD-ROM, or DVD-ROM, 100MB or 250MB ZIP disk, 8mm tape.

Second, you need to decide on the format of the image file. ERM has a long list of formats that it can import (see the options under "Utilities | Import Raster" for a complete list). Some common options that work well are: GeoTIFF, a flat binary file (you need to know the number of rows/cols and how many bytes per pixel), the Mac PICT format, and an ASCII grid. ERDAS IMG files and Arc/Info export format files can also be used.

5. How do I burn a CD or DVD?

Each PC has the software package Click'N Burn. This program has good online documentation. Two of the PCs also have DVD burners which use RecordNow Max software to burn DVDs. If you are having difficulty, please see a member of the CEO staff or one of the course TA's.

II. Major Characteristics of Remotely Sensed Data

In this part we will consider some characteristics, sources and general applications of some of the common types of remotely sensed data in use today. In all cases we will consider only gridded raster data, which may be defined as information organized onto a regular two-dimensional grid. This type of data may be contrasted with vector data, information organized as lines or polygons, and irregular raster data, information stored in an irregular array. See the books by Elachi and Rees for good explanations of the physical principles underlying remotely sensed data.

II.1 Resolution Considerations

When using remotely sensed raster data one must consider four types of resolution to fully understand the meaning of an image.

  1. Spatial: The spatial resolution of a sensor is perhaps the most intuitive or obvious characteristic of an image. Spatial resolution may be defined as the area represented by one pixel in an image, or as the size of the smallest object on the ground which can be distinguished by the sensor. However, this measurement is usually smaller than the smallest object that can clearly be distinguished by someone looking at an image.
    A sensor's instantaneous field of view, IFOV, (defined as the ground area sampled (viewed) at one time) may or may not be the same as its spatial resolution, depending on the method used for sampling.
  2. Spectral: A sensor's spectral resolution is defined by the region (bandwidth) of the electromagnetic spectrum over which it is sensitive.
  3. Radiometric: Radiometric resolution refers to the number of possible values in each band of data. A high radiometric resolution allows finer distinction between values.
  4. Temporal: The time between successive passes over the same region defines temporal resolution.

II.2 Passive Sensors

Passive sensors detect energy which has been emitted by another source (for example, sunlight reflected off the surface of the earth), as opposed to active sensors which emit and measure their own energy (for example RADAR). To make use of the largest signal, passive sensors are generally designed to respond to energy in the visible and infrared regions of the EM spectrum, because reflected sunlight and long wave radiation emitted from the earth's surface and atmosphere have maximum intensities in this range. Figure 1 contains plots of the black body curves at the temperatures of the sun (normalized for a radius equal to the Earth's Orbit) and earth as well as of the absorption of radiation by the earth's atmosphere.

II.2.1 Spectral Signatures

A characterization of the energy reflected by and/or emitted from an object is known as its spectral signature. An object's spectral signature is based on its physical and chemical composition and, depending on the wavelength sensed, can be influenced by factors which mar, alter or otherwise obscure a "clean" signal from an object's surface. Examples of these factors include weather, atmospheric absorption or scattering, and the presence of shadows or water on the object's surface.

In principle, minerals may be uniquely identified based on their spectral signature. However, in practice, the spectral and spatial resolution of most passive sensors is too poor for the degree of detail necessary to do this. Figure 2 illustrates the spectral signatures of several minerals. Notice that broad variations (over several microns) occurs between some minerals, but others have the same general shape and may only be distinguished by subtle variations in the intensities of their signatures, or by the presence/absence of distinctive, narrow absorption bands in the spectra. The spectral signatures of geologic objects is further complicated by variations in weathering and surface cover, as well as the blending of individual component signatures into one combination signature for each pixel. Although these limitations might prevent identification of a specific mineral or rock from an image alone, areas with similar signatures may be identified and objects may be placed in broad groups based on their spectral signatures. The actual identification of these groups may then be confirmed with ground truth. This approach works best in regions with little obscuring ground cover such as deserts and mountains above the tree line.

Sensors equipped with thermal bands and with sufficient temporal resolution may be used to estimate the thermal inertia of an area, potentially enabling an analyst to more accurately identify an object or composition of a region.

Applications involving plant life are generally better able to take advantage of the spectral information contained in an image acquired by current day sensors for several reasons. One major reason for this is that vegetation has a very distinctive spectral signature in the visible and near infrared (NIR) which is detectable by sensors with low spatial, spectral, or radiometric resolution. Figure 3, figure 4, and figure 5 illustrate typical reflectance spectra from a few types of vegetation. The key features in these plots are vegetation's high NIR reflectance in general, and the relative reflectance of grass, deciduous and coniferous trees. Chlorophyll and water are the substances that dominate the spectral signature of plant life. Different concentrations of these compounds produce marked differences in the spectra of different plants, allowing one to make estimates of plant health, biomass, and species identification. Figure 6, figure 7, and figure 8 demonstrate how the spectral signature of vegetation changes with variations in water content, biomass, and plant health.

The spectral signature of water bodies themselves are largely dependent on water depth and suspended matter [1]. Radiation received at a sensor will have penetrated the water to some depth, reflected off suspended matter or off the bottom, then traveled back through the water to the surface. Water is a strong absorber of EM radiation, especially at longer wavelengths; blue light will travel through water for a few tens of meters or more while infrared light is absorbed almost immediately at the surface. The spectral signature of a water body therefore is composed of the spectral signature of the reflecting surface (the bottom or suspended particle) minus whatever is absorbed or scattered as the light rays travel through the overlying water. This effect is shown in figure 9, which plots the spectral reflectance of water with a sandy bottom at various depths. Notice that the overall shape of the plot remains similar at different depths, but that the intensity drops off as the water gets deeper.

Water will also affect the signature of bare soil or sand. Figure 10 is a plot of wet and dry sand reflectance. Notice that the water decreases the reflectance of the sand more severely at longer wavelengths. This effect is also illustrated in figure 11 which plots three spectra acquired on a wet lawn. The plot demonstrating high reflectance in the NIR is from grass on this lawn. The grass in the middle plot is partially covered with water so that only the tips of the blades of grass stick out of the water. A puddle of water covers the grass in the third plot by about 2-3 inches. Note that the water decreases the reflectance at all wavelengths, but that this effect is stronger at longer wavelengths.

II.2.1.1 Key References:

Knipling, Edward B., 1970, Physical and Physiological Basis for the Reflectance of Visible and Near-Infrared Radiation from Vegetation, Remote Sens. Environ., 1, 155-159.
Wallace, J. M., and Peter V. Hobbs, Atmospheric Science an Introductory Survey, Academic Press, San Diego, 1977.

II.2.2 Patterns and Textures

Fortunately, humans are quite good at recognizing and classifying spatial patterns, because computers are not! A computer can help enhance or smooth spatial detail at different frequencies however, which in turn may help a user interpret a given image.

Spatial analysis can be a powerful tool for identifying and characterizing large-scale features such as folds, faults, and drainage patterns. Remotely sensed data provide the ability to efficiently map extremely large areas.

Spatial analysis is generally less useful than spectral analysis for identifying different groups of plant life. In general variations in plant texture can be subtler than large geologic features and can occur quite frequently, increasing the effect of (typically) high frequency noise in the data. However, patterns of plant life, perhaps determined through spectral techniques, can often give clues to the underlying geology of an area. For example, a transition from one species of plant to another might indicate a transition from one soil type to another.

II.3 Active Sensors

Active sensors provide their own source of energy. They are designed to illuminate a target with radiation and measure the reflected energy. Common active remote sensors are RADAR, SONAR, and LIDAR. Because these sensors produce their own energy, they are not dependent on solar reflection and can operate during both day and night. It is also possible to direct the angle of illumination to enhance reflectance of various surfaces. SONAR systems will not be discussed in this document.

II.3.1 RADAR

RADAR is an acronym for RAdio Detection And Ranging. The earliest radar systems operated in the radio band of the electromagnetic spectrum from approximately 1 to 10m. Modern radar systems transmit in the shorter wavelength microwave band from approximately 0.8cm to 1m. A radar system produces frequent, short bursts of microwave energy and measures the strength of the reflected echo, sometimes referred to as backscatter. Longer-wavelength radar systems can penetrate clouds and some surfaces such as sand and snow. This makes it an ideal tool for imaging tropical regions that have almost constant cloud cover. It has also been used to locate ancient stream beds in desert areas.

Two common forms of radar are not used to image the earth's surface. One is the Doppler radar system, otherwise known as the radar gun. It uses Doppler frequency shifts to measure relative differences in speed of the reflector and target. Plan Position Indicator (PPI) radar systems feature a rotating antenna with a circular sweeping display. These are commonly used for weather forecasting and air traffic control.

Side Looking Airborne Radar (SLAR) systems are used to image the earth's surface. These systems have an antenna fixed to the bottom of an airplane or spacecraft that is typically pointed to the side of the flight path. The side looking scheme was devised so that airplanes could fly parallel to the border of a hostile nation and "look" into the enemy territory.

Radar systems transmit energy in the microwave portion of the electromagnetic spectrum using wavelengths from approximately 0.75 cm to 100 cm. This range is divided into 8 bands, each identified by an alphabetic code (Table 3). These random letter designations were assigned during World War II as a security measure.

Table 3. Radar Band Designations
Band
Designation
Wavelength
Range in cm
Ka0.75 - 1.1
K1.1 - 1.67
X1.67 - 2.4
C3.75 - 7.5
S7.5 - 15
L15 - 30
P30 - 100

POLARIZATION - Radar systems transmit energy in either a horizontal (H) or vertical (V) polarized plane. Systems generally receive reflected energy in the same plane as was transmitted. These are referred to as HH or VV systems. Horizontal systems are usually better at discriminating rectangular features such as buildings and fields, while vertical systems are usually better at discriminating between vertical features such as trees. More sophisticated systems have two receiving antennas and capture reflected energy with the opposite polarity. These are referred to as HV or VH systems. Some advanced radar systems can transmit and receive both polarities and produce four images of an area; HH, VV, HV, and VH. These multi-polarity sensors offer greater information, similar to the capabilities of multi-spectral images used by passive sensors.

INTERPRETATION - Satellite images produced by passive sensors record the variations in reflectivity and absorption of objects across the electro-magnetic spectrum. Interpretation of radar images is significantly different than interpretations of passive sensors. Radar images record variations in structure, texture, and electrical properties of the targeted surfaces.

Surface slope has a significant impact on the macro-scale interpretation of radar images. Slopes that face an antenna (foreslopes) are brighter than slopes facing away from the antenna. As the foreslopes approach perpendicular to the radar beam, their reflectance becomes brighter. This is known as foreslope brightening. Foreslopes take less time to image than backslopes. This phenomenon, called foreshortening, results in foreslopes being recorded shorter than they really are. Objects with very steep foreslopes will appear to lean toward the radar source. This is a result of the radar beam intercepting the top of the object before the base and is known as layover.

Surface roughness produces micro-scale relief on radar images. Radar-smooth surfaces will cause the transmitted energy to reflect away from the antenna. These surfaces appear dark on an image. Typical radar-smooth surfaces are calm water and paved roads. Radar-rough surfaces produce a diffuse reflection, resulting in a brighter image. Examples of radar-rough surfaces are cobbles, old-growth forest canopies, and surface waves on water. Apparent roughness on radar images is also dependent on radar wavelength and relative angle between the radar beam and the target surface.

The dielectric constant is a measure of an objects' ability to conduct or reflect microwave energy. For most objects, this phenomenon has no significant impact on a radar image. As surface moisture increases, the dielectric constant and reflectivity increase. This would make a recently irrigated field appear brighter than a similar field without irrigation. Metallic objects such as bridges and railroad tracks act as amplifiers and appear very bright on radar images.

Where to find more information about RADAR at the Center for Earth Observation?

One of the first places to look is the textbook Remote Sensing and Image Interpretation by Lillesand and Kiefer, which can be found in the CEO lab or in one of the Yale libraries. Chapter 8 - Microwave Sensing provides a thorough background on radar systems and their special image processing techniques. Journal articles are also a source of relevant information. For example, the December 1995 issue of Photogrammetric Engineering & Remote Sensing (PE&RS) has three articles related to the use of Synthetic Aperture Radar (SAR) systems and sea ice. Other sources of information about radar are the remote sensing software packages used at the CEO, and websites for the various radar systems.

The ERMapper Applications Manual has a chapter dedicated to SAR imagery in mineral and oil exploration. Two case studies are described, outlining the reasons why radar imagery was appropriate for these projects. The Applications Manual is available in the CEO lab. It can also be found on-line by selecting the Help button from the main ERMapper menu. ERMapper has another on-line manual for radar. This is the ER Radar Manual. It provides a detailed description of specific processing and analysis techniques and algorithms used within ERMapper.

Two other remote sensing software packages available at the CEO are ERDAS Imagine and ENVI. Each application has a set of tools and on-line documentation for processing radar images. In addition, ENVI has several tutorial exercises to learn how to process and interpret radar data. The ENVI tutorial manual can be found in the CEO lab.

The World Wide Web is a vast source of information, some of it even on radar! You should begin by going to the CEO Links section of the CEO Home Page. There is a section for Radar which includes links to several of the more important radar providers.

The CEO also has sample images and browse software for RADARSAT, SIR-C, and JERS-1 data. These image samples should help you to understand the capabilities and challenges involved with using radar data. These samples are stored in the cabinet that contains the CEO Data Archive. See any of the CEO staff if you wish to work with these data sets.

II.3.2 LIDAR

LIDAR is an acronym for LIght Detection And Ranging. This active remote sensing system transmits pulses of laser light from an airborne platform and records the time delay of the reflection to measure the distance between the aircraft and the surface. When global positioning systems are integrated with the lidar, surface maps can be generated with sub-meter accuracies.

Lidar systems are able to record multiple returns at each pulse. This means that multiple surfaces can be measured at the same time. It is possible to map a forest canopy and the forest floor, or the surface and depth of a water body.

II.4 Sensor Descriptions

This section gives a very brief introduction to the types of satellite imagery most commonly used in the CEO, and some pointers to places to find more information. Refer to Lillesand and Kiefer or other textbooks for more detail on these sensors, as well as descriptions of sensor formats not included here and additional sources of sensor data. Additional key references are included at the end of each section.

Figure 12 is a graphic comparison of the spatial and spectral resolutions of each of the various sensors discussed below. Each band on each sensor is represented by a rectangle on this plot. The x-axis on this plot is calibrated to wavelength (log scale to show detail at short wavelengths), so each band's width in the x-dimension spans the wavelengths to which it is sensitive. The y-axis on this plot is arranged by sensor, and each box's y-dimension is proportional to its spatial resolution. This convention is the same for plots 12a and 12b, the difference being that the spatial scale on the y-axis changes.

II.4.1 Landsat Sensors

II.4.1.1 Historical, Orbital and Resolution Overview:

The Landsat program began in 1967 with the concept of a series of six Earth Resources Technology Satellites (ERTS), which were designed to test the possibility of obtaining data on Earth's resources from unmanned satellites. NASA renamed the ERTS program the Landsat program just prior to the launch of Landsat 2. Textbooks have an extensive section describing the Landsat program, so only the key details of the sensors will be described here.

Table 3: Summary of the Landsat Program
Satellite Launched Decommissioned Orbit Sensors
Landsat-1 7/23/72 1/6/78 18 days/920 km MSS-RBV
Landsat-2 1/22/75 2/25/82 18 days/920 km MSS-RBV
Landsat-3 3/5/78 3/31/83 18 days/920 km MSS-RBV
Landsat-4 7/16/82 -- 16 days/705 km MSS-TM
Landsat-5 3/1/84 -- 16 days/705 km MSS-TM
Landsat-6 10/5/93 Lost Immediately -- --
Landsat-7 4/15/99 -- 16 days/705 km ETM+
II.4.1.1.1 RBV
The Landsat Return Beam Vidicon was flown on the first three Landsats. On Landsats 1 and 2 the RBV was actually three sensors, which were essentially television cameras, each of which was sensitive to a different region of the visible spectrum. On Landsat three the RBV was changed to be two side-by-side panchromatic television cameras, meaning that each sensed radiation over the entire visible region of the spectrum. The RBV on Landsat 3 had an effective spatial resolution of 19 meters.

Unfortunately, these sensors were plagued with technical problems and they were replaced on Landsat 4 with the Thematic Mapper sensors.

II.4.1.1.2 MSS
The Landsat Multi-Spectral Scanner flew on the first five Landsats, providing continuous, comparable data over a period of about 20 years, from 1972 to 1993. This fact makes MSS data appealing to those doing change detection analysis. The MSS has a swath width of 185 km, and an individual scene is approximately 170 km in the along track direction, (so that MSS scenes are approximately square). The MSS has an IFOV of 80m by 80m, but the spatial resolution is actually about 56m by 79m due to scanner overlap. However, corrected MSS data comes resampled to 57m x 57m. Data is recorded in four bands in the visible and near IR, and stored with 7-bit radiometric resolution in bands 1-3 and 6-bit resolution for band 4 [2]. The MSS uses a side-sweeping scanner to sample all four bands, six lines at a time, pixel-by-pixel (as opposed to the RBV TV camera approach). This can lead to a striping effect if the different radiometric responses of the sensors are not properly calibrated.

band 1: (green, 0.50-0.60µm) This region corresponds to the green reflectance of healthy vegetation and is useful for mapping detail, such as depth or sediment in water bodies. Cultural features such as roads and buildings also show up well in this band.
band 2: (red, 0.60-0.70µm) Chlorophyll absorbs these wavelengths in healthy vegetation. This band is useful for soil and geologic boundary discrimination.
band 3: (near IR, 0.70-0.80µm) The near IR is responsive to vegetation biomass and health.
band 4: (near IR, 0.80-1.10µm) Band 4 is very similar to band 3. It is used for vegetation discrimination, penetrating haze, and water/land boundaries.

II.4.1.1.3 TM
Like the MSS, the Thematic Mapper has a swath width of 185 km and an along-track distance of 170 km for an individual scene. Its 30m by 30m IFOV [3] is resampled to a 28.5m by 28.5m spatial resolution when geometric corrections are applied. The TM data is stored with 8-bit radiometric resolution in seven spectral bands. Bands 1,2,3 are sensitive to visible radiation. Bands 4,5,7 are in the reflective IR and band 6 is in the thermal IR.

band 1: (blue, 0.45-0.52µm) Water increasingly absorbs EM radiation at longer wavelengths, so band 1 provides the best data for mapping depth/detail of water covered areas. It is also used for soil/vegetation discrimination, forest mapping and distinguishing cultural features.
band 2: (green, 0.52-0.60µm) Like MSS band 1, this corresponds to the green reflectance of chlorophyll in healthy vegetation.
band 3: (red, 0.63-0.69µm) This band is useful for distinguishing plant species, soil and geologic boundaries.
band 4: (near IR, 0.76-0.90µm) Band 4 corresponds to the region of the EM spectrum which is especially sensitive to varying vegetation biomass. It also emphasizes soil/crop and land/water boundaries.
band 5: (mid IR, 1.55-1.74µm) This region is sensitive to plant water content which is a useful measure in studies of vegetation health. This band is also used for distinguishing clouds, snow and ice.
band 6: (thermal IR 10.40-12.50µm) This region of the spectrum is dominated completely by radiation emitted by the earth and is useful for crop stress detection, heat intensity, insecticide applications, thermal pollution and geothermal mapping.
band 7: (mid IR, 2.08-2.35µm) This region is used for mapping geologic formations and soil boundaries. It is also responsive to plant and soil moisture content.

ETM+

The Enhanced Thematic Mapper Plus (ETM+) sensor captures data using the same seven bands as the TM sensors. One major feature of this enhanced sensor is the addition of a panchromatic band with 15m spatial resolution and a bandwidth from 0.52 to 0.90 µm. The second major enhancement is the increase in spatial resolution of the thermal band (6) from 100m to 60m.

The Scan Line Corrector on the ETM+ sensor failed in May 2003. As a result there are gaps between each line of data. The USGS will sell images that have these gaps filled in with data from previous scenes. While this produces a gap free picture, it should not be used for change detection analysis. Currently Landsat 5 images are being sold again and NASA is exploring options to replace the sensor.

II.4.1.2 Data Sources and Pricing:

The U.S.G.S.is the primary source of Landsat data in the United States. They have the largest collection of images for the U.S. and a very large catalogue of images from other parts of the world. As of January 2009 these data are now free. Several other countries have established Landsat ground receiving stations which also archive and distribute imagery. These stations are known as the Landsat Ground Station Operations Working Group (LGSOWG).

One can search these archives online through the USGS EarthExplorer website at http://earthexplorer.usgs.gov.

Availability and pricing of satellite images changes frequently. You should explore the Image Archives and CEO Links - Image Archives section or the CEO FAQs page and/or see a member of the CEO staff for current information.

II.4.1.3 Key References:

Short, N. M., The Landsat Tutorial Workbook, NASA Ref. Publ. 1078, U.S. Government Printing Office, Washington, DC, 1982. (Yale Library: Forestry, QB637 +S56 (LC) )
U.S. Geological Survey (USGS), Landsat Data Users Handbook Revised, USGS, Sioux Falls, SD, 1979.
U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration, Landsat 4 Data Users Handbook, USGS, Sioux Falls, SD, 1984.

II.4.2 SPOT

II.4.2.1 Historical, Orbital and Resolution Overview:

The SPOT (System pour l'Observation de la Terre) satellite was launched into an 832 km polar orbit on 2/33/86 by a multi-national cooperation, primarily France, Sweden and Belgium. SPOT repeats its Orbit every 26 days but has the capability of 'off-nadir' viewing (looking at a scene to either side of the ground-track). This capacity increases SPOT's potential temporal resolution to 3-4 days, a significant improvement when studying short time-scale phenomena like volcanic eruptions or fires.

The SPOT ground track has a swath width of 60 km nadir and 80 km off-nadir, and images are stored in 60 km along-track segments. Unlike the MSS and TM instruments, SPOT acquires its data using a 'push broom' method as opposed to a side-scanning mirror. The platform carries two identical high-resolution-visible (HRV) scanners, which may be used in either of two modes.

SPOT 4 was successfully launched on March 24, 1998. The satellite features the new high-resolution visible/infrared (HRVIR) sensor package and a "vegetation" sensor package. The HRVIR differs from the HRV in that it includes a new band in the short-wave infrared range that is very sensitive to soil and leaf moisture. The "vegetation" sensor package captures data using the same four bands of the electromagnetic spectrum but has a pixel resolution of 1km and a swath width of 2,250 km. This will provide near-global coverage daily.

SPOT 5 was launched on May 4, 2002. This satellite features a pair of 5m panchromatic sensors that can be combined to produce a 2.5m black and white image. It also has a 10m multi-spectral sensor. This satellite carries the same "vegetation" sensor package as SPOT 4.

Panchromatic mode

In panchromatic mode the HRV has an IFOV of 10m2, stores data with 8-bit resolution in only one band which spans the visible region of the spectrum.

band 1: (0.51-0.73µm) The radiometric information content of this band is very similar to that of a black and white photograph. SPOT panchromatic images are not very useful by themselves for classification of landscapes. However, their very high spatial resolution makes them useful for visual interpretation, digitally sharpening lower-resolution multi-spectral data, and generation of stereo pairs.

Multi-Spectral mode

In XS (multi-spectral) mode the HRV has an IFOV of 20m2, stores data with 8-bit resolution in three spectral bands which are similar to bands 1,2, and 4 of the MSS.

band 1: (green, 0.50-0.59µm) Like MSS band 1, this corresponds to the chlorophyll reflectance of healthy vegetation.
band 2: (red, 0.61-0.68µm) Like MSS band 2, this band is useful for distinguishing plant species, soil and geologic boundaries.
band 3: (near IR, 0.79-0.89µm) Like MSS band 4, this band is sensitive to varying vegetation biomass and emphasizes soil/crop and land/water boundaries.
band 4: (short-wave IR, 1.5-1.75µm) (HRVIR only) This band has a high degree of sensitivity to soil and leaf moisture.

II.4.2.2 Data Sources and Pricing:

The U.S. distributor of SPOT imagery is the SPOT Image Corporation. Customer services at SPOT Image may be reached at 1-800-ASK-SPOT. SPOT Image's educational support program is substantial. Level 1 imagery (basic radiometric and geometric corrections) is available at the educational price of $1000 per scene, as opposed to the standard commercial price of $2600.

II.4.2.3 Key References:

CNES, SPOT Image, SPOT User's Handbook, 3 Volumes (Volume 1: Reference Manual, Volume 2: SPOT Handbook. Volume 3: SPOT Handbook Appendices), Centre National d'Etudes Spatials and SPOT Image Corporation, Toulouse, France and Reston, VA.

On the web at: http://www.spot.com/

II.4.3 AVHRR

II.4.3.1 Historical, Orbital and Resolution Overview:

The Advanced Very High Resolution Radiometer (AVHRR) was first launched with TIROS-N on 10/19/78 and has flown on each of the subsequent NOAA satellites through NOAA-14. NOAA-15 was successfully launched in May, 1998 and is being brought online as of June, 1998. The AVHRR flown aboard TIROS-N, NOAA-6, NOAA-8 and NOAA-10 has four spectral channels, while those flown aboard NOAA-7, NOAA-9, NOAA-11, NOAA-12 and NOAA-14 have an additional thermal infrared channel. The nominal orbital altitude for each of these satellites of 833 km gives a repeat orbit every 8-9 days, but the swath width of the AVHRR is approximately 2,400 km, which allows for complete global coverage every day. The data is stored as 10-bits per pixel. The IFOV for the AVHRR is 1.1 km by 1.1 km at the nadir and 2.4 km by 6.9 km at the edges of the image.

The AVHRR transmits data in three modes. Data is continuously broadcast at full spatial resolution, and may be received/stored by any station within line of sight that is capable of capturing the signal. Data acquired directly from the satellite in this manner is known as High Resolution Picture Transmission (HRPT). Full resolution is also stored using onboard tape recorders for selected regions then dumped to a ground station once per orbit. Datasets recorded then dumped in this manner are known as Local Area Coverage (LAC) data and have the same resolution characteristics as HRPT. In addition to these high resolution AVHRR data, lower spatial resolution data sets, known as Global Area Coverage (GAC), are maintained for all regions. These lower resolution datasets are produced by sampling every third scan line and averaging four out of every five pixels along each scan line, resulting in approximately 4 km resolution.

The AVHRR's four or five spectral bands are used primarily for mapping large areas, especially when good temporal resolution is required. Applications include snow cover and vegetation mapping; flood, wild fire, dust and sandstorm monitoring; regional soil moisture analysis; and various large-scale geologic applications.

band 1: (visible, 0.58-0.68µm) The blue-green region of the spectrum corresponds to the chlorophyll absorption of healthy vegetation.
band 2: (near IR, 0.725-1.10µm) This region is sensitive to varying vegetation biomass and emphasizes soil/crop and land/water boundaries.
band 3: (IR, 3.55-3.93µm) A thermal band which detects both reflected sunlight and earth-emitted radiation and is useful for snow/ice discrimination and forest fire detection.
band 4: (thermal IR, 10.30-11.30µm) A band useful for crop stress detection and locating/monitoring geothermal activity. This channel is also commonly used for water surface temperature measurements.
band 5: (thermal IR, 11.50-12.50µm) Similar to band 4, this channel is often used in combination with band 4 to better account for the effects of atmospheric absorption, scattering, and emission.

II.4.3.2 Data Sources and Pricing:

Two very good sources of AVHRR data are the USGS EarthExplorer website at http://earthexplorer.usgs.gov and the Satellite Active Archive operated by NOAA/NESDIS at http://www.saa.noaa.gov. These are fully searchable archives, complete with browse images. Full datasets are available on 8mm tapes for $50. The SAA also offers a discount for multiple images ordered at one time with a charge of $50 for the first image on each tape and $30 each for as many more images that will fit on the tape. There is also a provision for obtaining image subsets less than 10 megabytes in size for free via ftp.

Many other sites offer AVHRR data in various formats, covering various locales with varying pricing schemes. Generally these are sites with HRPT stations that post whatever they get for some short time period. Good places to start looking are the University of Miami, Louisiana State University, University of Hawaii, University of Colorado and Dundee University in England.

II.4.3.3 Key References:

Kidwell, K. B., NOAA Polar Orbiter Data Users Guide, NOAA/NESDIS/NCDC, Satellite Data Services Division, Washington D.C., 1995.

II.4.4 GOES

II.4.4.1 Historical, Orbital and Resolution Overview:

Geostationary satellites have been contributing to meteorological observations and analyses since the first geostationary Applications Technology Satellite (ATS-1) was launched in 1966. The Geostationary Operational Environmental Satellites (GOES) began operating in 1975, providing operational imagery in the visible and infrared at frequent intervals. As opposed to all of the satellites discussed thus far, GOES satellites are geostationary, meaning that they remain fixed over a single point on the earth's surface. These images are distributed in almost real time for weather forecasting. They have also been used for some applications involving very large-area mapping when high-resolution is not required. Two of these spacecraft are currently in operation, GOES-8 covering the eastern US and the Atlantic Ocean and GOES-9 covering the western US and the Pacific Ocean. GOES-10 has been launched and has been held in reserve.

GOES 7
Launched in February 1987, GOES-7 is the last spin-stabilized GOES remaining in operation. The Visible and Infrared Spin Scan Radiometer Atmospheric Sounder (VAS) instrument (first flown on GOES-4 in 1980) on GOES-7 has two distinct modes of operation, imaging and sounding. In imaging mode, west to east scanning is performed as the satellite spins around its vertical axis and individual scan lines are acquired by stepping a mirror in between scans. In the sounding mode, a single scan line is repeatedly acquired to improve the signal to noise ratio. The VAS instrument consists of eight visible sensors which have an IFOV of 1 km and six thermal sensors. Two of the thermal sensors (used primarily for imaging) have an IFOV of approximately 7 km, and the other four (used primarily for sounding) have an IFOV of approximately 14 km. Ground operators can program which spectral bands the IR sensors will record by manipulating an array of 12 filters which are mounted on a wheel in front of the detectors. All data is acquired with 6-bit resolution.

In the imaging mode, three bands of information are routinely acquired:
Visible band: The GOES visible band is sensitive to most of the visible region of the spectrum, from about 0.4 to 0.7 µm. Radiation in this region of the spectrum is entirely composed of reflected or backscattered sunlight and is therefore indicative of the earth's albedo, an important quantity for computing the radiation budget of the earth's atmosphere. This band is only activated during the day. Visible images are acquired at 1 km resolution, but are often distributed at 8 or 16 km resolution for practical purposes.

IR band: The GOES IR band that is usually acquired in imaging mode is centered around 11.2 µm, located in the infrared window. Emitted radiation from the earth's surface and atmosphere dominates this region of the spectrum. Images of this band are usually printed as negatives so clouds (low IR emitters, due to their low temperature) appear white. This band is commonly used to estimate a temperature profile for the earth's atmosphere, among other applications. IR images are commonly acquired at 7 km resolution and resampled to 8 or 16 km resolution for distribution.

Water vapor band: The GOES water vapor band most commonly used for imaging is sensitive to radiation around 6.7 µm, though bands centered at 12.7 and 7.3 µm (also sensitive to H2O emission), and 13.3 (CO2) or 3.9 (window) µm are sometimes substituted. Images of this band are also commonly printed as negatives for the same reason as the IR band. Water vapor images give a picture of upper-tropospheric moisture distribution and are commonly acquired at approximately 16 km resolution.

The sounding mode of the VAS is primarily used for research purposes because it cannot operate independently of the imager.

GOES 8 through 12
GOES-8 is the first in a new series of GOES satellites, collectively known as GOES I-M, that are designed to take a step beyond the VAS system. Like GOES-7, this generation of GOES satellites has both imaging and sounding capabilities, however these functions have been separated into two instruments which can function simultaneously, enabling operational sounding products for the first time. GOES-8 is three-axis stabilized, which means that the instruments always aim at Earth, enabling longer exposures and improved signal to noise ratios. Data is acquired with 10-bit precision, which results in crisper looking images as compared to GOES-7.

Currently GOES-12, referred to as GOES - East, is centered at 75W. GOES-10 is now GOES - West and is centered at 135W. GOES 8 and 11 are in stand by orbits. GOES-9 has been redirected to the northern Pacific.

The imager on the GOES I-M series has five spectral bands:
band 1: (visible, 0.52-0.72 µm) The visible band produces images of clouds and the Earth's surface in clear sky with 1 km resolution.
band 2: (mid IR, 3.78-4.03 µm) Similar to the AVHRR band 3, this region of the spectrum is dominated by reflected sunlight during the day, and emitted thermal radiation at night. It is useful for daytime snow/ice/water discrimination, night time measurements of sea surface temperature, and hot spot (volcano, forest fire) monitoring. Band 2's resolution of 4 km is a major improvement over GOES-7's 13.8 km resolution in a similar band.
band 3: (water vapor, 6.47-7.02 µm) GOES-8 images of upper-tropospheric water vapor have a slightly narrower spectral range, and increased radiometric (10-bit vs. 8-bit) and spatial (8 km vs. 13.8 km) resolution relative to the corresponding GOES-7 data.
band 4: (Thermal IR, 10.2-11.2 µm) Similar to the AVHRR band 4, this band provides cloud top and sea surface temperatures day and night at 4 km resolution.
band 5: (Thermal IR, 11.5-12.5 µm) Similar to the AVHRR band 5, this band is slightly more sensitive to water vapor absorption than band 4, and should provide useful sea surface and low-level moisture data, especially when used in conjunction with band 4.

The GOES I-M sounder acquires data in 18 infrared bands and one visible band. See Menzel and Purdom for detailed descriptions of both the imager and sounder instruments.

II.4.4.2 Data Format and Availability

GOES data streams are publicly broadcast, and with the proper receiving equipment anyone can download real time imagery. High quality commercial reception equipment costs tens to hundreds of thousands of dollars, depending on the complexity of the system, and amateur radio fanatics have built their own stations for less. Many sites that do have reception systems make their data available to the public over the internet. Unfortunately, almost all of these sites are forced to post reduced resolution images to reduce network traffic and save on disk space.

The GOES Project home page http://rsd.gsfc.nasa.gov/goes/ has many pointers to other servers

II.4.4.3 Key References

Rao, P.K., S. J. Holmes, R. J. Anderson, J. S. Winston, and P. E. Lehr, Weather Satellites: Systems, Data, and Environmental Applications, Amer. Meteor. Soc., 1990, 503pp. (Yale Library: Geology, QC879.5 +W39 1990 (LC))
Menzel, W. P. and J. F. W. Purdom, 1994, Introducing GOES-I: The First of a New Generation of Geostationary Operational Environmental Satellites, Bul. Amer. Meteor. Soc., 75, 757-781.

II.4.5 IKONOS

II.4.5.1 Historical, Orbital and Resolution Overview:

The IKONOS satellite was launched on 24 September 1999 at the Vandenberg Air Force Base, California. It is the world's first commercial, high-resolution satellite. It has 1-meter ground resolution for panchromatic band (nominal at <26deg off nadir) and 4-meter for multispectral bands (nominal at <26deg off nadir). The IKONOS has a swath width 13km at nadir and an along track distance of 13km for an individual scene. The sun-synchronous orbit has an altitude of 423 miles/681 kilometers. Revisit frequency is 2.9 days for 1-meter resolution; and 1.5 days for 1.5-meter resolution.

Panchromatic mode: In panchromatic mode, the ground resolution is 1-meter, and data are stored in only one band which spans the visible to near infrared region of the electro-magnetic spectrum.
band 1: (0.45-0.90Ám) The radiometric information content of this band is very similar to that of a black and white photograph. The very high spatial resolution makes these images useful for visual interpretation and for digitally sharpening lower-resolution multi-spectral data.

Multispectral mode: In multispectral mode, the ground resolution of each band is 4-meter, and data are stored in four spectral bands which are same as bands 1,2, 3 and 4 of Landsat 4&5.
band 1: (blue, 0.45-0.52Ám) Same band range as Landsat 4&5 TM band 1. Water increasingly absorbs EM radiation at longer wavelengths, so band 1 provides the best data for mapping depth/detail of water covered areas. It is also used for soil/vegetation discrimination, forest mapping and distinguishing cultural features.
band 2: (green, 0.52-0.60Ám) Same band range as Landsat 4&5 TM band 2, this corresponds to the green reflectance of chlorophyll in healthy vegetation.
band 3: (red, 0.63-0.69Ám) Same band range as Landsat 4&5 TM band 3. This band is useful for distinguishing plant species, soil and geologic boundaries.
band 4: (near IR, 0.76-0.90Ám) Same band range as Landsat 4&5 TM band 4. It corresponds to the region of the EM spectrum which is especially sensitive to varying vegetation biomass. It also emphasizes soil/crop and land/water boundaries.

II.4.5.2 Data Format and Availability

Space Imaging offers the ability to fully browse and buy imagery, products and services through their website, http://www.spaceimaging.com/level2/level2buy.htm

II.4.5.3 Key References

On the web at: http://www.spaceimaging.com/aboutus/satellites/IKONOS/ikonos.html

II.4.6 RADARSAT

II.4.6.1 Historical, Orbital and Resolution Overview:

The RADARSAT satellite was launched in November of 1995 and has been operating continuously since that time. The RADARSAT system was developed in Canada and launched by NASA in exchange for data access rights. RADARSAT-1 has an orbital altitude of 798 km and an inclination of 98.6 degrees and circles the earth 14 times a day. It has a sun-synchronous orbit that allows it to rely on solar, rather than battery, power and provides satellite overpasses at the same local mean time. RADARSAT-2 is planned to launched in the year 2001.

The Synthetic Aperture Radar (SAR) sensor on RADARSAT-1 can be directed from an incidence angle of 10 to 60 degrees, in swaths of 45 to 500 km in width. This produces image resolutions ranging from 8 to 100 meters. RADARSAT-1 has a repeat cycle of 24 days, but covers the Artic daily and can reach any part of Canada in three days. Using the 500 km swath width, equatorial coverage can be repeated every six days.

SAR Characteristics: RADARSAT-1 operates in the C-band at a frequency of 5.3GHz and a wavelength of 5.6 cm. The antenna polarization is HH, meaning that the system transmits and receives energy in the horizontal plane.

II.4.6.2 Data Format and Availability

RADARSAT International handles the commercial distribution of RADARSAT images. Images cost several thousand dollars apiece. Specific missions can be planned to acquire images at required locations, times, and resolutions for an additional fee. RADARSAT International can be contacted at: http://www.rsi.ca/pricelist/price.htm

The scientific and educational community can acquire low-cost imagery from the NASA funded Alaska SAR Facility (ASF) housed at the University of Alaska Fairbanks. To learn more about the ASF and data availability, check out their web site at: http://www.asf.alaska.edu/

II.4.6.3 Key References

On the web at: http://radarsat.space.gc.ca/

II.4.7 JERS

II.4.7.1 Historical, Orbital and Resolution Overview:

The Japan Earth Resources Satellite (JERS) was launched on 11 February 1992. It acquired images from 24 August 1992 to 31 December 1996. JERS-1 has an orbital altitude of 570 km and an inclination of 98 degrees. It has a repeat cycle of 44 days and operated in s sun-synchronous mode.

The SAR sensor on JERS-1 has an off-nadir angle of 35 degrees, with a swath width of 75 km. This produces an image resolution of 18 meters. JERS-1 operates in the L-band at a frequency of 1275 MHz.

JERS-1 also has an optical sensor package OPS. OPS has a 75 km swath width and a pixel resolution of 18m x 24m. The sensor captures reflected energy in seven spectral bands from the visible to mid-infrared wavelengths. It can also produce stereoscopic images.

band 1: (green. 0.52-0.60 Ám). This corresponds to the green reflectance of chlorophyll in healthy vegetation.
band 2: (red. 0.63-0.69 Ám). This band is useful for distinguishing plant species.
band 3: (near IR. 0.76-0.86 Ám). This band is especially sensitive to plant biomass.
band 4: (near IR. 0.76-0.86 Ám). This band operates at the same wavelength as band 3 but is aimed at 15.3 degrees forward to produce stereoscopic images.
band 5: (mid IR. 1.60-1.71 Ám). This band is sensitive to plant water content.
band 6: (mid IR. 2.01-2.12 Ám). This band is used to map geologic formations and is responsive to plant and soil moisture.
band 7: (mid IR. 2.13-2.25 Ám). This band is used to map geologic formations and is responsive to plant and soil moisture.
band 8: (mid IR. 2.27-2.40 Ám). This band is used to map geologic formations and is responsive to plant and soil moisture.

II.4.7.2 Data Format and Availability

Information regarding data availability and cost can be obtained at the Earth Observation Center of the National Space Development Agency of Japan at the following website: http://www.eoc.nasda.go.jp/homepage.html

II.4.7.3 Key References

On the web at: http://www.eoc.nasda.go.jp/guide/satellite/sat_menu_e.html

II.4.8 Terra

II.4.8.1 Historical, Orbital and Resolution Overview:

On 18 December 1999 the Terra spacecraft (formerly known as EOS AM-1) was launched. Terra has an orbital height of 705km with a sun-synchronous, near-polar orbit. This spacecraft contains 5 new sensor packages to study the earth's surfaces and atmosphere. For the latest update, and a great deal of more information on the Terra mission, check out the NASA web site at: http://terra.nasa.gov/

The following is a brief synopsis of each of the sensor packages:

II.4.8.2 ASTER - Advanced Space borne Thermal Emission and Reflection Radiometer

ASTER is designed to obtain high spatial resolution global, regional, and local images of the Earth. This sensor records 14 spectral bands of data ranging from visible, through short-wave infrared, to thermal. An ASTER scene covers an area of approximately 60 km by 60 km and data is acquired simultaneously at three resolutions. It has a spatial resolution of between 15 and 90 meters and is capable of 3-D stereoscopic viewing. It is anticipated that within 2 years a global 30-meter elevation model will be created using this package.

Bands 1 through 3 have a spatial resolution of 15m and cover the visible and near IR portions of the spectrum. Two receivers operate in the near IR wavelength, one pointing to nadir, one pointing backwards to produce stereoscopic images. Bands 4 through 9 operate in the short-wave IR portion of the spectrum and have a spatial resolution of 30m. This portion of the ASTER sensor failed in May 2008. Bands 10 through 14 operate in the thermal IR portion of the spectrum and have a spatial resolution of 90m. See the Appendix of the CEO online ASTER document for specific information about the wavelengths for each band of data.

For additional information about this sensor visit the ASTER web site at: http://asterweb.jpl.nasa.gov/. For information about locating and importing ASTER data see the FAQ section of the CEO website.

II.4.8.3 CERES - Clouds and the Earth's Radiant Energy

CERES will be used to measure solar-reflected and Earth-emitted radiation at the Earth's surface and the top of the atmosphere. This will be used to measure the earth's radiation balance daily. More information can be found at the ECERES web site at: http://asd-www.larc.nasa.gov/ceres/ASDceres.html

II.4.8.4 MISR - Multi-angle Imaging Spectro-Radiometer

This package features nine cameras pointing at various angles through the atmosphere. It will capture four bands of data at the red, green, blue, and near-infrared portions of the spectrum. It is designed to determine the amount, type and height of clouds and measure atmospheric aerosol particles. For specific information about MISR, visit the web site at: http://www-misr.jpl.nasa.gov/

II.4.8.5 MODIS - MOderate-resolution Imaging Spectro-radiometer

This package captures 36 bands of data, in the visible and IR portions of the spectrum, at a spatial resolution of between 250m and 1km. It is designed to provide coverage of the Earth's land, oceans, and atmosphere. It will provide global coverage every two days. For specific information about this sensor visit the MODIS web site at: http://modis.gsfc.nasa.gov/. For information about locating and importing MODIS data see the Online Documentation section of the CEO website.

II.4.8.5 MOPITT - Measurements Of Pollution In The Troposphere

This package is designed to observe carbon monoxide and methane in the lower atmosphere, and its interaction with the land and oceans. It has a spatial resolution of 22km and a swath width of 640km. More information can be obtained at the MOPITT web site.

II.4.9 Aqua

II.4.9.1 Historical, Orbital and Resolution Overview:

The Aqua spacecraft (formerly known as EOS PM) was launched on May 24, 2002. Its mission is to study the Earth's water cycle; including evaporation, water vapor, clouds, ice, snow, and soil moisture. Like its sister satellite Terra, Aqua has an orbital height of 705km with a sun-synchronous, near-polar orbit. It crosses the equator 4 hours later than Terra at approximately 2:30 PM local time. This spacecraft contains 6 sensor packages. For the latest update, and a great deal of more information on the Aqua mission, check out the Aqua project web site.

Aqua carries the same MODIS and CERES sensors as Terra that are described above. It also carries four new sensors described briefly below:

II.4.9.2 AIRS - Atmospheric InfraRed Sounder

This package is designed to develop accurate temperature profiles within clouds. It features 2378 infrared channels and 4 visible/near infrared channels. You can learn more at the AIRS website.

II.4.9.3 AMSR-E - Advanced Microwave Scanning Radiometer for EOS

This is a 12 channel, all weather, passive microwave sensor designed to measure precipitation rate, sea surface winds and temperature, water vapor and ice. The ability to measure these geophysical parameters will contribute to our understanding of the Earth's climate. You can learn more about AMSR-E on the Internet

II.4.9.4 AMSU - Advanced Microwave Sounding Unity

This 15 channel sensor designed to create upper atmosphere temperature profiles. It has a cloud filtering capability and can measure temperature at five different levels simultaneously. You can learn more about AMSU at the AeroJet site.

II.4.9.5 HSB - Humidity Sounder for Brazil

This is a 4 channel sounder developed by Brazil. It is designed to obtain humidity profiles throughout the atmosphere. You can learn more at the HSB web site.

II.4.10 SeaWiFS

II.4.10.1 Historical, Orbital and Resolution Overview:

SeaWiFS stands for the Sea-viewing Wide Field-of-view Sensor. The objective of this project is to gather data on global ocean bio-optical properties. Various types and quantities of marine phytoplankton can be identified by observing subtle changes in the oceans color. This ocean color data contributes to the study of ocean primary production and biogeochemistry.

The SeaStar spacecraft carrying the SeaWiFS instrument was launched on 1 August 1997. It has a 705km, sun-synchronous orbit. It features a spatial resolution of 1.1km and a nominal swath width of 2,800 km providing daily coverage of the world's oceans. The SeaWiFS instrument records information in 8 bands of approximately 20nm in width ranging from 400nm to 885nm. Find out more about the SeaWiFS project at the following web site: http://seawifs.gsfc.nasa.gov/SEAWIFS.html

SeaWiFS data is continuously being evaluated, recalibrated, and refined. As of November 1999 the third phase of data reprocessing was being finalized. Details of the reprocessing can be found at http://www.yale.edu/ceo/Documentation/sea_reproc.html

III. Selected Composite Datasets

Various datasets processed and compiled by national data centers are opening up previously unexplored areas of satellite research. With its frequent overpass time and wide swath width, AVHRR imagery can record daily images of the entire globe. Datasets compiled of these daily images can be used to look at temporal changes in reflectance and are very important for vegetation monitoring and land use classification. Daily ground based observations can also be obtained for variables such as meteorological conditions and agricultural statistics which can be combined with satellite data. This section describes composite datasets of satellite imagery that are available in the CEO facilities. Websites for some important sources of ground based data are listed at the end of the section.

III.1 1 Km Composite AVHRR Datasets:

In 1990, the EROS Data Center (EDC) located in Sioux Falls, South Dakota, began producing maximum NDVI composites for the conterminous United States. The maximum NDVI compositing process examines the NDVI value of each pixel from each daily pass of the compositing period and then selects the maximum NDVI value (Holben, 1986). The result is a composite image representing the maximum vegetation 'greenness' for that period. The length of the compositing period, usually weekly, biweekly, or decadal (10 days), is selected to best suit the seasonal characteristics and the phenology of the region being studied. In addition to selecting the maximum NDVI, compositing reduces the effects of cloud contamination and some atmospheric conditions that unnecessarily lower the signal. Currently, there are two major AVHRR composite datasets with 1 km resolution produced by the EDC; the Conterminous U.S. AVHRR Data Set and the Global 1 km AVHRR Data Set; these datasets are summarized in Table 4.

Table 4: 1 Km Composite AVHRR Datasets
What Conterminous US Biweekly Global 1 KM
Who EDC: Eidenshink Weinheimer Madigan EDC: Eidenshink Faundeen Foreign Ground Stations
Area Conterminous US Global
Spectral Res. Calibrated AVHRR channels 1-5 NDVI 3 solar geometry measures date Calibrated AVHRR channels 1-5 NDVI 3 solar geometry measures date
Spatial Res. 1 Km 1 Km
Temporal Res. (Composite Period) 14 Days 10 Days: 3 per month (01-10, 11-20, 20-end of month)
Radiometric Res. Raw 1&2: 0.5% reflectance (8bit) Raw 3-5: 0.5 K (8 bit) NDVI: 0.01 (8 bit) geom: 1 degree (8 bit) Raw 1&2: 0.1% reflectance (16 bit) Raw 3-5: 0.17 K (16 bit) NDVI: 0.01 (8 bit) geom: 1 degree (8 bit)
Map Projection Lambert Azimuthal Equal Area Goode's Interrupted Homolosine
Size of 1 full image (MB) 13 694 (8 bit) 1388 (16 bit)
Composite Technique (89, 92-95) 1. Calc viewing geometry 2. Raw -> Radiance 3. Calc NDVI 4. Geometric registration 5. Max NDVI composite
(90,91) 1. Raw -> Radiance 2. Calc viewing geometry 3. Geometric Registration 4. Compute NDVI 5. Max NDVI composite
1. Raw -> Radiance 2. Calc viewing geometry 3. Geometric Registration 4. Compute NDVI 5. Max NDVI composite 6. Atmospheric Corrections: a) Rayleigh Scattering (Teillet 90) b) Ozone (Teillet 91)
Composite Dates 1989 - 1995 (winter discontinuities) April 1992 - September 1996
Status/Availability All available on CDROM April 92 - Sep 93 and Feb 95 - Dec 95 avail via ftp
Satellite(s) 89 - 94 NOAA 11 95 NOAA 14 89 - 94 NOAA 11 95 NOAA 14


III.2 1 Km Land Cover Characteristics Data Bases:

The U.S. Geological Survey (USGS) and the University of Nebraska-Lincoln, have classified the temporal AVHRR composites described above to derive a land cover data base for the conterminous US, and for the world. The classification has been performed in several different ways to produce results comparable to previous studies on a smaller scale. See Table 5 for a comparison of the two datasets and the several products within each of those datasets.


Table 5: Land Cover Databases Derived from Temporal NDVI Classifications
What Conterminous US Land Cover Characteristics Global Land Cover Characteristics
Who USGS & U. Nebraska - Lincoln USGS & U. Nebraska - Lincoln
Area Conterminous US Global
Raw Data Conterminous US Biweekly AVHRR NDVI composites further composited to 1 month. 8 Bands: March - October 1990 Global 1 Km 10 Day NDVI composites further composited to 1 month. 12 Bands: April 1992 - March 1993
Spatial Res. 1 Km 1 Km
Original Classification Seasonal Land cover: 159 classes Seasonal Land Cover: 205 classes
Derived Classifications USGS LULC (Anderson level II) 26 classes, Simple Biosphere Model: 20 SiB classes + 7 mosaic classes, Biosphere/Atmosphere Transfer Scheme: 19 BATS classes + 9 mosaic classes Global Ecosystems: 94 classes, IGBP LCC: 17 classes, USGS LULC (Anderson level II): 27 classes, Simple Biosphere Model: 20 classes, Biosphere/Atmosphere Transfer Scheme: 20 classes
Derived Summary of Seasonal Characteristics Onset of greenness, Peak of greenness, Duration of greenness, Vegetation Characteristics (?), Perennial/Annual image, Leaf Longetivity (evergreen vs. deciduous) Onset of greenness, Rate of greenup, Peak of greenness, Senescence, Rate of Senescence, Duration of greenness, Time integrated NDVI
Map Projection Lambert Azimuthal Equal Area Goode's Interrupted Homolosine
Status (9/96) Complete NA Complete SA & Africa "coming soon"


III.3 Pathfinder AVHRR Land Data:

Produced and distributed as part of NASA's Mission to Planet Earth project, this data set includes global 10 day and monthly composites at 8 km and 1 degree resolution from 1982-1992. The data was collected by AVHRR sensors on-board NOAA series satellites and includes all AVHRR channels as well as additional elevation and georeferencing data.

III.4 USGS 3 Arc Second Digital Elevation Model:

This DEM was assembled by the USGS and Defense Mapping Agency. The data is distributed in latitude longitude format in one degree squares that correspond to USGS 1:250,000 quads. These can be readily converted to ERMapper format with a pixel size of 60 to 100 meters, depending on latitude. The CEO has a 9-CD set of US DEM data that can be read directly by ERMapper. The continental US, Hawaii, and Puerto Rico have 3 arc-second (approx 100 meter) resolution. Alaska has a 6x12 arc-second (approximately 150 to 250 meter) resolution. This set also includes a single scene continental US at 9 arc-seconds (approximately 300 meter) resolution.

III.5 Digital Chart of the World DEM:

Digital Chart of the World DEM data includes 30 by 30 arc-second digital elevation data derived from the Defense Mapping Agency's 1:1,000,000 scale DCW contour and hydrology data. The goal for the project at EROS Data Center is to create DEMs for the entire globe. Currently completed areas include Africa, North America, Japan, Madagascar, and Haiti.

III.6 Key References:

Eidenshink, L.C.,1992, The 1990 conterminous U.S. AVHRR data set. Photogrammetric Engineering & Remote Sensing 58 (6) pp. 809-813.
Eidenshink, L.C., 1994, The 1 km AVHRR global land data set: first stages in implementation. International Journal of Remote Sensing 15 (17) pp. 3443-3462.
Holben, B.N., 1986, Characteristics of maximum-value composite images from temporal AVHRR data. International Journal of Remote Sensing 7 (11) pp. 1417.

USGS Global Land Information Center: http://edcwww.cr.usgs.gov/webglis

Mission to Planet Earth Scientific Data: http://www.hq.nasa.gov/office/mtpe/

* Additional source information can be found via the World Wide Web as well as in the "Climate Image Datasets" black binder available in the CEO lab.

IV. Tools of Image Analysis

This introduction to the basic tools of image analysis outlines the basic image processing operations discussed in class and explored in lab. Most of these topics and many more are covered in much greater detail in your textbook.

IV.1 Geometric Corrections

Often, raw sensor data contains too many distortions to be used as a map (to use to derive accurate area and/or distance measurements). The process of correcting these errors is known as geometric correction.

The term georeferencing refers to the process of assigning map coordinates to specific pixels in a raster dataset. If the image itself is already in a known map projection, the values of individual pixels need not be altered. Instead, one simply assigns a coordinate to each pixel, which retains its original data value.

In certain cases, however, the image will not already be in the desired map projection. The process of projecting raster data onto a plane and forcing it to conform to a given map projection is known as rectification. During the rectification process data values from the original raster grid must be extrapolated onto the new, rectified grid. The method used to assign these values is known as resampling or warping. Depending on the situation, one might use any of several common resampling methods to accomplish a given rectification.

For the most part rectification, by definition, involves georeferencing since all map projections are associated with a coordinate system. However, in some cases, one might only care if the image in question aligns with the grid of another raster dataset, not if the image is in any given projection! In a case such as this, rather than assigning map coordinates to image pixels, one assigns image coordinates to the pixels, and then performs a warp (rectification process). This process, aligning the grids of two raster datasets, is known as registration and is necessary for combining datasets of different types, for example image and topography, as well as for change detection. Image to image registration only involves georeferencing if the reference image (not the one being warped) is already georeferenced.

IV.1.1 Rectification:

Williams describes three basic rectification techniques; the basic Orbital model, control point techniques, and the advanced geocoding model. The basic Orbital model, which is most often used in bulk correction of many satellite images, can produce images with fairly good relative accuracy, but generally have noticeable errors in absolute accuracy. The advanced geocoding model produces the best results, but uses Orbital, sensor geometry and terrain inputs, which makes it a cumbersome and complex task. A widely used good compromise for many applications is a basic ground control point technique. There are four basic steps to the rectification process when performed in this manner. These apply whether the image is being registered to a map projection or to another image. Figure 13 is a graphic representation of the process.

  1. First, ground control points (GCP's) are assigned to certain pixels in the original image. GCP's are pixels which are identifiable as specific points with known coordinates.
  2. These GCP's are located on the desired output grid (located in a given map projection).
  3. The original image and the output grid are 'overlaid' so the GCP's of both line up. This is an idealized view of what happens. In practice, a computer will compute how the original image needs to be stretched/rotated to make the GCP's line up then it will compute a formula to calculate how much each part of the image needs to be stretched to conform to the given projection.
  4. Finally, the value of each of the pixels on the new grid is calculated based on the new pixel's proximity to an old pixel.

Data that has been rectified to a particular projection is known as geocoded.

IV.1.2 Map Projections in a Nutshell:

The subject of map projections is quite extensive and only the very basics will be discussed here. For an excellent review of both basic concepts and specifics, see John Snyder's book, Map Projections, a Working Manual.

IV.1.2.1 What and Why?

Snyder defines a map projection as a "systematic representation of all or part of the surface of a round body, especially the Earth, on a plane." In other words, a map projection defines a relationship between coordinates on a flat body (in a plane, on a map) and coordinates on a round body (on a sphere, on the surface of the Earth). Map projections are coordinate transformations that are necessary when using a planar representation (a map) of something that is three dimensional (the Earth).

Unfortunately, it is impossible to create a completely distortion-free planar representation of a round object. Compromises must be made between the accuracy of area, shape, scale and direction as represented on the map. Different applications require different balances of these factors, and in some cases demand that the map satisfy some additional functional characteristic, plotting great circles as straight lines, or maintaining constant scale along a satellite ground track for example. The wide variation in uses of maps has lead to the development of many different map projections, which can be grouped based on their distortion-free characteristics as follows:

IV.1.2.2 Spheroids, Ellipsoids, Datums and Projections:

As described in the last section, a map projection defines a mathematical relationship between points on a plane and those on a round body. For complex round bodies this relationship can get very detailed and difficult to compute, even with current high-speed computers. A simple sphere is easy to define mathematically, a spheroid or ellipsoid is slightly more difficult, but imagine how complicated it would be to define a function that would fit the Earth's topography. Fortunately, the Earth's shape is fairly close to ellipsoidal, so for practicality's sake we approximate the shape of the Earth with a sphere or ellipse when calculating map projection relationships.

Just as one chooses a given map projection for a given application, one must choose a particular sphere or ellipse to approximate the Earth for each application. Different shaped spheres and ellipses fit the actual shape of the Earth better in some places than in others. Some are very close fits in one place, but not in another, while some are pretty good approximations everywhere, but don't really fit perfectly anywhere. When a particular sphere or ellipse is "pinned down" to a particular point on the surface of the Earth by specifying a tie point, or point of tangency, the pair are known as a datum. A specific datum and map projection pair are required to compute the transformation between latitude/longitude and map coordinates.

IV.1.2.3 Processing Considerations for Ordering Images:

When ordering satellite image products, a user will generally have a choice among several levels of pre-processing (corrections performed to the digital data before delivery to the user). In general, you will have to pay more for higher levels of processing and/or for the option of choosing what processing is actually applied. The user typically has no control over the preprocessing or format of data obtained for free over the Internet, for example, but has much more control of this when purchasing the data from a private company. The following list presents several general levels of processing that are typically available.

  1. The lowest level of processing one could obtain is the direct transmission from a given satellite. In many cases this signal is encrypted to prevent users from stealing data, but in others (GOES or AVHRR HRPT, for example) the data stream is considered public domain. Anyone with the appropriate hardware and software can legally receive the transmission. This method is only practical for users requiring large volumes of data.
  2. The next level of processing that is typically available is the raw signal to which certain systematic corrections have been made - corrections for known errors induced by the sensor's optics, for example. This is what most users (those without their own receiving stations) would consider "raw".
  3. The simplest geometric correction typically applied to imagery is based on knowledge of where the satellite was and where the sensor was pointing when the data were acquired. This type of correction involves warping the image (see the section IV.1.1 Rectification). Usually the data are warped to a given map projection, resulting in a geometrically correct image that has associated coordinate information that is typically accurate to within several pixels. [5]
  4. The next level of processing improves the navigational accuracy of the image by using GCP's to perform the geometric corrections. This correction also requires warping the image, and produces images that are often navigated to within a pixel, but that can be in error up to a few pixels.
  5. Further corrections can be made to the image by removing the effects of topography, sun angle, atmospheric effects, and other random noise. Images processed to this level may be navigated with a high degree of precision.

Most typical CEO users are likely to acquire images processed to levels 2, 3 or 4. [6] When purchasing imagery, consider the following trade off; geometric fidelity and navigational accuracy are obtained by warping the image which degrades the radiometric accuracy of the resulting image. Conventional wisdom states that applications requiring very accurate classifications or conversion of digital numbers to physical values (radiance, albedo, temperature etc..) obtain better results by performing these operations on "raw" data (level 2) and then performing geometrical corrections as opposed to deriving quantitative data from already geometrically correct images. On the other hand, accurate geometric correction is both time consuming and difficult. Those experienced with a particular type of imagery are likely to always acquire the closest thing to raw as possible, preferring to perform all the corrections based not only on the published calibration information, but also on vast stores of personal experience. After much hard work, such an approach is likely to yield better results and more accurate answers than an approach based on corrections applied by the vendor. However, much experience with a particular data type, and with satellite image processing techniques in general, is required to make this approach pay off. Applications that can afford reduced radiometric fidelity (and the higher cost of corrected imagery) and/or do not have the requisite ability/time/experience to perform all the corrections, may be better served by purchasing imagery with higher levels of pre-processed geometric correction.

IV.1.2.4 Common Satellite Mapping Projections:

A satellite image may be rectified to any map projection, given the appropriate software that has the ability to warp images and contains the formulas describing the projection, spheroid, and datum. Fundamentally, the choice of projection depends on the needs of the particular application. In some cases, the actual choice of projection is not important, rather the overriding priority is to place all the data related to a project in the same coordinate system, whatever that may be. If there are no compelling reasons to select one projection over another, the UTM (see below) is frequently a good choice simply because it is quite common. See section IV.1.2.1 What and Why? above for more information on the factors involved in this decision. Most vendors that offer the option of geometric correction will also offer a user choice of several different map projections. Very brief descriptions of a few of the most commonly offered projections are described here, see Snyder for details.

IV.1.2.4.1 The Universal Transverse Mercator (UTM) Projection:
The Mercator projection is formed by projecting features on the surface of the Earth, from its center, onto a cylinder that is tangent at the equator. Scale and area are increasingly distorted from the equator to the poles, but the projection is conformal, so local angles are preserved everywhere. A slight modification of the Mercator is the Transverse Mercator, which uses a meridian as the tangent great circle instead of the equator. The properties are the same as the Mercator, except scale and area are distorted in an east-west direction away from the meridian instead of in a north-south direction away from the equator. By choosing a meridian near the area of interest, the Transverse Mercator may be used to map areas of predominantly north-south extent with low distortion.

The UTM is a series of Transverse Mercator projections established by the U.S. Army in 1947 to provide a standard for worldwide mapping. Under this system, the world is divided into sixty zones, each six degrees in longitude. Distances in the x-direction (Eastings) and in the y-direction (Northings) are measured from the origin for each zone (where the zone's central meridian intersects the equator). Table 6 lists the UTM zones.

The UTM does not have a preferred datum, and is commonly used with whatever datum best approximates the region being mapped. Snyder [7] points out that the U.S.G.S. uses the Clarke 1866 ellipsoid for all land under U.S. jurisdiction except Hawaii where the International ellipsoid is used. In ERMapper, use the "NAD27" datum for the Clarke 1866 ellipsoid. Most of Connecticut is in UTM Zone 18 with a small portion of eastern Connecticut in UTM Zone 19. Connecticut data will typically use the "NAD27" datum with the Clarke 1866 ellipsoid.

Table 6: Universal Transverse Mercator zones, central meridians, and longitude ranges.
All values are listed in full degrees east (E) or west (W) from the Greenwich prime meridian (0). [From ERDAS, Inc., 1991]

Zone Central Meridian Longitude Range Zone Central Meridian Longitude Range
1 177W 180-174W 31 3E 0-6E
2 171W 174-168W 32 9E 6-12E
3 165W 168-162W 33 15E 12-18E
4 159W 162-156W 34 21E 18-24E
5 153W 156-150W 35 27E 24-30E
6 147W 150-144W 36 33E 30-36E
7 141W 144-138W 37 39E 36-42E
8 135W 138-132W 38 45E 42-48E
9 129W 132-126W 39 51E 48-54E
10 123W 126-120W 40 57E 54-60E
11 117W 120-114W 41 63E 60-66E
12 111W 114-108W 42 69E 66-72E
13 105W 108-102W 43 75E 72-78E
14 99W 102-96W 44 81E 78-84E
15 93W 96-90W 45 87E 84-90E
16 87W 90-84W 46 93E 90-96E
17 81W 84-78W 47 99E 96-102E
18 75W 78-72W 48 105E 102-108E
19 69W 72-66W 49 111E 108-114E
20 63W 66-60W 50 117E 114-120E
21 57W 60-54W 51 123E 120-126E
22 51W 54-48W 52 129E 126-132E
23 45W 48-42W 53 135E 132-138E
24 39W 42-36W 54 141E 138-144E
25 33W 36-30W 55 147E 144-150E
26 27W 30-24W 56 153E 150-156E
27 21W 24-18W 57 159E 156-162E
28 15W 18-12W 58 165E 162-168E
29 9W 12-6W 59 171E 168-174E
30 3W 6W-0 60 177E 174-180E

IV.1.2.4.2 The Hotine Oblique Mercator (HOM) Projection:
The Oblique Mercator projection is a modification of the Mercator which allows the projection cylinder to lie tangent to any great circle, not just the equator or a meridian. Hotine approximated the ground track of Landsat satellites with a great circle in order to map a continuous swath of imagery with minimal distortion. In fact, the ground track of a sun-synchronous satellite is sinusoidal, not circular, so Hotine divided the Earth into five latitudinal zones [8], each using different inclinations for their tangent great circles. Within each zone, the inclination of the tangent great circle is fixed, but its central latitude and longitude changes with each path. ERMapper supports the Oblique Mercator projection, but does not include parameters for all combinations of latitudinal zones and paths. Currently, only zone 2, path 38 is installed (projection OBZ2P38), but others can be added if necessary. See Snyder's book or appendix D of the Landsat Data User's Handbook for more detail on the use of the Oblique Mercator projection and its uses for satellite mapping.

IV.1.2.4.3 The Space Oblique Mercator (SOM) Projection:
The HOM has two drawbacks. First, the actual ground track of a satellite in an orbit inclined to the equator is not straight; it curves due to the Earth's rotation as the satellite passes overhead. This introduces scale errors because inclination chosen for the tangent great circle is essentially a best fit or average of the inclination of the satellite's ground track. The magnitude of this problem is reduced by using zones, and assigning a different inclination for the tangent circle within each zone. Using this approach, the scale errors are reduced because the tangent circle's inclination within a zone better approximates the ground path. However the use of zones introduces the second limitation of the HOM, which is the inability to continuously map the satellite's path due to discontinuities at the zone boundaries.

The SOM was designed to overcome these limitations by introducing time as a projection parameter. This approach allows the spacecraft-Earth geometry to change over the course of an orbital period and permits the central line of the projection to be curved rather than straight. The projection is not perfectly conformal, but errors are extremely small within the (~185 km) swath width of the Landsat satellites. ERMapper does not support the SOM because it is so complicated. This is a problem because many of the scenes in the CEO's archive have been processed to the SOM projection and are therefore not navigable using ERMapper without rewarping these scenes to a different projection using GCPs. See Snyder's book or appendix D of the Landsat Data User's Handbook for formulae and a more complete description of this important projection. It may be possible to import these images using ERDAS Imagine, reprojecting the image to a more standard projection, then converting the image to ERMapper format for subsequent processing.

IV.1.3 Key References:

ERDAS, Inc., ERDAS Field Guide, Atlanta, GA, 1991, 394 pp.
Snyder, John P., Map Projections - A Working Manual, U.S. Geological Survey Professional Paper 1395, U.S. Government Printing Office, Washington, D. C., 1987, 383 pp.
U.S. Geological Survey (USGS), Landsat Data Users Handbook Revised, USGS, Sioux Falls, SD, 1979.
Williams, Jonathan, Geographic Information from Space, Processing and Applications of Geocoded Satellite Images, Chichester: John Wiley & Sons, 1995

IV.2 Image Enhancement

IV.2.1 Contrast Enhancement

Contrast enhancement is the process of stretching out the values of a given dataset to take advantage of all possible values of a display. For example, let's say you have a dataset with values ranging from 1-10 and a monitor that can display 256 colors. If you assign all the data points with values less than 1 color 1, all the points with values between 1-2 color 2 and so forth, you are not making use of the full potential of your display. Values that are similar, 1 and 1.5 for example, will be impossible to distinguish using this scheme. Contrast stretching spreads out the data values on the display to convey more information in the image. In the above example, we might assign data points with values 1-1.1 color 1 and values 1.1-1.2 color 2 and so forth. Therefore we use more (or all) of the available colors and are able to pick out subtle detail in the image.

Contrast stretching is important because regions that are spatially contiguous often have similar spectral signatures. Take for example a desert, ice cap, or ocean. Data points across the image will all have very similar values, and will result in a bleak image unless the contrast between these values is enhanced.

Figure 14, figure 15, and figure 16 graphically illustrate a few contrast enhancement strategies. We use a plot with a double y-axis to illustrate the contrast enhancement. [9] The x-axis is simply all possible values of the original data. In these examples, the original data ranges from 0-10. One y-axis represents the frequency that each x-axis value occurs in the original dataset (the number of pixels with each x-axis value in the dataset). This generates a histogram of dataset values which is plotted as a filled in curve. The second y-axis lists the possible range of output values, in these cases 0-255. The second curve on each of these plots, which does not have the area underneath it filled in, represents the transform from input values to display values. The linear transform stretches a range of given input values equally. The histogram equalization function stretches the input values more in the ranges that have a higher concentration of pixels. The threshold function takes all input pixels below a certain value and assigns them the same output value, and all input pixels above that value to a different output value.

IV.2.2 Spatial Enhancement

Spatial enhancement is a broad term covering techniques used to either emphasize (interesting) features or de-emphasize (annoying) features in the data. For example one might want to enhance sudden changes in dataset values to sharpen detail of things like water/land or soil/crop boundaries or smooth out speckled noise.

To perform these enhancements, one operates on the entire dataset with a filter (kernel in ERMapper) using a process called convolution. A filter is simply a small array of numbers carefully chosen to relate a pixel with its neighbors in a certain way. Convolving the filter with the dataset simply means to multiply each value in the filter by its corresponding value in the dataset, sum up the results and divide by the sum of the values in the filter. This process of applying a filter to a dataset is often called the "sliding window" method, because you center the filter (window) around a pixel, convolve it with the dataset values falling within the window, then slide the filter to center around the next pixel. Convolution is not the same as matrix algebra, because one computes a new value for only one pixel at a time, regardless of the size of the filter.

An example helps. In this case consider a 3x3 piece of a much larger dataset which contains a locally high value surrounded by smaller values:

Original Pixel and Neighbors
10129
11209
91010

We'll convolve this windowed out piece of the dataset with a 3x3 average filter (see Figure 17) and a 3x3 sharpen filter (see Figure 18).
The computation for the average filter is (1x10 + 1x12 + 1x9 + 1x11 + 1x20 + 1x9 + 1x9 + 1x10 + 1x10) / 9 = 11.1.
The computation for the sharpen filter is (-1x10 + -1x12 + -1x9 + -1x11 + 14x20 + -1x9 + -1x9 + -1x10 + -1x10) / 9 = 23.2. The resulting windows are:

Figure 17 Figure 18
  Average Filter Formula   
1 1 1
1  1 1
1 1 1
  Sharpen Filter Formula  
-1 -1 -1
-1 14-1
-1 -1 -1
 
 Pixel After Average Filter
10 12 9
11 119
9 10 10
Pixel After Sharpen Filter
10 12 9
11 239
9 10 10

Note that only the center pixel has changed. After convolving with the average filter, the pixel with the locally high value is more like its neighbors, whereas the sharpen filter enhances the difference between the center pixel and its neighbors. In a real dataset, this process would be repeated for each pixel and its neighbors that fall within the window of the filter. These filters can be of any size but must usually have an odd number of values in each direction so that the window is symmetric around the pixel in question.

IV.3 Multi-Spectral Analysis

The enhancements discussed in the previous section work on a single raster layer. However, much of the important information in a remotely sensed dataset is contained in its multi-spectral nature. The following sections outline common techniques of multi-spectral analysis designed to take advantage of the additional data present in multi-band datasets.

IV.3.1 Mathematical Combinations

Perhaps the most straightforward multi-spectral technique is to simply combine several bands of a dataset into a single raster layer for display using a mathematical relationship. For example, taking a ratio of two bands results in an entirely new grid of values related to both of the original bands. This technique is useful because it compresses the information in to a smaller dataset in a (hopefully) meaningful way. See Table 7 for some commonly used band combinations. Ratioing is an especially useful mathematical technique because it can help reduce the effects of shadows caused by sun angle, clouds, or haze.

Table 7: Some Useful TM Ratios
Ratio Significance
4/3 Vegetation Vigor (brighter is more)
2/1 Water Depth (darker is deeper)
2/3 Variations in Iron Content
5/7 Variations in Clay Content
(4-3)/(4+3) NDVI (common, standard vegetation index)

IV.3.2 Band Combinations

A color monitor uses three separate colors to produce an image on the screen (red, green and blue). These three primary colors can combine to make all the other colors. One of the most common and useful multi-spectral techniques is to use different bands of a dataset to specify the intensity of each of these three "so called" color guns. For example, assigning TM band 3 to the red color gun, band 2 to the green color gun and band 1 to the blue color gun produces a (more or less) true color image, because band 3 represents reflected light in the visible red, band 2 in the visible green and band 1 in the visible blue portions of the spectrum. Using this technique, we can view three bands of a dataset at one time, thereby making more effective use of our multi-band dataset. For further visual compression, we could assign a ratioed raster layer to each of the color guns. We can even apply filters to each layer independently! Of course, the important part of these techniques is to have an intuitive understanding of what the data means physically in order to enable us to interpret the resultant image.

Table 8 lists some of the useful TM band combinations. The common convention used for describing the band combinations used in a given image is to list the bands used in Red, Green, and Blue order. In other words, the image created by using the red gun to display band 4, the green gun to display band 2, and the blue gun to display band 1 would be called "4,2,1 - R,G,B" for short.

Table 8: Some Common TM Band Combinations
R,G,B Comments & Applications
3,2,1 True Color. Water depth, smoke plumes visible
4,3,2 Similar to IR photography. Vegetation is red, urban areas appear blue. Land/water boundaries are defined but water depth is visible as well.
4,5,3 Land/water boundaries appear distinct. Wetter soil appears darker.
7,4,2 Algae appears light blue. Conifers are darker than deciduous.
6,2,1 Highlights water temperature.
7,3,1 Helps to discriminate mineral groups. Saline deposits appear white, rivers are dark blue.
4,5,7 Also used for mineral differentiation.
7,2,1 Useful for mapping oil spills. Oil appears red on a dark background.
7,5,4 Identifies flowing lava as red/yellow. Hotter lava is more yellow. Outgassing appears as faint pink.

IV.3.3 Classification

One of the most common uses of remotely sensed data is mapping land use/cover of vast areas, in other words to classify each pixel in a scene as belonging to some group. Techniques developed to help automate this process, known as classification schemes, are all based on the fundamental assumption that members of the same land cover group will have similar spectral signatures. Classified datasets are often used to efficiently gather spatial statistics for large areas.

IV.3.3.1 Supervised Classification

The concept behind any supervised classification scheme is for the analyst to train the computer to look for all the pixels in an image that are similar to those in defined training regions. Once the user has defined the training regions, the computer examines the spectral signature of each pixel and determines which of the training regions' signatures it most closely resembles. The pixel then is assigned to that group.

For an example, consider a supervised classification performed on bands 3 and 4 of a TM dataset. First the analyst selects homogeneous groups of pixels representing the regions of interest. In our example, we'd like to find all the pixels falling into three groups, vegetation, bare soil and water. The values of each band for the pixels used for these training groups may be plotted against each other (see figure 19). In this plot each point represents a pixel, its position is determined by its value in band 3 and 4 of the dataset. Notice that the pixels in a training region do not all have identical values in these two bands, their values fall over some range in each band, forming a cloud in this spectral space.

For each pixel, the computer then computes which of these groups is most likely to contain it. Several common methods are currently used to make this decision. The most straightforward method, called the minimum distance method, simply places the pixel in question into the group with the closest training region mean. The parallelepiped method compares the value of each pixel to the maximum and minimum values of the pixels in each training region and assigns the pixel to a group only if it falls within this range. The maximum likelihood algorithm takes the distribution of pixels in a training region into account when deciding how to group a pixel. Think of it as a minimum distance method that also considers the distribution of the other pixels in the group. Suppose all the pixels in one training region have very similar spectral values, like the water region in our example. Imagine also that another training region contains pixels with a wide spread of values, like the vegetation group in our example. Let's say the pixel in question is located an equal distance from the means of each of these groups, it is more likely that the pixel belongs to the group with a larger variance in values of its training members.

IV.3.3.2 Unsupervised Classification

Unsupervised classification works in a similar way except that the user does not specify the initial training regions. Instead, the algorithm assumes some number of arbitrarily spaced cluster means and uses the minimum distance method to determine each pixel's initial group. Then new mean values are calculated for each group. Groups with means that are within some specified range are merged to form a new group, and groups with variances larger than some tolerated level are split to form new groups. This entire process is repeated over and over until the groups stabilize and pixels stop changing groups between iterations. Figure 20 illustrates this process.

V. Selected Literature in Remote Sensing

The items selected for this (by no means comprehensive) list are a combination of important references in the field of remote sensing, and related works that are available here at Yale. The reader should note that the Yale libraries continue to make a strong effort to increase their holdings of works relating to remote sensing and GIS, and the collection is growing rapidly.

For references to works on specific topics see the extensive bibliographies following each chapter in your textbook. Also see the "Key References" sections earlier in this Guide. In addition to this list, we have included copies of the annual index from the journals Photogrammetric Engineering and Remote Sensing and Remote Sensing of Environment in Section VI. Abstracts from these and most of the other journals listed below are searchable on cdrom or on-linein the Geology library. Ask the librarians for help.

V.1 Journals and Periodicals

  1. Photogrammetric Engineering and Remote Sensing
    v47+ (1981) in Forestry: TR693 P46
    v59+ (1993) and selected 1991/1992 issues in CEO lab
    See 1998's annual index in Section VI.
  2. Remote Sensing of Environment
    v.1+ (1968) in Kline: QE33.2R4 +R45
    See 1993's annual index in Section VI.
  3. International Journal of Remote Sensing
    v1+ (1981) in Geology: G70.4 I56
  4. I.E.E.E. Transactions on Geoscience and Remote Sensing
    v18+ (1980) in Becton: QC801 +I2(LC)
  5. Earth Observation Magazine
    v1.+ (Apr 1992) in CEO lab
  6. ERMapper Forum
    v1+ (1993) in CEO lab
    See ERMapper's web page: http://www.ermapper.com.

V.2 Bibliography

  1. Asrar, Ghassem, ed., Theory and Applications of Optical Remote Sensing, New York: John Wiley and Sons, 1989.
    Becton: G70.4 T47 1989 (LC)
  2. Colwell, Robert N., ed., Manual of Remote Sensing, Falls Church, Virginia: American Society of Photogrammetry, 1983.
    Forestry: G70.4 M36 1983 (LC)
  3. ERDAS Inc., ERDAS Field Guide, Atlanta, Georgia: 1991.
    106 KGL
  4. Elachi, Charles, Introduction to the Physics and Techniques of Remote Sensing, New York: John Wiley and Sons, 1987.
    106 KGL
  5. Gluhosky, Paul A., An Introduction to UNIX, New Haven: 1994.
    Reproduced on the CEO WWW server: www.yale.edu/ceo/Documentation/UnixManual/unix_talk.html
  6. Lillesand, Thomas M. and Ralph W. Kiefer, Remote Sensing and Image Interpretation, New York: John Wiley and Sons, 1994.
    Kline: G70.4 L54 1987
  7. Rees, W. G., Physical Principals of Remote Sensing, Cambridge: Cambridge University Press, 1990.
    Geology: G70.4 R44X 1990 (LC)
  8. Richards, J. A., Remote Sensing Digital Image Analysis: An Introduction, Berlin: Springer-Verlag, 1986.
    Geology: G70.4 R53 1986 (LC)  and  Becton: G70.4 R53 1986 (LC)
  9. Sabins, Floyd F., Remote Sensing Principles and Interpretation, New York: WH Freemand & Company, 1996.
    Geology: G70.4 S15X 1996 (LC)
  10. Short, N. M., The Landsat Tutorial Workbook, NASA Ref. Publ. 1078, U.S. Government Printing Office, Washington, DC, 1982.
    Forestry, QB637 +S56 (LC)
  11. Snyder, John P., Map Projections - A Working Manual, U.S. Geological Survey Professional Paper 1395, U.S. Government Printing Office, Washington, D. C., 1987, 383 pp.
    Geology: QE75 +P7 1395 (LC)  and  SML Map collection GA 110 +S577 1987 (LC)
  12. U. S. Geological Survey (USGS), Landsat Data Users Handbook Revised, USGS, Sioux Falls, SD, 1979.
    106 KGL
  13. U. S. Geological Survey (USGS), Landsat 4 Data Users Handbook, USGS, Sioux Falls, SD, 1984.
    106 KGL
  14. Verbyla, David L., Satellite Remote Sensing of Natural resources, Boca Raton: Lewis, 1995.
    Geology: G70.4 V47X 1995 (LC)
  15. Williams, Jonathan, Geographic Information from Space, Processing and Applications of Geocoded Satellite Images, Chichester: John Wiley & Sons, 1995.
    Geology: G102.4 R44 W55X 1995 (LC)

VI. Sample of a Remote Sensing Journal Annual Index

This section reproduces the 1998 annual index from the journal Photogrammetric Engineering and Remote Sensing. This is included to give a flavor for the sorts of articles found in these journals.

VI.1 Photogrammetric Engineering and Remote Sensing:

PE & RS Annual Index by Subject

PE & RS Annual Index by Author

Footnotes

  1. Water depth dominates the spectral signature of water bodies only in the case of diffuse reflection. Specular reflection will dominate the spectral signature of water bodies if the sensor - water body - light source geometry is correct. In this case, the spectral signature measured will be that of the light source. Specular reflection off water bodies is known as sunglint and can be useful or annoying in satellite images depending on your application.
  2. The MSS bands were renumbered between landsats 3-4, though the actual sensors remained the same. Therefore bands 4,5,6,7 for landsat 1-3 correspond exactly to bands 1,2,3,4 for landsats 4,5.
  3. The thermal IR band (6) has a larger (120m2) IFOV, but it is subsampled to 30m2 by the ground stations to provide consistency between bands.
  4. Snyder, p4.
  5. In many cases, imagery ordered with processing of only basic sensor corrections comes with the information on satellite position and pointing that is necessary for the user to make these basic navigational and geometric corrections. Note however that this sort of warp does not use the ground control point technique discussed in the section on rectification and requires additional software to perform the correction. At this time, the CEO does not have software that can make use of this information.
  6. Note that the numbering scheme for labeling levels of processing varies from sensor to sensor. The numbers presented here do not necessarily represent those actually used by any particular vendor. Rather, they are generic levels of processing which different vendors call different names.
  7. p58.
  8. Actually there are five zones for the decending portion of the orbit and five for the ascending portion, however the ascending portion of the orbit images the dark side of the Earth and relatively few images are acquired during this portion of the orbit.
  9. ERMapper uses the same strategy for its contrast stretching, however the axes are not labeled, so the plot appears more confusing.