Lessons:  Introduction  History  Remote Sensing  10  11  12  13  14  15  16  17  18  19  20

Other Remote Sensing Systems

SPOT and Charge-Coupled Devices (CCDs) | IRS-1, JERS and RESURS Series | Hyperspectral Imaging | Radar and Thermal Systems | Meteorological, Oceanographic, and Earth Systems Satellites | The Systems (Multisource) Approach to Remote Sensing | Military Intelligence Satellites | Medical Applications of Remote Sensing

Concluding Remarks

MOMS, SPOT and Charge-Coupled Devices (CCDs)

The first "competitor" to Landsat in providing high resolution multispectral images of the Earth is the SPOT series of satellites operated by a French-owned company. SPOT is briefly described here (in detail in Section 3). Its sensors use a different type of detector, the CCD, whose function is discussed in depth on this page.Two examples of SPOT images are displayed.

Scanners, such as those on the Landsats (MSS and TM) were the prime Earth-observing sensors during the 1970s into the 1980s. But these instruments contained moving parts, such as oscillating mirrors that were subject to wear and failure (although remarkably, the MSS on Landsat 5 continues to operate into 1999 after launch in March of 1984). Another approach to sensing EM radiation was developed in the interim, namely the Pushbroom Scanner, which uses charge-coupled devices (CCDs) as the detector. This diagram may help to better grasp the description of CCDs in the next paragraph:

Schematic Diagram depicting the general characteristics of a Pushbroom (CCD) Scanner

A CCD is an extremely small, silicon chip, which is light-sensitive. When photons strike a CCD, electronic charges develop whose magnitudes are proportional to the intensity of the impinging radiation during a short time interval (exposure time). From 3,000 to more than 10,000 detector elements (the CCDs) can occupy a linear space less than 15 cm in length. The number of elements per unit length, along with the optics, determine the spatial resolution of the instrument. Using integrated circuits each linear array is sampled very rapidly in sequence, producing an electrical signal that varies with the radiation striking the array. This changing signal recording goes through a processor to a recorder, and finally, is used to drive an electro-optical device to make a black and white image, similar to MSS or TM signals. After the instrument samples, the array discharges electronically fast enough to allow the next incoming radiation to be detected independently. A linear (one-dimensional) array acting as the detecting sensor advances with the spacecraft's orbital motion, producing successive lines of image data (analogous to the forward sweep of a pushbroom). Using filters to select wavelength intervals, each associated with a CCD array, we get multiband sensing. The one disadvantage of current CCD systems is their limitation to visible and near IR (VNIR) intervals of the EM spectrum.

CCD detectors are now in common use on air- and space-borne sensors (including the Hubble Space Telescope which captures astronomical scenes on a two-dimensional array, i.e., many parallel rows of detectors). The first airborne pushbroom scanner to be used operationally was the Multispectral Electro-optical Imaging Scanner (MEIS) built by Canada's CCRS. It images in 8 bands from 0.39 to 1.1 µm (using optical filters to produce the narrow band intervals) and uses a mirror to collect fore and aft views (along track) suitable as stereo imagery.

The German Aerospace Research Establishment (DFVLR) developed the first pushbroom scanner to be flown in space. The Modular Optico-electronic Multispectral Scanner (MOMS) was aboard Shuttle Mission STS-7 and STS-11 in 1983 and 1984. It uses two bands, at 0.575-0.625 and 0.825-0.975 µm, to produce 20 m resolution images. The MOMS image below is an area of farmland in Zimbabwe:

MOMS color composite (higher wavelength band in red) of farmland near Harare, Zimbabwe.

MOMS-2 was flown on STS-55 in May of 1993. It has four multispectral channels (13 m resolution) and a panchromatic band (4.3 m), and is in stereo mode. Here is a panchromatic image of a city on the western shore of the Persian Gulf (locus incertae; probably in Qatar):

MOMS-2 panchromatic image of a coastal city on the Persian Gulf.

The first use of CCD-based pushbroom scanners on an unmanned Earth-observing spacecraft was on the French SPOT-1 launched in 1986. (Page 3-2 describes the SPOT system, which is operated as a commercial program; 4 SPOTS have now been launched) An example of a SPOT image, from its high-resolution video (HRV) camera, covering a 60 km section (at 20 m. spatial resolution) of the coastal region in southwest Oregon, is the next image we show. Note that scan lines are absent, because each CCD element is, in effect, a tiny area analogous to a pixel.

A SPOT-1 Band 2 (red) HRV image (20 m resolution) covering a 60 x 60 km scene of the Klamath Mountains in southwestern Oregon.

False color composites are a standard product from SPOT. Here is part of a SPOT image taken in 1994 of developing farmland west of Miami, FL and near the edge of the Everglades, in which a new housing development has been put in since the mid-80s.

SPOT false color image of the edge of the Everglades in Florida.

The third image is a panchromatic image (with 10 meters resolution) showing the edge of Orlando, Florida, including its airport.

A SPOT-1 Panchromatic image (10 m resolution) of the airport and a part of Orlando, Florida and surrounding countryside.


Other countries are now active in the satellite remote sensing "game". India has launched four satellites, the IRS series, each with multispectral sensors. Japan is operating two satellites, the JERS series, with sensors that cover much the same spectral regions as the Landsat TM, but at higher resolution. The Russians with their RESURS series now offer imagery similar to Landsat and some meteorological satellites. The Ukraine has launched its own OKEAN satellites. China and Brazil have jointly developed the CBERS series. Links to several major American private corporations that sell some of these satellite products are provided.

India successfully operates several Earth-resources satellites that gather data in the Visible and Near IR bands, beginning with IRS-1A in March of 1988. The latest in the series, IRS-1D, launched on September 29, 1997. Its LISS sensor captures radiation in the blue-green, green, red, and near IR bands at 23 m spatial resolution. The spacecraft also produces 5.8 m panchromatic images, as well as 188 m resolution wide-field (large area) WiFS multispectral imagery. Below are three recent images from this system, the one on the top (WiFS) showing the Grand Canyon of Arizona, in the middle a three-band color composite made by the 23 m LISS, showing mountainous terrain and pediments with alluvium fans in southern Iran, and at the bottom a 5.8 meter panchromatic view of part of the harbor at Tamil Nadu in India.

The Grand Canyon cutting into the Coconino Plateau of northern Arizona, IRS WiFS image.
A three-band color composite made from images (at 23 m resolution) acquired by the IRS LISS, showing mountains, alluvial plains and fans, and scattered oases of vegetation in southern Iran.
IRS-1D panchromatic image (5.8 m resolution) showing a harbor along the coastline in the state of Tamil Nadu in southernmost India.

More information on the Indian remote sensing program is available from its U.S. distributor, Space Imaging, Inc. (http://www.spaceimaging.com/).

The Japanese, beginning in 1990, have flown JERS-1 and JERS-2 which include optical and radar sensors. Here is an artist's conception of JERS-1 in space:

Artist's picture of JERS-1

The optical system is a seven band scanner similar in coverage to the TM. The satellites are operated by the National Space Agency, NASDA. Here is a false color JERS-1 image of Tokyo and Tokyo Bay:

JERS-1 false color images of Tokyo and Tokyo Bay; note the large rectangles which are landfill into the bay that serve as docking and storage areas.

Starting in the mid 1980s, the Soviet Union (and now Russia) entered the world arena with an Earth-observing satellite program available on the open market. The RESURS-01 series (3 so far, a fourth pending) provided a multispectral system (3 Vis-NIR bands; 2 thermal) whose resolution (160 m, and 600 m for thermal) is intermediate between that of Landsat/SPOT and the AVHRR on meteorological satellites. Like Landsat RESURS are placed in near-polar, sun-synchronous orbits. Two images from this system appear below: the first is a false color composite of land near Arkhangel'sk almost due north of Moscow near the Arctic Circle and the Barents Sea.

RESURS-01 image of a fire burning in the Arkhangel’sk area near the Arctic circle north of Moscow.

The second RESURS (Resources) image is part of a mosaic of Europe which here includes all of Norway, Sweden, and Denmark, and part of Finland and several Baltic nations.

This general region has recently been scanned by the SeaWiFS sensor or OrbView-2 (page 14.3) and rendered as an oblique perspective view:

SeaWiFS wide angle view, in perspective, of the Baltic Sea and neighboring countries (north Germany, Denmark, Norway, Sweden, Finland, Russia (around St. Petersburg), Latvia, Lithuania, Estonia, and Poland.

The National Space Agency of the Ukraine has its own program of space observations; it works in cooperation with the Russian Federation in using certain facilities. Its OKEAN series includes multispectral scanners, thermal sensors, and radar. Two MSU-V images (50 m resolution) show a standard false color composite (left) of the southern Crimea (Sebastapol in lower left) and a different color combination (right) of the Dnieper River in the Ukraine Lowlands, with Kiev just below the upper "lake" (caused by river damming).

The southern part of the Crimea; red area is forested low mountains The Dnieper River; Kiev

The Peoples Republic of China has joined forces with the Brazilian government to develop a series of earth-observing satellites launched by Long March rockets from China. Their program goes by the name of CBERS (China-Brazil Earth Resources Satellites); in China these satellites are called the Zujuan series. CBERS-1 was orbited on October 14, 1999. It includes three sensors: 1) WFI (300 km swath; 260 m resolution; 4 bands); 2) IR-MSS (20 km swath; 80 m resolution; 4 bands including thermal); and 3) CCD (20 m resolution; 4 bands). The more than 280000 images received are concentrated mainly over Brazil and China and are not generally available to other nations. CBERS-2 (YZ-2) was launched on September 1, 2000; although reputed to be available for earth resources applications, western observers have concluded that its 3-meter resolution sensor is being used primarily for military reconnaissance. Here is a CBERS-1 CCD image of an (unidentified) area in Brazil:

CBERS-1 image of an area in Brazil.

Hyperspectral Imaging

Imaging spectroscopy, also known as hyperspectral remote sensing, allows a sensor on a moving platform to gather reflected radiation from a ground target such that a special detector system consisting of CCD devices can record up to 200+ spectral channels simultaneously over the range from 0.38 to 2.5 mm. With sampling thus at a 0.01 mm interval it is possible to plot the data as quasi-continuous narrow bands that approximate a spectral signature rather than histogram-like broader bands. With such detail, the ability to detect and identify individual materials or classes greatly improves. The AVIRIS instrument developed at JPL is described. Examples are shown that confirm that the hyperspectral approach is now the state-of-the- art cutting edge of remote sensing from air and space.

Another major advance, now coming into its own as a powerful and versatile means for continuous sampling of broad intervals of the spectrum, is hyperspectral imaging (see second half of Section 13, starting on page 13-5, for more principles and details). Heretofore, because of the high speeds of air and space vehicle motion, insufficient time was available for a spectrometer to dwell on a small area of Earth's surface or an atmospheric target. Thus, data were necessarily acquired for broad bands in which spectral radiation is integrated within the sampled areas to cover ranges, such as 0.1 µm, for instance Landsat. In hyperspectral data, that interval narrows to 10 nanometers (1 micrometer [µm] contains 1000 nanometers [1 nm = 10-9m]). Thus, we can subdivide the interval between 0.38 and 2.55 µm into 217 intervals, each approximately 10 nanometers (0.01 µm) in width. These are, in effect, narrow bands. The detectors for VNIR intervals are silicon microchips, while those for the Short Wave InfraRed (SWIR, between 1.0 and 2.5 µm) intervals consist of an Indium-Antimony (In-Sb) alloy. If a radiance value is obtained for each such interval, and then plotted as intensity versus wavelength, the result is a sufficient number of points through which we can draw a meaningful spectral curve.

The Jet Propulsion Lab (JPL) has produced two hyperspectral sensors, one known as AIS (Airborne Imaging Spectrometer), first flown in 1982, and the other known as AVIRIS (Airborne Visible/InfraRed Imaging Spectrometer), which continues to operate since 1987. AVIRIS consists of four spectrometers with a total of 224 individual CCD detectors (channels), each with a spectral resolution of 10 nanometers and a spatial resolution of 20 meters. Dispersion of the spectrum against this detector array is accomplished with a diffraction grating. The total interval reaches from 380 to 2500 nanometers (about the same broad interval covered by the Landsat TM with just seven bands). It builds an image, pushbroom-like, by a succession of lines, each containing 664 pixels. From a high altitude aircraft platform such as NASA's ER-2 (a modified U-2), a typical swath width is 11 km.

From the data acquired, we can calculate a spectral curve for any pixel or for a group of pixels that may correspond to an extended ground feature. Depending on the size of the feature or class, the resulting plot will be either a definitive curve for a "pure" feature or a composite curve containing contributions from the several features present (the "mixed pixel" effect discussed in Section 13). In principle, the intensity variations for any 10-nm interval in the array extended along the flight line can be depicted in gray levels to construct an image. In practice, to obtain strong enough signals, data from several adjacent intervals are combined. Some of these ideas are elaborated in the block drawing shown here.

 Diagram showing the AVIRIS concept, indicating that spectral signatures can be obtained from ground targets capable of recognizing and classifying the different materials and features present.

Below is a hyperspectral image of some circular fields (see Section 3) in the San Juan Valley of Colorado. The colored fields are identified as to vegetation or crop type as determined from ground data and from the spectral curves plotted beneath the image for the crops indicated (these curves were not obtained with a field spectrometer but from the AVIRIS data directly).

 Hyperspectral image made from a single narrow band data set acquired by AVIRIS over terrain in San Luis Valley of Colorado, on which some of the agricultural fields have been classified to the level of specific crops.
Spectral signatures constructed from multiple AVIRIS bands from which crops in selected fields in the San Luis image above have been identified.

In Section 13 other AVIRIS images, used for mineral exploration near Cuprite, Nevada and other mining districts are displayed (see page 13-10) following an extended narrative on principles of spectroscopy and further consideration of the hyperspectral approach. A preview of the remarkable results achievable by this technology is given by this trio of images of the Cuprite district. The left image shows the area mapped as rendered in a near natural color version; the center image utilizes narrow bands that are at wavelengths in which certain minerals reflect energy related to vibrational absorption modes of excitation; in the right image, modes are electronic absorption (see page 13-7).Shown here without the mineral identification key, the reds, yellows, purples, greens, etc. all relate to specific minerals.

AVIRIS images of the Cuprite Mining District, in southern Nevada.

We know hyperspectral data are usually superior for most analyses to broader-band multispectral data, simply because such data provide so much more detail about the spectral properties of features to be identified. In essence, hyperspectral sensing yields continuous spectral signatures rather than the band histogram plots that result from systems like the Thematic Mapper which "lump" varying wavelengths into single-value intervals. Plans are to fly hyperspectral sensors on future spacecraft (see Section 21, page 21-1). The U.S. Navy is presently developing a more sophisticated sensor called HRST and industry is also designing and building hyperspectral instruments such as ESSI's Probe 1.

I-25: In your own words, using a single sentence, state the major advantage of hyperspectral sensors over broad band sensors. ANSWER

As of 2000, there are plans to put several hyperspectral sensors on to space platforms. One such instrument, called Hyperion, is part of EO-1, the first satellite in NASA's New Millenium series, launched in December, 2000. It was inserted into an orbit that places it just about 50 km (30 miles) behind Landsat 7, which allows similar images acquired at almost the same time to be compared for performance evaluation. Operated by Goddard Space Flight Center, this satellite is a test bed for new ideas in instrumentation that can be made smaller and lighter, so that launch costs can be lowered. Here is an artist's illustration of EO-1, with its 3 main sensors:

Artist's painting of EO-1 in orbit.

The Hyperion consists of CCD detectors and other components that break the spectral range from 0.4 to 2.5 µm into 220 channels. Each resulting image is 7.5 by 100 km in ground coverage; resolution is 30 meters. This next image shows a color composite made with 3 narrow channels all in the visible in which the scene is of Maryland and Virginia along the Potomac River:

Fairfax County, VA and Montgomery County, MD separated by the Potomac River, an image made from Hyperion data.

Hyperion images commonly are presented as long strips corresponding to down track scene acquisition. This image shows sedimentary rocks in a fold belt in the Mount Fritton area of the Flinders Range in South Australia:

Hyperion image constructed from 3 channels in the visible, showing folded sedimentary units in

The Atmospheric Corrector takes measurements that help to remove adverse effects from the atmosphere on image/data quality. A third sensor, the ALI (Advanced Land Imager) has 9 spectral bands and provides both multispectral images (30 m resolution) and panchromatic images (10 m). Here is an ALI image of the central part of Washington, D.C.

Part of Washington, D.C. in a false color rendition made from EO-1's ALI.

Radar and Thermal Systems

Radar (an active microwave system) has been flown on both military and civilian spacecraft because of its ability (for certain wavelengths) to penetrate clouds. Seasat, the SIR series, and Radarsat are among the instruments used so far. Thermal remote sensing, operating primarily in the 8-14 µm but also in the 3-5 µm wavelength region of the spectrum produces diagnostic data that can aid in identifying materials by their thermal properties. Some meteorological satellites have thermal sensors, as does the Landsat TM. A multispectral thermal scanner, TIMS, is described.

Another class of satellite remote sensors now in space are radar systems (these are treated in detail in Section 8). Radar commonly provides a very different view of the same landscape compared with a visible image. This is obvious in this pair showing an ancient terrain in Egypt with fractures in a crystalline terrain evident in the left image (SIR-A radar) and plutons in the same scene in the Landsat image on the right.

Egyptian crystalline terrain image by SIR-A radar (left) and Landsat (right).

The first civilian radar system to operate from space was mounted in a Space Shuttle. Seasat was an experimental L-Band radar whose primary mission was to measure ocean surfaces. However, it produce very informative images of the land surface, including this scene that includes Death Valley, one of the prime test sites for determining the capabilities of various sensors.

Seasat radar image of Death Valley, California and surrounding mountains.

Among systems now operational are the Canadian Radarsat, ERS-1 and ERS-2 managed by the European Space Agency, and JERS-1 and JERS-2 under the aegis of the National Space Development Agency of Japan, NASDA. As an example, here is the first image acquired by Radarsat, showing part of Cape Breton in Nova Scotia, and the surrounding waters.

Radarsat image of Cape Breton, Nova Scotia.

The European Space Agency, ESA, also has flown radar on its ERS-1 and ERS-2 satellites. Here is an image in black and white showing the San Francisco, California, metropolitan area and the peninsula to its south, as well as Oakland, California, the East Bay, and beyond.

ERS-1 radar image showing San Francisco, the East Bay Cities, cities in the Peninsula, the entire San Francisco Bay and coastal ranges on either side.

I-26: Look at the above two radar images, especially the one showing San Francisco. State two characteristics of the radar images that seem to differ from those of Landsat. ANSWER

The ERS satellites had other sensors, as was indicated in the Overview. One in wide use is ATSR (Along Track Scanning Radiometer). Here is an image of the English Channel made by that instrument:

The English Channel as viewed by ATSR on ERS-2.

NASA, through its Jet Propulsion Laboratory (JPL) in Pasadena, California, has flown three radar missions on the Space Shuttle. The SIR (Shuttle Imaging Radar) series has used different wavebands and look conditions, with many excellent images over much of the globe having been acquired. Appearing below is a SIR-C image obtained on October 3, 1994 during a flight of the Space Shuttle. This is a false color composite made by assigning the L-Band HV, L-Band HH, and C-Band images to red, green, and blue respectively (see page 8-7 ). The area shown is that part of Israel containing disputed West Bank territory that includes Jerusalem (yellowish patterns on left) and the top of the Dead Sea.

SIR-C radar color composite of Jerusalem and surroundings.

Remote sensors that cover two thermal intervals - the 3-5 µm and 8-14 µm broad bands (corresponding to two atmospheric windows) allowing sensing of thermal emissions from the land, water, ice and the atmosphere - have been flown on airplanes for several decades. Many of the meteorological satellites (see next page) include at least one thermal channel. A thermal band is included on the Landsat Thematic Mapper.

The principles behind thermal remote sensing are treated in some detail in Section 9. For now, let us look at two representative samples of the types of thermal images that indicate the kinds of information resulting from operation of thermal sensors on moving platforms above the Earth's surface.

This next image was made from a satellite dedicated to sensing one thermal property - thermal inertia (defined on page 9-3). The Heat Capacity Mapping Mission (HCMM) was launched in 1978 and is described on page 9-8. This image covers about 700 km (435 miles) on a side and was taken at night (daytime thermal images were also generated) on July 16, 1978 over southern Europe using a sensor that integrates thermal emissions within the wavelengths from 10.5 to 12.5 µm. The darker area in the upper left portrays lowlands in eastern France and southwestern Germany. The Alps form a broad arc crossing the image. The blackish pattern within the Alps corresponds to the cold higher elevations (with some snow). The lighter-toned land below the Alps is the Piedmont and western plains of Italy's Po Valley. The light tones near the image bottom are the waters of the Mediterranean Sea, which at night are warmer (heat sink) than most land surfaces.

HCMM Nighttime image of part of southern Europe centered along the broad arc comprising the Alps from the Mediterranean Sea (bottom) to southwestern Germany and southeastern France.

Thermal data, especially from the 8-14 µm region become more valuable in singling out (classifying) different materials when this spectral interval is subdivided into bands, giving multispectral capability. NASA's JPL has developed an airborne multiband instrument called TIMS (Thermal IR Multispectral Scanner) that is a prototype for a system eventually to be placed in space. The images it produces are notably striking in their color richness, as evident in this scene that includes a desert landscape around Lunar Lake in eastern California.

HCMM Nighttime image of part of southern Europe centered along the broad arc comprising the Alps from the Mediterranean Sea (bottom) to southwestern Germany and southeastern France.

The image pair below covers a part of the White Tank Mountains of Arizona. The left image is made from 3 TIMS bands; the right is a false color composite formed from visible band data on another multispectral scanner onboard the aircraft that gathered TIMS emitted radiation.

TIMS image of the Lonar Lake area in California, made using Bands 5,3,1 rendered in red, green, blue respectively.

While these color patterns make some sense when interpreted through geologic maps, aerial photos, field visits (ground truth), etc., it is hard to envision what they mean just from this image pair. Perhaps a better insight and context will result from the view below, which is an oblique or perspective "photo" showing the full extent of the White Tank Mountains (a fault block range) and surrounding desert and agricultural farms, about 40 km (25 miles) west of Phoenix. But, this is really a "trick picture", in that it is made from Landsat TM bands (1,2,3) that have been registered to Digital Elevation Model (DEM) data (see page 11-5) that contain heights above sea level, from which a 3-dimensional representation is constructed. Other such examples will appear in several Sections of this Tutorial.

A Landsat TM color composite registered to a DEM (digitized topographic base) data set to yield an oblique perspective view of the White Tank Mountains and adjacent valleys and farmlands.

Meteorological, Oceanographic and Earth System Satellites

Satellite-and Shuttle-based remote sensors are especially adept at providing image and physical property data regarding atmospheric and ocean surface conditions, usually scanned from meteorological/oceanographic platforms. Example from two of these (GOES and NOAA) are described on this page but the main treatment is reserved for Section 14. Metsats look mainly at cloud cover, water vapor, wind patterns, advancing fronts, and certain atmospheric properties. SeaWiFS is a dedicated oceanographic satellite whose sensor measures in multibands (visible and near IR) that specify ocean color and chlorophyll content (in algae and plankton). To these specialized satellites should now be added a third group - a large array of satellites just launched or to be launched in the next 5 years that gather integrated information on the Earth System - land, marine, atmosphere, and biological aspects of the terrestrial environments. Although not the first directed towards this endeavor, Terra is now operating and returning data that prove the value of this approach.

As suggested earlier in this Introduction, among the first satellites in the U.S.'s entry into space were those designed to demonstrate that weather systems and climate variations could be monitored at regional or even continental scales, thus greatly improving the realtime surveillance of clouds, temperature variations, water vapor, and moving fronts (especially tornadic vortices and hurricanes). The Nimbus series has already been mentioned ( page I-7 ), but the images there emphasize land features rather than meteorological conditions. In Section 14, we will review the entire history of the "Meteorology from Space" programs that include satellites operated by several other countries. Suffice here to show two typical examples.

The first is a January 21, 2000 scene covering part of the western U.S. and adjoining Pacific Ocean as imaged many times each day by the GOES 10 (geostationary) satellite. The land tones are darkened so that the cloud patterns, in white to gray tones, stand out.

GOES 10 image made at 1:00 AM PST on January 21, 2000 from geostationary orbit, looking at western North America from northwestern Mexico, western Texas, northward into Canada and up to the Gulf of Alaska.

The second scene was made by the NOAA 15 satellite that was the principal monitoring system that followed the westward progress of the powerful Hurricane Floyd which struck the U.S. mainland in mid-September of 1999. This color composite shows Floyd in the early morning of September 15 after it had passed over the northern Bahamas and was bearing down on the north Florida coast. The size of this hurricane can be grasped by noting that the bottom left of the image covers the Everglades (in green) whereas the top left includes all of the Georgia coast into South Carolina.

NOAA-15 image made during the morning of September 15, 1999 showing the full extent of the huge Hurricane Hugo as it passed the Bahamas and headed northwest toward the upper Florida to South Carolina coast; this storm caused more damage by flooding than by winds; Florida Everglades in green.

Many meteorological satellites are adept at picking out chracteristics of the oceans such as silt/sediment patterns, temperatures, wave trains, and current circulation. But several satellites have been flown primarily to sense these and other properties of the ocean surface (again, see Section 14). Among these are Seasat, Radarsat, the Coastal Zone Color Scanner (CZCS) on Nimbus 7, and SeaWiFS (now operating).

On SeaWiFS, several bands cover the blue, green, and red parts of the visible spectrum, and into the near infrared, yielding data that can be used to display variations in ocean color or, for particular bands, indications of the distribution and intensity of chlorophyll that resides mainly in surficial plankton. This SeaWiFS image maps the generalized ocean colors as well as chlorphyll concentrations (in red, yellow, and orange colors) on a near global scale during September, 1997.

A SeaWiFS composite covering much of the globe as displayed in planimetric format, indicating primarily variations in chlorophyll (from high, shown in reds, oranges, yellows) to low (in blues and purples), integrated over part of the month of September in 1997.

Terra is the "flagship" satellite in the Earth Science Enterprise (ESE) that is the United States contribution to a continuing scientific effort often referred to as the International Geosphere-Biosphere Program. Mentioned at the end of the Overview, the IGBP and its spin-offs are of sufficient scope and merit to deserve its own Section (16) in this Tutorial. The five sensors on Terra are: MODIS, MOPITT, MISR, ASTER, AND CERES. For the moment, we show here just a single image (others will appear in Sections prior to Section 16) made by the ASTER instrument on Terra. This scene is of volcanoes in the Andes mountain chain of South America. Volcanoes are important components of the Earth System being studied by the ESE in that they affect the environment on regional to worldwide scales by expelling into the atmosphere gases and dust that can affect weather patterns.

ASTER false color image of volcanoes in the Andes Mountains.

The last image in this Introduction is also constructed from a multispectral satellite sensor which produces color imagery. It is a natural color "portrait" of the entire globe, in which vegetation-rich areas are in green, vegetation-poor (including deserts) areas are in various shades of yellow and brown, and ice is in white. One thing brought out in this world view, is that a large part of the total land surface does not have extensive vegetation cover - this helps to visualize the possibly precarious state of those biomes that contain most of the living species, recycle oxygen to the atmosphere, and provide organic raw materials and foodstuffs for the health and survival of the human race and many members of the animal kingdom.

A SeaWiFS image similar to the one above in which data collected over the Earth’s continents have been classified to show active vegetation (greens), semi-arid to desert surfaces (buffs and yellows), and ice (white).

The Systems (Multisource) Approach to Remote Sensing

As may already be evident from earlier pages in this section, remote sensing data is of such nature and volume as to require it to be compatible with processing and outputing by computers. They are the easiest, fastest, and most efficient way to produce images, extract data sets, and assist in decision making. One special function is to assist in manipulating other kinds of data about the spatial or locational aspects of areas in the world that are the subjects of interpretation and decision making. Today, the approach to analyzing a problem or determining a plan dealing with some aspect(s) of monitoring or managing these areas for specific uses or development is embodied in the concept of a Geographic Information System (GIS). This page previews this tool but Section 15 will be devoted to understanding its capabilities and applications.

Since the early days of monitoring the Earth by orbiting spacecraft, the development of computer-aided techniques for reliably identifying many categories of surface features within a remotely sensed scene, either by photointerpretation of enhanced images or by classification ranks in itself as an outstanding achievement. Numerous practical uses of such self-contained information are being made without strong dependence on other sources of complementary or supporting data. Thus, automated data processing assists in recognizing and mapping, as an example, major crop types, estimating their yields, and spotting early warning indicators of potential disease or loss of vigor. However, many applications, particularly those involving control of dynamic growth or change systems, or decision making in management of natural resources, or exploration for nonrenewable energy or mineral deposits, require a wide variety of input data of various kinds (from multisources) not intrinsic to acquisition by spaceborne sensors such as those on Landsat, SPOT, and others of similar purpose.

Some data are essentially fixed or time-independent - slope aspect, rock types, drainage patterns, archaeological sites, etc. - in the normal span of human events. Other data come from measurements or inventories conducted by people on the ground or in the air - weather information, population censuses, traffic flow patterns, soil erodability, etc. However, many vital data are transient or ephemeral - crop growth, flood water extent, insect infestation, limits of snow cover, etc. - and must be collected in a timely sense. Pertinent remote sensing data play a key role in this last instance, and in fact satellite monitoring is often the only practical and cost-effective way to acquire data frequently over large regions.

A given scene imaged at different times of the year can show great variety. Changing Sun angles, atmospheric variations, seasonal differences in vegetation cover, presence of snow, and other variables will produce often pronounced contrasts in the spectral responses that determine "how an image looks". This is evident in this montage of 6 Landsat MSS images of an area in the desert of Utah.

Six Landsat images covering the same area in Utah, taken at different times of the year.

One should always keep in mind that remote sensing is an integral part of a larger Information Management System. In fact, in many applications, the user community employs remote sensing inputs as a key component of a continuing cycle of decision making. Consider this diagram:

A typical Remote Sensing-driven Information System.

This chart shows a simple closed-loop cyclic process, unencumbered by the various feedback loops that no doubt exist. The starting point, and end point as well, is the set of panels labeled Information Requirements. This focuses on the ultimate driver of any information management system: the user and his/her recurring needs. Various disciplines concerned with Earth observations and resources are represented (one not shown is Meteorology). The terrestrial globe in the background reminds us that the system should be worldwide in scope. Information requirements logically lead to user/customer demands. The best remote sensing system approach is the one most responsive to these demands.

There has by now been full realization that the best current and future uses of most Earth-observing data from satellites (or astronauts) stem from correlating and interleaving this type of data with various other types that together are essential inputs to decision making and applications models. This is embodied in the "Multi" concept, described in Section 13 (page 13-4a ff), but summarized here by these terms: Multistage; Multilevel; Multisensor; Multispectral; Multitemporal; Multisource. Remote sensing data constitute an integral element of a more general Earth Survey Information System, as exemplified here:

A typical Earth Resources survey information system

The bulk of the data in such systems have in common a geographical significance, that is, they are tied to definite locations on the Earth. In this sense, they are similar to or actually make up what has become a powerful tool in decision making and management: the Geographic Information System (GIS; also known as geobased or geocoded systems). Because vast amounts of spatial or geographically referenced data must be gathered, stored, analyzed in terms of their interrelations, and rapidly retrieved when required for day to day decisions, a GIS that accepts these data must itself be automated (computerized) to be efficiently utilized.

The importance of GIS as a unifying means of handling geospatial data, including often mandatory inputs from remote sensing, warrants an extended explanation of how its works and what is does. This is the subject of Section 15. In Section 1 you will learn how computers with appropriate software are an essential part in processing, manipulating, and integrating data such as is the output of Landsat and other systems. It is safe to say that today, without computers, remote sensing from space would be next to impossible.

Military Intelligence Satellites

This page is devoted entirely to one of the prime uses of remote sensing that has driven the entire technology since the early days of the entry of satellites into space. Depending almost exclusively on imaging capabilities, "spy satellites" have been orbited by the hundreds (by several countries) to gather military intelligence or information about terrorist activities. Visible, Near-Infrared; Thermal Infrared, and Radar sensors are applied to gathering information about ground targets and activities of national security significance. Many of the military or intelligence satellites, up until recently, have had superior resolutions when compared with Space Agency systems.

Looking down and out (as from a mountain) to survey the battlefield for information useful to military leaders goes back to ancient times. In Napoleonic times, the French used observation balloons to scan their foes before and during battles. This technique was often a factor in the U.S. Civil War. By the First World War, airplanes and dirigibles were employed over enemy lines and their staging areas and cities as platforms from which aerial photography provided reconnaissance and intelligence pertinent to the content of battle. This approach was much expanded during the Second World War, as for example the follow-ups to a bombing raid to assess damage to the target. With the advent of rockets and then satellites, observations of both military and political activities on the ground became possible, ushering in the so-called Age of Spy Satellites. Besides surveillance of a wide variety of targets of interest to military intelligence units (in the United States, these include the Department of Defense, the CIA, the National Security Agency, and Homeland Defense), satellites can now assist in areas other than simply observing features on the ground - this includes communications, meteorology, oceanography, location (Global Position Systems [GPS]), and Early Warning Systems (none of these latter applications will be discussed on this page). In addition to satellites, manned aircraft continue to be platforms and in recent years UAV's (Unmanned Aerial Vehicles) such as drones have assumed some of the intelligence-gathering tasks.

As one would suspect, there is extensive coverage of "spy satellites" on the Internet, although much information remains classified and thus not released to the public. On this page, only the broadest outline of the history of military intelligence operations from the air and space in the last 50 years will be treated, along with some representative image examples that are now declassified into the public domain. The reader is offered these five Internet sites as among the best found by the writer. Two top the list: 1) An overview (click on that link) prepared by Federation of American Scientists, with an offshoot or link from the same organization that shows specific imagery; and 2) this site, with an information history supported by good imagery, produced by a group at George Washington University. Three others, simply enumerated here are also worth a visit: (1); (2); and (3). It's your choice, either visit these first or read further on this page (most images come from these Web pages just cited) and then check out the above sites.

Before moving through this page, we suggest going to one more Web site that specifically includes the effects of spatial resolution in military satellite surveillance. This is again two of the links on the FAS Imint site (1)and (2). The single idea to draw from these illustrations is that military observations work best and reveal the desired intelligence when resolution is a few meters or better. Note what can be found and, more importantly, identified at each resolution level. For nearly four decades the military had high resolution systems that could not be matched by non-military earth observation systems (i.e., there were imposed limits of resolution below which (namely, towards greater/better resolution [picking out smaller objects]) no civilian space agency or private group were permitted to design into the sensors on the satellites they operated). All this changed in the 1990s when the Russians began to sell high resolution (~2 meters) imagery from their SPIN-2 on the open world market. After that the U.S. placed more than 800,000 of its earlier military space photos into the public domain.

Now to some specifics: When one thinks of any postwar spy from the sky incidents, the first and most famous case that many recall was the U-2 high altitude airplane that was shot down over the Soviet Union during President Eisenhower's second term in which the pilot, Gary Powers, was captured and held for many months. Here is an example of a U-2 photo (from another mission) over a military air base.

U-2 photograph of an unidentified military air base; note the aircraft line up.

The U-2 achieved even more fame during the October 1962 Cuban Missile Crisis, when Pres. John F. Kennedy went to the brink with Soviet Chairman Nikita Krushchev over the installation of Medium and Long Range nuclear rockets in parts of Cuba. The next two U-2 images show the facilities that were in place before the Soviet/Cuban block agreed to remove these weapons (probably avoiding World War III or at least partial annihilation):

The San Cristobal, Cuba missile site on October 14, 1962; U-2 photo.
Another San Cristobal site on October 27, 1962, as the facilities were being dismantled; U-2 photo.

Added to this fleet of spy planes was the SR-71, known as the Blackbird. This photo shows a military storage base in Nicauragua (in 1987):

The first military satellites carried photographic cameras that had high resolution optics. The problem, of course, was that of retrieving the film. It could have been developed onboard like was done with the Lunar Orbiter pictures. Instead, the operators of this type of satellite chose to eject the (undeveloped) film towards Earth, with its container deploying on a parachute when the object entered the upper atmosphere. The film package was then "snatched" in mid-by an airplane sent to its expected point of entry. This seemingly difficult feat was remarkably successful, accomplished most of the time. This diagram shows the sequence:

Diagram showing sequence of stages in parachute drops from a satellite.

The primary group of military satellites used in reconnaissance and surveillance is known as the KeyHole series. The first of these was placed in orbit in 1960 and used the parachute retrieval system until 1972. Each numbered KeyHole mission series consisted of multiple satellites. Different identifiers denoted specific satellite types that varied in launch vehicle, orbital characteristics and satellite instrumentation. For example, the KH-1 through KH-4 series also included the appellation Corona, KH-5 is called the Argon series, and KH-6 the Lanyard. Each involved various numbers of satellites. There were 105 successful Corona missions, which were operated by the U.S. Air Force, with CIA involvement. Spatial resolution, initially about 2 meters with the first KH-1 launch in 1960 to work properly, steadily improved with the higher KH numbers. To illustrate the types of images obtained (as photos) from this early group (that constitute the now declassified images released during the Clinton Administration), first examine this view of Moscow made by a KH-4 pass:

KH-4 Corona Photo of central Moscow in the Soviet Union, with inset showing an enlargement of Red Square.

Next, examine this KH image (specific group unspecified) of a Soviet Airfield.

Early KH photo of a Soviet airfield.

All of the KH satellites, of which more than 150 have been launched, consist of film cameras or electro-optical cameras that view the ground through telescopes. KH-7 and KH-9, the Gambit series, had resolutions of about 7 and 2.5 centimeters respectively. The KH-9 Hexagon satellites had 5 to 10 meter resolutions. It is not clear from any of the Internet sources consulted when 1) parachute drops ceased and 2) when electro-optical scanners replaced film. But, KH-11 is designated as an ELINT type (Electronic Intelligence) and did relay its imagery to receiving stations. The KH-11 Crystal satellites produce both SWIR and Thermal Infrared imagery, suggesting a non-photographic component. The KH-12 series, the last for which some specifications can be found, is reputed to achieve a resolution of 2+ cm, although images of this sharpness haven't been released. Below are several illustrations of declassified (but less than optimum resolution) KH-11 and KH-12 images:

One of the first KH-11 Crystal images, showing facilities at a Soviet harbor; the streaks to the right suggest scan lines, implying that an electro-optical scanner made this image.

During the 1991 Gulf War, this KH image located vehicles in the Iraqi Hammurabi Tank Division.

Tanks deployed in southern Iraq

Here is one of Saddam Hussein's palaces, a potential target in any renewed Iraqi conflict.

One of several palaces near Baghdad used by Saddam Hussein.

The next pair shows a military barracks in Serbia before and after an airstrike during the Bosnian conflict.

Belgrade barracks before an air raid.
The same barracks after the raid, showing damage inflicted.

This next image was made by a KH-12 satellite using an improved Crystal sensor to image the Zawahr Kili Al-Bahr terrorist camp in western Afghanistan in 1998

KH-12 image of Afghan terrorist camp.

Thermal infrared imagery proved scarce during the writer's Internet search. No photo was identified as of this type. But here is one which resembles a thermal IR image. It was taken during the Contra conflict in Nicauragua.

Tanks and armored vehicles in this image (possibly thermal IR) of a facility in Nicauragua.

Radar has special military value because, using the right wavelengths, this active system can "see through" clouds and can operate at night. The U.S's Lacrosse series consists of a SAR sensor mounted on a very large (reputed to be schoolbus sized) platform that ties to extended solar arrays. No Lacross/Vega images were found from an Internet Search through more than 200 sites - this confirms the highly secretive and classified nature of this system. The first Lacrosse was orbited in 1988; Lacrosse-4 launched in 2000. The speculation is that this radar can achieve 1 meter or better resolution. The next image was made by an airborne SAR (TIER program) which simulates the Lacrosse products:

TIER SAR imagery of Army barracks.

The TESAR (Tactical Enhanced SAR) program has produced these high resolution images: first of shipping crates at a Baltimore, MD harbor; second, the most famed military building in the world - the Pentagon outside Washington. D.C.

TESAR image of Baltimore Harbor dock and adjacent ship
TESAR image of the Pentagon.

Over these past 42 years, and well before, aerial reconnaissance and intelligence surveillance has relied heavily on aircraft (ranging from low flying slow single engine propeller planes to high flying fast jets). A U-2 and an SR-71 example were shown above. In recent years, the military has turned to another aerial method - the use of UAVs - unmanned Aerial Vehicles - to conduct pre-programmed surveys of targets and personnel. Some vehicles are called "drones". One class of UAVs is the Predator series. Below are two examples of images obtained from so-called gun-mounts, where the camera or sensor is placed at or near the front of the vehicle.

UAV-acquired image of a bridge being destroyed in Kosovo.
Serbian fighters surrendering in Bosnia; UAV image.

Finally, intelligence imagery has come full circle. Among the first satellites launched were those used for military purposes. The experience with this technology was invaluable to NASA and other space agencies in developing their (until recently, lower resolution) earth-observers. But, with the Russian declassification followed by orbiting of civilian commercial satellites, e.g., IKONOS and Quickbird, high resolution imagery (1 to 4 meter range) has proved to have its military applications and is being purchased by many nations. The U.S. military and the CIA/NSA complex have contracted with these companies to obtain imagery, especially since the 9/11 terrorist attacks on the World Trade Center and the Pentagon. We close with this one example of an Afghanistan Taliban Air Force base at Baghram, north of Kabul. This is an IKONOS product:

IKONOS image of the airfield at Baghram, Afghanistan, showing deployment of MIG fighters.

This availability of high resolution satellite images from unrestricted civilian sources has another important ramification: Images of possible military value can now be purchased by any country. Most nations cannot afford their own spy satellite programs but IKONOS, Quickbird and other private company products are affordable.

We leave this page with a set of three examples of such imagery that have been gathered by Quickbird 2 (DigitalGlobe) in its now routine coverage of Baghdad, Iraq during 2002 and 2003 in anticipation of a possible invasion of that country by the U.S. military (and, probably other cooperating nations) as an initiative to topple the government of Saddam Hussein. Read the captions of each image for details.

Most of Baghdad, Iraq, in a moderate resolution Quickbird image
An enlargement of Quickbird imagery (4 meter resolution) show palaces and other buildings along or near the Tigris River.
The Al-Sijood Palace, one of the principal residences of Saddam Hussein, along the Tigris River (see its location in the image above).

So, the question of the day: Can anything really be private anymore?

Finally, for still more information on security-based surveillance from space, we suggest you check out the Global Security Organization web site.

Medical Applications of Remote Sensing

The use of various instruments/machines as diagnostic tools in medical examinations falls within the broad definition of remote sensing, although the target or surface being analyzed is close to the sensor, which may be exterior to the body or can be inserted inside the body to examine internal organs. Electromagnetic radiation is the sensing medium in most analyses. Both active and passive sensors are used. The usual end product is an image. Most medical remote sensing is designed to "see into" the body without having to be invasive (cutting it open). Some techniques produce only static images; others can actually display the features being examined in dynamic, real time images which show the functional movements of the organ(s) within the body. On this page, one of the earliest medical imaging devices - x-ray units that use radiography or fluorescence to produce the pictures - is described.

The writer (NMS) gratefully thanks his physician, Dr. Harry Rose of the Geisinger Medical Group, Bloomsburg Hospital, for his thorough review of the content of this 3-page subsection in terms of its medical accuracy and relevance.

The very idea of the presence of the material on these next three pages comes as a surprise to the writer in that the idea for the inclusion of applications of remote sensing to the medical profession's need to examine humans and animals by sophisticated imaging instruments never once occurred to him in the first five years in which the Tutorial was being developed. (No one else ever called this omission to his attention.) But, then, in May of 2002 he was given an echocardiogram (which uses ultrasonic waves to image the body's interior) and a light in his brain flashed on: Many of the techniques of using high-powered instruments to send electromagnetic and sonic waves into the human target (we shall assume that "animal" is understood implicitly to be included since at least some of these instruments have been used to examine dogs, cats, horses, etc.) fall within the broader definition of remote sensing - electromagnetic radiation photons (at different wavelengths) or sonic wave trains are generated and coupled to the body and then detected as transmitted, absorbed, or reflected signals to an external detector a short distance away. Most medical remote sensing is of the active mode, i.e., EM radiation or acoustical waves generated by the instrument are sent into or through the body. Some examinations of bodily functions depend on implanting (by injection or swallowing) a source of radiation, such as radioactive element(s) as tracers which can be sensed by appropriate detectors as they move about (as in the blood) or concentrate within organs - this approach is analogous to passive remote sensing.

The subject of medical remote sensing (more commonly referred to as medical imaging) is now a major topic covered on the Internet. One quick overview is found at this site put together at the Lawrence Berkeley Laboratory. Another of general interest is at this site. A third broad review site is the oft-consulted How Stuff Works site. Once there, click first on "Body and Health" and when that comes up choose "Health Care" and look for the box category of interest (one of the imaging methods); or, type in the specific method in the Search box. Finally, consult a very informative survey of principal imaging methods, including the physics involved, prepared by Dr. D. Rampolo.

In the Tutorial, we will cover these methods: X-ray Radiography; X-ray Fluoroscopy; Computer Assisted Tomography (CAT scans); Magnetic Resonance Imaging (MRI); SPECT (Single Photo Emission-Computed Tomography; Positron Emission Tomography (PET); CATscan-SPECT combined; Infrared Imaging Thermography; Ultrasound; and Endoscopy. PET and SPECT also fall in the realm of nuclear medicine. These various methods can produce "static" images or can be viewed in real time to examine "movements" within the body. Also, some methods concentrate on skeletal parts (bone), others on internal organs (e.g., brain; heart; kidneys); others on circulation and other functions. Most methods are used to detect abnormalities such as malignant growths, bone breaks, and disease effects.

Modern medical imaging began with an almost accidental discovery in the lab of Professor Wilhelm Roentgen in Germany on a November day in 1895.

Wilhelm Roentgen's Laboratory.

Roentgen was experimenting with a Crooke's Tube he had recently obtained from its inventor. This is a glass vessel from which air is withdrawn creating a near vacuum; at one end is an anode (positively charged) and at the other a cathode (negatively charged source of electrons); the tube is wired to be part of an electrical circuit. When a current is passed between these electrodes, the few particles within the tube are excited and fluoresce or glow (commonly blue or green); this results from the flow of high speed electrons (cathode rays) across the (voltage) potential difference imposed in the circuit. Roentgen had placed the Tube in a black box but to his amazement noted that a fluorescent screen nearby was glowing of its phosphors which he deduced to be excitement by radiation escaping the box. This unknown (X) radiation he simply labelled X-rays (they are also called Roentgen rays). As he studied their properties, he experimented by putting a hand on a fluorescent screen directly in the path of this radiation, getting this famous picture:

Colorized version of the first image of an x-rayed hand.

Soon others were experimenting with x-rays. The first medical uses of x-ray machines occurred within a year. Roentgen's achievement was recognized in 1901 when he received the first Nobel Prize in Physics. A fascinating account of his discovery is given at this Internet site

X-rays are produced when electrons are impelled against an anode metal target (tungsten; copper; molybdenum; platinum; others) as they pass through a vacuum tube at high speeds driven by voltages from 10 to 1000 kilovolts (kV). When incoming electrons interact with inner electrons in the metal, these latter are driven momentarily to higher energy levels (these orbital electrons are pushed into outer orbitals); when these excited electrons drop back to their initial orbits (a transition from a higher to a lower energy level), the energy they acquired is given off as radiation, including x-rays . Some of the scattered x-rays are collimated into beams (typically at conical angles up to 35°) that are directed towards targets (such as the human body). Soft body tissue absorbs less x-rays, i.e., passes more of the radiation, whereas bone and other solids prevent most of the x-rays from transmitting through the body mass. (X-rays have other uses, such as examining metals for flaws or determining crystal structure.) Here is a diagram of a typical x-ray machine setup:

Schematic diagram of an x-ray generating tube assembly and of its using in examining the human body.

Two classes of detectors record the x-ray-generated image: 1)Photographic film, in which the difference in gray levels or tones relates to varying absorption of the radiation in the beam impinging on the target (the convention is to use the exposed film [x-rays act on the silver halide {see page 12 of this Introduction} to reduce it to metal silver grains] in its negative form, such that bone will appear nearly white [thus, because bone absorbs efficiently, few x-rays strike the corresponding part of the film, leaving it largely unexposed; the soft tissue equivalents pass much more radiation and darken the film]); 2) fluorescent screens, that include phospors (elements or compounds that fluoresce or phosphoresce) coating a substrate; this occurs when electrons in the phosphors jump to higher level orbitals, with visible light given off either instantly when the electrons transition back to the lower state or with a time delay fractions of a second or seconds (afterglow), in a process similar to x-ray production; typical phosphors include Calcium tungstate or Barium Lead sulphate (many other compounds are available such as Lead oxide or those containing Gadolinium or Lanthanum; these screens in certain configurations allow realtime movements of the medical patient to be observed and the sreen images can be photographed or digitized.

Here is a typical hospital examining room that contains the setup used in x-ray radiology; the table on which the patient lies that can, in some instruments, be raised to a vertical position.

X-ray radiological instrument.

X-ray radiology is still the most commonly used medical instrument technique. Here are a sequence of images that illustrate typical uses and results. The first is a chest x-ray, (the skeletal bones are whitish since they absorb the radiation and thus the negative is not darkened and the lungs dark because more of the radiation has passed through them):

Chest x-ray of a patient with healthy lungs

This next is a front and side view of the upper torso; the arrow points to a tuberculosis patch in the left lung:

Two chest x-rays of a patient with tuberculosis.

Here is a negative x-ray film image of the pelvic area:

X-ray of the pelvic-lower backbone-abdominal area of a patient.

Compare this recent image of the human hand with that shown above as the first ever taken:

X-ray picture of a hand and wrist.

This next picture is a mammogram showing a growth in the female breast:

X-ray mammogram of a woman's breast showing veining and an abnormal growth.

The human skull is x-rayed mainly to spot signs of fracture. But, sometimes indications of tumors are present, as shown by the darker gray patch in the cranium of this individual's skull:

The upper part of the skull, with a dark patch due to a tumor.

The jaw and teeth are evident in this lateral view of the lower human skull:

The jaw and teeth structure in a human skull

Most of us gain our first experience and insight into x-rays when we have a small film inserted into our mouth and then the x-ray machine is placed against that part of our jaw. Here is a typical x-ray image of teeth, in which the whitest part of the negative corresponds to metal fillings:

Dental x-ray image of several teeth.

An important variation in x-ray radiography is Fluoroscopy. In this method, either chemicals that react with x-rays are swallowed or inserted as an enema or chemicals/dyes are injected into the blood stream. These tend to increase the contrast between soft tissue response in the parts of the body receiving these fluids and surrounding bone and tissue. This pictorially highlights abnormalities.

Barium sulphate is a good example. When swallowed (either at once or commonly in gulps), the "Barium Cocktail" is especially useful in examining the digestive track. In this image, an obstruction in the esophagus carrying food and liquids into the stomach is made evident:

X-ray fluoroscopic image of part of the esophagus and trachial areas in the neck and upper chest; normal conditions. X-ray fluoroscopic image of constricted esophagus.

The large intestine or colon is strikingly emphasized in a patient who has just received a Barium enema:

X-ray fluoroscopic image of the large intestine.

Still another variant is the Angiogram. This involves insertion of a catheter into an artery, accompanied by a dye that reacts to x-rays. It is commonly used to explore the areas in and around the heart. Here is a pair of views of the left ventricle of the heart when it is pumping and squeezing blood and thus contracting (systolic phase) and then expanding as blood is returned (diastolic phase):

Systolic phase of heart's beating; angiogram centered on the left ventricle. The corresponding diastolic phase.

This next image is an angiogram that has been colored to show blood vessels including the great trunk artery or aorta around the heart:

Angiogram of the heart.

Using special methods, angiogram-like images can be made for the blood vessels in the human head:

Angiogram of the head showing certain arteries and veins.

We move on now to a powerful new approach to medical imaging, based on the technique of tomography, which uses computers to assist in obtaining three-dimensional images or image slices when either x-rays or radioactive elements (nuclear medicine) are involved in producing radiation-based imagery.

Concluding Remarks

This Introduction closes by citing sources of information such as the Eros Data Center (EDC) and calls attention to the commercialization of Earth-monitoring satellites; reference is made to the Idrisi image processing program.

Most of the Landsat images appearing in many of the 21 sections of this Tutorial are individual TM bands or color composites made from diverse combinations of three TM bands, along with a considerable number of MSS images. Also appearing are selected images acquired by SPOT, IRS, JERS, Terra, plus others, and images made from radar and thermal sensors that flew on satellites and Space Shuttle missions. A principal source for Landsat imagery and much of the astronaut space photography is the EROS Data Center (EDC) in Sioux Falls, S.D., operated by the U.S. Geological Survey (USGS). Where appropriate or necessary, principles underlying the operation of those sensors in the text accompanying these sections.

The USGS's EROS Data Center has assembled an Internet on-line collection of satellite images (mainly Landsat) designed to introduce the general viewer to scenes worldwide that focus on several environmental themes (e.g., cities, deserts, forests). We have linked their three-page Index for this Earthshot collection, which you can access here. To further stimulate interest in the practicality of this imagery, in addition to its beauty, we suggest taking time to sample scenes of interest and read the accompanying descriptions. We also recommend returning to this collection for repeat looks or to examine new examples. A recent Web site dedicated to showing a wide variety of Earth Science-related images from various satellites is at NASA's View Earth.

Throughout the Tutorial, we will introduce some of the computer-based data processing techniques that are employed to extract information from space imagery. We present the main elements of image processing in the extended analysis of a single image used throughout Section 1. There we consider a Landsat image subscene that is centered on the oceanside town of Morro Bay, California. For that analysis, and other images in Sections 2 and 5 and the Final Exam, we applied the IDRISI Program modules, developed in part as a training tool for image processing and Geographic Information Systems (GIS) by the Geography Department at Clark University. To learn more about their system, contact them on Email at: Idrisi@clarku.edu . However, the capstone of this instruction in image processing techniques is found in Appendix B, where we describe the Photo Interpretation Toolkit (PIT), an interactive processing software package. Working with a training image, we walk you through many of the processing routines, ending in an opportunity to classify images into thematic maps. But, before trying these procedures, we suggest you complete Section 1, and perhaps, several sections that follow it.

References to foreign (non-U.S.) remote sensing systems now operating and to U.S. distribution centers, such as EROS and SpaceImaging, indicate that remote sensing is now truly a worldwide activity. The big trend in the 1990's and into the next century is the commercialization of space. Remote sensing is becoming a multi-billion dollar industry and new national organizations and private companies emerge each year to take advantage of its income-producing aspects, because they identify many applications (some covered in this tutorial) of great practical value. There is even a monthly magazine, EOM, dedicated to these increasing uses. We shall more fully touch upon trends and issues in commercialization of remote sensing as we review the outlook for the future of remote sensing in Section 21.

With this introduction into basic principles and to characteristics of Landsat and other systems completed, and, hopefully, digested and understood. You should move on to Section 1, with its protracted tutorial development of the "whys" and "hows" of image processing. There you should gain real insight into and practice in the efficacies of remote sensing from satellites and other types of air and space platforms.