Remote sensing

From Citizendium - Reading time: 6 min

This article may be deleted soon.
To oppose or discuss a nomination, please go to CZ:Proposed for deletion and follow the instructions.

For the monthly nomination lists, see
Category:Articles for deletion.


Remote sensing can be defined as the art and science of obtaining information about Earth (or, for that matter, other planets) features from measurements made at a distance. It is an essential part of modern geography, but also has extensive military applications, especially in imagery intelligence and measurement and signature intelligence, but also in warfare itself, for such things as battle damage assessment.

General geography[edit]

Remotely sensed data comes in many forms such as satellite imagery, aerial photography and data obtained from hand-held sensors. Geographers increasingly use remotely sensed data to obtain information about the Earth's land surface, ocean and atmosphere because it:

  • supplies objective information at a variety of spatial scales (local to global)
  • provides a synoptic view of the area of interest,
  • allows access to distant and/or inaccessible sites,
  • provides spectral information outside the visible portion of the electromagnetic spectrum
  • facilitates studies of how features/areas change over time.

The information is stored in Geographic Information Systems (GIS), which, among other things, map observations to a system for location reference

Processing[edit]

Because much of the spatial data used in GIS is based on remotely sensed images, the software systems designed for GIS and those designed for the processing of remotely sensed images are intimately related, and many of the major GIS packages include at least basic image processing capabilities, and image processing applications vice versa. The image processing capabilities of most GIS platforms consists primarily of geometric and spectral correction, rectification and classification utilities.

Applications[edit]

Remotely sensed data may be analyzed either independently of, or in conjunction with, other digital data layers (e.g., in a GIS). In the United States intelligence community, geospatial intelligence is a term of art for the combination of images with precise locations of the placed being imaged; the term also can be applied to non-imaging intelligence collection technologies for which the exact location is important.

Basic interaction of sensors with targets[edit]

Remote sensing depends on the interaction of a source of energy with a target, and energy measured from the target [1]. In the "Remote Sensing" diagram, Source 1a is an independent natural source such as the Sun. Source 1b is a source, perhaps manmade, that illuminates the target, such as a searchlight or ground radar transmitter. Source 1c is a natural source, such as the heat of the Earth, with which the Target interferes.

Image:MASINT-passive-active-relationships.png|thumb|Remote Sensing--relationships between radiation source, target and sensor

The target itself may produce emitted radiation, such as the glow of a red-hot object, which Sensor 2 measures. Alternatively, Sensor 1 might measure, as reflected radiation, the interaction of the Target with Source 1a, as in conventional sunlit photography. If the energy comes from Source 1b, Sensor 1 is doing the equivalent of photography by flash.

Source 3a is under the observer's control, such as a radar transmitter, and Sensor 3b can be tightly coupled to Source 3. An example of coupling might be that Sensor 3 will only look for backscatter radiation after the speed-of-light delay from Source 3a to the target and back to the position of Sensor 3b. Such waiting for a signal at a certain time, with radar, would be an example of Electronic warfare#electronic protection|electronic counter-countermeasures (ECCM), so that a signal jamming aircraft closer to Sensor 3b would be ignored.

A bistatic remote sensing system would separate source 3a from sensor 3b; a multistatic system could have multiple pairs of coupled sources and sensors, or an uneven ratio of sources and sensors as long as all are correlated. It has significant value in sensing in shallow water[2] operations.

Techniques such as radar#synthetic aperture radar|synthetic aperture radar have source 3a and sensor 3b colocated, but the source-sensor array takes multiple measurements over time, along the flight path of a moving aircraft or satellite, giving the effect of physical separation of source and sensor. radar#inverse synthetic aperture radar|Inverse synthetic aperture radar works in a similar way, but with a different spatial relationship between the flight path and the target.

There are passive sensing techniques that take measurements over time, using multiple synchronized sensors. Astronomical techniques such as very long baseline interferometry use multiple sensors that move with the rotation of the earth, with the sensor moving either with the rotation of the earth, also inspire earth-oriented technique; after all, everything an astronomer observes is remote. Microsatellite constellations use multiple synchronized sensors in earth orbit.

Any of the illuminations of the target (i.e., Source 1a, 1b, or 3a), and the returning radiation, can be affected by the atmosphere, or other natural phenomena such as the ocean, between source and target, or between target and sensor.

Observe that the atmosphere comes between the radiation source and the target, and between the target and the sensor. Depending on the type of radiation and sensor in use, the atmosphere can have little interfering effect, or have a tremendous effect requiring extensive engineering to overcome. The atmosphere may absorb part of the energy passing through it. This is bad enough for sensing if all wavelengths are affected evenly, but it becomes much more complex when the radiation is of multiple wavelengths, and the attenuation differs among wavelengths. In addition, the atmosphere may cause an otherwise tightly collimated energy beam to spread.

Classes of sensor[edit]

Sensing systems have five major subcomponents:

  • Signal collectors, which concentrate the energy, as with a telescope lens, or a radar antenna that focuses the energy at a detector
  • Signal detectors, such as charge-coupled devices for light or a radar receiver
  • Signal processing, which may remove artifacts from single images, or compute a synthetic image from multiple views
  • Recording mechanism
  • Recording return mechanisms, such as digital telemetry from satellites or aircraft, ejection systems for recorded media, or physical return of a sensor carrier with the recordings aboard.

Sensors may be framing or scanning or synthetic. A framing sensor, such as a conventional camera, records the received radiation as a single object. Scanning systems use a detector that moves across the field of radiation to create a raster or more complex object. Synthetic systems combine multiple objects into a single one.

Sensors may be passive or coupled to an active source (i.e., "active sensor"). Passive sensors receive radiation from the target, either from the energy the target emits, or from other sources not synchronized with the sensor.

Most sensors will create digital recordings or transmissions, but specific cases might use film recording, analog recording or transmissions, or even more specialized means of capturing information.

Passive sensing[edit]

Figure "Remote Sensing Geometry" illustrates several key aspects of a scanning sensor.

Image:MASINT-Sensor-geometry.png|thumb|Remote Sensing Geometry-- relationships between scanning sensor and target

The instantaneous field of view (IFOV) is the area from which radiation currently impinges on the detector. The swath width is the distance, centered on the sensor path, from which signal will be captured in a single scan. Swath width is a function of the angular field of width (AFOV) of the scanning system.

Push broom sensors either have a sufficiently large IFOV, or the scan moves fast enough with respect to the forward speed of the sensor platform, that an entire swath width is recorded without movement artifacts. These sensors are also known as survey or wide field devices, comparable to wide angle lenses on conventional cameras.

Whisk broom or spotlight sensors have the effect of stopping the scan, and focusing the detector on one part of the swath, typically capturing greater detail in that area. This is also called a close look scanner, comparable to a telephoto lens on a camera.

Passive sensors can capture information for which there is no way to generate man-made radiation, such as gravity. Geodetic passive sensors can provide detailed information on the geology or hydrology of the earth.

Active Sensors[edit]

Active sensors are conceptually of two types, imaging and non-imaging. Especially when combining classes of sensor, such as MASINT and IMINT, it can be hard to define if a given MASINT sensor is imaging or not. In general, however, MASINT measurements are mapped to pixels of a clearly imaging system, or to geospatial coordinates known precisely to the MASINT sensor-carrying platform.

In MASINT, the active signal source can be anywhere in the electromagnetic spectrum, from radio waves to X-rays, limited only by the propagation of the signal from the source. X-ray sources, for example, must be in very close proximity to the target, while lasers can illuminate a target from a high satellite orbit. While this discussion has emphasized the electromagnetic spectrum, there are also both active (e.g., sonar) and passive (e.g., hydrophone and microbarograph) acoustic sensors.

Quality of Sensing[edit]

Several factors make up the quality of a given sensor's information acquisition, but assessing quality can become quite complex when the end product combines the data from multiple sensors. Several factors, however, are commonly used to characterize the basic quality of a single sensing system.

  • Spatial resolution defines the correspondence between each recorded pixel and the square real-world area that the pixel covers.
  • Spectral resolution is the number of discrete frequency (or equivalent) bands recorded in an individual pixel. Remember that relatively coarse spectral resolution from one sensor, such as the spectroscopic analyzer that reveals a "bush" is painted plaster, may greatly enhance the ultimate value of a different sensor with finer spectral resolution.
  • Radiometric resolution is the number of levels of energy recorded, per pixel, in each spectral band.
  • Temporal resolution describes the intervals at which the target is sensed. This is meaningful only in synthetic imaging, comparison over a longer time base, or in producing full-motion imagery.
  • Geospatial resolution is the quality of mapping pixels, especially in multiple passes, to known geographic or other stable references.

References[edit]

  1. Meaden, Geoffery J. & James M. Kapetsky (1991), Geographical information systems and remote sensing in inland fisheries and aquaculture. Chapter 4: Remote Sensing as a Data Source, Meaden1991
  2. National Academy of Sciences Commission on Geosciences, Environment and Resources (April 29-May 2, 1991). Symposium on Naval Warfare and Coastal Oceanography.

Licensed under CC BY-SA 3.0 | Source: https://citizendium.org/wiki/Remote_sensing
14 views | Status: cached on April 29 2024 22:26:06
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF