Learn Earth science &Geology
Make your knowledge perfect
Thursday, 4 April 2019
Sunday, 24 March 2019
1) What is Remote Sensing?
What is Remote Sensing?
So, what exactly is remote sensing? For the purposes of this
tutorial, we will use the following definition:
"Remote sensing is the science (and to some extent, art) of
acquiring information about the Earth's surface without actually being in
contact with it. This is done by sensing and recording reflected or emitted
energy and processing, analyzing, and applying that information."
In much of remote sensing, the process involves an
interaction between incident radiation and the targets of interest. This is
exemplified by the use of imaging systems where the following seven elements
are involved. Note, however that remote sensing also involves the sensing of
emitted energy and the use of non-imaging sensors.
1. Energy Source or Illumination (A) - the first
requirement for remote sensing is to have an energy source which illuminates or
provides electromagnetic energy to the target of interest.
2. Radiation and the Atmosphere (B) - as the energy
travels from its source to the target, it will come in contact with and
interact with the atmosphere it passes through. This interaction may take place
a second time as the energy travels from the target to the sensor.
Electromagnetic Radiation
As was noted in the previous section, the first requirement for remote
sensing is to have an energy source to illuminate
the target (unless the sensed energy is being emitted by the target). This
energy is in the form of electromagnetic radiation.
All electromagnetic radiation has fundamental properties and behaves in
predictable ways according to the basics of wave theory. Electromagnetic
radiation consists of an electrical field(E) which varies in magnitude
in a direction perpendicular to the direction in which the radiation is
traveling, and a magnetic field (M) oriented at right angles to the electrical
field. Both these fields travel at the speed of light (c).
Two characteristics of electromagnetic radiation are particularly
important for understanding remote sensing. These are the wavelength
and frequency.
The wavelength is the length of one wave cycle, which can be measured as
the distance between successive wave crests. Wavelength is usually represented
by the Greek letter lambda (l). Wavelength is measured in metres (m)
or some factor of metres such as nanometres (nm, 10-9 metres), micrometres (mm, 10-6metres)
(mm, 10-6 metres) or centimetres (cm, 10-2 metres).
Frequency refers to the number of cycles of a wave passing a fixed point per
unit of time. Frequency is normally measured in hertz (Hz),
equivalent to one cycle per second, and various multiples of hertz.
Therefore, the two are inversely related to each other. The shorter the
wavelength, the higher the frequency. The longer the wavelength, the lower the
frequency. Understanding the characteristics of electromagnetic radiation in
terms of their wavelength and frequency is crucial to understanding the
information to be extracted from remote sensing data. Next we will be examining
the way in which we categorize electromagnetic radiation for just that purpose.
The Electromagnetic Spectrum
The electromagnetic spectrum ranges from the shorter
wavelengths (including gamma and x-rays) to the longer wavelengths (including
microwaves and broadcast radio waves). There are several regions of the
electromagnetic spectrum which are useful for remote sensing.
For most purposes,
the ultraviolet or UV portion of the spectrum has the shortest
wavelengths which are practical for remote sensing. This radiation is just
beyond the violet portion of the visible wavelengths, hence its name. Some
Earth surface materials, primarily rocks and minerals, fluoresce or emit
visible light when illuminated by UV radiation.
The longest
visible wavelength is red and the shortest is violet. Common wavelengths of
what we perceive as particular colours from the visible portion of the spectrum
are listed below. It is important to note that this is the only portion of the
spectrum we can associate with the concept of colours.
Blue, green, and red are the primary
colours or wavelengths of the visible spectrum. They are defined as
such because no single primary colour can be created from the other two, but
all other colours can be formed by combining blue, green, and red in various
proportions. Although we see sunlight as a uniform or homogeneous colour, it is
actually composed of various wavelengths of radiation in primarily the ultraviolet,
visible and infrared portions of the spectrum. The visible portion of this
radiation can be shown in its component colours when sunlight is passed through
a prism, which bends the light in differing amounts according to
wavelength.
Mie scattering occurs when the particles are just about the same size as the
wavelength of the radiation. Dust, pollen, smoke and water vapour are common
causes of Mie scattering which tends to affect longer wavelengths than those
affected by Rayleigh scattering. Mie scattering occurs mostly in the lower
portions of the atmosphere where larger particles are more abundant, and
dominates when cloud conditions are overcast.
Passive vs.
Active Sensing
In previous sections we described the visible portion of the spectrum and the concept of colours. We see colour because our eyes detect the entire visible range of wavelengths and our brains process the information into separate colours. Can you imagine what the world would look like if we could only see very narrow ranges of wavelengths or colours? That is how many sensors work. The information from a narrow wavelength range is gathered and stored in a channel, also sometimes referred to as a band. We can combine and display channels of information digitally using the three primary colours (blue, green, and red). The data from each channel is represented as one of the primary colours and, depending on the relative brightness (i.e. the digital value) of each pixel in each channel, the primary colours combine in different proportions to represent different colours.
- Violet: 0.4 - 0.446 mm
- Blue: 0.446 - 0.500 mm
- Green: 0.500 - 0.578 mm
- Yellow: 0.578 - 0.592 mm
- Orange: 0.592 - 0.620 mm
- Red: 0.620 - 0.7 mm
The next portion of the spectrum of interest is the infrared (IR) region
which covers the wavelength range from approximately 0.7mm to 100 mm - more than 100
times as wide as the visible portion! The infrared region can be divided into
two categories based on their radiation properties - thereflected IR,
and the emitted or thermal IR. Radiation in the reflected IR region
is used for remote sensing purposes in ways very similar to radiation in the
visible portion. The reflected IR covers wavelengths from approximately
0.7 mm to 3.0 mm. The thermal IR region is quite
different than the visible and reflected IR portions, as this energy is
essentially the radiation that is emitted from the Earth's surface in the form
of heat. The thermal IR covers wavelengths from approximately 3.0mm to 100 mm.
The portion of the spectrum of more recent interest to remote sensing is
the microwave regionfrom about 1 mm to 1 m. This covers the longest
wavelengths used for remote sensing. The shorter wavelengths have properties
similar to the thermal infrared region while the longer wavelengths approach
the wavelengths used for radio broadcasts. Because of the special nature of
this region and its importance to remote sensing in Canada, an entire chapter
(Chapter 3) of the tutorial is dedicated to microwave sensing.
Before radiation used
for remote sensing reaches the Earth's surface it has to travel through some
distance of the Earth's atmosphere. Particles and gases in the atmosphere can
affect the incoming light and radiation. These effects are caused by the
mechanisms of scattering and absorption.
Scattering occurs when
particles or large gas molecules present in the atmosphere interact with and
cause the electromagnetic radiation to be redirected from its original path.
How much scattering takes place depends on several factors including the
wavelength of the radiation, the abundance of particles or gases, and the
distance the radiation travels through the atmosphere.
There are three (3)
types of scattering which take place.
Rayleigh scattering occurs when
particles are very small compared to the wavelength of the radiation. These
could be particles such as small specks of dust or nitrogen and
oxygen molecules. Rayleigh scattering causes shorter wavelengths of energy to be
scattered much more than longer wavelengths. Rayleigh scattering is the
dominant scattering mechanism in the upper atmosphere. The fact that the sky
appears "blue" during the day is because of this phenomenon. As
sunlight passes through the atmosphere, the shorter wavelengths (i.e. blue) of
the visible spectrum are scattered more than the other (longer) visible
wavelengths. At sunrise and sunset the light has to travel
farther through the atmosphere than at midday and the scattering of the shorter
wavelengths is more complete; this leaves a greater proportion of the longer
wavelengths to penetrate the
atmosphere.
The final scattering mechanism of importance is called non-selective
scattering. This occurs when the particles are much larger than the
wavelength of the radiation. Water droplets and large dust particles can cause
this type of scattering. Nonselective scattering gets its name from the fact
that all wavelengths are scattered about equally. This type of scattering
causes fog and clouds to appear white to our eyes because blue, green, and red
light are all scattered in approximately equal quantities (blue+green+red light
= white light).
Absorption is the other main mechanism at work when electromagnetic radiation
interacts with the atmosphere. In contrast to scattering, this phenomenon
causes molecules in the atmosphere to absorb energy at
various wavelengths. Ozone, carbon dioxide, and water vapour are the three main
atmospheric constituents which absorb radiation.
Ozone serves to absorb the harmful (to most living things) ultraviolet
radiation from the sun. Without this protective layer in the atmosphere our
skin would burn when exposed to sunlight.
You may have heard carbon dioxide referred to as a
greenhouse gas. This is because it tends to absorb radiation strongly in the
far infrared portion of the spectrum - that area associated with thermal
heating - which serves to trap this heat inside the atmosphere. Water vapour in
the atmosphere absorbs much of the incoming longwave infrared and shortwave
microwave radiation (between 22mm and 1mm). The presence of water vapour in the
lower atmosphere varies greatly from location to location and at different
times of the year. For example, the air mass above a desert would have very
little water vapour to absorb energy, while the tropics would have high concentrations
of water vapour (i.e. high humidity).
Because these gases absorb electromagnetic energy in very specific
regions of the spectrum, they influence where (in the spectrum) we can
"look" for remote sensing purposes. Those areas of the spectrum which
are not severely influenced by atmospheric absorption and thus, are useful to
remote sensors, are called atmospheric windows. By comparing the
characteristics of the two most common energy/radiation sources (the sun and
the earth) with the atmospheric windows available to us, we can define those
wavelengths that we can use most effectively for remote
sensing. The visible portion of the spectrum, to which our eyes are most
sensitive, corresponds to both an atmospheric window and the peak energy level
of the sun.
Note also that heat energy emitted by the Earth corresponds to a
window around 10 mm in the thermal IR portion of the
spectrum, while the large window at wavelengths beyond 1 mm is associated with
the microwave region.
Now that we understand how electromagnetic energy makes its journey from
its source to the surface (and it is a difficult journey, as you can see) we
will next examine what happens to that radiation when it does arrive at the
Earth's surface.
Radiation - Target Interactions
Radiation that is not absorbed or scattered in the atmosphere can reach
and interact with the Earth's surface. There are three (3) forms of interaction
that can take place when energy strikes, or is incident (I) upon the
surface. These are:absorption (A); transmission (T);
and reflection (R). The total incident energy will interact with
the surface in one or more of these three ways. The proportions of each
will depend on the wavelength of the energy and the material and condition of
the feature.
Absorption (A) occurs when radiation (energy) is absorbed into the
target while transmission (T) occurs when radiation passes through a target.
Reflection (R) occurs when radiation "bounces" off the target and is
redirected. In remote sensing, we are most interested in measuring the
radiation reflected from targets. We refer to two types of reflection, which
represent the two extreme ends of the way in which energy is reflected from a
target: specular reflection and diffuse reflection.
When a surface is smooth we get specular or mirror-like
reflection where all (or almost all) of the energy is directed away from the
surface in a single direction. Diffusereflection occurs when the
surface is rough and the energy is reflected almost uniformly in all directions.
Most earth surface features lie somewhere between perfectly specular or
perfectly diffuse reflectors. Whether a particular target reflects specularly
or diffusely, or somewhere in between, depends on the surface roughness of the
feature in comparison to the wavelength of the incoming radiation. If the
wavelengths are much smaller than the surface variations or the particle sizes
that make up the surface, diffuse reflection will dominate. For example,
fine-grained sand would appear fairly smooth to long wavelength microwaves but
would appear quite rough to the visible wavelengths.
Let's take a look at a couple of examples of targets at the Earth's
surface and how energy at the visible and infrared wavelengths interacts with
them.
Leaves: A chemical compound in leaves called chlorophyll strongly absorbs
radiation in the red and blue wavelengths but reflects green wavelengths.
Leaves appear "greenest" to us in the summer, when chlorophyll
content is at its maximum. In autumn, there is less chlorophyll in the leaves,
so there is less absorption and proportionately more reflection of the red
wavelengths, making the leaves appear red or yellow (yellow is a combination of
red and green wavelengths). The internal structure of healthy leaves act as
excellent diffuse reflectors of near-infrared wavelengths. If our eyes were
sensitive to near-infrared, trees would appear extremely bright to us at these
wavelengths. In fact, measuring and monitoring the near-IR reflectance is one
way that scientists can determine how healthy (or unhealthy) vegetation may be.
Water: Longer wavelength visible and near infrared radiation is absorbed more by
water than shorter visible wavelengths. Thus water typically looks blue or
blue-green due to stronger reflectance at these shorter wavelengths, and darker
if viewed at red or near infrared wavelengths. If there is suspended sediment
present in the upper layers of the water body, then this will allow better
reflectivity and a brighter appearance of the water. The apparent colour of the
water will show a slight shift to longer wavelengths. Suspended sediment (S)
can be easily confused with shallow (but clear) water, since these two
phenomena appear very similar. Chlorophyll in algae absorbs more of the blue
wavelengths and reflects the green, making the water appear more green in
colour when algae is present. The topography of the water surface (rough,
smooth, floating materials, etc.) can also lead to complications for
water-related interpretation due to potential problems of specular reflection
and other influences on colour and brightness.
We can see from these
examples that, depending on the complex make-up of the target that is being looked
at, and the wavelengths of radiation involved, we can observe very different
responses to the mechanisms of absorption, transmission, and reflection. By
measuring the energy that is reflected (or emitted) by targets on the Earth's
surface over a variety of different wavelengths, we can build up a spectral
response for that object. By comparing the response patterns of
different features we may be able to distinguish between them, where we might
not be able to, if we only compared them at one wavelength. For example, water
and vegetation may reflect somewhat similarly in the visible wavelengths but
are almost always separable in the infrared. Spectral response can be quite
variable, even for the same target type, and can also vary with time (e.g.
"green-ness" of leaves) and location. Knowing where to
"look" spectrally and understanding the factors which influence the
spectral response of the features of interest are critical to correctly
interpreting the interaction of electromagnetic radiation with the surface.
Passive vs.
Active Sensing
So
far, throughout this chapter, we have made various references to the sun as a
source of energy or radiation. The sun provides a very convenient source of
energy for remote sensing. The sun's energy is eitherreflected, as it is
for visible wavelengths, or absorbed and then re-emitted,
as it is for thermal infrared wavelengths. Remote sensing systems which measure
energy that is naturally available are called passive
sensors. Passive sensors can only be used to detect energy when the
naturally occurring energy is available. For all reflected energy, this can only
take place during the time when the sun is illuminating the Earth. There is no
reflected energy available from the sun at night. Energy that is naturally
emitted (such as thermal infrared) can be detected day or night, as long as the
amount of energy is large enough to be recorded.
Active
sensors, on
the other hand, provide their own energy source for illumination. The sensor
emits radiation which is directed toward the target to be investigated. The
radiation reflected from that target is detected and measured by the sensor.
Advantages for active sensors include the ability to obtain measurements
anytime, regardless of the time of day or season. Active sensors can be used
for examining wavelengths that are not sufficiently provided by the sun, such
as microwaves, or to better control the way a target is illuminated. However,
active systems require the generation of a fairly large amount of energy to
adequately illuminate targets. Some examples of active sensors are a laser
fluorosensor and a synthetic aperture radar (SAR).
Characteristics of
Images
Before we go on to the next chapter, which looks in more detail at
sensors and their characteristics, we need to define and understand a few
fundamental terms and concepts associated with remote sensing images.
Electromagnetic energy may be detected either photographically or
electronically. The photographic process uses chemical reactions on the surface
of light-sensitive film to detect and record energy variations. It is important
to distinguish between the terms images and photographs in
remote sensing. An image refers to any pictorial representation, regardless of
what wavelengths or remote sensing device has been used to detect and record
the electromagnetic energy. A photograph refers specifically to images
that have been detected as well as recorded on photographic film. The black and
white photo to the left, of part of the city of Ottawa, Canada was taken in the
visible part of the spectrum. Photos are normally recorded over the wavelength
range from 0.3 mm to 0.9 mm - the visible and
reflected infrared. Based on these definitions, we can say that all photographs
are images, but not all images are photographs. Therefore, unless we are
talking specifically about an image recorded photographically, we use the term
image.
A photograph could
also be represented and displayed in adigital format by subdividing
the image into small equal-sized and shaped areas, called picture elements
or pixels, and representing the brightness of each area with a
numeric value or digital number. Indeed, that is exactly what has
been done to the photo to the left. In fact, using the definitions we have just
discussed, this is actually a digital image of the original
photograph! The photograph was scanned and subdivided into pixels with each
pixel assigned a digital number representing its relative brightness. The
computer displays each digital value as different brightness levels. Sensors
that record electromagnetic energy, electronically record the energy as an
array of numbers in digital format right from the start. These two different
ways of representing and displaying remote sensing data, either pictorially or
digitally, are interchangeable as they convey the same information (although
some detail may be lost when converting back and forth).
In previous sections we described the visible portion of the spectrum and the concept of colours. We see colour because our eyes detect the entire visible range of wavelengths and our brains process the information into separate colours. Can you imagine what the world would look like if we could only see very narrow ranges of wavelengths or colours? That is how many sensors work. The information from a narrow wavelength range is gathered and stored in a channel, also sometimes referred to as a band. We can combine and display channels of information digitally using the three primary colours (blue, green, and red). The data from each channel is represented as one of the primary colours and, depending on the relative brightness (i.e. the digital value) of each pixel in each channel, the primary colours combine in different proportions to represent different colours.
When we use this method to display a single channel or range of
wavelengths, we are actually displaying that channel through all three primary
colours. Because the brightness level of each pixel is the same for each
primary colour, they combine to form a black and white image,
showing various shades of gray from black to white. When we display more than
one channel each as a different primary colour, then the brightness levels may
be different for each channel/primary colour combination and they will combine
to form a colour image.
Saturday, 23 March 2019
Classification of Brachiopods
- PHYLUM Brachiopoda, (lower Cambrian to Recent)
- CLASS-1 Inarticulata, (lower Cambrian to Recent)
- ORDER-I Atremata, (lower Cambrian to Recent)
- SUBORDER-1 Lingulacea, (lower Cambrian to Recent)
- SUBORDER-2 Trimerellacea,, (Middle ordovician to Upper Silurian)
- ORDER-II Neotremata, (lower Cambrian to Recent)
- SUBORDER-1 Paterinacea, (lower and middle Cambrian )
- SUBORDER-2 Siphonotretacea,, (lower Cambrian to middle ordovician)
- SUBORDER-3 Acrotretacea,, (lower Cambrian to upper ordovician)
- SUBORDER-4 discinacea,, (Middle ordovician to Recent)
- SUBORDER-5 craniacea ,, (lower Cambrian to middle ordovician)
- CLASS-2 Articulata, (lower Cambrian to Recent)
- ORDER-I Palaeotremata,, (lower Cambrian)
- ORDER-II orthida,, (lower Cambrian to upper Permian)
- SUBORDER-1 orthacea,, (lower Cambrian to lower devonian)
- SUBORDER-2 Dalmanellacea,, (Middle ordovician to upper Permian)
- ORDER-III Terebratulida,, (Upper Silurian to Recent)
- ORDER-IV Pentamerida,, (Middle Cambrian to Upper devonian)
- SUBORDER-1 Syntrophiacea, (Middle Cambrian to lower devonian)
- SUBORDER-2 Pentameracea, (upper ordovician to Upper devonian)
- ORDER-V Triplesiida, (Middle ordovician to Upper devonian)
- ORDER-VI Rhynchonellida, (Middle ordovician to Recent)
- SUBORDER-1 Rhynchonellacea, (Middle ordovician to Recent)
- SUBORDER-2 Rhynchoporacea, (Mississippian to Recent)
- ORDER-VII Strophomenida, (Lower ordovician to Recent)
- SUBORDER-1 Strophomenacea, (Lower ordovician to Recent)
- SUBORDER-2 Productacea, (upper ordovician to upper permian)
- ORDER-VIII Spriferida, (Middle ordovician to Jurrassic)
- SUBORDER-1 Atrypacea(Lower ordovician to Recent)
- SUBORDER-2 Spiriferacea, (Middle silurian to Jurassic)
- SUBORDER-3 Rostrospiracea, (Middle silurian to Jurassic)
- SUBORDER-4 Punctospiracea, (upper silurian to Jurassic)
Subscribe to:
Posts (Atom)