acquired October 6, 1972
acquired November 18, 2016

A Clearer View of Silicon Valley

Downloads

Metadata

By the middle of the 20th Century, Silicon Valley was already “on the map.” This part of California’s Santa Clara Valley drew its nickname from the raw material being used in the region’s growing semiconductor industry. The area at the south end of San Francisco Bay became a magnet for scientists and for technology companies, so by the time the new Landsat 1 satellite caught a glimpse in 1972, urban sprawl had already replaced many of the valley’s orchards.

While the two images above don’t show much change in the development of the landscape, they clearly show the development of the technology behind Landsat’s satellite sensors. The first, false-color image was acquired on October 6, 1972, with the Multispectral Scanner System (MSS) on Landsat 1; the second, natural-color image was acquired on November 18, 2016, by the Operational Land Imager (OLI) on Landsat 8.

The most obvious improvement is the spatial resolution. Over the past 45 years, you have certainly noticed similar improvements in your electronics and imaging products. Better spatial resolution is the reason you can now see blades of grass in a televised football game and the fine lines on your face in a smartphone photo. In short, there is a lot more detail visible in the 2016 image than in the 1972 image. Both are displayed at a resolution of 45 meters per pixel. The MSS image is relatively blurry, however, because the sensor’s spatial resolution was just 68 by 83 meters. The OLI image appears crisper because the instrument can resolve, or “see,” objects down to about 30 meters (15 meters in some cases).

The 2016 image also has better radiometric resolution, which means the newer instrument is more sensitive to differences in brightness and color. OLI uses 4,096 data values to describe a pixel on a scale from dark to bright. MSS used just 64. More data ultimately translates to the features in the image appearing smoother.

Finally, the images are very different colors because the wavelengths (color) of light used to compose the images are from different parts of the spectrum. Both images were composed using red and green wavelengths. The top image, however, uses near-infrared. False-color images like this one (MSS bands 6, 5, 4) are still produced with modern instruments because they are useful for distinguishing features such as vegetation, which appears red in the top image.

In contrast, the OLI image does not show near-infrared (although the instrument does have the capability). Instead, it includes blue, a color that MSS was not designed to sense. This combination (OLI bands 4, 3, 2) produces a natural-color image similar to what your eyes would see.

If you have read this far and are still wondering about the colorful ponds at the top left of the images, read this Image of the Day.

NASA Earth Observatory image by Jesse Allen, using Landsat data from the U.S. Geological Survey. Caption by Kathryn Hansen.