LEDs: The Light of the Future

Light Emitting Diodes (LEDs), “semiconductors that emit light when zapped with [positive polarity] electricity,”[1] are on the verge of taking over the commercial and consumer sectors of the lighting industry. With greater efficiency, longer useful lives, and their “clean” nature, LEDs are the future of light, pushing traditional incandescent and fluorescent bulbs toward extinction. Only the higher production costs for LEDs has extended the existence of traditional bulbs LED Display


When viewing the history of traditional bulbs, the higher costs associated with producing LEDs is not an insurmountable hurdle to overcome. The incandescent bulb lingered for about 70 years before supplanting “candles, oil lanterns, and gas lamps” as the main source of lighting.[2] When the first crude incandescent bulb was created in 1809 by Humphrey Davy, an English chemist, using two charcoal strips to produce light, it remained impractical. Later when the first true incandescent bulb was created by Warren De la Rue in 1820, utilizing a platinum filament to produce light, it was too expensive for commercial use. Only when Thomas Edison created an incandescent bulb utilizing a carbonized filament within a vacuum in 1879, did the incandescent bulb become practical and affordable for consumer use.

Although considered relatively novel, the concept for LEDs first arose in 1907 when Henry Joseph Round used a piece of Silicone Carbide (SiC) to emit a dim, yellow light. This was followed by experiments conducted by Bernhard Gudden and Robert Wichard Pohl in Germany during the late 1920s, in which they used “phosphor materials made from Zinc Sulphide (ZnS) [treated] with Copper (Cu)” to produce dim light.[3] However, during this time, a major obstacle existed, in that many of these early LEDs could not function efficiently at room temperature. Instead, they needed to be submerged in liquid nitrogen (N) for optimal performance.

This led to British and American experiments in the 1950s that used Gallium Arsenide (GaAs) as a substitute for Zinc Sulphide (ZnS) and the creation of an LED that produced invisible, infrared light at room temperature. These LEDs immediately found use in photoelectric, sensing applications. The first “visible spectrum” LED, producing “red” light was created in 1962 by Nick Holonyak, Jr. (b. 1928) of the General Electric Company who used Gallium Arsenide Phosphide (GaAsP) in place of Gallium Arsenide (GaAs). Once in existence, they were quickly adopted for use as indicator lights.

Before long these red LEDs were producing brighter light and even orange-colored electroluminescence when Gallium Phosphide (GaP) substrates were used. By the mid 1970s, Gallium Phoshide (GaP) itself along with dual Gallium Phosphide (GaP) substrates were being used to produce red, green, and yellow light. This ushered in the trend “towards [LED use in] more practical applications” such as calculators, digital watches and test equipment, since these expanded colors addressed the fact that “the human eye is most responsive to yellow-green light.”[4]

However, rapid growth in the LED industry did not begin until the 1980s when Gallium Aluminium Arsenides (GaAIAs) were developed, providing “superbright” LEDs (10x brighter than LEDs in use at the time) – “first in red, then yellow and… green,” which also required less voltage providing energy savings. [5] This led to the concept of the first LED flashlight, in 1984.


Leave a Reply

Your email address will not be published. Required fields are marked *