Chips Fueled Armstrong’s Giant Leap

By John Neuffer, EETimes

Semiconductors were invented in America and came of age through NASA’s Apollo program and the moon landing. Working together, government and industry can advance policies that ensure U.S. leadership in semiconductors and other advanced technologies for the next 50 years.

Fifty years ago, America accomplished the impossible, landing a person on the moon and fulfilling the promise of President Kennedy’s moonshot challenge from eight years earlier. Neil Armstrong’s “giant leap” – one of our country’s finest hours and most awe-inspiring technological feats – was driven in part by robust government investment in scientific research and enabled by early semiconductor technology.

Semiconductors, now ubiquitous in modern electronics, were newly invented and largely untested during the 1960s, when the Apollo program took shape. NASA’s decision to use chips to power its Apollo Guidance Computer and other devices helped launch the semiconductor industry and give rise to Silicon Valley.

NASA, in fact, was the first and largest consumer of semiconductors in the 1960s, buying more than one million chips between 1962 and 1967, which was approximately 60 percent of all the chips produced in the U.S. during that time. The first chips tested for NASA’s use cost $1,000 each. By the time astronauts landed on the moon, the price had dropped to $15 apiece. This principle of dramatically increased functionality at lower cost, set forth by Moore’s Law in 1965, has guided semiconductor innovation ever since.

READ MORE>>>