It was the feldspar, which is used in glassmaking, that first attracted engineers from the Corning Glass Company to the area. At the time, the leftover quartz grains were still seen as just unwanted by‑products. But the Corning engineers, always on the lookout for quality material to put to work in the glass factories, noticed the purity of the quartz and started buying it as well, hauling it north by rail to Corning’s facility in Ithaca, New York, where it was turned into everything from windows to bottles.
One of Spruce Pine quartz’s greatest achievements in the glass world came in the 1930s, when Corning won a contract to manufacture the mirror for what was to be the world’s biggest telescope, ordered by the Palomar Observatory in Southern California. Making the 200‑inch, 20‑ton mirror involved melting mountains of quartz in a giant furnace heated to 2,700 degrees Fahrenheit, writes David O. Woodbury in The Glass Giant of Palomar.
Once the furnace was hot enough, “three crews of men, working day and night around the clock, began ramming in the sand and chemicals through a door at one end. So slowly did the ingredients melt that only four tons a day could be added. Little by little the fiery pool spread over the bottom of the furnace and rose gradually to an incandescent lake 50 feet long and 15 wide.” The telescope was installed in the observatory in 1947. Its unprecedented power led to important discoveries about the composition of stars and the size of the universe itself. It is still in use today.
Significant as that telescope was, Spruce Pine quartz was soon to take on a far more important role as the digital age began to dawn.
In the mid‑1950s, thousands of miles from North Carolina, a group of engineers in California began working on an invention that would become the foundation of the computer industry. William Shockley, a pathbreaking engineer at Bell Labs who had helped invent the transistor, had left to set up his own company in Mountain View, California, a sleepy town about an hour south of San Francisco, near where he had grown up. Stanford University was nearby, and General Electric and IBM had facilities in the area, as well as a new company called Hewlett‑Packard. But the area known at the time as the Santa Clara Valley was still mostly filled with apricot, pear, and plum orchards. It would soon become much better known by a new nickname: Silicon Valley.
At the time, the transistor market was heating up fast. Texas Instruments, Motorola, and other companies were all competing to come up with smaller, more efficient transistors to use in, among other products, computers. The first American computer, dubbed ENIAC, was developed by the army during World War II; it was 100 feet long and 10 feet high, and it ran on 18,000 vacuum tubes.
Transistors, which are tiny electronic switches that control the flow of electricity, offered a way to replace those tubes and make these new machines even more powerful while shrinking their tumid footprint. Semiconductors—a small class of elements, including germanium and silicon, which conduct electricity at certain temperatures while blocking it at others—looked like promising materials for making those transistors.
At Shockley’s startup, a flock of young PhDs began each morning by firing up kilns to thousands of degrees and melting down germanium and silicon. Tom Wolfe once described the scene in Esquire magazine: “They wore white lab coats, goggles, and work gloves. When they opened the kiln doors weird streaks of orange and white light went across their faces . . . they lowered a small mechanical column into the goo so that crystals formed on the bottom of the column, and they pulled the crystal out and tried to get a grip on it with tweezers, and put it under microscopes and cut it with diamond cutters, among other things, into minute slices, wafers, chips; there were no names in electronics for these tiny forms.”
Shockley became convinced that silicon was the more promising material and shifted his focus accordingly. “Since he already had the first and most famous semiconductor research and manufacturing company, everyone who had been working with germanium stopped and switched to silicon,” writes Joel Shurkin in his biography of Shockley, Broken Genius. “Indeed, without his decision, we would speak of Germanium Valley.”
Shockley was a genius, but by all accounts he was also a lousy boss. Within a couple of years, several of his most talented engineers had jumped ship to start their own company, which they dubbed Fairchild Semiconductor. One of them was Robert Noyce, a laid‑back but brilliant engineer, only in his mid‑20s but already famous for his expertise with transistors.
The breakthrough came in 1959, when Noyce and his colleagues figured out a way to cram several transistors onto a single fingernail‑sized sliver of high‑purity silicon. At almost the same time, Texas Instruments developed a similar gadget made from germanium. Noyce’s, though, was more efficient, and it soon dominated the market. NASA selected Fairchild’s microchip for use in the space program, and sales soon shot from almost nothing to $130 million a year. In 1968, Noyce left to found his own company. He called it Intel, and it soon dominated the nascent industry of programmable computer chips.
Intel’s first commercial chip, released in 1971, contained 2,250 transistors. Today’s computer chips are often packed with transistors numbering in the billions. Those tiny electronic squares and rectangles are the brains that run our computers, the Internet, and the entire digital world. Google, Amazon, Apple, Microsoft, the computer systems that underpin the work of everything from the Pentagon to your local bank—all of this and much more is based on sand, remade as silicon chips.
Making those chips is a fiendishly complicated process. They require essentially pure silicon. The slightest impurity can throw their tiny systems out of whack.
Finding silicon is easy. It’s one of the most abundant elements on Earth. It shows up practically everywhere bound together with oxygen to form SiO2, aka quartz. The problem is that it never occurs naturally in pure, elemental form. Separating out the silicon takes considerable doing.
Step one is to take high‑purity silica sand, the kind used for glass. (Lump quartz is also sometimes used.) That quartz is then blasted in a powerful electric furnace, creating a chemical reaction that separates out much of the oxygen. That leaves you with what is called silicon metal, which is about 99 percent pure silicon. But that’s not nearly good enough for high‑tech uses. Silicon for solar panels has to be 99.999999 percent pure—six 9s after the decimal. Computer chips are even more demanding. Their silicon needs to be 99.99999999999 percent pure—eleven 9s. “We are talking of one lonely atom of something that is not silicon among billions of silicon companions,” writes geologist Michael Welland in Sand: The Never-Ending Story.
Getting there requires treating the silicon metal with a series of complex chemical processes. The first round of these converts the silicon metal into two compounds. One is silicon tetrachloride, which is the primary ingredient used to make the glass cores of optical fibers. The other is trichlorosilane, which is treated further to become polysilicon, an extremely pure form of silicon that will go on to become the key ingredient in solar cells and computer chips.
Each of these steps might be carried out by more than one company, and the price of the material rises sharply at each step. That first‑step, 99 percent pure silicon metal goes for about $1 a pound; polysilicon can cost 10 times as much.
The next step is to melt down the polysilicon. But you can’t just throw this exquisitely refined material in a cook pot. If the molten silicon comes into contact with even the tiniest amount of the wrong substance, it causes a ruinous chemical reaction. You need crucibles made from the one substance that has both the strength to withstand the heat required to melt polysilicon, and a molecular composition that won’t infect it. That substance is pure quartz.