Browse by year:
February - 2003 - issue > Cover Feature
Chips Ahoy!
Venkat Ramana
Thursday, January 30, 2003
IN 1947, A TECHNOLOGAL INNOVATION WAS achieved at the Murray Hill, NJ facilities of Bell Labs; the first transistor was made of germanium, a weak or “semi” conductor of electricity. It was able to amplify electric current and communicate by turning the current on and off. Then in September 1958, Texas Instruments’ Jack Kilby introduced the world to the integrated circuit (IC), a.k.a. the semiconductor. While the original had only a single transistor and was about the size of a small finger, today’s most powerful processors contain 42 million transistors that work together to store and manipulate data, so that the microprocessor can perform a wide variety of useful functions.

From 1958, the semiconductor or microprocessor or the “chip” has gone through the inevitable “shrink and stretch” evolution dictated by technology; shrinking the size of the product while stretching the possibilities of its functions. As the size of the chip shrunk, the number of transistors embedded in it increased logarithmically, so much so that Gordon Moore predicted that this transistor squeeze would double every two years (now revised to a more realistic 18 months).

As Vinod Dham says, chips will pervade our lives in ways not yet seen or imagined. But the shape of the future is not hard to perceive—one world where even inanimate objects would be embedded with chips that we won’t even notice. It is not unreasonable to claim that microchips will be the foundation of the 21st Century. We can expect the semiconductor chip to power most advancements in transportation, biotechnology and medicine. Automobiles are becoming computers on wheels. According to Alex Trotman, Ford Motors’ CEO, “the Ford Taurus has more computing power than the original Apollo that went to the moon in 1969.” Soon, steering, throttle, and braking systems will be controlled electronically on the automated highway system.

Chips come in many shapes, sizes, and perform a variety of functions. As uses increase in penetration, the costs of chips have gone down. The classic example is that of the chips inside a computer. From a single, mammoth processing machine at IBM, semiconductors have been the catalysts in the evolution of the computer—from a few corporate computer centers to thousands of workstations to millions of home computers and sleek laptops. Industry analysts note that the price of the semiconductors have dropped by 30 percent in the last 30 years, and predict another such drop in the next 30.

All of this sounds impressive, but what goes on to create these complex devices is equally impressive. Putting 42 million transistors in a space the size of a postage stamp with zero contamination is an arduous accomplishment. The diminutive engines that power the global village begin as pure silicon, are melted and then re-solidified to form single 250-pound crystals. These crystals are sliced into thin eight-inch or 200mm wafers (300mm wafers are currently in development). The wafers must be flat and vary no more than 1/100th of the thickness of a human hair. As many as 1,000 chips are built upon each finished wafer, with 20 layers of different materials placed carefully on the thumbnail-sized space. A standard semiconductor fab today produces around 30,000 8” wafers per month.

Despite the excruciating logistics, the most critical objective in chip manufacturing is to maintain a zero contaminate environment. This end is difficult considering that a cubic foot of the air you breathe contains a million specks that are 1/50,000th of an inch or larger. A single speck landing on a wafer would be akin to dropping a boulder on a greenhouse—in other words, complete destruction. Fab workers don “bunny suits” to ensure that nothing, such as threads from their clothes or flakes of skin, contaminates the delicate chips.

One fab engineer compares viewing a chip under a microscope to looking at a city in which one foreign particle on a chip is as large as an asteroid landing in the middle of the city. Today’s cost of running a semiconductor fab is $100,000 an hour, making it critical that equipment and materials run around the clock.

Intensive labor has historically separated the technology from the process. While the developed countries prefer to limit involvement in the idea and technology, the labor of establishing the technology as a product has gone to countries where labor resources is in plenty and at low prices. This has been the case in the semiconductor industry also. China, Taiwan and other far East countries have set up massive “foundries” where the chips are made. Thus came the birth of “fabless” companies, which developed the design and used the foundries to embed the design in silicon.

This contract structure has helped another evolution in the market. Second tier companies have found it easier to stay afloat for a longer time and focus on developing specific products—in power, memory, logic,video, and so on. Also, the design community is pushing the limit on the wafer size. From 0.18 micron the wafer thickness now being experimented with is below 0.1micron, or more popularly known as the sub-micron. This, of course, leads to more expensive manufacturing costs, and the debate on cost versus application utility rages on.

As the push to get more out of each chip increased, a new support was born: Electronic Design Automation, or EDA.As engineers rushed to put more complexity into the wafer, they needed tools to validate their concepts, model and analyze their designs, identify and eliminate problems before making production commitments. As the semiconductor industry grew to be a multi-billion dollar industry, EDA has grown to be a $6 billion-dollar industry.

Design engineers are squeezing more into a wafer—the chip industry has managed to combine onto a single chip discrete components for functions like graphics, processing, and communications. In their drive to shrink circuits into microsize pieces of silicon, engineers are bumping up against the scientific laws of electronics and energy use, all the while improving performance. Some companies have also experimented with 3D technology, where the stacking order has broken linear flatness and taken up scale vertically. While success has been limited, it has surely shown promise.

Many veterans predict that the future of the design itself will be distinct from the wafer—the hardware will be like Lego blocks and the software layers will dictate functionality. The System-On-Chip genre is gaining momentum and Stanley Bruederle at Gartner predicts that this would be the driver for next-generation wireless growth. “System on chip (SOC) technology is beginning to provide single-chip wireless products. The first products are Bluetooth devices, which developers promise to make available in volume in late 2002 or early 2003. Single-chip Bluetooth devices will appear in many different products. Some of these products will have other radio technologies in them—for example, mobile phones, notebook computers and personal digital assistants embedded with wireless LANs (WLANs)—but others, such as handheld games and music devices, will be upgraded to wireless capability by the addition of Bluetooth radio technology,” he says.

In the meanwhile, engineers are realizing that new computing systems are going change the way design itself will be conducted. Companies are experimenting with computing systems that can think—self analyze and self heal. Cognitive computers that can reconfigure themselves as necessary, generate their own code, respond to naturally expressed human directives, and be configured and maintained by non-experts, and therefore last much longer than current systems seem to be evolving in the not-too-distant future.

In his keynote address to the Intel Developers’ Forum, Intel Chief Technology Officer, Pat Gelsinger, asks us to “imagine the world that’s possible—chips being integrated into every room and device of the home. You can imagine products that are able to identify and maintain themselves and notify when different devices need to be upgraded or looked at.” Despite the market fluctuations, Instat predicts that “the recovery of the semiconductor industry has slowed, but it has not stalled. We currently expect that worldwide revenue in 2002 will be essentially flat at $139 billion. Slow growth in the first half of 2003 will be followed by stronger growth in the second half, resulting in 18.1% revenue growth for the year. Growth is expected to continue until mid-2005 when a capacity driven downturn is expected to begin.”

The following cover feature covers this exciting and technology-intensive space, some new technologies, and some future roadmaps.

Share on LinkedIn