The triumph of modern lithography:Lessons in managing high-tech innovation
Date: Friday , February 01, 2008
The fortunes of modern lithography illustrate some of the best operational practices in managing innovation in a high-risk environment. Back in the 1980s, the semiconductor industry faced a crisis: the alleged ‘sub-micron barrier’ which threatened to disrupt the monotonic doubling of device density every two years, as predicted by Intel’s Gordon Moore. The fast-approaching diffraction resolution limit, according to classical optical theory (the Rayleigh resolution criteria), led the industry to contemplate the demise of optical lithography—hitherto one of the cornerstones of its growth. The plan of record was to transition to shorter wavelength x-ray lithography, which entailed significantly more expensive experimental synchrotron radiation sources, fragile membrane reticles, and fabs built like radiation-hardened nuclear bunkers. These technical and financial challenges were a threat to the economic viability of the industry.
This scenario has never materialized, so far. What actually happened is one of the wonders of engineering in the modern era. Working in concert, technologists unleashed a wave of innovation that has extended optical lithography more than 10 times beyond the supposed limit, to today’s 45nm node and beyond (albeit with ultraviolet light). X-ray lithography, now re-branded as extreme ultraviolet lithography, is only expected to make tentative steps into production in the next few years at the 22nm node, provided nothing else changes in the meantime.
Below, we review these extraordinarily expensive and unpredictable developments and their business implications. These provide valuable lessons and point to the best practices for managing high-technology innovation in the future which is likely to be just as exciting and risky as the past.
Saving optical lithography
In a highly complex global endeavor, technologists pulled on just about every conceivable innovation lever to save optical lithography. The obvious approach to extension, according to the classical theory, was to shorten the wavelength of the exposure light from visible through i-line to today’s 193nm and 157nm laser sources, while nearly doubling the numerical aperture of the lens to increase its resolving power. Image fidelity had to be maintained precisely uniform over a 30mm field of view for the most powerful chips, and in order to print en masse smaller chips economically at each exposure. To say that development of these lenses was challenging and risky severely understates the magnitude of the task. In fact, maintaining a 2m tall stack of 25 plus precision ground elements in sub-micron dimensional alignment over the long term against thermal and vibrational perturbations was unprecedented. Mitigating the optical degradation effects to 50 plus lens coatings from long-term UV exposure was no small feat either. Along the way, stepped-field exposure gave way to a step-and-scanned field exposure, and more recently to double patterning approaches. The partial coherence and polarization properties of light were exploited to enhance image contrast and thus refine resolution. More recently, the use of high-refractive index immersion liquids between the lens and the wafer has enabled yet shorter wavelength operation from the same light source. This development has raised several issues, including dynamic pattern-dependent photo-thermal heating of the immersion liquid causing mirage-like refractive distortion of the image, particles, and micro-bubbles being dragged around in the meniscus and getting deposited on the wafer, all of which can generate device-killing pattern defects.
In parallel with changes in the exposure systems, reticle technology evolved from simple binary chrome images of line and space to include transparent phase-shifting elements that enhance contrast. As the ultimate resolution of lithography progressed, the divergence between the circuit designers’ intent (as embodied on the reticle) and the final etched pattern on the wafer becomes unacceptably large. The solution here was to pre-distort the reticle features to compensate for the variation, so that the final result was closer to the desired shape. This technology has been further extended to the use of computer-designed sub-resolution assist features in such a way that the reticle pattern is barely recognizable yet achieves the desired result. More recently, the backside of the reticle plate, traditionally an unpatterned surface, is now being exploited to provide local modification of exposure properties across the device. This capability further optimizes resolution in critical regions.
Chemistry has also played a leading role in this revolution. Amplified resists of extreme sensitivity enable the necessary resolution, pattern uniformity, and production throughout. Innovative scatterometry metrology, which uses exquisitely accurate computer modeling of optical diffraction, enables sub-nanometer precision pre-etch trimming of resist features to half the line width imaged by the wafer stepper. As a result, 35nm features can be created from 70nm resist images—effectively doubling the resolution. Today, the whole process is so complicated, and process windows so small, that simulation is a key tool and in some cases the only practical tool for exploring process optimization and integration, a far cry from the starting point. Of course, device materials and process equipment were other key components of the innovation ecosystem.
What can managers of high-technology innovation learn from this experience? First and foremost, each process generation required a complete solution drawn from many candidate innovations. Potential solutions came from multiple sources around the globe, including academia, diverse industries, competitors, capital equipment manufacturers, and the military Stealth program. The technical and financial risks were extremely high, and the stakes even higher. Chip manufacturers had to take a portfolio approach to process technology in order to hedge their bets. Also, no single company’s research capabilities could have funded such progress. As a result, co-opetition and research consortia flourished, and wise companies have embraced the concept of open innovation.
Chip manufacturers have had no alternative but to move in lockstep with equipment suppliers—an uneasy relationship of mutual interdependency and some conflicting business interests. All players had to frequently coordinate their technology road maps as the pace of change has been too rapid for an annual planning cycle to be effective; nonetheless, intellectual property and commercial advantage had to be protected.
Moreover, this experience shows that innovation must be sustained despite the legendary technology-driven and market-driven cyclicality in the semiconductor industry. To that end, best performers use disciplined R&D portfolio management practices and focus efforts on the most value-adding prospects. These practices also help to accelerate innovation, which is imperative in view of shrinking technology and product lifecycles and the corresponding front-end loading of profit.
Some of the key operational management capabilities necessary for success include:
* A vigorous strategic planning and review process to provide unbiased insight and timely commitment of resources against only the highest value objectives.
* Processes to dynamically drive alignment (and manage risks) between research, process platform, and product development road maps. The objective is to ensure a seamless flow of technology culminating in a rapid production ramp.
* Vendor and customer-management strategies to constantly align technology investments across the supply chain. This alignment ensures that compatible technologies are available at the right time, and at the right cost.
* Open innovation practices, which include structured methods for prospecting, selecting, initiating, managing, and retiring technology partnerships.
* Management of intellectual property assets to generate strategic advantage and protect investments, while leveraging the full benefits of open innovation.
Going forward, these proven management practices will help industry leaders tackle operational challenges as daunting as the underlying technological hurdles. Lessons learned from the past will allow them to face future uncertainties with confidence.