Power-saving optical chip tech will need to wait for wider use: Nvidia CEO
A new chip technology that aims to cut energy usage is not yet reliable enough for use in Nvidia's flagship graphics processing units (GPUs), Nvidia's CEO Jensen Huang
Co-packaged optics, as the emerging technology is called, uses beams of laser light to send information on fiber optic cables between chips, making connections faster and with superior energy efficiency to those through traditional copper cables.
Huang said his company would use the co-packaged optical technology in two new networking chips that sit in switches on top of its servers, saying the technology would make the chips three and a half times more energy efficient than their predecessors.
The switch chips will come out later this year and into 2026 in a small but significant step toward advancing the technology.
Huang also told a group of journalists after his speech that while Nvidia examined using it more widely in its flagship GPU chips it had no current plans to do so, because traditional copper connections were "orders of magnitude" more reliable than today's co-packaged optical connections.
"That's not worth it," Huang said of using optical connections directly between GPUs. "We keep playing with that equation. Copper is far better."
Silicon Valley entrepreneurs and investors have pinned their hopes on the optics technology, which they believe will be central to building ever-larger computers for AI systems, which Huang told would still be necessary even after advances by companies like DeepSeek because AI systems would need more computing power to think through their answers
Ayar Labs, Lightmatter and Celestial AI are Startups have raised hundreds of millions of dollars in venture capital - some of it from Nvidia itself - to try and put co-packaged optical connections directly onto AI chips. Lightmatter and Celestial AI are both targeting public offerings.
Huang said that he was focused on providing a reliable product roadmap that Nvidia's customers, such as Open AI And Oracle, could prepare for.
"In a couple years, several hundred billion dollars of AI infrastructure is going to get laid down, and so you've got the budget approved. You got the power approved. You got the land built," Huang said. "What are you willing to scale up to several hundred billion dollars right now?"
Copper connections are cheap and fast, but can only carry data a few meters at most. While that might seem trivial, it has had a huge impact on Nvidia's product lineup over the past half decade.
Nvidia's current flagship product contains 72 of its chips in a single server, consuming 120 kilowatts of electricity and generating so much heat that it requires a liquid cooling system similar to that of a car engine. The flagship server unveiled on Tuesday for release in 2027 will pack hundreds of its Vera Rubin Ultra Chips into a single rack and will consume 600 kilowatts of power.
