Moore’s Law, the principle that chips are supposed to get smaller and faster every few years, is faltering. But one company, Xilinx, thinks that’s actually good news for a new type of flexible processors it expects to sell next year.
You’ve heard of Intel, Apple and Samsung, some of the biggest chipmakers around. But you probably don’t know Xilinx unless you’re building things like high-end network equipment or self-driving cars. Xilinx’ new chief executive, Victor Peng, hopes to change that.
Xilinx’ new chip design, code-named Everest, won’t power your next phone or PC. But it could bring the company to a broader audience of programmers frustrated with progress in more traditional chips. If all goes according to plan, Everest will come to the attention of all the programmers who rely on cloud-computing services that run in data centers packed with thousands of servers. Ultimately, that means new services you do actually use, like artificial intelligence tools that recognize your voice or scan your X-ray for tumors, will run faster.
Why? Because Xilinx chips let them accelerate specific jobs as general-purpose processors run out of steam, Peng argues. That helps fast hardware takes over for slower software running on central processing unit — the traditional kind of computer brain.
“CPUs always will be around, but they can’t do the heavy lifting,” Peng said. “You’ll need other forms of accelerators.”
Peng joined Xilinx in 2008 and became CEO in January, so he’s got a lot riding on Everest’s success. Over the last four years, the company employed 1,500 engineers and spent more than $1 billion in research and development costs to create Everest.
Three decades of flexible FGPAs
More than 30 years ago, Xilinx helped pioneer a chip technology called field programmable gate arrays (FPGAs), which unlike conventional chips can be programmed to perform specific tasks, then reprogrammed when needs change or bugs are found. It’s a modestly large market and growing, with sales expected to increase 9 percent a year to about $13 billion in 2023, according to Energias Market Research. Intel bought Xilinx’s FPGA rival Altera for $16.7 billion in 2015.
Everest packages Xilinx’ traditional FPGA hardware with other modules, including a traditional CPU core, memory and a very high-speed connection to the outside world. One of its most interesting properties is the capacity to be reprogrammed very fast — thousandths of a second. That means a data center using it for one job at one moment could give it a personality transplant nearly instantly as new work crops up.
It also means data centers can squeeze more use out of existing hardware rather than let it sit idle during moments when demand for some type of machine wanes. Amazon, the 800-pound gorilla of cloud-computing services, has added FPGAs to its array of Amazon Web Services options, and second-place Microsoft is also relying on FPGAs.
Some customers will like Everest, especially those working on AI software, said Linley Gwennap, an analyst with the Linley Group.
“We’re seeing more computing — particularly in the AI space — moving away from the CPU onto more specialized architectures,” Gwennap said.
Custom chips compete
But Gwennap also predicts FPGAs won’t escape a longstanding challenge: the choice to build special-purpose processors that, while not as flexible as FPGAs, are cheaper to make when you need lots of them. AI is new and fast-changing now, but special-purpose chips will look better as it settles down, he said.
“You’ll still see custom architectures, but they will be burned into silicon rather than use programmable gates,” Gwennap said.
FPGAs have traditionally been the province of hardware engineers building them into specific devices. But Peng’s ambition with Everest is to bring FPGAs to the attention of software programmers, too — a vastly larger community and, potentially, a bigger business for Xilinx.
Luring new programmers
Programming FPGAs is complicated, but to broaden its market, Xilinx is counting on new tools that make it easier to use FPGAs and to integrate them with existing technology. For example, Xilinx will provide libraries of pre-written software that make Everest slot right into existing AI software like Google’s TensorFlow.
“We want to make it more of a software development experience as opposed to chip development experience,” Peng said.
Today, computers using FPGAs are harder to program than those with traditional chips, but Xilinx wants to erase that difference. “In a five-year timeframe, our goal is to get there,” he said.
And with Moore’s Law no longer delivering steady progress, he could find an audience for his message.
“Faster, better, cheaper just doesn’t happen anymore,” he said. “The intelligent, connected world needs to be adaptable and needs to have acceleration built into it.”
iHate: CNET looks at how intolerance is taking over the internet.
Tech Enabled: CNET chronicles tech’s role in providing new kinds of accessibility.