Intel Launches Next Generation of Microprocessors


Image courtesy of AnandTech
Although much is made of the “need” for Intel to capture mobile computing market share, it is more likely that they will continue to concentrate significant effort on more traditional computing platforms.

Last week at Computex in Taiwan, Intel officially launched its next generation of microprocessors, Haswell.

The new microarchitecture is manufactured with the existing 22 nm TriGate process technology (see tick-tock product cycle). The Haswell devices replace the Sandy Bridge designs and claim to offer a 20 X improvement in power efficiency.

What really interests me is how these devices are manufactured and assembled. But on the tock half of the product cycle, it's fair to ask, “Why do I care?”

It's all about Intel's return to DRAM production. A few weeks ago, I looked at what details could be found about a new embedded DRAM (eDRAM) based on a VLSI Technology Symposium abstract.

To find out more, I contacted the VLSI Technology paper co-author and eDRAM program lead at Intel, Kevin Zhang who helped fill in some details. One particular segment of the Haswell line will use the eDRAM memory. The Iris Pro is a single-chip (please read on for the picky details) microprocessor and embedded graphics designed to provide mid-range discrete graphics card performance. Intel sees an opportunity to provide builders and users of all but the higher end of graphics workstations with a simpler system design, lower component count and (most significantly) lower cost.

The Iris Pro is a system-in-package (SiP) or multi-chip module (MCM) design with the micro/graphics processor die placed next to the eDRAM chip on the pin grid array package substrate. Although manufactured as a separate integrated circuit, the eDRAM gets that moniker because the DRAM cell is built using the low power SoC version of Intel's 22 nm TriGate transistor. It is a DRAM cell with one access transistor and one storage capacitor. Although no specific material details were forthcoming, the capacitor is a typical M-I-M crown device using a high-K dielectric located above the bitlines.


Image courtesy of AnandTech

Those who follow the field will recognize both the cell type and the use of a separate eDRAM chip in the system. The Xbox 360 employed an eDRAM chip manufactured by NEC as did the Nintendo Wii. The other main game console of that era was Sony's PlayStation. Sony used eDRAM as well, but it was in a fully integrated SoC with IBM trench cells.

The Intel eDRAM is a significant industry milestone despite being out of the DRAM business since the eighties and having a reputation beyond graphics. Although eDRAM is currently offered by TSMC, IBM is really the only other player looking at the most advanced logic process nodes including a DRAM option. IBM has published 22 nm eDRAM as well (IBM is well known for its trench capacitor cell which is quite different from the Intel design). Casual observers should note that building DRAM into a logic process doesn't allow competition for the cell sizes available in a commodity DRAM process built only for cell size reduction and performance. Intel's 22 nm logic eDRAM cell is 0.029 μm 2 .

The Iris Pro chips are currently in manufacturing runs and products will be coming out on the usual “shipping soon” timeline.

According to Zhang, the decision to adopt a proprietary chip and I/O technology is based on many limitations of current commodity DRAM products. The bandwidth, power consumption and cost combined to push Intel to take a new approach with graphics RAM for the Iris Pro line. I/O bandwidth appeared to be the main focus as today's HD and UHD graphics require the exchange of vast amounts of data between the various processing elements. Intel will use a proprietary I/O scheme it calls OPIO – On-Package I/O.

Dan Snyder of Intel has a blog post with details of the product, especially performance and links to some independent graphics benchmarks. AnandTech takes a more detailed look at Iris Pro.

Of course, Intel never discusses roadmaps, future technology or products, but I always ask. What I didn't find out were the critical roadmap questions:

  • Will eDRAM eventually be used in mainstream components as well?
  • Will the eDRAM be integrated onto processor chips or always be a separate die in a system-in-package configuration?

These answers await the next round of product introduction announcements and analyst briefings, so stay tuned.