Try our new look!

Definitely Not 1980s Intel DRAM

It seems that Intel has a lot more on the go than the custom foundry operation discussed in a recent post . Thanks to a heads-up from Paul Boldt of the Ned Matters column , for pointing out that Intel is back in the DRAM game of all things.

That's not to say that Intel is about to start making commodity memory chips to go head-to-head with Samsung, Micron and Hynix.

Before the tip off of the abstract for the upcoming 2013 Symposia on VLSI Technology and Circuits analyzed in detail by David Kanter at real world technologies , I would have been inclined to agree with the assessment of AnandTech that the "embedded DRAM" meant a separate standard commodity DRAM chip from one of the big three suppliers mounted alongside the Intel processor in the same package. I can't fault AnandTech as they were trying to provide clarity to the usual situation arising in the PR-fuelled tech media where catchy terms are often misinterpreted and then amplified around the blogosphere.

But the title of the upcoming Intel paper leaves no doubt:

"A 22nm High Performance Embedded DRAM SoC Technology Featuring Tri-Gate Transistors and MIMCAP COB"

The title is enough to clear up any lingering confusion. For one thing, Intel would obviously not refer to a multi-chip package with a separate DRAM die as "embedded DRAM." For another, we see the reference to a MIMCAP or metal-insulator-metal capacitor COB or capacitor on bitline.

For more details, you can read the full abstract and see the media tip sheet images (for other technology session papers as well) here .

There are at least two more papers in this year's VLSI Technology Symposium that continue to highlight Intel's attention to the custom foundry and attracting potential clients.

IBM has long been a proponent of embedded DRAM (or eDRAM) in order to provide very large capacity on-chip cache memories rather than the all transistor latch circuit or 6T SRAM widely used for a wide range of volatile storage requirements for integrated circuits. But IBM was increasingly alone until now.

There are other reasons to move to the DRAM approach, but I wonder whether the SRAM cell scaling is making a switch to DRAM a much easier decision than it once was.


Images reprinted courtesy of the Secretariat for VLSI Symposia

Stay Informed!

Want More Electronics Design News For Engineers?

Sign up today to get weekly updates on the guts behind the gadgets

Recommended For You