Intel for More On-Die Integration; Memory Bandwidth a Challenge to Keep Cores Alive

March 9, 2012, By Sanjeev Ramachandran

Well, Intel is hatching improved chips day by day. They have to, with the increasing competition right now present in the chip making scenario. But do you know what their biggest challenge is?

If you are saying TI or NVIDIA, you are wrong. According to the recent statements from the top bosses from Intel, they are finding memory bandwidth as the major challenge ahead.

When they recently launched the new Xenon E5, they touted about the I/O improvements of the new chip very much. One of the key features was the latest 10 Gb/sec Ethernet controller.

However, the challenge of the memory bandwidth is rising for the chip maker as they add more cores to their chips to make them powerful. The question is about getting enough bandwidth to keep these additional cores active.

Intel is focusing on increasing I/O bandwidth and putting components directly to the processor die so that I/O will be closer to the processor.

When it comes to integrating memory controllers, it is the right approach sometimes than adding one more core.

Intel feels that integration is an expensive business and they can go only as far as die area finance allows. But integrated memory controllers are often cheaper than a separate chip.

Adding more functionality to dies will bring in high performance chips in the future, but, they have to be cautious.

One thing is that they have given extra care to confirm that the functionality it put on-die is perfect before it goes out of the door. It may also delay the frequency with which they are churning out new chips.

© 2008-2012 DeviceMag.com - All rights reserved | Privacy Policy