Got a spare hour?
Intel’s solution to the memory bottleneck has been Rambus Technologies’ RDRAM (Rambus DRAM). Rambus is an entirely new design in memory technology, centered around extremely high operating frequencies--RDRAM operates at an astonishing 400 MHz. To attain such high clock speeds, however, RDRAM features a slimmer bus than current SDRAM. As opposed to SDRAM’s 64-bit bus, RDRAM operates on a slim 16-bit bus. This incongruency in bus-width, however, means that an RDRAM-supporting chipset must be designed differently, and will not be compatible with an SDRAM chipset, without some form of translation such as Intel’s MTH. Further adding to RDRAM’s impressive numbers is its ability to transfer data on both the rising and falling edges of the clock cycle. Given all that, the theoretical maximum bandwidth provided by RDRAM is as follows:
(400MHz Operating Speed) x (16-bit Bus) x (2 Rising & Falling Edge) / (8 bits per byte) = 1600MB/s available bandwidth.
The numbers speak for themselves--RDRAM is capable of providing twice as much bandwidth as PC100 SDRAM, and 50% more than even PC133 SDRAM. Further adding to these impressive numbers is RDRAMs extremely high efficiency. Simply put, RDRAM is a very streamlined, effective memory architecture. Hyundai has estimated RDRAMs bus effectiveness to be as high as 85%, compared to SDRAM’s 75%. So, RDRAM is the perfect solution, right? Wrong. The primary, and most important concern with RDRAM at present is simply cost. Yields thus far have been less than impressive, which drives up the cost of RDRAM. As well, RDRAM is a proprietary technology, unlike SDRAM, which means that any third party memory manufacturer who wishes to produce RDRAM is forced to pay royalties to Rambus. Furthermore, due to RDRAM’s completely new design, production would require retooling of current memory manufacturers’ fab plants, an expensive process. Those factors, coupled with some early problems with RDRAM (read: i820 fiasco), have resulted in most major memory manufacturers being somewhat reluctant to hop onto the RDRAM bandwagon.
RDRAM--smaller 16-bit bus, while allowing for higher throughput speeds, forces a slight serialization of the commands. Because of this, the handful of applications out there that require very rapid access to small bits of data from different locations will actually suffer a slight decrease in performance when used with RDRAM. Fortunately, the majority of software in existence at the moment will benefit from the higher continuous speeds RDRAM brings to the table. But adding to the latency concerns is another of RDRAM’s ‘design flaws’. RDRAM operates at extremely high temperatures. So high, in fact, that the initial design for RDRAM called for active convection cooling (a fan on your RAM!). As this was not a viable option, a method of reducing heat production was required. The only viable way to lower production of heat, unfortunately, was to power down the modules--and that’s what had to be done. That in mind, only one chip on a RIMM will be actively sending data at one time. While one chip transmits, the others enter a ‘standby’ mode in which they run much cooler. When the active chip powers down, another powers back up, and begins transmitting. The heatsinks that are commonly seen on RIMMs are not actually heatsinks but heat dissipaters, which help to spread the heat from one ‘active’ module to the others in ‘standby’ mode. An unfortunate drawback to this design is that each module, upon leaving ‘standby’ mode, must go through a power-up cycle which can take upwards of 100ns! This imposes a slight latency penalty as well.
AMD and VIA, amongst others, have recognized the immediate shortcomings of RDRAM and agreed that, at present, it is not a viable alternative to current memory technology. Simply put, in their eyes the cost does not justify the benefits. But they also agree that something must be done, and for AMD and VIA that something is Double Data Rate SDRAM. Double Data Rate SDRAM is little more than a small evolution of current SDRAM technology. DDR SDRAM is capable, like RDRAM, of transferring data on both the rising and falling edges of the clock cycle. As such, its effective bandwidth is doubled. Consider standard 100 MHz DDR SDRAM:
(100 MHz Operating Speed) x (2x Rising & Falling) x (64-bit Bus) / (8 bits per byte) = 1600 MB/s available bandwidth.
As the numbers illustrate, DDR SDRAM, in conjunction with a 100 MHz FSB can provide bandwidth equivalent to that of RDRAM. Of course, like standard SDRAM, DDR SDRAM can be made to operate at a 133 MHz FSB as well.
(133 MHz Operating Speed) x (2x Rising & Falling) x (64-bit Bus) / (8 bits per byte) = 2133 MB/s available bandwidth.
When utilized with a 133 MHz FSB, DDR SDRAM can provide greater bandwidth than RDRAM. Thus, DDR SDRAM is able to achieve equal, or even higher, bandwidth levels while maintaining SDRAM’s lower latency.
In terms of production, DDR SDRAM seems to be the preferred option among memory manufacturers such as Micron, Hitachi, Hyundai, Samsung, Siemens and others. DDR yields have been acceptable thus far, and are already in production at moderate levels primarily due to demand for GeForce DDR-based graphics cards. Furthermore, DDR SDRAM requires very little retooling from standard SDRAM, so getting production levels up to an acceptable level would not incur high initial costs.
The only immediate drawback to DDR SDRAM is its poor memory bus efficiency. Early estimates dictate that DDR SDRAM’s bus effectiveness would be less than 65%, compared to 75% for standard SDRAM and 85% for RDRAM, which seriously cuts into the available bandwidth. Despite this, PC266/2100 DDR SDRAM is capable of the same effective bandwidth as PC800 RDRAM.
The deciding factor in terms of DDR’s success may prove to be not the technology itself, but support for it. While AMD and VIA are making ground, the fact remains that most people still use Intel systems, and most Intel systems operate on Intel chipsets. While its recent Apollo 133A and KX133 chipset releases have served to better its reputation, VIA has historically been remembered as a second-rate chipset manufacturer. Since DDR DIMMs will require a new chipset and new 184-pin memory slot, a large part of DDR’s success will depend on VIA’s ability to create and implement a competitive chipset--at an affordable price.
It is unlikely that the two different solutions presented can coexist for very long--one will likely become the standard. Some months ago, Hyundai released a comparison matrix, which does an excellent job of summarizing the concerns associated with each type of RAM.
Intel & Rambus’ RDRAM is a very promising technology. Its high clockspeed, extremely fast burst rates and incredible bus efficiency will make it very beneficial for most applications. As Rambus and Intel have insisted, the performance increases offered by RDRAM will likely increase as do the clockspeeds of the processors. All things said, though, RDRAM’s success is dependent more upon its cost to the consumer than on any other factor.
DDR SDRAM appears to be the more viable of the two solutions. DDR SDRAM won’t require a large investment on the part of manufacturers, and has thus far experienced decent yields. DDR simply seems, at present, to be the easier of the two options, while still offering performance on par with RDRAM. The success of DDR SDRAM, however, will be very dependent upon VIA’s ability to come through with competitive chipset releases.
In the meantime, try to stay on top of the issues, and don’t let advertising fool you. Expect to see quite a few confusing numbers floating around, such as PC133, PC266, PC600, PC700, PC800, PC1600, and PC2100. So when you see Intel advertising the new PC800 RDRAM memory, remember that, no, it’s not eight times as fast as PC100 memory. Likewise, when you see PC1600 and PC2100 DDR memory, remember that they’re not two to three times the speed of PC800 memory, regardless of what a salesperson tries to tell you. The plethora of new memory options that will present themselves in the next few months will be enough to overwhelm many potential buyers, so remember to sit back and do your homework before making that important purchase. The memory wars are just beginning, and the next few months promise to be very interesting indeed.