Memory interfaces – past, present and future

Presenter

Presentation Menu

Abstract

DRAM standards have evolved tremendously over the last two-and-a-half decades, leading to diversification not only in the architecture of the memory array but also in that of the off-chip interface. Application-specific signaling channels have influenced the transceiver design nearly as much as system power and bandwidth requirements have. The influence of the multidrop server channel, along with a broad range of target environments, has led the DDR branch of JEDEC DRAMs to incorporate multi-tap Decision Feedback Equalization to maximize flexibility, while shrinking supply voltages to facilitate energy reduction have led Low-Power DDR (LPDDR) to completely rethink the output driver structure. In parallel, Graphics DDR (GDDR) has reached speeds requiring nearly equal care of the external channel and the chip itself. The adoption of multi-level signaling in GDDR6x and GDDR7 to relax on-chip frequency requirements has only heightened the need for more rigorous co-design of transceiver, package and system characteristics. And, of course, the integration of silicon interposers to support High Bandwidth Memory (HBM) has driven a paradigm shift in memory interface design. With all of these adaptations, and many others not captured here, the splintering DRAM family continues to push the boundaries of single-ended signaling into the future. 

This presentation briefly explores what has driven the diversification in DRAM signaling schemes over the decades, will discuss the motivation behind present embodiments, and will project into the future to where the DRAM interface is likely headed (e.g., features and functions necessary for continued energy-efficient bandwidth scaling).