Bundled in their latest earnings call, Micron has revealed that later this year the company will finally introduce its first HBM DRAM for bandwidth-hungry applications. The move will enable the company to address the market for high-bandwidth devices such as flagship GPUs and network processors, which in the last five years have turned to HBM to meet their ever-growing bandwidth needs. And as the third and final of the "big three" memory manufacturers to enter the HBM market, this means that HBM2 memory will finally be available from all three companies, introducing a new wrinkle of competition into that market.

Overall, while Micron has remained on the cutting-edge of memory technologies, the company has been noticeably absent from HBM thus far. Previous efforts have instead focused on GDDR5X, as well as a different take on fast-and-stacked memory with Hybrid Memory Cube (HMC). First announced back in 2011 as a joint effort with  Samsung and IBM, HMC was a similar stacked DRAM type for bandwidth hungry applications, which featured a low-width bus & extremely high data rates to offer memory bandwidth that by far exceeded that of then-standard DDR3. As a competing solution to HBM, HMC did see some usage in the market, particularly in products like accelerators and supercomputers. Ultimately, however, HMC lost the battle against more widespread HBM/HBM2 and Micron folded the project in 2018 in favor of GDDR6 and HBM.

In the end, is has taken Micron around two years to develop its first HBM2 memory devices, and these will finally become available in 2020. Given the broad, financial nature of the call, Micron isn't disclosing the specifications of its first HBM2 devices at this time, though it is a safe bet that the underlying DRAM cells will be produced using the company’s 2nd or 3rd Generation 10 nm-class process technologies (1y or 1z). Meanwhile, Micron will obviously do its best to be competitive against Samsung and SK Hynix both in terms of performance and capacity.

Sanjay Mehrotra, president and chief executive officer, had the following to say:

“In FQ2, we began sampling 1Z-based DDR5 modules and are on track to introduce high-bandwidth memory in calendar 2020. We are also making good progress on our 1-alpha node.”

Related Reading:

Source: Micron

POST A COMMENT

12 Comments

View All Comments

  • TheJian - Sunday, March 29, 2020 - link

    Except for certain use cases (servers, some pro stuff), this crap is pointless and expensive. Stick to easy to produce GDDR5x/GDDR6 type stuff. Not sure why they didn't just do GDDR6x vs. HBM2. All HBM/HBM2 has done for AMD is kill NET INCOME on flagship cards (gaming that is, surely they make some on pro stuff). It was the cause of shortages multiple times, cost issues multiple times, delays...Jeez, why bother with this crap for gaming stuff? Can you prove it is NEEDED? No? Then why the heck are we still having to deal with this crap memory for gaming?

    We almost got screwed by rambus/Intel ages ago, why are they shoving this crap down our throats now? PROVE you need the bandwidth on the gaming side or drop this crap AMD/NV (NV joining it seems too for gaming, stupid if so). Go cheaper unless you can prove bandwidth is an issue. So far I have seen NOTHING to prove we need the memory on a gaming card.

    Is the Titan RTX bandwidth starved with 24GB GDDR6? Nope. 672GB/s seems enough. I doubt the next release from NV this xmas (or whatever) will be massively faster than TitanRTX so prove you need it or forget raising the costs for NOTHING (talking to BOTH sides). Drop this expensive buzzword memory until you can PROVE you need it. Don't say 4k, nobody plays there and still prove it. Yeah, I call under 5% NOBODY (that is pretty much the total of both 1440+4k!). Wasting time on 4k testing in every card review is dumb. More 1080p where 65% play. I mean, should you write for 98% of your readers or 2%? Only a fool wastes time on nobody in publishing. I haven't looked at a 4k result in your reviews (anyone's LOL) since the first hit your site :) I still won't be on 4k for my next monitor...ROFL. Still no 4k TV either and I can easily afford ~800 for a great one at costco, but my 1080p samsung/LG's works fine still (61/65in both look great even with a good 720p rip - not how anandtech tests mind you...trash settings here) ;)
    Reply
  • lilkwarrior - Sunday, March 29, 2020 - link

    This stuff is first & foremost for pro usage. Gamers are a very broad audience that hardly wants to pay for high-end GPUs that warrants using HBM memory.

    On top of that games have been stuck catering to Windows 7 users; as result game developers have had to code games that are extremely inefficient of leveraging modern hardware like an HBM 2.0 GPU.

    That has only changed recently when Windows 7 dying and next-gen consoles now releasing that enabled game developers to enforce far higher bars moving forward (WDDM 2.0 FINALLY being the baseline for DX12 & Vulkan; Windows 10 exclusive DX12 features now labeled DX12 Ultimate to make it easier to communicate once next-gen consoles drop what are games that leave the Windows 7 catering behind that gamers accommodate by upgrading)
    Reply

Log in

Don't have an account? Sign up now