HBM

TrendForce projects a remarkable 105% increase in annual bit shipments of high-bandwidth memory (HBM) this year. This boost comes in response to soaring demands from AI and high-performance computing processor developers, notably Nvidia, and cloud service providers (CSPs). To fulfill demand, Micron, Samsung, and SK Hynix are reportedly increasing their HBM capacities, but new production lines will likely start operations only in Q2 2022. More HBM Is Needed Memory makers managed to more or less match the supply and demand of HBM in 2022, a rare occurrence in the market of DRAM. However, an unprecedented demand spike for AI servers in 2023 forced developers of appropriate processors (most notably Nvidia) and CSPs to place additional orders for HBM2E and HBM3 memory. This made DRAM makers use...

Micron Publishes Updated DRAM Roadmap: 32 Gb DDR5 DRAMs, GDDR7, HBMNext

In addition to unveiling its first HBM3 memory products yesterday, Micron also published a fresh DRAM roadmap for its AI customers for the coming years. Being one of the...

4 by Anton Shilov on 7/27/2023

Micron Unveils HBM3 Gen2 Memory: 1.2 TB/sec Memory Stacks For HPC and AI Processors

Micron today is introducing its first HBM3 memory products, becoming the latest of the major memory manufacturers to start building the high bandwidth memory that's widely used in server-grade...

7 by Anton Shilov on 7/26/2023

As The Demand for HBM Explodes, SK Hynix is Expected to Benefit

The demand for high bandwidth memory is set to explode in the coming quarters and years due to the broader adoption of artificial intelligence in general and generative AI...

9 by Anton Shilov on 4/18/2023

Intel Showcases Sapphire Rapids Plus HBM Xeon Performance at ISC 2022

Alongside today’s disclosure of the Rialto Bridge accelerator, Intel is also using this week’s ISC event to deliver a brief update on Sapphire Rapids, the company’s next-generation Xeon CPU...

22 by Ryan Smith on 5/31/2022

Intel to Launch Next-Gen Sapphire Rapids Xeon with High Bandwidth Memory

As part of today’s International Supercomputing 2021 (ISC) announcements, Intel is showcasing that it will be launching a version of its upcoming Sapphire Rapids (SPR) Xeon Scalable processor with...

150 by Dr. Ian Cutress on 6/28/2021

2023 Interposers: TSMC Hints at 3400mm2 + 12x HBM in one Package

High-performance computing chip designs have been pushing the ultra-high-end packaging technologies to their limits in the recent years. A solution to the need for extreme bandwidth requirements in the...

35 by Andrei Frumusanu on 8/25/2020

Micron to Launch HBM2 DRAM This Year: Finally

Bundled in their latest earnings call, Micron has revealed that later this year the company will finally introduce its first HBM DRAM for bandwidth-hungry applications. The move will enable...

14 by Anton Shilov on 3/27/2020

AMD Discusses ‘X3D’ Die Stacking and Packaging for Future Products: Hybrid 2.5D and 3D

One of AMD’s key messages at its Financial Analyst Day 2020 is that the company wants to remain on the leading edge when it comes to process node technology...

12 by Dr. Ian Cutress on 3/5/2020

SK Hynix Licenses DBI Ultra Interconnect for Next-Gen 3DS and HBM DRAM

SK Hynix has inked a new broad patent and technology licensing agreement with Xperi Corp. Among other things, the company licensed the DBI Ultra 2.5D/3D interconnect technology developed by...

9 by Anton Shilov on 2/11/2020

Samsung Develops 12-Layer 3D TSV DRAM: Up to 24 GB HBM2

Samsung on Monday said that it had developed the industry’s first 12-layer 3D packaging for DRAM products. The technology uses through silicon vias (TSVs) to create high-capacity HBM memory...

11 by Anton Shilov on 10/7/2019

Samsung HBM2E ‘Flashbolt’ Memory for GPUs: 16 GB Per Stack, 3.2 Gbps

Samsung has introduced the industry’s first memory that correspond to the HBM2E specification. The company’s new Flashbolt memory stacks increase performance by 33% and offer double per-die as well...

25 by Anton Shilov on 3/20/2019

JEDEC Updates HBM Spec to Boost Capacity & Performance: 24 GB, 307 GB/s Per Stack

JEDEC this week published an updated version of its JESD235 specification, which describes HBM and HBM2 DRAM. The new version of the standard allows memory manufacturers to increase capacities...

15 by Anton Shilov on 12/19/2018

Xilinx Announces Project Everest: The 7nm FPGA SoC Hybrid

This week Xilinx is making public its latest internal project for the next era of specialized computing. The new product line, called Project Everest in the interim, is based...

16 by Ian Cutress on 3/19/2018

Samsung Starts Production of HBM2 “Aquabolt” Memory: 8 GB, 2.4 Gbps

Samsung this week announced that it had started mass production of its second-generation HBM2 memory code-named “Aquabolt”. The new memory devices have 8 GB capacity and operate at 2.4...

17 by Anton Shilov on 1/11/2018

SK Hynix: Customers Willing to Pay 2.5 Times More for HBM2 Memory

SK Hynix was the first DRAM manufacturer to start producing HBM Gen 1 memory in high volume back in 2015. However, the company is somewhat behind its rival Samsung...

23 by Anton Shilov on 8/4/2017

Hot Chips 2016: Memory Vendors Discuss Ideas for Future Memory Tech - DDR5, Cheap HBM, & More

Continuing our Hot Chips 2016 coverage for the evening, along with the requisite presentations on processors, several of the major players in the memory industry are also at the...

11 by Ryan Smith on 8/23/2016

SK Hynix Adds HBM2 to Catalog: 4 GB Stacks Set to Be Available in Q3

SK Hynix has quietly added its HBM Gen 2 memory stacks to its public product catalog earlier this month, which means that the start of mass production should be...

43 by Anton Shilov on 8/1/2016

GDDR5X Standard Finalized by JEDEC: New Graphics Memory up to 14 Gbps

In Q4 2015, JEDEC (a major semiconductor engineering trade organization that sets standards for dynamic random access memory, or DRAM) finalized the GDDR5X specification, with accompianing white papers. This...

70 by Anton Shilov on 1/22/2016

JEDEC Publishes HBM2 Specification as Samsung Begins Mass Production of Chips

The high-bandwidth memory (HBM) technology solves two key problems related to modern DRAM: it substantially increases bandwidth available to computing devices (e.g., GPUs) and reduces power consumption. The first-generation...

42 by Anton Shilov on 1/20/2016

Log in

Don't have an account? Sign up now