{"id":120938,"date":"2026-03-12T23:56:37","date_gmt":"2026-03-13T02:56:37","guid":{"rendered":"https:\/\/ces.einnews.com\/article\/899073737"},"modified":"2026-03-12T23:56:37","modified_gmt":"2026-03-13T02:56:37","slug":"disciplined-supply-strategy-keeps-dram-and-hbm-markets-tight","status":"publish","type":"post","link":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/2026\/03\/12\/disciplined-supply-strategy-keeps-dram-and-hbm-markets-tight\/","title":{"rendered":"Disciplined Supply Strategy Keeps DRAM and HBM Markets Tight"},"content":{"rendered":"<p><b>Thursday, March 12, 2026<\/b> <\/p>\n<p>During a recent virtual roundtable of TechInsights analysts, the consensus was that memory makers have learned from previous boom-and-bust cycles and are showing greater discipline in ramping up to meet AI-driven demand. Forecasted HBM and DRAM shortages stem not from supply chain disruptions, but from unprecedented and largely unanticipated adoption. <\/p>\n<p>AI growth will continue to be driven primarily by the cloud, with the focus gradually shifting from training to inference. <\/p>\n<p>Cameron McKnight-MacNeil, process analyst at TechInsights, said Nvidia\u2019s Rubin platform, announced in September 2025 and touted as a new class of GPU, appears to be positioned as a platform for inference in the cloud. <\/p>\n<p>He said the massive scale of HBM required for AI platforms will continue in 2026, noting that each of Nvidia\u2019s Blackwell accelerator packages contains eight HBM modules, with each module comprising eight DRAM dies plus a controller die. With hyperscalers deploying super pods of racks full of GPUs, \u201cHBM4 is going to be the memory flavor as it were for AI in \u201926.\u201d <\/p>\n<p>Cameron McKnight-MacNeil<br \/>\nCameron McKnight-MacNeil (Source: TechInsights)<br \/>\nMcKnight-MacNeil said the hunger for HBM and the corresponding packaging raises yield concerns. \u201cTechInsights has highlighted the potential concerns around the poor yields from stacking [DRAM dies] and the impact of sustainability,\u201d he said. <\/p>\n<p>JEDEC\u2019s recent update increased the allowable package height for HBM, enabling stack heights up to 16 dies, made possible in part by thinning DRAM dies and hybrid bonding. <\/p>\n<p>There is extensive R&amp;D aimed at improving system efficiency, alongside increased adoption of optics in data centers, McKnight-MacNeil added. \u201cThey\u2019re going to make the transistors more efficient.\u201d <\/p>\n<p>But any improvements are immediately snapped up by the request for more compute, he said. \u201cIt\u2019s a growth situation.\u201d <\/p>\n<p>Dan Kim, chief strategy officer at TechInsights, said that despite discussions about a potential AI bubble, the current memory market is not speculative. \u201cThis is purely a function of demand and supply.\u201d <\/p>\n<p>He said the market began recovering in 2023 after a cyclical downturn, with growth expected to remain healthy through 2026. \u201cWhat we\u2019re seeing is just a robust demand for accelerated compute that is being manifested in the growth of GPUs and ASICs.\u201d <\/p>\n<p>This \u201cextraordinary\u201d growth is happening primarily in the data center and the cloud, Kim added, contributing to rising prices for both DRAM and NAND. <\/p>\n<p>\u201cThere is also an insatiable demand to lower the power consumption of AI compute,\u201d he added. <\/p>\n<p>That means innovation in more efficient power systems is becoming just as critical as scaling GPU and memory bandwidth. <\/p>\n<p>Growth outside the data center space is not as aggressive, Kim said, but all segments are competing for the same foundry space. \u201cThis will shape up to be a very interesting growth market for 2026.\u201d <\/p>\n<p>But if CES is any indication, edge AI is seeing remarkable and even transformative growth, Jack Narcotta, head of consumer electronics at TechInsights, said. \u201cIn the consumer market, there\u2019s always that phrase \u2018bigger, better, faster, more\u2019. A handful of things were just outright vaporware.\u201d <\/p>\n<p>He said the smart home industry is rapidly approaching a point where brands must consider some fundamental questions about their relationship with AI and where it will be located. <\/p>\n<p>Putting AI directly on the device raises power and thermal issues, Narcotta said, but this would provide a smoother, faster, and significantly more reliable experience by reducing dependence on cloud communication. <\/p>\n<p>He said the consumer segment for AI is the \u201cwild west\u201d right now and that the Nvidia Rubin announcement seemed of place at CES given the platform is for the data center. But it will have a ripple effect, as consumer brands will need to understand what its capabilities mean for their products, Narcotta said. \u201cConsumer electronics tends to lag some of the macro trends that are happening.\u201d <\/p>\n<p class=\"nText\"> <b>By: DocMemory<\/b><br \/><span class=\"txtNormal\">Copyright \u00a9 2023 CST, Inc. All Rights Reserved<\/span> <\/p>\n<p><strong><a href=\"https:\/\/blockads.fivefilters.org\"> <\/a><\/strong> <a href=\"https:\/\/blockads.fivefilters.org\/acceptable.html\"> <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>&#8230; for 2026.\u00e2\u0080\u009d But if <span class=\"match\">CES<\/span> is any indication, edge &#8230; , head of <span class=\"match\">consumer electronics<\/span> at TechInsights, said. \u00e2\u0080\u009cIn the <span class=\"match\">consumer<\/span> market, there &#8230; seemed of place at <span class=\"match\">CES<\/span> given the platform is for &#8230; their products, Narcotta said. \u00e2\u0080\u009c<span class=\"match\">Consumer electronics<\/span> tends to lag some &#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"fifu_image_url":"","fifu_image_alt":"","footnotes":""},"categories":[1],"tags":[],"class_list":["post-120938","post","type-post","status-publish","format-standard","hentry","category-news","wpcat-1-id"],"_links":{"self":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/120938","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/comments?post=120938"}],"version-history":[{"count":0,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/120938\/revisions"}],"wp:attachment":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/media?parent=120938"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/categories?post=120938"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/tags?post=120938"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}