Solder-It-Yourself DDR5: Russian modders pitch the Idea of making their own RAM
去死吧 内存厂家 网友开始自己焊内存了
版主: Jack12345
#8 Re: 去死吧 内存厂家 网友开始自己焊内存了
内存(RAM)涨价的主要原因是人工智能(AI)和服务器对高端内存(如HBM、DDR5)需求激增,导致芯片大厂将产能倾斜到这些利润更高的产品,从而挤压了传统消费级内存(DDR4)的供应,加上市场囤货和供应链短缺,共同推高了价格,是AI驱动的供需失衡结果。
具体原因分析:
AI需求爆发:
大模型训练和推理需要巨大的算力,对高带宽、低延迟的高端内存(HBM)需求量爆炸式增长。
一台AI服务器对内存的需求是普通服务器的8倍,消耗了全球DRAM月产能的很大一部分。
供应端产能转移:
三星、海力士、美光等巨头将晶圆产能集中投向利润更高的HBM和DDR5,减少了对DDR4的供应。
DDR4本身也已接近生命周期末期,厂商逐步停止供货或减产,导致市场供不应求。
市场囤货与恐慌性采购:
涨价预期促使经销商、代理商甚至终端厂商大量囤货,加剧了市场流通短缺。
这种“卖方市场”进一步推高了价格,形成了涨价循环。
产业周期与历史因素:
存储行业本身具有周期性,经历过供需轮动。
此前(2023年)行业经历低谷(亏损、停产),导致本轮复苏时供应弹性不足。
供应链紧张:
全球半导体供应面临挑战,部分限制措施也影响了产能扩张,间接推高成本。
对消费者的影响:
电脑、手机等电子产品升级和购买成本增加。
游戏玩家等需要高性能设备的用户,预算要求提高。
总的来说,本轮内存涨价与上一轮(如手机容量需求增长)不同,核心驱动力是AI算力基础设施建设,而非个人消费市场,这使得价格上涨更迅猛,影响范围更广。
#10 Re: 去死吧 内存厂家 网友开始自己焊内存了
存储芯片价格失控!华尔街再调预期:2026年DRAM或暴涨88%,NAND涨74%
article.author.display_name
许超
01-07 04:30
考虑到AI Agent普及和AI CPU内存需求激增,花旗将服务器DRAM的平均售价(ASP)涨幅预期上调至144%,预计企业级SSD的ASP将同比上涨87%。在分析师看来,市场将进入一个极其剧烈的卖方市场,定价权将完全掌握在三星等存储巨头手中。
花旗警告,2026年全球将面临存储芯片的“严重供应短缺”。
据追风交易台信息,花旗在其最新的展望中,展现了比野村证券更为激进的看涨姿态。分析师认为,受AI Agent(人工智能代理)普及和AI CPU内存需求激增的驱动,存储芯片价格将在2026年出现失控式上涨。分析师将2026年DRAM的平均售价(ASP)涨幅预期从原本的53%暴力上调至88%,NAND的涨幅预期从44%上调至74%。
定价权完全向卖方倾斜
花旗研究团队在最新报告中明确指出,预计2026年商品存储器市场将出现“严重的供应短缺”。这种短缺并非暂时性的供应链扰动,而是由结构性数据增长驱动的。花旗将2026年DRAM(动态随机存取存储器)的平均售价(ASP)同比增速预期从之前的+53%大幅上调至+88%。
更令人震惊的是服务器DRAM的数据。
花旗预计,受AI训练和推理需求的双重推动,2026年服务器DRAM的ASP将同比暴涨144%(此前预测为+91%)。以主流产品64GB DDR5 RDIMM为例,花旗预测其价格将在2026年第一季度达到620美元,环比增长38%,远高于此前预测的518美元。
在NAND(闪存)领域,花旗同样不手软,将2026年的ASP增长预期从+44%上调至+74%。其中,企业级SSD的ASP预计将同比增长87%。在分析师看来,市场将进入一个极其剧烈的卖方市场,定价权将完全掌握在三星等存储巨头手中。
基于上述激进的价格预测,花旗对三星电子(Samsung Electronics)的盈利前景进行了大幅修正。花旗预计,受惠于极其有利的定价环境,三星电子2026年的营业利润(OP)将飙升至155万亿韩元,同比激增253%。
这一数字远高于花旗此前预测的115万亿韩元。花旗认为,随着DRAM和NAND价格的飙升,三星的盈利能力将展现出极强的弹性。为此,花旗将三星电子的目标价从17万韩元直接上调至20万韩元。
野村的“超级周期”与花旗的“极端短缺”
此前野村在其报告中提出了“三重超级周期”(DRAM、NAND、HBM)的概念,预测2026年全球存储市场规模将增长98%至4450亿美元。
然而,在价格涨幅的具体判断上,两家机构出现了巨大的分歧。
野村预测2026年DRAM价格上涨46%,NAND价格上涨65%。虽然这已经是非常可观的涨幅,但花旗的DRAM涨幅预测(88%)几乎是野村的两倍。
分歧的核心在于对需求的理解深度不同。
野村强调的是AI服务器与通用服务器的“双重共振”以及HBM4的产能爬坡;而花旗则进一步强调了“AI代理”(AI Agents)带来的增量数据生成,认为这将导致数据量呈爆炸式增长,从而对通用服务器内存产生比预期更陡峭的价格拉升。
洁净室短缺成为长期瓶颈
为何价格会如此失控?除了需求的爆发,供应端的物理限制是另一大关键因素。
野村证券在报告中敏锐地指出,全球存储行业的供应扩张受到了“洁净室”(cleanroom)可用性不足的严重制约。
野村强调,即便厂商现在决定扩产,由于洁净室的短缺,2027年中期之前供应端的实质性扩张将非常有限。此外,技术迁移(如向1C纳米制程过渡)实际上会导致晶圆产能减少10%至15%,且初期良率较低。这意味着,面对花旗所预期的AI数据爆发,供给侧几乎没有快速响应的能力。这种供需错配正是花旗敢于给出DRAM价格翻倍式上涨预测的根本逻辑。
#13 Re: 去死吧 内存厂家 网友开始自己焊内存了
这个想法可以啊。我上淘宝看了看,4gb的ddr
4颗粒最便宜的33人民币,32gb就是264,加上20块钱的pcb,再加锡珠的钱,不到300块钱就能买齐材料。32gb的ddr4条子要1000左右。newegg或者amazon上的价格就更贵了。但焊锡珠需要手艺,一般人搞不定。
-
tingtingliu
- 著名点评

- 帖子互动: 166
- 帖子: 4106
- 注册时间: 2022年 8月 29日 17:52
-
tingtingliu
- 著名点评

- 帖子互动: 166
- 帖子: 4106
- 注册时间: 2022年 8月 29日 17:52
#20 Re: 去死吧 内存厂家 网友开始自己焊内存了
Tech
AI memory is sold out, causing an unprecedented surge in prices
Published Sat, Jan 10 20267:00 AM EST
thumbnail
Kif Leswing
@kifleswing
Share
Share Article via Facebook
Share Article via Twitter
Share Article via LinkedIn
Share Article via Email
Key Points
This year, there won’t be enough memory to meet worldwide demand for memory because powerful AI chips made by companies like Nvidia, AMD and Google need so much of it.
Prices for computer memory, or RAM, are expected to rise over 50% this quarter compared to the last quarter of 2025.
Wall Street has been asking consumer electronics companies, like Apple and Dell Technologies, how they will handle the memory shortage and if they might be forced to raise prices or cut margins.
In this article
MU
-0.09 (-0.03%)
After Hours
Follow your favorite stocks
CREATE FREE ACCOUNT
Digitally generated image of glowing futuristic semiconductor, digital data flowing and network structure. Innovation, AI and cybersecurity concepts.
Eugene Mymrin | Moment | Getty Images
All computing devices require a part called memory, or RAM, for short-term data storage, but this year, there won’t be enough of these essential components to meet worldwide demand.
That’s because companies like Nvidia
, Advanced Micro Devices
and Google
need so much RAM for their artificial intelligence chips, and those companies are the first ones in line for the components.
Three primary memory vendors — Micron
, SK Hynix and Samsung Electronics — make up nearly the entire RAM market, and their businesses are benefitting from the surge in demand.
″We have seen a very sharp, significant surge in demand for memory, and it has far outpaced our ability to supply that memory and, in our estimation, the supply capability of the whole memory industry,” Micron business chief Sumit Sadana told CNBC this week at the CES trade show in Las Vegas.
Micron’s stock is up 247% over the past year year, and the company reported that net income nearly tripled in the most recent quarter. Samsung this week said that it expects its December quarter operating profit to nearly triple as well. Meanwhile, SK Hynix is considering a U.S. listing as its stock price in South Korea surges, and in October, the company said it had secured demand for its entire 2026 RAM production capacity.
Now, prices for memory are rising.
TrendForce, a Taipei-based researcher that closely covers the memory market, this week said it expects average DRAM memory prices to rise between 50% and 55% this quarter versus the fourth quarter of 2025. TrendForce analyst Tom Hsu told CNBC that type of increase for memory prices was “unprecedented.”
Three-to-one basis
Chipmakers like Nvidia surround the part of the chip that does the computation — the graphics processing unit, or GPU — with several blocks of a fast, specialized component called high-bandwidth memory, or HBM, Sadana said. HBM is often visible when chipmakers hold up their new chips. Micron supplies memory to both Nvidia and AMD, the two leading GPU makers.
Nvidia’s Rubin GPU, which recently entered production, comes with up to 288 gigabytes of next-generation HBM4 memory per chip. HBM is installed in eight visible blocks above and below the processor, and that GPU will be sold as part of single server rack called NVL72, which fittingly combines 72 of those GPUs into a single system. By comparison, smartphones typically come with 8 or 12GB of lower-powered DDR memory.
Nvidia founder and CEO Jensen Huang introduces the Rubin GPU and the Vera CPU as he speaks during Nvidia Live at CES 2026 ahead of the annual Consumer Electronics Show in Las Vegas, Nevada, on January 5, 2026. (Photo by Patrick T. Fallon / AFP via Getty Images)
Nvidia founder and CEO Jensen Huang introduces the Rubin GPU and the Vera CPU as he speaks during Nvidia Live at CES 2026 ahead of the annual Consumer Electronics Show in Las Vegas, Nevada, on Jan. 5, 2026.
Patrick T. Fallon | AFP | Getty Images
But the HBM memory that AI chips need is much more demanding than the RAM used for consumers’ laptops and smartphones. HBM is designed for high-bandwidth specifications required by AI chips, and it’s produced in a complicated process where Micron stacks 12 to 16 layers of memory on a single chip, turning it into a “cube.”
When Micron makes one bit of HBM memory, it has to forgo making three bits of more conventional memory for other devices.
“As we increase HBM supply, it leaves less memory left over for the non-HBM portion of the market, because of this three-to-one basis,” Sadana said.
Hsu, the TrendForce analyst, said that memory makers are favoring server and HBM applications over other clients because there’s higher potential for growth in demand in that business and cloud service providers are less price-sensitive.
In December, Micron said it would discontinue a part of its business that aimed to provide memory for consumer PC builders so the company could save supply for AI chips and servers.
Some inside the tech industry are marveling at how much and how quickly the price of RAM for consumers has increased.
Dean Beeler, co-founder and tech chief at Juice Labs, said that a few months ago, he loaded up his computer with 256GB of RAM, the maximum amount that current consumer motherboards support. That cost him about $300 at the time.
“Who knew that would end up being ~$3,000 of RAM just a few months later,” he posted on Facebook on Monday.
Micron is building the biggest-ever U.S. chip fab, despite China banwatch now
VIDEO17:42
Micron is building the biggest-ever U.S. chip fab, despite China ban
‘Memory wall’
AI researchers started to see memory as a bottleneck just before OpenAI’s ChatGPT hit the market in late 2022, said Majestic Labs co-founder Sha Rabii, an entrepreneur who previously worked on silicon at Google and Meta
.
Prior AI systems were designed for models like convolutional neural networks, which require less memory than large language models, or LLMs, that are popular today, Rabii said.
While AI chips themselves have been getting much faster, memory has not, he said, which leads to powerful GPUs waiting around to get the data needed to run LLMs.
“Your performance is limited by the amount of memory and the speed of the memory that you have, and if you keep adding more GPUs, it’s not a win,” Rabii said.
The AI industry refers to this as the “memory wall.”
Erik Isakson | Digitalvision | Getty Images
“The processor spends more time just twiddling its thumbs, waiting for data,” Micron’s Sadana said.
More and faster memory means that AI systems can run bigger models, serve more customers simultaneously and add “context windows” that allow chatbots and other LLMs to remember previous conversations with users, which adds a touch of personalization to the experience.
Majestic Labs is designing an AI system for inference with 128 terabytes of memory, or about 100 times more memory than some current AI systems, Rabii said, adding that the company plans to eschew HBM memory for lower-cost options. Rabii said the additional RAM and architecture support in the design will enable its computers to support significantly more users at the same time than other AI servers while using less power.
Sold out for 2026
Wall Street has been asking companies in the consumer electronics business, like Apple
and Dell Technologies
, how they will handle the memory shortage and if they might be forced to raise prices or cut margins. These days, memory accounts for about 20% of the hardware costs of a laptop, Hsu said. That’s up from between 10% and 18% in the first half of 2025.
In October, Apple finance chief Kevan Parekh told analysts that his company was seeing a “slight tailwind” on memory prices but he downplayed it as “nothing really to note there.”
But in November, Dell said it expected its cost basis for all of its products to go up as a result of the memory shortage. COO Jefferey Clarke told analysts that Dell planned to change its mix of configurations to minimize the price impacts, but he said the shortage will likely affect retail prices for devices.
“I don’t see how this will not make its way into the customer base,” Clarke said. “We’ll do everything we can to mitigate that.”
Even Nvidia, which has emerged as the biggest customer in the HBM market, is facing questions about its ravenous memory needs — in particular, about its consumer products.
At a press conference Tuesday at CES, Nvidia CEO Jensen Huang was asked if he was concerned that the company’s gaming customers might be resentful of AI technology because of rising game console and graphics cards prices that are being driven by the memory shortage.
Huang said Nvidia is a very large customer of memory and has long relationships with the companies in the space but that, ultimately, there would need to be more memory factories because the needs of AI are so high.
“Because our demand is so high, every factory, every HBM supplier, is gearing up, and they’re all doing great,” Huang said.
At most, Micron can only meet two-thirds of the medium-term memory requirements for some customers, Sadana said. But the company is currently building two big factories called fabs in Boise, Idaho, that will start producing memory in 2027 and 2028, he said. Micron is also going to break ground on a fab in the town of Clay, New York, that he said is expect to come online in 2030.
But for now, “we’re sold out for 2026,” Sadana said.





