Phison’s CEO predicts growing interest in running AI models, such as OpenClaw, over PCs threatens to extend the memory shortage. It could also solve the crunch too.

The memory shortage has been blamed on AI data center demand. But might the growing interest in running AI from your home make things worse?
The CEO of Phison, a Taiwanese memory controller and NAND flash vendor, is predicting the shift toward running AI programs on local hardware risks pushing the supply gap out for several years, and even possibly up to a decade.
“The AI demand is not going to slow down,” Khein Seng Pua told PCMag in an interview at Nvidia’s GTC event in San Jose, California.
Pua spoke to us as news emerged that one of the leading memory suppliers, SK Hynix, is already projecting the memory shortage will last four or five years, rather than merely two. SK Hynix is pointing to a scarcity of wafers, the foundational base that chips are built on.
However, Pua says growing interest in OpenClaw, the open-source autonomous AI agent that you can run over a PC, is another reason why the memory industry will struggle to address the demand in coming years.
“In these few weeks, in China, OpenClaw is getting popular. It’s getting crazy, right? Don’t you believe that users will eventually install OpenClaw on-premise?” he asked. Pua noted that a recent Phison memory shipment to a Chinese PC vendor quickly sold out because Chinese customers were buying Intel’s “Panther Lake” laptops to run OpenClaw.
At GTC, Nvidia’s CEO Jensen Huang even said of OpenClaw, “This is as big of a deal as HTML. This is as big of a deal as Linux. We have now a world-class open agentic framework.”
However, Pua sees a near future where a growing number of users will want to run more advanced AI models, including for video generation, that need more memory for the best performance, far beyond merely 16GB of RAM or 512 GB of storage. “In two years, the PC can definitely run AI inference, maybe OpenClaw or something,” he said. But this will create more consumer demand for memory when AI data centers will continue to gobble up the supplies.
In the short-term, Pua said the memory shortage is forcing smartphone and PC vendors to settle for lower storage configurations. He predicts by Q3 more PCs will arrive with only 256GB of storage, down from 1TB. During this time, Nvidia will likely start shipping the company’s new “Vera Rubin” chips for data centers, which will absorb even more SSD chips, further worsening the memory crunch.
Copyright © 逆传播-newswires-All Rights Reserved 粤ICP备18027777号