Nvidia will not release a new gaming processor in 2026 for the first time since the company’s founding, ending a streak of annual GeForce launches dating to the 1990s. According to Stacy Rasgon of Bernstein Research, as reported by CNBC, “The gaming segment is no longer the driving force of the company.” The shift reflects dramatically higher profitability in AI chips: Nvidia’s computer and networking division averaged a 69% profit margin over three years, compared to just 40% for its graphics segment targeting gamers. A single Blackwell AI chip costs up to $40,000, while gaming cards sell for $299 to $1,999.
The competitive opportunity appeared clear. Nvidia built its business on gaming GPUs, nearly going bankrupt to launch its first GeForce 256 in 1999—a risk that gamers rewarded by purchasing the technology. Yet neither AMD nor Intel can capitalize on Nvidia’s gaming exodus because both face the same constraint crippling Nvidia’s gaming business: severe computer memory chip shortages.
AMD’s Radeon RX 9000 series saw significant price increases across its lineup. The flagship Radeon RX 9070 XT jumped 17%, while the Radeon RX 9060 XT 8GB rose 10% and the Radeon RX 9060 XT 16GB increased 14% due to its doubled memory capacity. David McAfee, who oversees AMD’s Radeon division, told Gizmodo during CES 2026 that the company works closely with memory suppliers to keep prices reasonable for everyday buyers. However, he admitted that “sustaining these efforts remains unrealistic amid the ongoing shortage.”
Intel’s situation is more severe. The company planned to launch an Arc B770 gaming card built on its BMG-31 chip with 32 Xe Cores and 16GB of memory, with reports pointing to a potential first quarter 2026 release. That launch is now cancelled. Instead, Intel will release the Arc Pro B70 workstation card with 32GB of memory, aimed at AI work rather than gaming. Intel scrapped the gaming version due to a “lack of financial viability,” as memory shortages and price hikes made an affordable gaming card economically unviable.
The root cause is a brutal shortage of computer memory chips affecting the entire industry. Nvidia plans to cut gaming GPU production by up to 40% because it cannot obtain sufficient memory chips. According to Cryptopolitan, Micron has warned of a near-permanent memory shortage affecting the industry.
Research firm Gartner predicts the shortage will drive computer prices up 17% this year, causing PC shipments to drop 10.4%. The firm expects entry-level consumer PCs to disappear entirely by 2028.
Stacy Rasgon explained the dynamic: “If there is push-outs or delays on the gaming roadmap, it’s probably in large part that they probably can’t make the cards anyways because it’s hard to get the memory. Every bit of memory that’s out there, I think is really getting prioritized to AI compute.”
High-performance AI processors require High Bandwidth Memory, which takes approximately four times as many silicon wafers to produce compared to regular memory chips. This production intensity means memory scarcity hits all chipmakers equally. Rasgon concluded: “That dynamic is starving the overall industry of the type of memory that is traditionally used for more consumer type applications. It’s just not available. If Nvidia can’t get the memory, AMD ain’t going to get the memory.”
Tim Gettys, who co-hosts the Kinda Funny Games podcast, acknowledged that AMD and Intel could have filled the competitive gap if memory were available. However, he noted the structural reality: “If they’re making three times the money and the stockholders are three times happier, then yeah, I do think that they will abandon gaming despite it being what got them there. There’s a clear favorite. If you’re playing on PC, you’re going to want an Nvidia card.”