SpaceX Will Start Making GPUs Soon, And They Could Potentially Compete With Nvidia
The ongoing memory shortage has put a wrench in the computing market, for consumers and companies alike. Generally attributed to the massive increase in hardware needs for advancing AI solutions, there's not much to do but wait it out and hope things improve — though, one YouTuber has decided to make his own RAM to combat the challenges. SpaceX is now proposing something similar, albeit with significantly more resources.
As part of a $1.75 trillion IPO listing, in a form S-1 registration reviewed by Reuters, the company revealed plans to make its own GPUs rather than relying entirely on traditional manufacturers like Nvidia. Don't get your hopes up just yet — it's first planning to invest billions in the production infrastructure and materials the processors will need. The resultant silicon will go to in-house assemblies, however. There's been a bit of loose conjecture about what they're for exactly, but they'll likely have something to do with Tesla's AI and automation goals.
Elon Musk did make it clear, however: SpaceX, xAI, and Tesla will be jointly working to develop Terafab, an advanced chip manufacturing complex for AI applications, set to be operating in Austin, Texas. Musk also announced the new facility will utilize Intel's 14A process to make chipsets for cars, robots, and space-based data centers — though GPUs were not directly mentioned as a focus.
$1.75 trillion is obviously a substantial amount of money to raise, and this auxiliary S-1 information is intended to warn investors and share where some of the money might go. However, SpaceX reportedly said "There can be no assurance that we will be able to achieve our objectives with respect to Terafab within the expected timeframes, or at all."
Why does this matter and what does it mean for the market as a whole?
It's too soon to say with certainty how this will affect the market or if it will alleviate any shortages. If the Terafab plan comes to pass, and xAI and Tesla develop AI GPUs in house, it may have some impact. But it's important to remember, those technologies still require resources before assembly, and the materials have to come from somewhere.
The current situation with RAM availability has widely affected computing and consumer electronics — there are a ton of gadgets that will jump in price during the shortage. Many have already seen increases, like the numerous Samsung phones and tablets getting a price hike. The SpaceX facility could help reduce conventional GPU demands, but it will still have growing needs for the manufacturing supplies and resources. And while GPU prices have increased in recent years, RAM availability seems to be the bigger driver of instability lately, and Tesla's AI ambitions are unlikely to have any impact on that.
Moreover, there's the point that while these reports do mention "GPUs" as the focus for in-house development, there's no clarification on what that actually means. Tesla doesn't follow consistent naming conventions, and Musk has previously called Tesla's AI5 processor a "GPU," even though it's not meant for conventional graphics processing. That doesn't change the demand per se, but neither does it help identify which chipsets Terafab will be making — processing chips or traditional graphics chips. They're both silicon-based, but have different architectures. CPUs are often less expensive to manufacture than GPUs, and the latter require more silicon.
As SpaceX outlines in the S-1 registration, the company doesn't have long-term contracts with suppliers. It will "continue sourcing a significant portion" of hardware from third parties.