🚀🚀🚀
Chamath Palihapitiya
Chamath Palihapitiya31.7.2025
Energy is the big bottle neck for AI. In both physical and software AI, it’s not just that the ingredients need to change but the entire recipe does as well: 1) we need infinite and marginally costless energy. This will mean an ensemble of energy sources working together that can produce energy TODAY (ie nuclear not really an option before 2032, building a Nat gas or coal plant with a multi year backlog for parts also isn’t a near term option until 2030+) which means we will need Solar + Storage because it can go online 12-17 months from being greenlit. No way around it. 2) but for energy storage to scale economically in light of Foreign Entity of Concern (FEOC)/Prohibited Foreign Entity (PFE) you will need to find domestic LFP CAM providers for the ESS supply chain. There are very few. 3) you will need to step down the overall power footprint of the data centers which means HVAC needs to be rethought - an entirely new kind of heat pump must be invented. This new device while having a superior profile will also need to eliminate the forever chemicals which are now outlawed and must be sunset. 4) the chips themselves need to be re architected for performant, power efficient inference. Memory design, c2c, cabling and all manner of other design decisions that work well for training won’t likely scale for inference if inference is 100x+ bigger than training. 4) in physical AI, after storage (see above) abundant REs are essential for any form of motion/actuation. But getting REs out of the ground, into an oxide then into an alloy that can be made into Permanent Magnets….are a huge exercise in energy. And the list goes on and on… . . . My point is that if you are focused on AI, you should start paying attention to energy as it will be the gatekeeper of progress/change over the next few years in AI.
7,67K