Trendaavat aiheet
#
Bonk Eco continues to show strength amid $USELESS rally
#
Pump.fun to raise $1B token sale, traders speculating on airdrop
#
Boop.Fun leading the way with a new launchpad on Solana.
Been having a fun chat today with @Shaughnessy119 on the energy requirements and constraints of AI and potential impacts on the timeline for AGI/ASI
My eyes were opened earlier in the week when I met with a friend who builds large scale data centers and said the power delivery for new builds is 2028-2030 - that's a crazy long time in the world of AI
So it makes you really wonder, how tf do we continue the pace of AI innovation or even just keep up with China given the energy constraints?
Tommy did some good research and the numbers are mind-boggling:
GPT-3 used an estimated 1.3 GWh of energy to train
GPT-4 used an estimated 50-60 GWh to train
To train an AGI model, it may take 600,000+ GWh!
To put that in perspective, that's about 22% of the entire annual electricity generation of the U.S.
Of course, these are just estimates and doesn't factor in any major innovations in energy production but it does offer us a huge reality check on 1) what it could take, and 2) the implications on the timelines to reach AGI given you can't just provision 600,000 GWh of new energy anytime soon
This seems to be a very under-appreciated and under-talked about dimension to the AI race
Going to continue to deep dive on this more, probably worthy of a more in-depth report
BTW if you want to see the details of what ChatGPT had to say on this topic, here you go:
Also, this doesn't even factor in the exponential demand for inference
Those workloads can be distributed across smaller data centers where power requirements are lower, but it's still a pull from the grid
Add it all up and there's a huge bottleneck looming
11,05K
Johtavat
Rankkaus
Suosikit