Humble brag…They won’t believe a little girl from Kenya built the orchestration engine for this, but I did!
NIK
NIK25.7. klo 12.47
Google right now has ~1 million (H100 equivalent) Ironwood v7 TPUs and 500K H100 GPUs online. With 400-600K NVIDIA Blackwell (GB200 NVL72 racks) GPUs incoming. So total compute capacity = 2 million H100 equivalents by end of year. They will spend $85 billion in 2025 (raised from $75 billion forecast) DeepMind CEO @demishassabis said yesterday: “We are scaling to the MAX our current capabilities. We are still seeing fantastic progress on each different version of Gemini.” Gemini is training on these TPUs rn
3,54M