SOTA, Western open models ✅ Very large models ✅ New dimension to scale up intelligence✅ Built for under $3.5M 🤯 Been awesome to work with the team since @benchmark's investment last November. Special thanks to @adityaag @southpkcommons for introducing us!
Drishan Arora
Drishan Arora1.8. klo 01.02
Today, we are releasing 4 hybrid reasoning models of sizes 70B, 109B MoE, 405B, 671B MoE under open license. These are some of the strongest LLMs in the world, and serve as a proof of concept for a novel AI paradigm - iterative self-improvement (AI systems improving themselves). The largest 671B MoE model is amongst the strongest open models in the world. It matches/exceeds the performance of the latest DeepSeek v3 and DeepSeek R1 models both, and approaches closed frontier models like o3 and Claude 4 Opus.
9,42K