xAI to run three Grok Build models in training simultaneously by this weekend
By this weekend, xAI will have three Grok Build models in training simultaneously
Elon Musk posted today that xAI will have three Grok Build models in training simultaneously by this weekend. Grok Build is xAI's coding-focused model line — confirmed by the official Grok account responding that the parallel runs are designed to "test different approaches, iterate quicker, and deliver bigger leaps sooner." No further specifications — parameter counts, architecture, or release timelines — were shared.
Training a frontier model is not a background process. A single training run at this scale ties up tens of thousands of GPUs for weeks to months and costs hundreds of millions of dollars in compute. Frontier labs — OpenAI, Anthropic, Google — typically run one major training job at a time, occasionally two when a smaller research model overlaps with a flagship run. Three simultaneous runs is an aggressive signal about compute capacity and capital deployment.
Every story from each day, delivered at midnight UTC.