Sarvam releases open-weight models debuted at AI Summit: How they compare with DeepSeek
Published on: March 8, 2026, 3:02 p.m. | Source: The Indian Express
The 30 billion- and 105 billion-parameter LLMs are reasoning models that have been built from scratch by training them on large-scale, high-quality datasets curated in-house, the Indian AI startup said in a blog post. Both models were trained using compute from GPUs (Graphics Processing Units) made available under the Indian government-backed Rs 10,372-crore IndiaAI Mission.
