Rapid Fire
Sarvam AI Launches 'Vikram' LLMs
- 19 Feb 2026
- 3 min read
Bengaluru-based Sarvam AI has unveiled two new Large Language Models (LLMs) named 'Vikram' at the India-AI Impact Summit 2026.
- The launch comes amid OpenAI’s introduction of IndQA, a benchmark to assess AI understanding of Indian languages and cultural contexts, reflecting its growing focus on India
- Indigenous Development: The models, named 'Vikram', were developed by Sarvam AI and include a 30-billion-parameter (30B) model and a 105-billion-parameter (105B) model.
- Parameters are the internal variables or "brain cells" of an AI model learned during training; a higher parameter count generally indicates a model with greater complexity, reasoning ability, and capacity to handle nuanced tasks.
- Key Features & Capabilities
- Indian Language Mastery: Unlike global models like GPT-4, which are primarily trained on English data, Vikram is built to excel in all 22 Indian languages with voice-first optimisation, making AI more accessible to the masses despite its smaller 105B-parameter scale (one-sixth the size of DeepSeek’s 600B R1 model).
- It addresses the "data scarcity" problem in Indic languages, allowing for accurate translation and content generation in local dialects.
- Open Source: The models will be released as open source, meaning developers and researchers can access the code/weights to build their own applications on top of Vikram.
- Indian Language Mastery: Unlike global models like GPT-4, which are primarily trained on English data, Vikram is built to excel in all 22 Indian languages with voice-first optimisation, making AI more accessible to the masses despite its smaller 105B-parameter scale (one-sixth the size of DeepSeek’s 600B R1 model).
- Training Infrastructure: Training LLMs requires immense computing power. The Vikram models were trained using GPUs (Graphics Processing Units) accessed through the IndiaAI Mission’s common compute programme, highlighting the success of public-private partnership.
- Under the IndiaAI Mission, Sarvam AI has been selected to build India’s first sovereign LLM ecosystem with an open-source 120B-parameter model for governance and public services.
- Besides Sarvam , Soket will develop a similar India-focused model for sectors like defence, healthcare, and education, while Gnani has launched its own model and Gan AI is building a 70B-parameter multilingual text-to-speech foundation model.
- Tribute to Science: The chatbot and models are named "Vikram" in honor of Vikram Sarabhai, the father of the Indian space program.
| Read more: Sarvam AI and the Sovereign AI in India |