Relevance: GS Paper III (Science & Technology: IT & Artificial Intelligence) | Source: IndiaAI Mission / MeitY / February 2026 Reports
1. The News: A Big Win for “Made in India” AI
In a major step for India’s tech world, a Bengaluru-based startup called Sarvam AI has released two powerful AI models (Large Language Models). One has 35 billion “brain connections” (parameters) and the other has 105 billion.
While most AI like ChatGPT is made in the USA, these models are special because they are designed specifically to understand Indian languages and our local culture.
2. Simple Concepts: How it Works
To understand this for your exam, think of these three simple terms:
- Large Language Models (LLMs): These are AI systems (like Gemini or ChatGPT) that have “read” massive amounts of text to talk and think like humans.
- Parameters: Think of these as the “memory cells” or “connections” in an AI’s brain. Usually, more parameters mean the AI is smarter.
- Mixture of Experts (MoE): This is a clever design. Instead of using the whole brain for every small question, the AI only wakes up the “expert” part needed for that specific task.
- Benefit: This saves a lot of electricity and makes the AI run much faster on cheaper computers.
3. Why does India need its own AI?
- Language Accuracy: Global AI often struggles with the “feel” and grammar of Indian languages. Sarvam’s AI is built to be accurate in Hindi, Tamil, Bengali, and more.
- Data Safety (Sovereignty): By training AI in India with Indian data, we ensure our private information stays under Indian laws and doesn’t sit on foreign servers.
- Lower Costs: Global AI models are expensive for Indian businesses to use. Local models are designed to be “light” and affordable for our economy.
4. How the Government is Helping
The IndiaAI Mission is providing the “fuel” for these startups:
- Supercomputers (GPUs): The government allowed Sarvam AI to use its massive cluster of 4,096 GPUs (powerful chips) to train these models.
- Public Infrastructure: Through the National Compute Bank, the government is making sure that Indian startups have the “computing power” to compete with global giants.
UPSC Value Box
| Feature | Details for Your Notes |
| The Goal | Aatmanirbharta (Self-Reliance) in Artificial Intelligence. |
| The Tech | Using MoE (Mixture of Experts) to make AI “Green” and efficient. |
| Sovereign AI | Technology that understands Indian context and keeps data local. |
The “Token” Problem: > Currently, typing in Indian languages “costs” more on global AI because they break our words into too many small pieces (tokens). Sarvam’s models fix this, making AI much cheaper for the average Indian user.
Summary
Sarvam AI’s new 105B parameter model is a breakthrough for India’s Digital Sovereignty. By using a “smart” architecture that saves power and focuses on Indian languages, it shows that India is moving from being a “user” of technology to a “creator” of world-class AI.
One Line Wrap: India is building its own digital mind to ensure our future is multilingual, secure, and self-reliant.
UPSC Mains Question
“For a diverse country like India, developing indigenous AI models is not just a technological achievement but a necessity for cultural and data sovereignty.” Discuss. (10 Marks, 150 Words)
Model Hints
- Intro: Mention Sarvam AI’s new models as a step toward Sovereign AI under the IndiaAI Mission.
- Body: * Explain why global models often fail in the Indic language context.
- Discuss the importance of keeping Indian data on Indian servers (Data Sovereignty).
- Highlight how MoE architecture makes AI affordable and sustainable for a developing economy.
- Conclusion: Conclude that owning the “brain” of the digital age is essential for Viksit Bharat @2047.
Would you like me to create a comparison between Sarvam’s ‘MoE’ architecture and the dense architecture used by traditional global models?
Share This Story, Choose Your Platform!
Start Yours at Ajmal IAS – with Mentorship StrategyDisciplineClarityResults that Drives Success
Your dream deserves this moment — begin it here.


