As the competition in the artificial intelligence (AI) arena heats up, Sam Altman, the head of Microsoft-supported OpenAI, is making an important trip to India. This visit comes on the heels of China’s budget-friendly DeepSeek AI tool making waves recently by surpassing ChatGPT in popularity on the US App Store just a week ago. OpenAI has also unveiled o3-mini, its most economical reasoning agent, which pushes the limits of what smaller models can accomplish, alongside a ‘Deep Research’ tool that allows ChatGPT to execute intricate and multi-step research operations by sifting through vast amounts of online information. This is Altman’s first visit to India since 2023, following his talks with Prime Minister Narendra Modi. Altman’s journey includes stops in South Korea, Japan, the UAE, Germany, and France, signifying OpenAI’s objectives to partner with a burgeoning market where AI has not yet firmly established itself.
Altman’s arrival coincides with India’s intensified focus on AI through the launch of the IndiaAI Mission. “Altman is here to advocate for OpenAI to play a role in India’s national AI agenda. After the emergence of China’s DeepSeek, which disrupted the industry with its low-cost offering, OpenAI has found itself up against a significantly smaller competitor. Furthermore, OpenAI is currently facing a copyright lawsuit in India mounted by a coalition of publishing houses and media organizations.
In remarks made in New Delhi on Wednesday, Altman pointed out that India ranks as OpenAI’s second-largest market, boasting a user base that has tripled in the past year.
Today, Union minister for electronics and IT, Ashwini Vaishnaw, engaged in discussions with Altman concerning India’s strategies for creating a comprehensive AI ecosystem. After their meeting, Vaishnaw shared on X: “Had an exciting conversation with Sam Altman regarding our strategy for developing the entire AI stack – GPUs, models, and applications. We are eager to collaborate with India on all three aspects.”
During their talks, Vaishnaw highlighted India’s capability to create cost-effective AI models, drawing parallels to the nation’s cost-efficient space missions. “Our country successfully sent a mission to the moon at a fraction of the cost compared to many other nations. So why can’t we build a model that costs significantly less than those produced by other countries? Innovation will lower costs across various applications, including healthcare, education, agriculture, weather forecasting, disaster management, and transportation—there are multiple areas we are exploring,” he explained.
Altman supports India’s ambition for foundational AI models
During his conversation with Vaishnaw, Altman sought to clarify his previous statements made at The Economic Times Conversations in 2023. He had mentioned that Indian startups might struggle to create foundational models due to high computing costs and that efforts to compete with OpenAI seemed “futile,” a remark that was noted as being taken out of context.
Altman elaborated: “That statement was made during a specific phase where I believed, and still believe, staying at the forefront of foundational models incurs significant expenses. However, one of the most exciting developments is our improved understanding of small models and reasoning models, which can be challenging to train and can still be pricey; yet, it is achievable. I believe this will catalyze an explosion of innovative creativity. India ought to lead in this area.
“Costs of models can be viewed in two distinct ways. We anticipate that while the costs of staying at the frontier will keep rising along an exponential curve, the returns in intelligence will also have exponential growth in terms of the economic and scientific value generated.
“We’re engaged in a significant project dubbed Stargate that is set to evolve rapidly. Concurrently, the expenses for achieving a specific level of intelligence appear to decline approximately tenfold over a year. Historically, advancements in chip transitions have led to a 2X improvement every 18 months, which transformed the world if one looked a few decades down the line.
“What’s fascinating about AI model costs is their extraordinary decline. Yet, it is uncertain if this will lessen the global demand for AI hardware, as the reduced costs could lead to increased usage across various applications. The overall financial investments in AI will surely escalate, which presents a thrilling prospect.”
Can India replicate China’s AI advancements?
Despite India’s long-standing tech talent pool, the recent successes of China’s AI advancements, particularly with DeepSeek, have sparked questions regarding India’s technological prowess in an era where innovation is progressing rapidly. Advancing its own cutting-edge AI technology is crucial for India, not just socially and economically, but also for strategic reasons.
In the past year, India has been divided into two camps. One faction argues for the development of indigenous foundational models from the ground up, while the other supports the creation of smaller language models having fewer parameters tailored for specific applications. India is now fostering a more optimistic outlook on its ability to create its own foundational models.
Through the government’s IndiaAI Mission, which commenced last week, India will provide the world’s cheapest computation costs, below $1 per hour for high-end chips that drive generative AI capabilities.
The government plans to stimulate the development of local language models crafted by academic and industrial sectors with financial backing and other support, as noted by Minister Vaishnaw. This initiative aims to bolster India’s prowess in foundational model creation. Proposals for model development will be solicited soon, with at least six startups and developers identified to achieve this within the upcoming ten months, as first reported by ET on January 23, highlighting India’s commitment to supporting indigenous foundational models.
“The true value will be derived from two components—algorithmic efficiency and the caliber of training datasets,” Vaishnaw commented, noting that DeepSeek has demonstrated to the world that cost-effective models are achievable.
“DeepSeek utilized 2,000 GPUs in its training,” Vaishnaw stated. “We currently possess 15,000 high-end GPUs. ChatGPT version 1 from OpenAI was trained using around 25,000 GPUs. This provides us with a significant computational resource that will undoubtedly enhance our ecosystem.” GPUs are essential components for executing complex AI development tasks. Vaishnaw expressed confidence that India is not tardy to the AI landscape and will play an influential role in global innovation within this domain. With the government incentivizing computational resources, the requisite models are expected to follow suit, he asserted.
As India sheds its hesitance to develop its own complete AI infrastructure—including not just GPUs and applications, but also foundational models—collaboration with Altman’s OpenAI, a trailblazer in the sector, could expedite the country’s ambitions.
(With inputs from TOI)