Will infra bottlenecks sour LLM dreams?
Founder, Director of Avaali Solutions Pvt Ltd, Ms. Srividya Kannan’s article published in financialexpress website.
As India accelerates its efforts to build a homegrown large language model (LLM), a critical question emerges: is the computing infrastructure geared up to support the AI workloads? After all, training large AI models requires massive processing capabilities, an area where the US and China are currently leading with their advanced supercomputing infrastructure.
What about data quality? While India generates humongous volumes of data, most of it is unstructured and requires extensive cleaning and organisation before it can be useful for AI training. Research and funding also remain critical roadblocks. Industry experts point out that while there is a rapidly growing talent base, India needs sustained investment, better industry-academia collaboration, and ongoing AI research initiatives to help push the boundaries of innovation.
To be sure, the `10,370-crore IndiaAI Mission’s multi-pronged approach — spanning initiatives such as IndiaAI Compute Capacity, IndiaAI Innovation Centre, IndiaAI Datasets Platform, and AI-led skill development — will provide the foundation for scalable, responsible, and inclusive AI growth. “Prioritising sovereign AI infrastructure, including data centres and semiconductor growth will enhance India’s self-reliance,” said Sunil Gupta, co-founder, CEO and MD, Yotta Data Services. He feels that AI-optimised infrastructure is no longer optional – it is a necessity. To meet growing demands, data centres must invest in high-performance computing (HPC), specialised GPU (graphics processing units) clusters and efficient cooling technologies to support large scale AI training and inferencing.
“AI workloads demand high uptime, scalable datacentre and cloud infrastructure capable of supporting hyperscale storage and high-performance computing. As AI adoption accelerates across industries, the need for AI-ready infrastructure is growing exponentially,” Gupta said. The IndiaAI Mission, in his opinion, is playing a key role in building India’s AI infrastructure by taking a strategic, market-driven approach. Instead of investing directly in data centres and managing GPU operations, the initiative empowers the service provider industry to take on the risks and invest in high-performance GPUs. These providers then offer GPU-based AI services on a cloud model, ensuring scalability and operational efficiency.
Srividya Kannan, founder and CEO, Avaali, said AI training, especially for large-scale models, demands extensive computing power, high-capacity GPUs, and energy efficiency. While some Indian data centres are equipped with modern infrastructure, they are not yet widely optimised for AI workloads, which require specialised hardware like Nvidia A100 GPUs or TPUs (tensor processing unit).
AI hardware is not only compute-intensive but also energy-intensive. The data centres must improve energy efficiency and adopt sustainable practices, especially in a climate-conscious era. “Large-scale AI training often involves handling massive datasets. The data centres need to enhance both their storage capabilities and interconnectivity to meet these requirements,” Kannan stressed.
Sridhar Pinnapureddy, founder & CEO, CtrlS Datacenters, said that to catch up in the global GPU race, India must prioritise the development of a robust GPU ecosystem. This includes expanding access to GPU resources, scaling up data centres with cutting-edge GPUs, and fostering collaborations between global GPU manufacturers and local tech companies. Additionally, India needs to improve its research into GPU architectures tailored to AI workloads. This will require investments in hardware design and AI applications.
“To truly handle the demands of AI, data centres need to evolve by integrating AI-optimised hardware. This includes specialised GPUs, FPGAs (field programmable gate array), and other accelerators designed to speed up model training and inference. Moreover, data centres must address issues like power efficiency, cooling and bandwidth, which are crucial for running complex AI models at scale,” said Pinnapureddy.
In addition, data centres need robust frameworks to protect sensitive data and proprietary AI models, requiring advanced physical security measures and sophisticated cybersecurity protocols that align with global standards. Pinnapureddy felt that this is particularly crucial given India’s data localisation policies, which have created new dynamics in the market.
Sharad Agarwal, CEO, Sify Infinit Spaces, highlighted that indigenous AI development demands a strong talent pipeline in multiple areas like AI-ready infrastructure, semiconductor design, AI hardware, chip fabrication etc. “India is the talent capital. But disruptive innovation requires better talent management and research oriented education. The academia and industry will need to collaborate to develop opportunities,” he said.