Analytics India Magazine | AIM https://analyticsindiamag.com/ AIM - News and Insights on AI, GCC, IT, and Tech Thu, 13 Feb 2025 04:03:06 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg Analytics India Magazine | AIM https://analyticsindiamag.com/ 32 32 Why Nagarro CTO Believes DeepSeek Might be Good for Indian IT https://analyticsindiamag.com/it-services/why-nagarro-cto-believes-deepseek-might-be-good-for-indian-it/ Thu, 30 Jan 2025 03:30:00 +0000 https://analyticsindiamag.com/?p=10162420

If India can surpass the security issues with DeepSeek, the model’s low cost will play a big role in its adoption, Kanchan Ray said.

The post Why Nagarro CTO Believes DeepSeek Might be Good for Indian IT appeared first on Analytics India Magazine.

]]>

DeepSeek, the open-source AI model from China, has captured global attention, and since then, the AI world hasn’t been the same. While ChatGPT served as a wake-up call for India to build its own foundational models, DeepSeek seems to be the one that could actually turn that vision into reality – after more than two years of anticipation.

While Indian IT remains hesitant about building foundational models, Nagarro’s CTO Kanchan Ray acknowledged that DeepSeek brings capabilities that could be particularly relevant for Indian use cases.

Just like most Indian companies, Ray also expressed scepticism about DeepSeek when it comes to Indian use cases due to its low trust factor. “I don’t have any well-formed opinion yet…but the interesting spin to this is [that] they have made it open source,” Ray told AIM.

He added that DeepSeek’s open-source nature is the first promising sign from the company. According to him, the world will use it, review it, form opinions, and then possibly build some use cases around it. 

“The accuracy is so good that it is on par with ChatGPT, Gemini, and Llama. This is very encouraging,” Ray said while adding that if India can surpass the security issues with DeepSeek, the model’s low cost will play a big role in its adoption.

 Indian IT companies are not making LLMs, and the most important reason is the cost. “Not so many Indian companies could afford spending triple-digit million dollars into making LLMs, but now this (DeepSeek) is interesting,” Ray said. According to him, since DeepSeek is expected to be trained with a budget of around $5 million, Indian companies might finally have an opportunity to enter the field of building foundational AI models. 

Pick a Model and Build a Service

Prompted by the announcement of the US’ $500 billion AI initiative, The Stargate Project, and China’s open-source “side project” DeepSeek, the Indian IT sector might finally make an AI model. Despite unwavering confidence in Infosys co-founder Nandan Nilekani’s vision of making India the AI use case capital of the world, the pressure and the keen eye from Indians might push service companies to build an LLM.

Honestly, it is not as if they are not building them. For example, Infosys said it has built four small language models and is in the process of building around 100 AI agents for its clients. This proves there are enough funds and capabilities for Infosys to step in and possibly spend close to $10 million on building a foundational model.

The same is true of TCS, whose CEO recently said that building LLMs from scratch does not offer any real benefit. K Krithivasan believes that building a foundation model for regional languages makes sense for democratising technology. 

CP Gurnani, former CEO of Tech Mahindra, told AIM that building a foundational model is important, and that is what led him to start the Project Indus initiative during his tenure. Meanwhile, Gurnani’s AI startup AIonOS has already said that it will work with DeepSeek’s model and provide it to its clients.

Since DeepSeek is also available as an API and the inference cost is almost nine times less than that of OpenAI, India IT services might finally adopt the model for building services for its clients. 

India has the Skills

Ray said that along with the cost, the skills are also an essential factor when it comes to building LLMs in India. “We might not have created an LLM from scratch, but we have taken LLMs, fine-tuned them to our client’s context,” Ray said, adding that this is the same with the Indian IT firms.

If Indian IT services can build models for their clients, they definitely have the skills and workforce to build one from scratch.

Ray added that IT services are more concerned about providing AI infrastructure as a service. That is why Nagarro also has its platform, NIA, which hosts different frameworks and LLMs for its clients. 

Comparing the metaverse wave and the ChatGPT wave to DeepSeek, Ray said that he is concerned about what will happen as no one talks about the metaverse anymore.

Nagarro strongly believes that 2025 is the year of agentic AI and has developed its own generative AI playbook. The firm has implemented this AI framework across various industries, including beauty, healthcare, and automotive. While the core algorithms for forecasting and decision-making remain similar, the fine-tuning is highly context-sensitive. 

So while the IT giants from India discuss building a foundational model, Ray is optimistic about what DeepSeek has to offer for Indian IT firms: if not a motivation to build from scratch, at least an adoption of the model by its clients.

The post Why Nagarro CTO Believes DeepSeek Might be Good for Indian IT appeared first on Analytics India Magazine.

]]>
Telangana and Andhra Pradesh, the New Hotspot for IT and GCCs https://analyticsindiamag.com/it-services/telangana-and-andhra-pradesh-the-new-hotspot-for-it-and-gccs/ Tue, 28 Jan 2025 12:30:00 +0000 https://analyticsindiamag.com/?p=10162335

Andhra Pradesh government has decided to provide land to the first 500 IT companies at very low rates compared to any other state in the country.

The post Telangana and Andhra Pradesh, the New Hotspot for IT and GCCs appeared first on Analytics India Magazine.

]]>

While Bengaluru continues to solidify its position as India’s IT and GCC hub, a status it is likely to maintain for years to come, neighbouring states are catching up fast. For decades, Bengaluru has benefitted from its abundant talent pool, which makes it an ideal choice for companies looking to set up offices and expand their presence. 

Karnataka is being challenged by Andhra Pradesh and Telangana, which are trying hard to attract GCCs and Indian IT for operations. Thus, both states are the second-best options for firms. 

Andhra Pradesh IT minister Nara Lokesh recently announced that TCS will establish its office in Visakhapatnam within three months. However, operations will initially commence from a temporary location, likely the Millennium Towers in Rushikonda.

The minister also shared updates on discussions with another IT giant, Cognizant, during his recent visit to the World Economic Forum (WEF) 2025 in Davos, hinting at positive developments to be announced soon. Lokesh stated that the government aims to create five lakh IT jobs in Visakhapatnam over the next five years. 

“TCS is looking for an ideal location to set up its permanent campus in the city. Once it is finalised, it may take two to three years for construction,” Lokesh said. “We are keen on exploring opportunities in deep tech, big data and AI. Our government has decided to provide land to the first 500 IT companies at very low rates compared to any other state in the country.”

HCLTech was also given a similar facility in Vijayawada and has been in expansion mode since. 

Hyderabad Takes it Forward

This is in line with HCLTech’s announcement of its global delivery operations in Hyderabad, which includes the launch of a new technology centre expected to generate 5,000 additional jobs. The announcement was made following a meeting between Telangana CM A Revanth Reddy, IT minister D Sridhar Babu, and HCLTech’s global CEO and MD C Vijayakumar at the WEF.

The new centre, which spans 3,20,000 sqft, will focus on delivering advanced cloud, AI, and digital transformation solutions to global clients in industries such as high-tech, life sciences, and financial services. 

Speaking on HCLTech’s expansion, Vijaykumar said, “Hyderabad, with its world-class infrastructure and high-quality talent pool, has been a key location on HCLTech’s global network. The new centre will bring cutting-edge capabilities to our global client base and contribute to the local technology ecosystem.

Furthermore, Jayesh Sanghrajka, group CFO of Infosys, met with Telangana IT and industries minister Duddilla Sridhar Babu at the WEF. They announced that the firm plans to expand its campus in Pocharam on the outskirts of Hyderabad. 

This will add around 17,000 jobs to the campus. The new IT towers, which will be constructed in the first phase with INR 750 crores, will be completed in the coming 2-3 years. Infosys currently employs around 35,000 people at its Hyderabad campus.

Krishna Vij, VP of IT Staffing at TeamLease Digital, said that both Telangana and Andhra Pradesh, especially Hyderabad, have a solid pool of tech talent. “Telangana, in particular, stands out because of its large supply of fresh IT graduates and its position as a major tech hub, which draws in both domestic and global companies looking for skilled professionals,” Vij told AIM.

“These states have some of the top institutions like IIITs, NITs, and a wide network of engineering colleges, so there is a steady flow of STEM graduates entering the market,” Vij added that Proactive government policies, infrastructure development, and skilling programs in these states make them attractive destinations for IT and GCC investments, further enhancing talent availability.

When it comes to GCCs, according to the Q3 report by ANSR, over 450 Forbes Global 2000 companies operate over 825 GCCs across the country, employing more than 1.3 million professionals. Hyderabad (110 GCCs and 1.90 lakh employees) follows Bengaluru (285 companies), attracting interest due to its infrastructure, talent availability, and business-friendly policies.

The Telangana government has signed a memorandum of understanding (MoU) with the US Chamber of Commerce’s US India Business Council, aiming to focus on collaboration in IT, AI, electronics and GCCs.

Andhra Leads with Tier-II & III Cities

Meanwhile, the Andhra government has introduced the Andhra Pradesh IT & GCC Policy 2021–24, aiming to incentivise and facilitate the establishment of prominent Fortune 500 companies within its borders. This policy focuses on creating a world-class electronics manufacturing infrastructure to transform the state into a hub for IT and electronics. 

Cities like Visakhapatnam have seen significant growth in the IT sector, with the establishment of IT Special Economic Zones and incubation centres such as the Sunrise Startup Village and Fintech Valley Vizag. These initiatives promote the city as a global fintech capital. The inauguration of Millennium IT Towers 1 and the planned Millennium IT Towers 2 further bolster the state’s IT infrastructure.

Telangana, similarly, has become a preferred destination for GCCs. Notably, pharmaceutical giant Eli Lilly announced plans to establish a new global capability centre in Hyderabad and hire over 1,000 professionals. SEI Investments is considering Hyderabad for a new GCC and intends to create high-skill engineering and financial jobs over the next three years. 

The Telangana government has been proactive in fostering IT growth beyond Hyderabad. 

Collaborations with organisations like ITServe Alliance aim to create 30,000 IT jobs in Tier-2 and Tier-3 towns, promoting balanced regional development. Additionally, the state has set up IT hubs in cities such as Warangal, Khammam, Karimnagar, and others, further decentralising IT growth.

“Tier-2 and Tier-3 cities stand to benefit significantly from the expansion of IT companies,” Vij said that while salaries in these regions may be lower than in Tier-1 cities due to the lower cost of living, firms are expected to offer competitive pay for specialized roles like AI, data analytics, and cloud computing. 

The state governments of the two states are already working towards that goal. For instance, Lokesh emphasised efforts to provide IT companies in Vizag with better bus connectivity, street lighting, and police patrolling. He mentioned upcoming discussions with the finance minister on providing incentives to IT units, reaffirming the government’s commitment to creating a robust IT and GCC ecosystem.

The post Telangana and Andhra Pradesh, the New Hotspot for IT and GCCs appeared first on Analytics India Magazine.

]]>
Why Indian IT is Not Keen on Building AI Foundational Models https://analyticsindiamag.com/it-services/why-indian-it-is-not-keen-on-building-ai-foundational-models/ Sun, 26 Jan 2025 04:30:00 +0000 https://analyticsindiamag.com/?p=10162186

It makes sense for the industry not to spend on foundational models and work with the ones already available.

The post Why Indian IT is Not Keen on Building AI Foundational Models appeared first on Analytics India Magazine.

]]>

While Indian startups and founders are catching up to the AI wave, prompted by the announcement of the $500 billion Stargate Project from the US and China’s open-source “side project” DeepSeek, the Indian IT sector remains unfazed. With unwavering confidence in Nandan Nilekani’s vision, they continue to focus on building India as the AI use case capital of the world.

Though tech giants like TCS, Infosys, Wipro, and HCLTech have started developing agentic AI frameworks, small language models, and even drug discovery, their efforts remain focused on clients alone. These initiatives do not prioritise building foundational technologies for the country

For example, Tech Mahindra built its Project Indus, the only foundational model emerging from an IT firm in India. However, the expected impact is largely on its clients instead of the creation of transformative products like ChatGPT, Claude, or Gemini. The model, uploaded on Hugging Face by Tech Mahindra chief innovation officer Nikhil Malhotra, is open-source. However, it has garnered very few downloads and is not available for Serverless Inference with its API.

Unfortunately, Indian IT is not very interested in building a foundational model. One can argue that even though the firms have enough funds to make a foundational model, they won’t build one unless their clients ask them to or there is some requirement from their side.

What Needs to be Done?

When AIM asked industry experts if India should also build its own Stargate Project, the responses were mostly positive. Ajai Chowdhry, HCL co-founder and chairman of the Mission Governing Board of India’s National Quantum Mission, expressed concern over the growing shift towards taking control of AI

“We seem to be getting to the weaponisation of tech. For strategic autonomy, we must create our own AI doctrine and have strong control over our data,” Chowdhry told AIM

Several others expressed similar thoughts as soon as Aravind Srinivas, CEO of Perplexity, said that India should definitely focus on building its own LLM and that Nilekani was “wrong about India not requiring it to build one.”

The co-founders of Infosys are still debating whether they need one. One of them, Kris Gopalakrishnan, wrote on X that India needs to build its foundation model for a cultural and strategic economy. 

Meanwhile, Mohandas Pai, the former CFO of Infosys, earlier told AIM that Indian IT companies are services companies and are not focused on building AI products. According to him, the funding required to build something foundational is much higher than what is available in the country.

Pai called for a government-backed innovation fund similar to France’s $36 billion France Innovation Fund, which supports startups like Mistral. Such funding, he argued, could enable India to produce foundational models and compete on a global scale.

This is similar to what K Krithivasan, CEO and MD of TCS, highlighted when he said that building LLMs has no huge advantage, as the cost outweighs the benefit. He added that since most organisations in India are system integrators, companies need to use products as software and ensure that clients receive the benefits.

At the same time, he also agreed that building it for regional languages makes sense for the democratisation of the technology.

CP Gurnani, former CEO of Tech Mahindra, told AIM that building a foundational model is important, and that is what led him to start the Project Indus initiative during his tenure. He said that India should build something like NVIDIA. 

How is the Work Progressing?

While AI startups like Sarvam and Krutrim are working with IT giants to provide language models for their clients, the firms are also interested in helping their clients build the models. However, instead of building one, it seems like the ideal approach for IT firms would be to provide a DeepSeek R1 model for their clients.

“The Indian path in AI is different. We are not in the arms race to build the next LLM; let people with capital, let people who want to pedal chips do all that stuff…We are here to make a difference, and our aim is to put this technology in the hands of people,” Nilekani said last year, back when the debate around building AI had started

The upskilling of the Indian IT workforce for generative AI is also a good sign for how the firms are adopting generative AI, but nothing about building a foundational model. It seems like Indian IT has missed the generative AI bus, and no amount of funds can bring it back up. 

As Narayana Murthy said last year, India is only good at copying ideas from the West and applying them here. India’s tech sector continues to prioritise short-term gains from outsourced IT services rather than investing in creating globally competitive products. For Indian IT, it actually makes sense. 

The post Why Indian IT is Not Keen on Building AI Foundational Models appeared first on Analytics India Magazine.

]]>
The Death of Hard Drives at Data Centres https://analyticsindiamag.com/it-services/10162180/ Sat, 25 Jan 2025 04:33:00 +0000 https://analyticsindiamag.com/?p=10162180

‘Today, 80% of storage purchases are concentrated in a few companies, but that’s about to change.

The post The Death of Hard Drives at Data Centres appeared first on Analytics India Magazine.

]]>

At the start of 2024, companies such as AdaniConnex, Reliance, Sify, Atlassian, Yotta, and AWS ramped up investments in India’s rapidly expanding data centre industry. AWS alone had announced a significant commitment of $12.7 billion to bolster its presence in the country.

According to a report, India’s colocation (colo) data centre capacity across its top seven cities reached 977 MW by the second half of 2023. This marks a substantial addition of 258 MW during 2023, reflecting a 105% year-on-year growth compared to the installed capacity in 2022.

The country is poised for even greater expansion, with 1.03 GW of under-construction colo capacity slated for completion between 2024 and 2028. 

Against the backdrop of such exciting developments, Pure Storage India, an advanced data storage platform, makes a bold prediction: Hard drives will likely be phased out in three to five years. 

In an exclusive interview with AIM, Shawn Hansen, general manager/VP at Pure Storage, compared this transition to DVDs, which were once everywhere, but were quickly replaced by streaming media. 

“It felt like an overnight shift,” Hansen expressed while anticipating a similar trajectory for hard drives. He believes that Flash storage is increasingly seen as the future, offering efficiency, speed, and sustainability that hard drives just can’t match.

What is so Special About Pure Storage?

Pure Storage’s core mission is to create the most efficient and dense storage solutions. Hansen mentioned that the company started with a 2-terabyte Flash module and has now developed a 75-terabyte one, with plans for a 150-terabyte module soon. 

However, these upgrades come with no increase in power or cooling requirements. He explains that hard drives, by contrast, are limited by their mechanical nature and consume much more power. 

“AI is incredibly data-hungry, and traditional hard drives just are not viable anymore,” Hansen explains. Flash storage, which uses one-tenth the power of hard drives, is the clear winner.

This transition is especially crucial as data centres face the challenge of growing demand while managing costs and energy consumption

Pure Storage’s flash technology allows companies to reduce power use, free up space, and lower cooling requirements. This efficiency translates to more capacity for AI workloads without the need to expand data centres.

Flash storage is not only efficient but also becoming more affordable. “We’re reaching price parity with hard drives, and that’s driving a dramatic shift in the market,” Hansen noted. Pure Storage is confident this shift will accelerate as vendors exit the hard drive market, making way for denser storage technologies like flash.

Why is Pure Storage in Bengaluru?

Hansen claimed that India, owing to its incredible pool of technical talent, is at the heart of this shift. Pure Storage credits Bengaluru, a city known for its startups and innovation, for having found some of its brightest engineers. “The talent density here is incredibly high,” he remarked. 

Over the past two years, the company has doubled its team size annually in Bengaluru, focusing on projects that require deep expertise in storage and AI. “The talent here is not just benefiting us but the global tech ecosystem. We are incredibly impressed with the leaders we have met here and look forward to continuing our partnership with this region,” he said.

Bengaluru is not just another location for Pure Storage. It is one of their three main innovation hubs, alongside Santa Clara in Silicon Valley and Prague in Europe. Each hub has its speciality, and Bengaluru plays to its strengths by tackling challenging storage and AI projects. 

“It’s not just about outsourcing,” Hansen shared proudly.

The company has developed technologies like DirectFlash, which makes storage solutions more efficient and reliable. “All our firmware for DirectFlash is developed in Bengaluru. The team here gives us a multiyear lead over competitors,” he said. 

Another key innovation is Fusion, which lets companies manage storage like a public cloud—simple, flexible, and scalable.

Pure Storage also works closely with major players like NVIDIA, collaborating on solutions like the SuperPod NGX, which combines GPUs and storage for seamless AI infrastructure. 

Besides, the company has secured a partnership with one of the top four hyperscalers (name undisclosed) to integrate flash storage into their data centres, hinting at a major milestone in the industry’s shift from hard drives.

Meta’s AI Research SuperCluster (RSC), one of the fastest AI supercomputers globally, is built to train next-generation AI models on massive datasets spanning petabytes. Powered by Pure Storage’s FlashArray and FlashBlade, RSC efficiently meets the GPU and storage demands while keeping operational costs low.

“We contacted a number of storage vendors of both disk and flash to evaluate their highest performance and highest density offerings. From a combination of performance and power and cost, we ended up selecting Pure Storage,” Vivek Pai, AIRSC storage lead at Meta, remarked.

“Today, 80% of storage purchases are concentrated in a few companies, but that’s about to change,” Hansen explained. Pure Storage is ready to provide the storage solutions needed to power this next phase of AI.

The post The Death of Hard Drives at Data Centres appeared first on Analytics India Magazine.

]]>
India IT Focuses on AI While Headcount Declines https://analyticsindiamag.com/it-services/india-it-focuses-on-ai-while-headcount-declines/ Wed, 22 Jan 2025 12:30:00 +0000 https://analyticsindiamag.com/?p=10161983

The IT industry is expected to add less than a fourth of the 60,000 people it added in the previous fiscal year by the end of FY2025. 

The post India IT Focuses on AI While Headcount Declines appeared first on Analytics India Magazine.

]]>

With attrition hitting 13%, India’s largest IT firm, TCS, reported a net decrease of 5,370 employees in Q3 FY25, bringing its total workforce to 607,354. This decline follows two consecutive quarters of headcount growth, marking a significant shift in the company’s employment trends. 

Despite this, Milind Lakkad, chief HR officer, shared that the firm has had over 25,000 promotions in Q3, bringing the total for the year to over 110,000. Lakkad said the company wants to get on track with its campus hiring goal of 40,000 people this year.

Meanwhile, TCS reported that its clients are actively investing in generative AI and agentic AI while building robust data foundations. The firm has started working on AI agents and drug discovery for its clients. This runs quite in contrast with the firm’s upskilling initiatives since many employees have yet to be trained in AI.

No AI Washing

This headcount reduction is not limited to TCS. India’s top five IT firms—Infosys, Wipro, HCLTech, Tech Mahindra and TCS—collectively experienced a decline for the seventh straight quarter. In the first quarter of FY25 alone, these companies saw a combined reduction of 2,034 employees compared to the previous quarter.

Tech Mahindra, which reported the highest YoY profit of 93%, also saw a 3,785 headcount decline. Meanwhile, the firm has actively invested in generative AI and is the only big Indian IT company that has built its own sovereign LLM and framework. 

The company plans to hire 6,000 freshers this fiscal year, with 3,000 already onboarded in the last two quarters.

Wipro, which has also decided to shift its focus to agentic AI with $1 billion worth of deals, saw a decline of 1,157 employees. Chief Srini Pallia said that the company wants to experiment in emerging areas like customer service and supply chain management. 

Though it sounds ideal, it is also the reason for the decline in the workforce, as AI agents can reduce the reliance on the human workforce.

Wipro also plans to hire around 12,000 freshers this fiscal year.

Despite the hiring, the IT industry is expected to add less than a fourth of the 60,000 people it added in the previous fiscal year by the end of FY2025. 

What’s to Learn?

Though the individual reductions in headcount were high, the overall count this quarter stands low because Infosys and HCLTech increased their employee counts by 5,591 and 2,134, respectively. This is while both firms made several announcements about their AI plans this quarter.

Salil Parekh, CEO and MD of Infosys, revealed that the firm had built four small language models for its clients and was also building over 100 AI agents. “We are clear about what we are doing in generative AI,” Parekh said. He added that Infosys was not ‘AI washing’ like some others in the industry, but instead, it was doing real generative AI work. 

Similarly, HCLTech said that its AI Force platform had already been adopted by over 20 clients, with plans to scale to over 100 clients by FY26. HCLTech is also exploring agentic AI for employee productivity. 

Infosys is also planning to increase its freshers intake to 15,000 in FY25. Additionally, it announced rolling out a 6-8% hike in February. HCLTech also plans to hire around 7,000 freshers. The two companies prove that the focus on AI can be achieved without reducing the headcount. 

About 20-25% of fresher hiring in the IT sector is now targeted towards AI skills. This is an uptick from a dismal 5-10% over the last three years. But this has also resulted in the reduction of team size in the firms, affecting the overall headcount.

All of this is also while the debate around working 90 hours a week was started by L&T chairman SN Subrahmanyan, which according to many, is a recipe for attrition.

The post India IT Focuses on AI While Headcount Declines appeared first on Analytics India Magazine.

]]>
Infosys Builds 4 Small Language Models, To Develop 100 Client-Focused AI Agents https://analyticsindiamag.com/it-services/infosys-builds-4-small-language-models-to-develop-100-client-focused-ai-agents/ Thu, 16 Jan 2025 11:12:49 +0000 https://analyticsindiamag.com/?p=10161564

Salil Parekh said that Infosys is not doing ‘AI washing’ as others in the industry might be and, instead, doing real generative AI work. 

The post Infosys Builds 4 Small Language Models, To Develop 100 Client-Focused AI Agents appeared first on Analytics India Magazine.

]]>

Infosys posted strong financial results for Q3 FY25, highlighting growth in revenue, operating margin, and free cash flow. CEO and MD Salil Parekh attributed the performance to the company’s strong positioning in digital services and its growing enterprise AI capabilities, particularly in generative AI and AI agents.

“In generative AI, we have built four small language models for banking, IT operations, cyber security and broadly for enterprises in generative AI,” Parekh said. He added that the company is developing over 100 new agents for its clients, many of which are already in use. 

“We are very clear about what we are doing with generative AI,” Parekh said. He added that Infosys is not doing ‘AI washing’ as others in the industry might be and, instead, doing real generative AI work. 

“We have several discussions with clients where they would like to use the small language models that we have built…They are built by using the proprietary data that we have. Some clients are asking us to build a small language model of their own.”

For AI agents, Parekh explained that Infosys has built a research agent for its clients which is not just a PoC. “They are now using that in their product area to support how queries are looked at, and where their own people and their own customers can use this agent,” Parekh said, adding that these agents are able to reduce the work of 18 days to eight days.

The company reported a revenue of $4.94 billion, reflecting a sequential growth of 6.1% year-on-year (YoY) and 1.7% quarter-on-quarter (QoQ) in constant currency terms. 

The global IT giant reported a net profit of $804 million for the quarter, which demonstrates robust financial performance. 

“We continue to strengthen our enterprise AI capabilities, particularly focusing on generative AI, which is witnessing increasing client traction,” Parekh further said. “This has led to another quarter of strong large deal wins and an improved deal pipeline, giving us greater confidence as we look ahead.” 

The company secured $2.5 billion in large deal total contract value (TCV) with 63% net new deals, growing 57% sequentially.

Headcount also increased for the second consecutive quarter, currently standing at 3,23,379. “We have had a strong hiring in Q3 with the addition of over 5,000 employees,” Parekh added. Infosys is on track to onboard 15,000 to 20,000 freshers at the group level in FY25.

Indian IT Reporting Strong Generative AI Pipeline

When Accenture reported its earnings for the first quarter of fiscal 2025 (ending November 30, 2024), the company set a record of $1.2 billion in generative AI bookings, reflecting growing client investments in this space.

In FY24, the company reported $4.2 billion in GenAI bookings overall. It was expected that Indian IT firms would follow suit. However, even TCS, L&T, HCLTech, and now Infosys, in all of their recent earnings calls, shied away from revealing the revenue from generative AI, citing several reasons like “AI is now part of every deal”. While all of them said that they are working on GenAI and AI agents, the exact details remain unknown.

For example, in its latest Q3 FY25 earnings call, TCS reported that its clients are actively investing in generative AI and agentic AI while building robust data foundations. K Krithivasan, CEO and MD, said that TCS has been working on a drug discovery project with a client who was able to identify 1,300 molecules and further filter them to 12 molecules. 

Moreover, HCLTech announced during its Q3 FY25 earnings call that it is advancing its GenAI strategy with an aim to integrate AI services into 100 clients by FY26.

Similarly, in the last quarter, Infosys finally revealed that it is working on small language models, building multi-agent frameworks, and doing incredible work with generative AI. The exact numbers gained from this work, however, remained undisclosed. 

Parekh added that the company believes small language models will provide clients with a powerful tool, allow them to build business logic on top of it and unlock new potential. “This combination will form the foundation of the small language model, which is being tailored for different industry applications,” he added.

Infosys is Doing a Lot of Generative AI

Just a few hours before the result, Infosys had launched a suite of AI-driven features for the Australian Open (AO) 2025 in partnership with Tennis Australia. As per the company’s press release, this marks a major milestone in their seven-year collaboration, which aims to enhance the tennis experience through technology. 

Last month, Infosys also launched a Google Cloud Centre of Excellence at its Bengaluru campus to drive enterprise AI innovations. The initiative, powered by Infosys Topaz, aims to help businesses use generative AI for transformative growth.

Furthermore, as promised earlier, Infosys is finally becoming an AI-first company. For instance, at the Microsoft Building AI Companions for India event in Bengaluru, Infosys CTO Rafee Tarafdar said that developers have been using GitHub Copilot for over a year, which now has about 20,000 users generating nearly a million lines of code every few weeks

Beyond internal work, at Meta’s Build with AI Summit held in Bengaluru last year, Infosys announced a partnership with Meta to utilise the Llama stack, a collection of open-source large language models and tools, to build AI solutions across industries.

As an early adopter of Llama 3.1 and 3.2 models, Infosys is integrating these models with the in-house AI platform Infosys Topaz to create tools that deliver business value. One example of such a tool is a document assistant powered by Llama that improves the efficiency of contract reviews.

The post Infosys Builds 4 Small Language Models, To Develop 100 Client-Focused AI Agents appeared first on Analytics India Magazine.

]]>
AI Agents are Basically RPA with LLMs https://analyticsindiamag.com/it-services/ai-agents-are-basically-rpa-with-llms/ Mon, 13 Jan 2025 05:19:52 +0000 https://analyticsindiamag.com/?p=10161243

The rebranding of advanced RPA as agentic AI is often a marketing move to capitalise on AI's hype.

The post AI Agents are Basically RPA with LLMs appeared first on Analytics India Magazine.

]]>

Every company, big tech or startups alike, is betting on agentic AI becoming the biggest trend in the near future. As we move away from small language models, everyone will be likely to increasingly talk about implementing AI agents in their workflow. 

Recently, NVIDIA CEO Jensen Huang suggested that all IT departments would evolve into HR for AI agents. Microsoft CEO Satya Nadella similarly highlighted that there would be a swarm of AI agents in the workforce, likening the shift to the rise of Robotic Process Automation (RPA) in recent years. However, it didn’t turn out exactly as predicted.

It was predicted that RPA would automate most of the mundane jobs, which would allow teams to focus on larger tasks. Several industry experts now predict that AI agents are going through the same phase as RPA. 

Nikhil Malhotra, chief innovation officer at Tech Mahindra, on an episode of What’s the Point with AIM, pointed out that while a lot of startups would be talking about agentic AI this year, most of the tech would just be RPA. “But the good thing about this would be that these startups will start thinking about agentic loops.”

For instance, when Anthropic released its computer use feature with Claude 3.5 Sonnet, it could move the cursor, click buttons, and type text, as well as fill out forms, navigate websites, and interact with software programmes. This agentic approach has left many wondering about the potential implications for RPA companies and the future of agentic AI and whether it will meet the same fate.

“Wasn’t RPA the exact same thing without an LLM and it failed miserably” 

Given the hype around AI agents, the question of their capabilities in the workforce needs to be examined deeply. The current frameworks seem very similar to RPA but with an LLM in the loop. Though that makes a huge difference, adapting them to the workflow still looks like adopting a 10-year-old technology.

It is predicted that the $250 billion SaaS market will be replaced by the $300 billion AI agents market as companies adopt AI agents in their workflows. However, given the huge price difference, people are still not convinced if moving away from the current systems to AI agentic ones is worth it.

Moreover, all RPA companies are also entering the AI agent race. Apart from Salesforce, companies like UiPath and Automation Anywhere have started leveraging AI agents because they believe that both offerings are different. This means that RPA is now actually being upgraded to agentic AI, and not much has changed.

While speaking with AIM, Param Kahlon, EVP and GM of automation and integration at Salesforce, earlier said that autonomous agents also do not mean the end of RPA technology. 

“RPA agents were designed to automate repetitive, tedious tasks, such as transferring data between systems when APIs aren’t involved. In contrast, autonomous agents process information more like humans, adapting to situations and making decisions based on changing conditions, enhancing efficiency and effectiveness in workflows.”

Ramprakash Ramamoorthy, director of AI research at ManageEngine and Zoho, told AIM  that the dispersion around agentic AI systems in enterprise IT is becoming increasingly polarised.  He said that for enterprises, the shift from RPA to agentic AI opens a new era of self-directed operations, which enables faster scaling and better responses to evolving business needs. 

“Agentic AI is more than just RPA with LLMs; it’s a transformative evolution that combines automation with intelligent decision-making. While traditional RPA executes predefined tasks, agentic AI learns, reasons, and adapts in real-time, elevating process automation with cognitive flexibility,” Ramamoorthy said. 

Agentic AI is RPA 2.0

“Majority of agentic applications are basically workflow automation with some minimal amount of people interactions,” tech YouTuber Shailesh, who runs channel SV Techie, wrote on X. He explained that there might be a reduction in headcount amongst companies, but nothing like the autonomous hype that is being sold.

Anil Kumar, CTO at Exotel, told AIM that calling agentic AI just RPA with LLMs is as unconvincing as saying C++ classes are just C structures with methods. While RPA deals with structured data, agentic AI aims to achieve automation by using LLMs to interpret decision trees.

Taking the example of complex human conversations such as loan negotiations, Kumar said that they cannot be expressed as a decision tree. “Agentic AI like the one used in our bots will work backwards from the goal given to them (which is to negotiate and disburse loans) and navigate the nuances of human conversation,” Kumar said. 

“They achieve this by carrying the context of the current conversation, learnings from previous conversations, information from a knowledge base, contractual constraints from a legal document, etc. and make decisions towards achieving the given objective.” He added that if this is implemented as RPA along with LLM, it will be like “giving a script to a child actor who will fluster on stage if others go off script”.

Between 2018 and 2023, AI integration into RPA solutions has steadily evolved. This has enhanced RPA’s functionality with sophisticated AI capabilities. The true breakthrough, however, brought about the emergence of agentic AI in 2024.

Andreessen Horowitz, in its thesis posted in November last year, pointed out that AI will automate operations and eat the world of RPA. The end of traditional RPA is widely discussed in the industry.

Deepak Dastrala, CTO and partner at IntellectAI, told AIM that RPA focuses on automating repetitive, rule-based tasks, which makes it a tactical solution. AI agents, on the other hand, take a more goal-based approach and act more like digital twins of humans, powered by LLMs, equipped with memory, and able to adapt and learn in real time. 

“That’s why RPA’s relevance has faded, while AI agents are poised to reshape our work at a level no other automation technology has achieved,” he said.

“The era of cargo cult programming to churn out generic, modular software is dead and buried. In 10 years, RPA and agent studios will be relics of the past. Instead, we’ll see specialised agents, each uniquely designed for specific industries to solve problems end-to-end,” said Arnav Bathla, CEO of Layerup.

Agentic AI can be viewed as RPA 2.0. The rebranding of advanced RPA as agentic AI is often a marketing move to capitalise on AI’s hype. Vendors position their products as “intelligent agents” to differentiate them from traditional RPA, despite the underlying functionality being a continuation of process automation.

The fundamental objective – automating repetitive tasks for efficiency – remains unchanged. However, the fate of agentic AI might end up the same as that of RPA if it fails to evolve and address its limitations.

The post AI Agents are Basically RPA with LLMs appeared first on Analytics India Magazine.

]]>
Dear L&T, This is a Recipe for Attrition https://analyticsindiamag.com/ai-features/dear-lt-this-is-a-recipe-for-attrition/ Fri, 10 Jan 2025 12:54:35 +0000 https://analyticsindiamag.com/?p=10161145

L&T chairman SN Subrahmanyan faces backlash for suggesting employees should work 90 hours a week, including Sundays.

The post Dear L&T, This is a Recipe for Attrition appeared first on Analytics India Magazine.

]]>

In June last year, Larsen & Toubro (L&T) made headlines for grappling with an acute manpower shortage across its businesses. Chairman SN Subrahmanyan, popularly known as SNS, said that the company needed around 45,000 engineers and techies. An attrition rate of 10% was said to be a contributing factor. 

L&T has made headlines once again, and this time, SNS broke the internet with his viral video. In it, he was seen asking employees to work 90 hours a week, including Sundays—a move that could only compound the company’s attrition and staff shortage issues.

During an employee interaction, Subrahmanyan said he would be happier if he could make them work on Sundays as well. “What do you do sitting at home? How long can you keep looking at your wife? Come on, get to the office and start working,” he added. 

Drawing comparisons with China’s intensive work culture, he said, “If you want to be on top of the world, you have to work 90 hours a week.”

Facing Brickbats

Netizens have reacted sharply to his extreme work expectations, erupting in a flurry of memes, jokes and posts online.

A Reddit user commented, “So unfortunate, we have such business leaders! I think we must call them “leaders in baby diapers” :) I had a few close friends who worked at L&T Madras. About 10 years ago. Going by what they said about the work culture, I felt it was like an adults’ kindergarten.”

Another added: “L&T came to my college for placements, offering a CTC of 6 LPA, and they expect us to work 90 hours a week for that? This really highlights the sad state of labour laws in India and the mindset of some Indian chairmen and CEOs. It’s honestly ridiculous.”

A former L&T employee, Karthik Madhavapeddi, deputy editor at IndiaSpend, had this to say: “I just saw a news report in which L&T chief SN Subrahmanyan (SNS to employees) is quoted saying he wants employees to work 90 hours a week.”

He added that having worked at L&T Construction from 2010 to 2013, “I can say this reflects the typical mindset of someone with a background in construction. On-site, we had 6.5-day workweeks, with hours stretching from 8:30 am to 8:30 pm Monday through Saturday, and up to 1 pm on Sundays. The only exceptions were projects where the client’s operations didn’t permit such extended hours.”

Further, he said that labourers and workmen were compensated with overtime pay for the extra three hours. Employees, however, were not.

“When I attended an internal interview for the management trainee programme at the corporate office in Mumbai, the interviewers didn’t seem to grasp why such long hours were necessary on-site,” Madhavapeddi added.

He said that SNS may have brought the same “construction culture” into the corporate.

Narayan Murthy in the Mix

Last year, Infosys co-founder Narayan Murthy kicked up a storm with his “70-hour a week” remark. Commenting on development and nation-building, Murthy said, “India’s work productivity is one of the lowest in the world… my request is that our youngsters must say, ‘This is my country. I’d like to work 70 hours a week’.”

Bollywood actress Deepika Padukone also took to social media, connecting SNS’s remarks to mental health.

deepika post

In a LinkedIn post, Sanjay Sehgal, chairman & CEO at MSys Technologies, explained that Indian workers, on average, worked significantly longer hours than their global counterparts. 

According to the International Labour Organisation, the average Indian worker, aged around 15, clocks in 47.7 hours per week. This is higher than countries like the US (36.4), the UK (35.9), Germany (34.4), and even Asian countries like China (46.1), Singapore (42.6), and Japan (36.6).

He further claimed that gig industry workers, such as those working for UrbanCompany, Swiggy, Zomato, Ola, and Uber, put in 11-12 hours a day, often totalling over 70 hours a week. This includes labourers, electricians, and plumbers, who spend long hours but often lack growth opportunities or fair pay. 

Young workers, aged 16 to 25, are increasingly involved in gig work like driving taxis, delivering food, or renting bikes, which provide limited benefits or career progression.

Additionally, employees in IT and corporate sectors often face expectations of being available round-the-clock for calls and emails to ease collaboration with global teams. “But now, despite knowing the effects of long working hours, pushing employees to work for 70 hours a week sounds unjust and brutal,” said Sehgal.

L&T, however, rushed in to defend its chairman. A company spokesperson said, “Nation-building lies at the heart of our mission. For over eight decades, we have been shaping India’s infrastructure, industries, and technological capabilities. We believe this is India’s decade – a period calling for collective commitment and effort to drive growth and realise our shared vision of becoming a developed nation.”

The post Dear L&T, This is a Recipe for Attrition appeared first on Analytics India Magazine.

]]>
The Dark Side of AI Upskilling in Indian IT https://analyticsindiamag.com/it-services/the-dark-side-of-ai-upskilling-in-indian-it/ Thu, 09 Jan 2025 13:30:39 +0000 https://analyticsindiamag.com/?p=10161075

“Effective upskilling isn’t about ticking boxes; it’s about formats that truly engage like interactive modules, case studies, hackathons, and real-world applications."

The post The Dark Side of AI Upskilling in Indian IT appeared first on Analytics India Magazine.

]]>

Indian companies are looking to upskill around 85% of their workforce with generative AI in FY25. This marks one of their largest investments in measuring the ROI of AI, yet the quality of the training offered remains questionable. 

A former employee of Infosys, who wanted to maintain anonymity, told AIM that when their company partners with the likes of NVIDIA, Google, or Microsoft for training their employees in generative AI, the outcome is not that great since the actual training is done in partnership with smaller firms offering entry-level courses at cheaper rates.

He explained that most of the training programmes involve presentations and slides where they simply need to press ‘next’ a few times over, answer a few multiple choice questions, answers to which can be easily found on Google, and certify themselves as ‘GenAI Trained’.

Therefore, these programmes can sometimes be more attractive for firms, as they can “train” their employees in generative AI while paying a lower price and coming across as better service providers to clients.

When it comes to the big-tech partnerships for training, the employee said that the target usually is to take 2-3 years to make the workforce “GenAI Ready”, which is a lot of time, and most do not even have concrete plans ready yet. 

He explained that this is because most Indian IT companies are focused on providing services rather than building products for their clients. The employees’ tasks mostly involve working with chatbots or copilots and helping their clients, which does not require much knowledge of generative AI.

The employee added that there are always smarter ways to train engineers in generative AI, like letting them experiment with tools like Cursor or GitHub Copilot. However, companies are sceptical of such tools and hesitant to test them as their code bases are proprietary. 

But, this is slowly changing as some companies have started partnering with GitHub and other AI tool providers to enable their employees to experiment with the tools.

Emails sent to Infosys and TCS did not elicit any response. 

Clicking ‘Next’ is All You Need

This was also revealed earlier by a user on X when he said that even though around 900,000 employees have been trained in generative AI, the depth and quality remain questionable

“A friend of mine works at one of the largest IT companies in India, and she just completed a GenAI course in an hour by clicking the next button 100s of times. She is now part of a GenAI-ready workforce! Proud of her :).”

Even though many companies offer in-depth courses for their employees, the trained workforce remains underutilised. “Such certifications and credentials are bogus, rather a waste of time and effort,” said a user on X. “There is a huge gap between the demand and the quality of candidates that exist in the AI job market, especially for GenAI.”

The number of employees trained in India is close to 2.5 lakhs as of this month. Most of these are from Indian IT companies where generative AI training is mostly compulsory to sit through, even though it does not offer any benefits for them. There’s an upskilling mania of sorts currently sweeping across the tech industry.

Today, it is generative AI skills. In the future, it might be something else. “IT changes so rapidly that you have to learn until the end of your career,” said a user on Reddit. With the advent of every new technology, be it Python, cloud, or AI, long-standing Indian IT employees need to keep upgrading to keep up.

Just Another Trend?

“Just like with the metaverse c**p the other year, we have to let companies know that we have some level of competency in generative AI,” said a user on Reddit. 

Mrinal Rai, assistant director and principal analyst at ISG, told AIM that Indian IT clients will prioritise the extent to which service providers need a workforce trained in key AI technologies. This suggests that having a well-trained team is seen as a competitive advantage and a deciding factor for clients when selecting a service provider. 

Rai added that smaller AI firms currently don’t have much influence or recognition among clients regarding their ability to meet training needs. As a result, clients would prefer larger or more established firms for AI training and implementation. 

That is probably why all the bigger IT firms are rushing to call their workforce ‘GenAI ready’, even if that requires minimal training, to appeal to clients. 

What Needs to Change

Krishna Vij, VP of IT hiring at TeamLease Digital, told AIM that upskilling programs in the Indian IT industry have evolved in recent years, focusing on emerging technologies like AI, cloud, and data science. 

“The best programs prioritise hands-on learning and problem-solving. While large companies are investing in structured platforms, there’s still room to tailor programs to specific roles and ensure they go beyond basic knowledge for competence building and driving efficiency,” Vij said.

Vij said that training programs facilitated as ‘click-through’ sessions clearly reveal a design gap, prioritising compliance over actual skill-building. 

“Effective upskilling isn’t about ticking boxes; it’s about formats that truly engage like interactive modules, case studies, hackathons, and real-world applications,” she said. Vij added that many companies are making significant investments in this direction by providing employees with the opportunities to work on live AI projects, which ensures practical learning that translates into meaningful competence and career relevance.

Satya Nadella, on his recent visit to Bengaluru, highlighted that Microsoft was committed to upskilling 10 million people in AI by 2030. Now, this is definitely something to keep an eye out for.

The post The Dark Side of AI Upskilling in Indian IT appeared first on Analytics India Magazine.

]]>
How Indian IT Partnered with AI Startups in 2024 https://analyticsindiamag.com/it-services/how-indian-it-partnered-with-ai-startups-in-2024/ Tue, 07 Jan 2025 03:30:00 +0000 https://analyticsindiamag.com/?p=10160811

Infosys, TCS, Tech Mahindra, Wipro, and HCLTech have all partnered with several AI startups over the last year.

The post How Indian IT Partnered with AI Startups in 2024 appeared first on Analytics India Magazine.

]]>

While some companies build AI solutions in-house, others rely on partnerships with startups and other tech companies to provide AI services for their clients. Indian IT firms, in particular, predominantly rely on startups and big tech as they believe that investing too much capital in building products is not their niche. 

In 2024, leading Indian IT firms like Infosys, TCS, Tech Mahindra, Wipro, and HCLTech significantly advanced their integration of generative AI into their offerings for providing small language models and also building multi-agent systems for their clients in the upcoming years. Much of this is allegedly achieved by partnering with AI startups with bases in India.

Infosys Leads the Way

Infosys has been at the forefront of this AI integration. Chairman Nandan Nilekani has predicted that while companies will increasingly develop their own specialised AI models to enhance efficiency and productivity, service companies will adopt them for their clients.

At the sidelines of the Meta Build with AI Summit, an Infosys representative told AIM, “We are building industry-wide solutions. For market research, we have built a proof-of-concept, and internally, we are leveraging many Llama models for our AI-first journey.”

In addition to these partnerships with Meta, Microsoft, and even NVIDIA to build models for its clients, Infosys also announced its partnership with Sarvam AI  in October to build small language models for banking and IT operations, Infosys Topaz BankingSLM and Infosys Topaz ITOpsSLM.

Infosys also completed the acquisition of 100% of the stake of in-tech for €450 million, which it announced in its Q4FY24 report in April to expand its R&D capabilities. 

Infosys Innovation Fund recently also placed its bet on Bengaluru-based oncology precision startup 4baseCare. The startup is pushing to beat cancer with genomics and AI

TCS and Wipro Invest for Technological Acquisitions

TCS has been involved in significant AI projects, which have contributed to its revenue growth. In June, along with Infosys, Tech Mahindra, and HCLTech, the IT giant partnered with Yellow.ai to build AI solutions. These collaborations aim to use Yellow.AI’s platform to enhance HR and customer service automation solutions.

In March, TCS announced its strategic partnership with SymphonyAI to create predictive and generative AI business applications. SymphonyAI specialises in providing enterprise AI SaaS for financial crime detection and related verticals. Cut to June; the IT firm also partnered with Xerox to improve business outcomes when migrating from legacy data centres to Azure public cloud and integrating generative AI services. 

In November, Wipro partnered with RELEX Solutions, a prominent provider of integrated supply chain and retail planning solutions. The aim is to use Wipro’s deep expertise in the retail and consumer packaged goods (CPG) industries alongside RELEX’s AI platform to deliver improved capabilities for demand forecasting, supply chain efficiency, and operational planning.

This came after Wipro announced its alliance with Cyble for AI-driven cybersecurity risk management solutions in August while working closely with its GCC in India.

Moreover, Wipro is investing to strengthen its capabilities across organisations and making bold moves in mergers and acquisitions (M&A) and acquiring companies like Capco and Rizing, which have boosted consulting capabilities and simplified the company’s operating model, said Wipro chief Srini Pallia during the Q4 2024 earnings call. 

Tech Mahindra’s Love for Startups

Under the leadership of CEO Mohit Joshi, Tech Mahindra is aiming to grow its banking and financial services revenue share to 25% by 2027. Joshi plans to leverage his BFSI expertise to drive this expansion and sees generative AI as an opportunity for the IT sector and the company’s workforce. 

In 2024, Tech Mahindra announced several partnerships with AI startups. In April, the company announced a collaboration with Spain-based Atento to deliver generative AI-powered solutions and services to enterprises worldwide. 

In May, Tech Mahindra partnered with Zapata Computing Holdings Inc. with the aim of enhancing Zapata AI’s quantum-based generative AI solutions. This collaboration aims to scale Tech Mahindra’s capabilities, offering ‘Scale at Speed’ solutions to drive operational efficiency and improve responsiveness for global Communication Service Providers (CSPs).

In August, Tech Mahindra joined forces with LivePerson, a leader in digital customer conversations, to revolutionise customer engagement in the financial services and healthcare and life sciences (HLS) sectors. 

In the same month, Tech Mahindra also collaborated with Horizon3.ai to develop cybersecurity solutions. The partnership integrates Horizon3.ai’s NodeZero platform, which offers autonomous threat detection, AI-powered penetration testing, and Governance, Risk, and Compliance (GRC) insights into Tech Mahindra’s cybersecurity services.

In September, Tech Mahindra also partnered with Discai, a subsidiary of the KBC Group, to provide an AI-powered anti-money laundering (AML) solution. 

HCLTech Provides Solutions to Others

HCLTech is establishing a generative AI technology education and training centre built on IBM’s platform. Along with TCS, in August, the company announced its partnership with Xerox for AI and digital engineering services. 

Most recently, in November HCLTech partnered with Inspeq AI, an Irish-Indian technology firm, to help enterprises worldwide responsibly develop and integrate AI applications. The alliance aims to embed a responsible AI (RAI) layer into application development, seamlessly integrating with existing toolchains like Microsoft Co-Pilot, AWS Bedrock, and other business applications. 

In June, the company collaborated with Tecnotree to develop 5G-led generative AI solutions for telcos. The aim of the partnership is to bring HCLTech’s AI-based communications technology with Tecnotree’s 5G and AI-based BSS platform. 

Apart from investing and partnering with AI startups, HCLTech decided to provide its services to its clients, which it built in partnership with big firms like Intel, SAP, IBM, AWS, Microsoft, and GitHub.

Other Mid-Sized Firms

Mid-sized IT firms are rapidly adopting AI through strategic investments and acquisitions

LTIMindtree has invested $6 million in Voicing.AI, focusing on enhancing conversational, contextual, and emotional intelligence across over 20 languages. The company also partnered with GitHub to train its workforce. 

Meanwhile, Mphasis has launched NeoCrux to improve developer productivity and acquired Silverline to enhance customer experience with conversational AI. Persistent Systems, addressing AI-related privacy concerns, acquired Arrka to strengthen its privacy management platform.

Smaller firms like Happiest Minds and Hexaware have also made AI acquisitions, positioning themselves for the agentic AI revolution. Happiest Minds partnered with MindSculpt in April and with Soroco in February to enhance its AI solutions.

The post How Indian IT Partnered with AI Startups in 2024 appeared first on Analytics India Magazine.

]]>
The H-1B Visa Policy Change Might be Good News for Indian IT https://analyticsindiamag.com/it-services/the-h-1b-visa-policy-change-might-be-good-news-for-indian-it/ Fri, 03 Jan 2025 14:00:00 +0000 https://analyticsindiamag.com/?p=10160721

Apple, Amazon, Google and Microsoft have been more significantly impacted by tightening of H-1B policy than the top H-1B petitioners, including Cognizant, Infosys, TCS and Wipro.

The post The H-1B Visa Policy Change Might be Good News for Indian IT appeared first on Analytics India Magazine.

]]>

The debate over H-1B visa has intensified ahead of US President-elect Donald Trump’s inauguration ceremony on January 20. For Indians working in their hometown and planning to move abroad, the changes in the policy have become a topic of utmost importance, particularly for the Indian IT sector.

According to a report by Macquarie Research, the ongoing challenges in hiring local talent in the US highlight the critical role of H-1B visas in addressing the talent gap. The US has a continuous shortage of qualified technical workers, further exacerbated by the 2.9% unemployment rate in November last year.

This suggests that proposed reforms to H-1B visa policies could benefit Indian IT firms. Despite their relatively low share of total visa positions – around 65,000 every year plus 20,000 for those who receive higher education from US universities – they would still be able to fill those positions. 

Indian IT companies rely heavily on the H-1B visa program to deploy skilled workers in the US. 

Macquarie has decided to maintain an “outperform” rating for TCS, Infosys, and HCLTech, which suggests that the firm believes Indian IT companies are well-positioned for this change. 

Meanwhile, Mrinal Rai, assistant director and principal analyst at ISG, told AIM that the majority of the Indian IT providers were already preparing and reducing their dependency on the H-1B by investing in hiring more local talent. 

“Given India’s vast talent pool, service providers have made significant strides in closing the skill gaps. Besides, many providers have offered upskilling programs to support local talent,” Rai said. 

Rai added that global big-tech companies are impacted more than Indian IT companies. “Technology providers such as Apple, Amazon, Google and Microsoft have been more significantly impacted by the tightening of H-1B policy than the top H-1B applicants, including Cognizant, Infosys, TCS and Wipro,” he said.

Flat Wage Floor Concerns

Macquarie has raised concerns about a proposal to introduce a flat wage floor for H-1B visa holders. A flat wage is a fixed salary that doesn’t adjust for differences in job roles, locations, or cost of living. The report highlighted that such a measure might not be feasible, given the wide disparities in the cost of living across the US.

To reduce these issues, Macquarie suggested transforming H-1B visas into general work permits, similar to Norway’s skilled work permit model. This shift would allow greater job mobility for visa holders, promote competition among employers, and enhance worker protections.

If such changes are implemented, it would largely benefit the Indian IT workforce applying for visas as they would account for regional economic disparities and the complexities of the global talent market.

Krishna Vij, VP of IT hiring at TeamLease Digital, told AIM that larger IT firms are likely to adapt more effectively to the policy change due to their advanced training programs and operational flexibility. “However, smaller and mid-tier firms could benefit if the policy promotes a more equitable distribution of visas, as these companies typically submit fewer applications.”

She added that the firms that have invested in localised operations and US-based hiring will gain a competitive edge, which will reduce their dependence on H-1B visas. This includes firms like Infosys and TCS, which might navigate the changes more effectively. Their experience in managing visa allocations and compliance makes them adaptable to new regulations with minimal disruption.

On the other hand, smaller IT firms or those heavily reliant on H-1B visas may encounter greater challenges. “Mid-tier and smaller IT companies may face heightened challenges due to increased competition for visas, especially if wage-based prioritisation becomes a deciding factor,” Vij said. 

She said that this will result in many smaller firms accelerating their localisation strategies, focusing on hiring US-based talent to mitigate visa risks. “The quota limit will intensify competition among Indian IT companies and force them to submit higher numbers of applications to secure the required slots.”

This might also result in companies prioritising hiring in other countries, like Canada or Mexico, for nearshore delivery centres. “These strategies will help mitigate risks, but they might also increase short-term costs,” Vij added.

Furthermore, past compliance issues and fines have prompted Indian IT firms to enhance their adherence to regulations. For example, Infosys had earlier agreed to pay millions of dollars in fines for US visa violation cases.

Already a Decline

Data from the National Foundation for American Policy (NFAP), a US-based non-partisan think tank, revealed a decline in H-1B visa approvals for Indian IT companies for FY2024. The seven largest Indian IT firms secured approval for just 7,299 new H-1B petitions – a significant drop from 14,792 approvals in 2015.

This is around a 50% decline in acceptance rate for visa approvals. Furthermore, the 7,299 approvals represented only 5.2% of all H-1B visa approvals for FY24, which equates to just 0.004% of the US civilian workforce. Denial rates for H-1B applications were low at 2.5% – slightly down from 3.5% in FY23, as per NFAP’s findings.

For example, Amazon topped the list of H-1B approvals for initial employment in FY24, with 3,871 approvals – down from 4,052 in FY23 and 6,396 in FY22. Cognizant followed with 2,837 approvals, Infosys with 2,504, and TCS with 1,452. The number accounts for less than 0.1% when we compare the number of visa applications for smaller firms.

Meanwhile, a separate report also revealed that Indian IT giants TCS, Wipro, Infosys, and HCL have reduced their reliance on the H-1B visa by 56%. These figures reflect a continued downward trend in H-1B visa approvals for Indian IT firms, but it is expected to change with the current policies or at least stay unharmed.

The post The H-1B Visa Policy Change Might be Good News for Indian IT appeared first on Analytics India Magazine.

]]>
This Indian Startup is Making ‘Black-Box’ AI Models Spill the Beans https://analyticsindiamag.com/it-services/this-indian-startup-is-making-black-box-ai-models-spill-the-beans/ Thu, 26 Dec 2024 03:30:00 +0000 https://analyticsindiamag.com/?p=10147775

This effort aligns with emerging regulations like the EU’s AI Act, which mandates explainability for high-risk AI applications.

The post This Indian Startup is Making ‘Black-Box’ AI Models Spill the Beans appeared first on Analytics India Magazine.

]]>

As models grow in size, one crucial element remains elusive—explainable AI. The larger they grow, the harder it becomes to understand their inner workings. And when we consider models 405b and beyond, it becomes extremely difficult to understand how they arrive at specific outcomes. 

Vinay Kumar Sankarapu, CEO of Arya.AI, perfectly encapsulates the challenge: “Capabilities are fine—you can say your model is 99.99% accurate, but I want to know why it is 0.01% inaccurate. Without transparency, it becomes a black box, and no one will trust it enough to put billions of dollars into it.” 

His statement cuts to the core of the black box dilemma: trust demands understanding.

The Problem is Beyond Hallucinations

A recent study by the University of Washington revealed significant racial, gender, and disability biases in how LLMs ranked job applicants’ names. The research found that these models favoured names associated with White individuals 85% of the time, and names perceived as Black male were never preferred over White male names. 

This study highlights the complex interplay of race and gender in AI systems and underscores the importance of considering intersectional identities when evaluating AI fairness.

While we spoke to Mukundha Madhavan, tech lead at DataStax, about model hallucination and the size of the model, he said, “Foundation models, their training, and architecture—it feels like they are external entities, almost victims of their own complexity. We are still scratching the surface when it comes to understanding how these models work, and this applies to both small and large language models.” 

He added that size doesn’t matter. Whether a model boasts 40 billion or 4 billion parameters, these are just numbers. The real challenge is to make sense of these numbers and understand what they represent.

Sankarapu pointed out a paradox in AI development: “We are creating more complicated models that are harder to understand while saying alignment is required for them to become mainstream.” He noted that while AI systems have scaled through brute force—using more data and layers—this approach has plateaued. Now, efficiency and explainability need to scale alongside model complexity.

Arya.AI has been working on solving this issue through its proprietary ‘Backtrace’ technique, which provides accurate explanations for deep learning models across various data types. Sankarapu explains, “We want to create explainability that is very accurate and can work for any kind of model so that LLMs are no longer black boxes but white boxes.” 

This effort aligns with emerging regulations like the EU’s AI Act, which mandates explainability for high-risk AI applications.

Sankarapu adds, “Once you start understanding these models, you can do tons of things around them—like improving efficiency or unlocking new research areas.” This positions explainability as a catalyst for both innovation and operational efficiency.

In response to these challenges, Arya.AI has developed AryaXAI, which provides precise, granular explanations for AI decisions across various model architectures, making it particularly valuable for enterprise applications. 

AryaXAI stands out by offering feature-level explanations that help users understand exactly which inputs influenced a model’s decision and to what extent. The platform can analyse both structured and unstructured data, providing explanations for decisions made by complex neural networks, including those processing images, text, and tabular data. 

A key differentiator is AryaXAI’s ability to provide explanations in real time, making it practical for production environments where quick decision-making is crucial. The platform generates natural language explanations that are easily understood by both technical and non-technical stakeholders, bridging the gap between AI capabilities and business requirements. 

For financial institutions, specifically, AryaXAI offers detailed audit trails and compliance documentation, addressing regulatory requirements while maintaining model performance. The platform’s ability to explain decisions in human-readable terms has made it particularly valuable in sectors where transparency is non-negotiable, such as banking, insurance, and healthcare.

What Should be the Vision for Autonomous Systems

On the future of autonomous systems in banking, Sankarapu shared Arya.AI’s focus on aligning models with end goals through feedback loops and explainability. “To build truly autonomous agents capable of handling complex tasks like banking transactions, they must be explainable and aligned with user expectations,” he said. 

Madhvan proposed a three-pronged approach to reduce hallucinations. The first part is model explainability research which focuses on uncovering how AI models make decisions by analysing their embeddings, attention mechanisms, and other internal processes, which are often opaque. This research is essential for building trust and transparency in AI. 

The second is model alignment, which ensures that AI models behave as intended by aligning their outputs with human values and reducing issues like hallucinations or unintended biases. 

Finally, practical implementation prioritises creating reliable systems for real-world applications by incorporating safeguards and guardrails that allow models to operate effectively within specific business contexts, even if complete transparency is unattainable. 

Together, these approaches aim to balance the growing complexity of AI systems with operational reliability and ethical considerations.

Arya.AI is also exploring advanced techniques like contextual deep Q-learning to enable agents to handle tasks requiring memory and planning. However, Sankarapu cautions against over-focusing on futuristic visions at the expense of current market needs. “Sometimes you get caught up with too much future vision and lose sight of current realities,” he concluded.

The post This Indian Startup is Making ‘Black-Box’ AI Models Spill the Beans appeared first on Analytics India Magazine.

]]>
Meet Shivaay AI, the Indian AI Model Built on Yann LeCun’s Vision of AI https://analyticsindiamag.com/it-services/meet-shivaay-the-indian-ai-model-built-on-yann-lecuns-vision-of-ai/ Mon, 23 Dec 2024 08:26:41 +0000 https://analyticsindiamag.com/?p=10144158

The team’s approach addresses the widespread notion in India that GPT models and similar architectures aren’t accessible or effective for local needs.

The post Meet Shivaay AI, the Indian AI Model Built on Yann LeCun’s Vision of AI appeared first on Analytics India Magazine.

]]>

They say creating a foundational model in India is incredibly challenging and blame it on the constraints on computational resources and the unavailability of high-quality data. While we do have plenty of open-source data, the computationally expensive part is pre-training—essentially training the model to predict the next token.

Two Indian engineering students, Rudransh Agnihotri and Manasvi Kapoor, recently launched an AI startup called FuturixAI. At first, the team released Mayakriti, an image-generation platform that created lifelike images. Later, the duo decided to build an AI model that competes with OpenAI’s GPT—and built it from scratch.

“Joint embedding and parameter sharing is not a widely discussed architecture,” Agnihotri told AIM. “We explored various approaches and identified models like Llama 2, Qwen, and Gemma. These models form part of a joint embedding architecture,” Agnihotri said, adding that they drew inspiration from Meta AI chief Yann LeCun’s vision of autonomous machine intelligence.

This is how the team developed Shivaay, an AI model consisting of 4 billion parameters built on this joint embedding architecture, which leverages the three models for data. This unique approach gives the model a knowledge base of all the three models. 

For inference, the team leverages NVIDIA A100 80GB GPUs via Google Cloud, which explains the fast response time on the server when AIM tried it. The startup is part of the NVIDIA Inception program, so it is currently offered free of cost. 

The model’s API is also available on Futurix’s website.

Agnihotri shared the model details with AIM. Shivaay already outperforms larger state-of-the-art models on benchmarks like MMLU and MMLU-Pro by a remarkable margin of 10-15 points. 

The benchmark also shows that Shivaay is excellent at reasoning and mathematical calculations.

Agnihotri claimed that when it comes to Indic use cases, the model was, in fact, better than Krutrim, Sarvam, and others coming up in the market. “The goal is to empower Indian developers and businesses to build their own AI agents and applications without relying on foreign models like GPT,” Agnihotri said.

Scaling the Right Way

The team’s approach addresses the widespread notion in India that GPT models and similar architectures aren’t accessible or effective for local needs. “We aim to change that by offering our service for free, much like how OpenAI initially provided GPT-3.5 at no cost. This allows users to explore and trust the model’s capabilities,” Agnihoti added.

Regarding scaling, the current priority is user acquisition. The team just recently improved the chatbot’s user interface and, in just 15 days, saw a spike in signups, crossing 1500+ users, most of which was achieved via Reddit. “Our goal is to convince users that models like Llama 2, Qwen, and Gemma can be just as good, if not better, than existing solutions [for Indian use cases],” Agnihotri said. 

For training beyond these models, Futurix utilised datasets like the GATE and IIT exam questions and answers. Agnihotri claimed that this aligns with chain-of-thought reasoning, which, in turn, aligns with the current paradigm of OpenAI’s o1 and GPT-4o. It enables users to test step-by-step responses and validate logical reasoning.

Agnihotri also said the company aims to allow developers to build AI agents for different vertical tasks. 

The company plans to raise more funds soon and release the technical paper for the Indian developer ecosystem. Agnihotri also highlighted how LeCun’s philosophy and open-source approach inspired him. 

Agnihotri is a third-year mechatronics engineering student from Delhi Skill and Entrepreneurship University. Kapoor is currently in his second year of pursuing electronics and communication engineering, specialising in AI and ML, at Netaji Subhas University of Technology.

Agnihotri was once a JEE aspirant who couldn’t make it to an IIT, but that setback didn’t lessen his love for math. Now, with FuturixAI and Quantum Works, the young founder aims to push forward research in the AI field in India, using research in math and physics. 

“Google has the capability, dataset and compute, but at the same time, we have our own methods that are evolving with time,” he said.

The post Meet Shivaay AI, the Indian AI Model Built on Yann LeCun’s Vision of AI appeared first on Analytics India Magazine.

]]>
‘AI Research at Indian Universities Feels like a Solo Journey’ https://analyticsindiamag.com/it-services/ai-research-at-indian-universities-feels-like-a-solo-journey/ Sat, 21 Dec 2024 06:20:29 +0000 https://analyticsindiamag.com/?p=10144071

The biggest factor is the massive funding gap between Indian and Western universities.

The post ‘AI Research at Indian Universities Feels like a Solo Journey’ appeared first on Analytics India Magazine.

]]>

Funding remains one of the most significant challenges in accelerating AI research in India. Compounding this challenge is a lack of motivation among researchers to pursue revolutionary innovations. However, university researchers alone can’t be held responsible for this, especially when compared to the West – where strong ecosystems enable the development of great solutions. 


To give an example, professor and university distinguished scholar of CS and engineering at Ohio State University, Dhabaleswar K (DK) Panda, presented his newly-established $20 million (National Science Foundation) NSF-funded AI Institute Intelligent Cyberinfrastructure with Computational Learning in the Environment (ICICLE) at the IEEE HiPC 2024 event in Bengaluru this week. 

Panda, who is also the founder of X-ScaleSolutions, highlighted how his initiative is focused on building the next generation of high-performance compute for big data, machine learning, and the future of deep learning. The focus is simply on the democratisation of AI solutions instead of keeping it in check by a few big players like Microsoft, Google, and others. 

Panda is focused on solving many use-case problems, such as those in agriculture and manufacturing. To scale this solution beyond the US, Panda is looking to partner with several Indian institutes, such as IIT Bhubaneswar and TCS Research.

Against the backdrop of the HiPC event, AIM spoke with Panda to better understand the project and asked why such initiatives are rarely seen in the Indian research ecosystem. Panda agreed with much of this, saying that things are slowly changing for Indian universities. 

Too Much Focus on Research Alone

The biggest factor is the massive funding gap between Indian and Western universities. Research in India is often constrained by limited resources, which makes it hard to take on long-term or foundational projects, especially in fields like AI.

The ICICLE project focuses on researching open-source projects that can motivate researchers from across the globe to solve problems. This is similar to something AI4Bharat or other such initiatives are building in India, but this has only been the case for the last few years. 

“The ideas are there [in India]; it depends on resources, manpower,” Panda said. He highlighted that since ICICLE got this funding from the National Science Foundation, it was able to put together the right team with the right expertise.

“Thirty years back, the gap between US universities and Indian universities was huge, but now that gap is shrinking. Most of it is about funding. As long as the university is well funded, it will scale since the talent is here.”

Panda, however, added that the researchers need to be trained so that they can move to the right field at the right time. “US universities are still focusing on fundamental research…If a student or faculty gets funding, they focus on short-term development.” Expressing optimism, Panda said he believes the most interesting research in AI could soon come from an Indian university.

A Common Sentiment

Agreeing with Panda’s sentiment, Amit Sheth, the chair and founding director of the (AI Institute, University of South Carolina) AIISC, told AIM that few institutions are increasingly putting students’ research first, such as IIT-D, IIIT-D, IIIT-H, IIT-M, IIT-P and IIT-Mumbai. Only a handful of universities are able to publish research in top conferences. 

“In the USA, all the projects they get to work on involve advancing the state-of-the-art (research),” Sheth said. He also highlighted the issue of a publication racket prevalent in India and several other developing countries, with only a handful of researchers from select universities standing out as exceptions.

Commenting on the disparity in research quality, Sheth noted that the gap is non-existent or very small in the universities mentioned above.

“But if you go to your run-of-the-mill NIT, IIITs, technical colleges, etc., the gap is not shrinking and is huge,” Sheth said. According to him, there are some enterprising students at those universities who, in their 3rd and 4th year of pursuing a bachelor’s degree, reach out to US groups to do online internships. 

“This allows them to work on the latest topics and often co-author a paper or two, which then significantly improves their chances of getting accepted at top research programs and professors during their graduate studies.”

Moreover, Renjith Prasad, a teaching assistant at the AIISC, told AIM that the core reason lies in the culture. “In India, the focus is so heavily on placements that research often feels like an afterthought. A lot of students I saw preparing to pursue Masters or PhD through Gate did not do it because they’re passionate about research, but because they didn’t land a good job right after undergrad.”

Prasad added that this results in a system where professors are often hesitant to invest deeply in fostering long-term research. “I felt this personally during my Master’s thesis, which was mostly guided by a senior PhD student at AIISC, instead of my own professors in India, and I don’t think I would have completed it without her.”

‘Research Feel like a Solo Journey in India’

Giving his personal example, Prasad said that when he joined IIT Jammu, the first thing he was told was to start preparing for placements through platforms like LeetCode. “The biggest club on campus wasn’t a research or innovation group – it was the placement club.” “Even professors understand that most students are here to chase the best packages, and the few who are genuinely interested in research either head abroad or try to collaborate with someone there.” 

AIM earlier wrote that Indian researchers should move beyond PhDs. We pointed out that Indian colleges are generally lagging in terms of where the state-of-the-art is and what they are doing because some of the professors still might be researching in CNN, whereas the SOTA is way ahead.

Pratik Desai, founder of KissanAI, had shared similar thoughts earlier when he said that this requires a fundamental shift from coaching and academia to a change in mindset from parents and founders to investors. 

Ritvik G, a graduate student at AIISC, told AIM that the education system was set up in a fashion to churn out developers and self-minded individuals rather than adapting to new times. “From the very moment we joined the university, we were taught that grades mattered more than learning.”

Ritvik further said that the biggest factors for meaningful research in his university in India were an outdated learning environment, limited exposure to AI innovations, limited collaborative support, and a monotonous research environment. 

This is often the case with Indian Institutes. “There is heavy focus and overall improvement on the core AI frontier such as NLP and Computer Vision, where most of the recent top conference and journal publications from India exist, but the very lack of diversity amongst teams leads to the development of systems that often are very targeted/outdated in a few years,” Ritvik added.

The post ‘AI Research at Indian Universities Feels like a Solo Journey’ appeared first on Analytics India Magazine.

]]>
2025 will be the Year of AI ROI for Indian IT and Enterprise https://analyticsindiamag.com/it-services/2025-will-be-the-year-of-ai-roi-for-indian-it-and-enterprise/ Fri, 20 Dec 2024 03:30:00 +0000 https://analyticsindiamag.com/?p=10144017

While only 26% of workers worldwide use AI daily, the figure is nearly double in India, with 45% reporting daily usage.

The post 2025 will be the Year of AI ROI for Indian IT and Enterprise appeared first on Analytics India Magazine.

]]>

As we enter 2025, we see companies anxious about AI’s return on investment (ROI). Throughout 2023 and 2024, the biggest focus for Indian IT and enterprises was moving from ideas to proof of concept (POC) to products. However, according to recent reports, Indian companies are not shying away from investing more in AI.

This is also in line with Accenture’s Q1 FY25 earnings call. The company reported record generative AI bookings of $1.2 billion, bringing the total to $4.2 billion since September 2023. This marks the highest quarterly bookings in the segment, reflecting growing client investments in generative AI. This is also a positive hint for generative AI growth for Indian IT for the coming year.

According to a new Freshworks report on AI in the workplace, 90% of Indian companies believe AI solutions are essential for achieving business success by 2025. India is setting global benchmarks in AI integration and innovation. Notably, 79% of organisations plan to augment their AI budgets, with spending expected to increase by an average of 41% from 2024.

India’s position as the world leader in AI investment is fueled by its strong confidence in ROI. Shelton Rego, vice president of Freshworks India, said, “The combination of mandatory AI integration and workforce upskilling initiatives is driving India’s productivity and innovation to new heights.”

The Freshworks report includes perspectives from around 4,000 professionals, including C-suite executives from HR, IT, customer service and support, and legal departments.

Indians Use More AI than the Others

Interestingly, the report also highlights the optimistic approach of the Indian workforce when it comes to using AI solutions. While only 26% of workers worldwide use AI daily, the figure is nearly double in India, with 45% reporting daily usage.

A staggering 74% of Indian professionals consider themselves knowledgeable or experts in AI, which is significantly higher than the global average of 55%. This confidence translates to comfort and trust, as 91% of employees in India feel at ease using AI tools (compared to 67% globally), and 90% trust AI to enhance work processes (versus 66% globally).

India also leads in implementing mandatory AI-use policies, with 65% of employees required to incorporate AI into all or part of their work— the highest globally. This trend is particularly evident in IT and marketing, where employees highlight AI’s speed, effectiveness, and necessity for achieving results.

Weekly AI usage among Indian workers is unparalleled, with 88% utilising AI-powered software, outpacing regions like the US, UK, and Europe. As a result, 85% of organisational leaders in India report improved ROI from AI investments, with IT and marketing departments seeing the greatest benefits.

This was also confirmed by the recent IBM report on APAC AI Outlook 2025. It reveals that Indian enterprises are now almost done with experimentation and are looking for returns on their AI investments. 

According to the report, many Indian respondents anticipate long-term benefits from AI in areas such as innovation (26%), revenue generation (21%), cost savings (12%), and improved employee productivity (12%). 

“In 2025, AI is set to be the game-changer for Indian enterprises, revolutionising productivity and enabling unprecedented scalability,” said Sandip Patel, MD, IBM India & South Asia. He added that the focus this year would be on leveraging AI responsibly to drive real business value, “moving beyond low-risk experiments to strategic initiatives that provide a competitive edge and improved ROI”. 

Krishna Vij, VP – IT staffing at TeamLease Digital, told AIM that achieving ROI by 2025 depends on several key strategies. “The primary consideration should be focusing AI applications on high-impact areas like customer experience, strategy optimisation, and operational efficiency to realise quicker, measurable results,” she said. 

Vij added that the second area of investment should be in employee training and upskilling, which is crucial to maximise the potential of AI tools and ensure smooth integration into existing workflows. 

“The third area of interest should be to embed AI into core business functions such as IT, marketing, and finance which can help drive innovation and productivity,” Vij added. This is how businesses expect AI to contribute 10-20% to revenue growth by 2025.

Adoption has Begun, Finally

Interestingly, as the conversation around generative AI shifts towards agentic AI and small models that require less investment, Indian enterprises are showing a readiness to achieve ROI. However, with Indian IT companies embedding generative AI into everything, measuring the exact ROI driven by AI remains a challenge. 

“I predict that every single dollar of revenue will have a generative AI component embedded in it,” Nachiket Deshpande, COO at LTIMindtree, told AIM. Considering that the impact of AI is much bigger than any revenue component any of the service companies will give, Deshpande added, “Counting AI revenue is not a measure that we would like to go after because I think it misrepresents the impact of AI.”

Measuring the ROI of GenAI investments is not as straightforward as calculating the savings from a new software tool. It involves a blend of quantitative and qualitative factors, from time saved to human value unlocked. Vijay Raaghavan, the head of enterprise innovation at Fractal, told AIM that unlocking human potential and counting the hours saved would be the ideal way to measure ROI.

The initial focus on short-term gains in generative AI projects is evolving into a deeper understanding of AI’s broader potential. Organisations are moving from low-risk, non-core applications to integrating generative AI into core business functions to gain a competitive edge and enhance ROI. 

Similarly, Mrinal Rai, assistant director and principal analyst at ISG, told AIM that most generative AI use cases are not yet fully in production and are mostly at the PoC stage. “2025 will see a shift from smaller use cases into larger transformation programs,” he said. “The most important ‘return’ identified by clients with AI is the percentage of improvement in efficiency that reduces person-hours/ time taken or improves internal processes.”

Nearly 60% of organisations in the Asia-Pacific region expect to see returns on their AI investments within two to five years, while only 11% anticipate benefits within the next two years.

Along similar lines, Viswanathan K S, former NASSCOM VP (industry initiatives), told AIM that what we are currently witnessing is merely a lull before the storm when it comes to Indian IT building AI products as it requires a large amount of data to build something tangible. “Let’s not get obsessed with services as of now, but build productionisation of services, as the data is only going to grow several times five years from now,” he said. 

“We are just waiting for the adoption to take place. The next big thing in AI for the global market would come from Indian IT as we wait for the productionisation of the existing services,” added Viswanathan, and said that Indian IT should currently focus on collecting data necessary for the models.

The post 2025 will be the Year of AI ROI for Indian IT and Enterprise appeared first on Analytics India Magazine.

]]>
Agentic AI is Now On Mid-Sized Indian IT’s Mind https://analyticsindiamag.com/it-services/agentic-ai-is-now-on-mid-sized-indian-its-mind/ Tue, 17 Dec 2024 07:00:00 +0000 https://analyticsindiamag.com/?p=10143698

The biggest reason for not building in-house models for Indian IT firms is that generative AI is too expensive to experiment with.

The post Agentic AI is Now On Mid-Sized Indian IT’s Mind appeared first on Analytics India Magazine.

]]>

The era of agentic AI is upon us, and surprisingly, Indian IT is also ready for this revolution. What is even more surprising is the fact that mid-sized and small IT firms are also ready for adoption. As usual, they are taking the acquisition route to achieve these capabilities.

The latest example is LTIMindtree. Last week, the IT firm announced that it has committed $6 million to Voicing AI, a US-based startup specialising in human-like AI voice agents. This investment aims to bring human-like voice capability across more than 20 languages with conversational, contextual, and emotional intelligence.

Moreover, LTIMindtree has formed an alliance with GitHub for its Copilot coding tool to train its workforce on the GitHub ecosystem.

Similarly, Mphasis has focused on conversational AI to boost user experiences. In July, the IT solutions provider launched NeoCrux, a tool for improving developer productivity by streamlining software developer cycles with AI agents and orchestrators. 

In October 2023, the company acquired Silverline, a New York-based Salesforce partner, for $132.5 million. During the announcement, Nitin Rakesh, CEO of Mphasis, said, “The acquisition aligns with our strategy to drive customer experience transformation, modernise contact centres, and enable conversational AI automation to meet evolving client needs.” 

It seems evident that Mphasis has decided to embrace AI and agentic AI for customer interactions as well. Though it is not clear whether the acquisition of Silverline increased Mphasis’s current capabilities, the impact is definitely visible. 

Persistent Systems, on the other hand, is addressing AI-related privacy challenges. In September, the company acquired Pune-based data privacy consultancy Arrka for ₹14.4 crore ($1.7 million). “Arrka’s mature frameworks and privacy management platform ensure a scalable and governance-driven approach, critical for successful AI implementations,” Persistent Systems CEO Sandeep Kalra said in a press release.

The Approach Differs from the Big IT

Meanwhile, bigger IT giants like TCS, Infosys, and Wipro have decided to develop a different strategy for building AI agents for their customers. In the latest quarterly results, TCS posted $1.5 billion in bookings with AI and Infosys has started building small language models and multi-agent frameworks for its clients.

A similar situation was observed at Wipro, HCLTech, and Cognizant.

TCS, India’s largest IT services exporter, launched a new AI-focused business unit, AI.Cloud, in May, by merging its cloud and AI divisions. Infosys, the second-largest IT firm, unveiled Topaz, an AI suite offering solutions and services. Notably, it has several clients in its portfolio.

The mid-sized IT companies are, however, positioning themselves differently here. Speaking with AIM, Nachiket Deshpande, COO at LTIMindtree, said Indian IT firms often do not want to build these capabilities in-house. Commenting on the recent deal with Voicing AI, Deshpande added, “There are so many startups that are coming all around the world. We would want to leverage those startups for those technologies that are coming up.”

Deshpande’s biggest reason for not building in-house models is that generative AI is so expensive that it doesn’t make sense to experiment with it while investing so much.

“I don’t think the technology in generative AI would be the real differentiator because of the level and the speed of innovation that is happening, whatever technology you may develop will become obsolete within a few months, and you have to keep doing that,” Deshpande said, adding that it requires a lot of capital.

Mohandas Pai, the founder of Aarin Capital and former CFO of Infosys, also told AIM that IT services companies are not built to invest disproportionate capital R&D. “Indian IT services companies are not product companies.”

“Creating an LLM or a big AI model requires large capital, time, a huge computing facility, and a market. All of which India does not have,” Pai pointed out, adding that even though Infosys, TCS, and others might have the funds, their focus is to provide vertical solutions and not horizontal ones like ChatGPT.

Why Is Everyone Bullish on Agentic AI?

Similarly, Deshpande also said that the services companies’ P&L is structured differently than that of product companies. “They operate with 80% cross-market, and hence they have the ability to continue to do R&D, and we operate at 30-35% cross-market,” he explained.

He believes that if a company invests and builds a particular technology, tomorrow it might become irrelevant. “I need to get 86,000 people to reimagine their work with AI, but I only need 2,000 people to build AI solutions.”

“The differentiation of LTIMindtree will lie in terms of how we adopt AI, rather than saying I have another shiny toy which is better than somebody else. Because that differentiation is short-lived,” he added.

Apart from the technological capabilities, Deshpande said it’s important that IT companies give startups the space they need to grow on their own. “Hence, acquisition was not the best way, but at a meaningful scale was a better way where we get to leverage the technology, we get to take them to the customer, but also provide those entrepreneurs space to continue their innovation,” he further said.

With generative AI systems going autonomous and several big-tech firms announcing models like Devin and others, Indian IT is also ready to adopt that change quickly. Deshpande said that one of the biggest reasons for this is the outlook on productivity. “The idea of productivity outlook is persona-centric, and you have to look at each persona becoming more and more productive,” Deshpande said. 

Hence, he said that agentic systems, which do not examine a particular business process or task but the entire persona and try to automate large parts of those personas, are the way to adopt that productivity. That’s why agentic systems are becoming increasingly focused.

Earlier this year, small IT firms were taking the acquisition route for building AI capabilities. Happiest Minds, Hexaware, Quest Global, Coforge, Sonata, and GlobalLogic have all announced acquisitions in this space. Now, the conversation has shifted to agentic AI.

The post Agentic AI is Now On Mid-Sized Indian IT’s Mind appeared first on Analytics India Magazine.

]]>
BITS Pilani’s ‘Zero Attendance Policy’ Has a Lesson for All Indian Universities https://analyticsindiamag.com/it-services/bits-pilanis-zero-attendance-policy-has-a-lesson-for-all-indian-universities/ Mon, 16 Dec 2024 05:54:33 +0000 https://analyticsindiamag.com/?p=10143595

The evaluation of students is done entirely on the basis of examinations and assessments at the end of the semester.

The post BITS Pilani’s ‘Zero Attendance Policy’ Has a Lesson for All Indian Universities appeared first on Analytics India Magazine.

]]>

Several Indian universities have been giving a year off to students interested in building startups. Leading institutes, including BITS Pilani, IIT Madras, IIT Hyderabad, DIT University, IIT Bombay and IIT Kharagpur, have incorporated this initiative. These ‘temporary withdrawal programmes’ allow students to take a break from their academic studies and pursue their entrepreneurial calling. 

BITS Pilani takes it to another level. The ‘zero attendance policy’, which means exactly what it sounds like, doesn’t require students to attend lectures. Their evaluation is done entirely on the basis of examinations and continuous assessments done during the semester. This, according to the university, pushes a lot of students to pursue entrepreneurial paths and build startups for the country.

For example, unicorns like Swiggy, redBus, Groww, bigbasket, SanDisk, Postman, Maser Group, Eruditus, OfBusiness, FalconX, and MPL, were all born out of BITS Pilani. 

This begs the question – should all Indian universities adopt a similar policy to some extent? If the BITS experiment has proven the zero attendance policy since 1970s, why can’t such policies be adopted at IIT Delhi or any other IITs or NITs?

Though the policy has existed at BITS Pilani for almost five decades, this time it has sparked a debate on social media platforms where professors, students, and policymakers alike are discussing whether or not such an approach would work for other institutes. 

How Does the Policy Work?

BITS Pilani campus group vice-chancellor, professor V Ramgopal Rao, joined the debate. He referenced the profiles of BITS Pilani alumni and compared them to the top three IITs in the country.

“[There are] over 6,400 founders/co-founders of companies, 15 unicorn founders, over 7,500 CEOs/VPs in large corporates, over 3,500 faculty members in top institutions the world over,” Rao explained that BITS Pilani has been supporting its students for being on top of their game and the attendance policy has definitely played its part.

This is also one of the reasons why the acceptance rate at BITS Pilani is around 2%, which is similar to that of any of the old IITs. 

While speaking with AIM, Rao clarified that ‘zero attendance’ actually means that attendance is not an eligibility criterion for appearing for the exams. “The whole semester and throughout the year, there are several quizzes and mid-sem exams, where the weightage is distributed so students are always preparing for the courses.”

He further added that since BITS has no reservations, students get selected with a very narrow window of merit.

He continues to add that it is also the responsibility of the faculty to be innovative and attract students to class. “Students can also select their teachers in BITS Pilani and make their own timetable,” he said, adding that while this works for BITS Pilani, it would not be ideal for every other institute.

Rao said that BITS Pilani has a Teaching Learning Centre where every faculty member who joins the institute undergoes at least three months of training on pedagogy and ways to attract students. None of the IITs offer anything similar. 

This ensures that the faculty at BITS remains the very best. “Every cycle, we receive about 7,000 applications, and we appoint 50-60 people,” he added.

The most important thing that Rao pointed out is that if one takes out any of these metrics and criteria from that equation of achieving the zero attendance policy and just implements it blindly, it can cause havoc.

Not For Everyone

Having spent 25 years at IIT Bombay and IIT Delhi, Rao said that this policy was implemented there as well, but the choice was left to the faculty. “I told my students that attendance is not compulsory; you come, and you clear the exams. I used to set a limit, and if students scored above it, they passed.”

Giving an example of the two recent suicides of students of IIT Guwahati, where they were not allowed to appear for exams over the compulsory attendance of 75%, Rao said that such policies are bad. He believes that freedom should be given at least to the faculty to decide if students are required to attend their classes if there is no policy because, in certain cases, students are forced to attend classes even if the teacher is not there.

Initiatives like the zero attendance policy also enable students to pursue entrepreneurial journeys. Rao said that he has been speaking to several alumni, and all of them appreciated the attendance policy of BITS Pilani as it allowed them to optimise their time to gain hands-on experience for building startups. 

“If we tied our students to classes, they couldn’t have done all that they were able to do outside the classroom.”

It is clear that the ‘zero attendance policy’ of BITS Pilani is not for every institute as it requires several bits and pieces to fall together to work properly. At the same time, the question of the zero attendance policy as a rule for every university is also still up for debate.

Manoj BS, professor and associate dean at IIST, agreed that the policy is not for every institute. 

He said that the policy can be effective only if the assessment is done appropriately. “Mandating a certain percentage of attendance is very, very important. Otherwise, there can be some instances where students can be completely away, [impacting] the quality of education.”

He added that creating one special program on entrepreneurial activities where attendance can be made non-essential is a better approach.

The thing to be noted is that the classes are always being conducted. It’s the students’ choice whether they want to attend or not. Amit Sheth, the chair and founding director of AIISC, told AIM that BITSians are exposed to many opportunities through this policy, which makes them ideal for an entrepreneurial journey. 

“Since BITSians have more opportunities to do things such as internships and practice schools, they have more exposure to what it takes to be an entrepreneur,” Sheth said. He added that this helps find partners, network, and reach potential investors. “This could be a reason they feel ready to try their hands at entrepreneurship earlier than the norm.”

Similar to Rao, Sheth also pointed out that this freedom may not work as well for most institutions, but for BITSian, this has demonstrably worked. “BITSians also have many learning and extracurricular options outside of course material and classrooms, which can lead to more personalised preparation for the student’s next step after the BITS,” Sheth added.

The post BITS Pilani’s ‘Zero Attendance Policy’ Has a Lesson for All Indian Universities appeared first on Analytics India Magazine.

]]>
Indian IT Salaries Might Finally Increase Next Year https://analyticsindiamag.com/it-services/indian-it-salaries-might-finally-increase-next-year/ Tue, 10 Dec 2024 13:00:00 +0000 https://analyticsindiamag.com/?p=10143159

Since both ITs and GCCs are competing for the same talent pool, with the GCCs offering significantly higher packages and better designations to freshers, it gets challenging for IT.

The post Indian IT Salaries Might Finally Increase Next Year appeared first on Analytics India Magazine.

]]>

All eyes are on Indian IT, with sky-high expectations for the years ahead. While billions of dollars of investments and bookings paint just one part of the picture, the other side hinges on generative AI, which is expected to change the hiring strategy and influence employee salaries. 

Based on an analysis by Pareekh Jain, CEO of Pareekh Consulting, IT companies are significantly increasing their hiring of candidates with AI-related skills. Candidates with these skills made up about 10% of total hiring last year, but their share is projected to rise to 20-25% this year.

If companies expand their entry-level workforce by 10% in the future, nearly half of the new hires are expected to be in AI-focused roles. This indirectly translates to the fact that the new hires with the latest AI skills would demand higher salaries as well.

Krishna Vij, VP at TeamLease Digital, told AIM that AI tools will enhance productivity by automating repetitive tasks. They will also create demand for specialised roles to manage and optimise AI systems effectively. “While team sizes may not drastically reduce, organisations are likely to focus on upskilling talent to balance costs,” said Vij. 

About 20-25% of fresher hiring in the IT sector is now targeted towards AI skills. This is an uptick from a dismal 5-10% over the last three years. This might result in an upswing of hiring for talented freshers but a downswing for others without the skills. 

Therefore, though the salaries for the ones with AI skills might be higher, the ones without them might witness a significant drop as well.

GCCs Driving the Salary Hike

As GCCs establish their presence in India, there’s a growing opportunity for the young talent in the country to leverage their skills in generative AI for a lucrative career with them. According to consulting firm ANSR, about 90% of GCCs in India plan to harness the potential of AI, ML, and cognitive computing in the next 2-3 years. 

A recent report reveals that GCCs offer salaries 12-20% higher than those in IT services and other non-tech industries for comparable tech roles. Vij agreed that the expansion of GCCs in India is driving up salaries, especially for niche roles in AI, data science, and cybersecurity. This also creates competition for skilled talent and sets new benchmarks. 

“This has prompted IT companies to increase salaries for specialised roles to retain and attract talent. However, to manage costs, many IT firms focus on hiring freshers and upskilling them, building a strong talent pipeline. These companies may need to refine their strategies to remain competitive in attracting and retaining skilled professionals,” Vij explained.

For instance, GCCs provide salaries ranging from INR 9.7 lakh to INR 43 lakh per annum for software developers, depending on experience. In contrast, the IT products and services sector offers around INR 5.7 lakh per annum for entry-level roles, with salaries going up to INR 17.9 lakh per annum for professionals with over eight years of experience.

Moreover, entry-level talent at GCCs in India is attracting pay packages that are, on average, up to 30% higher than entry-level salaries across all sectors. The same should get replicated for the Indian IT workforce that has been collaborating with GCCs over the past few years. For companies like TCS, Infosys, and others, the hike largely depends on their performance. Still, freshers might witness a small bump in salaries, given the larger ecosystem in India. 

Mrinal Rai, assistant director and principal analyst at ISG, told AIM that since AI skills are in demand, people with these skills would come at a higher cost. So far, there hasn’t been any substantial change in the salary packages of service providers for freshers. 

“However, since both service providers and GCCs are competing for the same talent pool, with the GCCs offering significantly higher packages and better designations to freshers, it gets challenging for service providers,” Rai explained.

The fact that Indian IT is reluctant to hire freshers is not new; it continued throughout 2024. However, things might change next year as it remains one of the biggest concerns for many. According to experts in the HR industry, hiring has become a focal point for several IT services companies.

Finally Better Salaries

Pinkesh Kotecha, chairman and managing director of Ishan Technologies, told AIM that he predicts a 10-15% higher salary for freshers while a salary hike of up to 15% for experienced professionals in the coming year. 

“The focus while hiring will be on how techies have upskilled and prepared themselves to be valuable to the company. There is a need for collaboration between academia, technology companies, industry bodies, and professionals to upskill regularly to meet the evolving demands of the modern tech world,” Kotecha said.

In their latest earnings calls, most Indian IT giants have promised to continue reducing their bench size and hiring freshers to increase the headcount. 

TCS aims to onboard 40,000 freshers this fiscal year, while Infosys plans to hire 15,000-20,000. Wipro and HCLTech are targeting 10,000-12,000 and approximately 10,000 fresh graduates, respectively. Including mid-tier IT and engineering services firms, around 1 lakh freshers have received offer letters in the ongoing placement season.

This was also resonated by Mohandas Pai, founder of Aarin Capital and former CFO of Infosys, when he told AIM that the hiring will also be affected because around 200,000 people will start retiring early next year in India and the companies will have to fill the positions quickly. 

According to a report by ServiceNow, AI is going to create around 2.73 million tech roles by 2028, a lot of which would be in the fields of software development, web development, software testers, and data analysis. With AI assisting them, the report says that the productivity gain would be able to help them transition into advanced roles as well. 

With the average package of INR 3.5 LPA in Indian IT, when the demand for AI skills increases, so will the demand from the engineers possessing these skills. 

The post Indian IT Salaries Might Finally Increase Next Year appeared first on Analytics India Magazine.

]]>
What to Expect from Indian IT in 2025 https://analyticsindiamag.com/it-services/what-to-expect-from-indian-it-in-2025/ Fri, 06 Dec 2024 13:17:36 +0000 https://analyticsindiamag.com/?p=10142637

Hiring of freshers will finally increase as the companies are already working on only 85% of their capacity.

The post What to Expect from Indian IT in 2025 appeared first on Analytics India Magazine.

]]>

Indian IT has had a fruitful 2024. When it comes to generative AI, practically all of them announced billions of dollars of investment in the field with bookings throughout the year, even though the revenues grew mostly in single digits. 

In the latest quarterly results, TCS posted $1.5 billion in bookings with AI and Infosys has started building small language models and multi-agent frameworks for its clients. The case was similar for Wipro, HCLTech, and Cognizant.

The same upward momentum is expected to continue for the next year as well. Speaking with AIM, Mohandas Pai, founder of Aarin Capital and former CFO of Infosys, said that there would be an increase in vertical AI projects, focused on specific industries. “Instead of generic models, Indian IT companies are interested in building specific models for their clients that can be used in their own environments for protecting their data,” Pai said.

He added that Indian IT’s generative AI spending will also increase, mirroring the momentum in the US since both are correlated. 

Mrinal Rai, assistant director and principal analyst at ISG, told AIM that there will be an increased focus on translating the generative AI PoCs into actual projects and scaling them. This will involve transforming smaller use cases into larger transformation projects. “There will also be a focus on enhancing professional services to ensure rapid technology adoption,” Rai said.

What’s in the Pipeline?

Last year, analysts predicted that Indian IT would see a surge in deals and partnerships in 2024. They also said that many of these IT giants’ revenue would come from generative AI. Though many partnerships were announced, none of the IT giants have actually reported about their revenue through generative AI.

On the bright side, TCS has reported that 600 AI and generative AI engagements were successfully deployed in production in various phases of development. The company noted that while discretionary spending was affected, clients continued to invest in AI and generative AI, yielding better outcomes.

Similarly, in the latest quarter, Infosys yet again emphasised its dedication to generative AI but shied away from revealing the revenue from it. However, the company has finally revealed that it’s working on small language models for its clients for various applications. 

In the same quarter, Wipro chief Srini Pallia revealed that the company had won two large deals. One of them was with a software technology company that chose Wipro to support its end-to-end product development and IT operations.

Speaking of Wipro’s AI strategy, Pallia said, “There are three categories of AI projects that we do: AI-led projects, AI-infused projects, and AI-powered solutions,” adding that these solutions target industry-specific challenges. AI-powered solutions, for instance, are cross-industry and prepackaged, leveraging generative AI as a core element.

He noted that Wipro’s large deal pipeline is also benefiting from its GenAI initiatives, contributing to an impressive $1.5 billion in large deals—the highest in 10 quarters. Last year, Wipro announced that it would invest $1 billion over the next three years. 

HCLTech is also on a partnership spree for AI. Over the past few months, it partnered with Google Cloud, AWS, GitHub, and IBM to train its workforce with generative AI and also has use cases and POCs for its clients. The company said in its last earnings call that it would focus more on specialised skills for recruitment for FY26.

In the coming year, such investments will continue to increase, and according to estimates, they might increase multifold. India IT’s partnerships with GCCs are also increasing substantially, which will help it expand its capabilities as well. 

Service providers will try to leverage the growth in GCCs for more partnerships while co-creating and collaborating with them. “There will be specialised GCC-focused services carved out of their portfolio. This will include strategic advisory on setting up GCCs with AI-related talent and expertise,” Rai told AIM.

Rai added that as enterprises reduce the total number of providers they work with, India will see an increasing number of deals with multi-tower scope. Many providers specialising in traditional business process services will also enter the domain of traditional IT services. Besides, providers with stronger portfolios in both domains will see more internal integration.

What About Hiring?

According to previous estimates, AI coding tools will have a huge impact on the Indian IT workforce as the size of the teams will decrease. The job market is already seeing a decline. It is probably one of the most difficult years for tech jobs in a very long time. The same is the case with Indian IT.

Vaibhav Domkundwar, founder and CEO of Better Capital, predicted that the size of the Indian IT companies will shrink in the coming years. “IT services companies in India will shrink every year for the next many years as they adopt AI to improve their engineers’ productivity… IT services might start with a 2:1 reduction and get better over time,” he said, adding that it is inevitable. 

The fact that Indian IT is reluctant to hire freshers is not new; it continued throughout 2024. But things might change next year as it remains one of the biggest concerns for many. All this while, Indian IT has been manically upskilling its employees with generative AI to prepare for this change. 

Agreeing with Domkundwar, Meghana Jagadeesh, founder and CEO of GoCodeo, said that this is a pivotal moment for IT services in India. “Companies that proactively upskill their workforce and embrace AI as a complement rather than a replacement will lead the transformation,” she said.

According to a recent report, about 20-25% of fresher hiring in the IT sector is now targeted towards AI skills. This is an uptick from a dismal 5-10% over the last three years. This might result in an upswing of hiring for talent freshers, but a downswing for others without the skills.

TCS aims to onboard 40,000 freshers this fiscal year, while Infosys plans to hire 15,000-20,000. Wipro and HCLTech are targeting 10,000-12,000 and approximately 10,000 fresh graduates, respectively. Including mid-tier IT and engineering services firms, around 1 lakh freshers have received offer letters in the ongoing placement season.

This was also resonated by Pai when he told AIM that the hiring will also be affected because around 200,000 people will start retiring early next year in India. “If Indian IT is going to fill all these positions and they haven’t hired freshers, they will have to start poaching from the same lateral from different companies, be it smaller or bigger,” Pai said.

He added that the hiring of freshers will also finally increase as the companies are already working on only 85% of their capacity. “More fresher hiring will happen because companies will have to build up system capacity,” he added.

A Break Through?

Possibly, Indian IT might also finally make a breakthrough after a decade. Currently, the IT industry is focused on maintaining existing projects, rather than building something revolutionary, which is slowly changing. 

“Creating an LLM or a big AI model requires large capital, time, a huge computing facility, and a market – all of which India does not have,” Pai told AIM earlier. He added that even though Infosys, TCS, and others might have the funds, their focus is on providing vertical solutions, and not horizontal ones like ChatGPT.

Along similar lines, Viswanathan K S, former NASSCOM VP (industry initiatives), told AIM that what we are currently witnessing is merely a lull before the storm when it comes to Indian IT building AI products as it requires a large amount of data to build something tangible. “Let’s not get obsessed with services as of now, but build productionisation of services, as the data is only going to grow several times five years from now,” he said. 

“We are just waiting for the adoption to take place. The next big thing in AI for the global market would come from Indian IT as we wait for the productionisation of the existing services,” added Viswanathan, saying that Indian IT should currently focus on collecting data necessary for the models.

The post What to Expect from Indian IT in 2025 appeared first on Analytics India Magazine.

]]>
Why India is Even Building Speech Models https://analyticsindiamag.com/it-services/why-india-is-even-building-speech-models/ Thu, 05 Dec 2024 08:20:13 +0000 https://analyticsindiamag.com/?p=10142488

If the future is all about communication in voice and giving it to the millions of UPI users in the country, speech models are necessary.

The post Why India is Even Building Speech Models appeared first on Analytics India Magazine.

]]>

Voice remains one of the most crucial modalities when it comes to AI-driven solutions for urban India and rural Bharat. The issue, though, pertains to the unavailability of sufficient data for all the Indian languages and their hundreds of dialects. This is often referred to as the missing link for the Indian language chatbots.

However, there are organisations and companies that are looking to solve this problem. 

AI4Bharat, the research group based at IIT Madras, recently introduced Indic Parler-TTS, an open-source text-to-speech (TTS) model built for over a billion Indic speakers. The model was released in collaboration with Hugging Face and aims to bring accessibility and high-quality speech to diverse linguistic communities.

It’s trained on 1,806 hours of multilingual and English datasets and currently supports 20 of the 22 scheduled Indian languages. It also includes English in the US, British, and Indian accents, which is ideal for developers, researchers, and companies. This model has a permissive license with unrestricted usage.

The model includes 69 unique voices and can render emotions in 10 languages. An interesting feature of this model is its customisable output, which includes background noise, pitch, reverberation, speaking rate, and expressivity. The model can also automatically detect languages through prompts. 

AI4Bharat, IISc, and EkStep Lead the Way

AI4Bharat has introduced several datasets and models, particularly aimed at speech translation. These include the BhasaAnuvaad dataset covering 13 Indian languages and the IndicConformer ASR model for 22 scheduled languages in India.

In September, the organisation launched a series of innovations looking to enhance Indian language technology, which includes IndicASR for 22 Indian languages and Rasa, which is India’s first multilingual expressive TTS dataset for Indian languages. 

Now, speech data is extremely important. Bhashini, the government-led service for Indic language technology, launched a crowdsourcing initiative called Bhasha Daan in July to collect voice and text data in multiple Indian languages, where anyone can contribute. In collaboration with Nasscom, it also launched the ‘Be our Sahayogi’ program on National Technology Day to crowdsource multilingual AI problem statements.

In September, IISc AI and Robotics Technology Park (ARTPARK) also announced that it is set to open-source 16,000 hours of spontaneous speech data from 80 districts as part of Project Vaani, under the Ministry of Electronics and Information Technology’s flagship AI initiative, Bhashini.

The ambitious project, created in collaboration with Google, aims to curate datasets of 150,000 hours of natural speech and text from approximately one million people across 773 districts in India. The first phase of the project, launched at the end of 2022, is nearing completion.

In its second phase, Project Vaani will target 160 districts, collecting 200 hours of speech data from about 1,000 people per district. So far, voice data in 58 different language variants or dialects has been gathered from 80 districts and will soon be made publicly available.

In a past conversation with AIM, Prasanta Ghosh, assistant professor in the department of electrical engineering at IISc, and leader of Project Vaani, said, “We record people with impairments and are building technology that can understand them. Maybe humans can’t, but AI would be able to.” He added that while collecting the data, he realised that there is no corpus even to build technologies for healthy people. 

“And then the first thing that came to my mind was the idea to build a good foundational model for them.” He added that the primary goal is to use this dataset as the training data for speech-to-text AI models, particularly benefiting conversational AI platforms and chatbots requiring diverse voice datasets.

Furthermore, three years ago, EkStep Foundation also open-sourced the wav2vec2 model after training it on 10,000 hours of speech data in 23 Indic languages. The Vakyansh team at the foundation was one of the first in the country to build Automatic Speech Recognition (ASR) and TTS models.

Startups Have Realised the Need

When it comes to startups, Sarvam AI and CoRover.ai have been focused heavily on building speech models. Speaking at Cypher 2024, Sarvam AI chief Vivek Raghavan demoed the speech capabilities of its AI models, leaving everyone at Cypher speechless

The company also launched a range of products, including voice-based agents that are accessible via telephone and WhatsApp. The voice-based agents can also be integrated into an app, allowing users to communicate verbally whenever they choose.

Sarvam also released Bulbul, a multilingual TTS model in six different voices. According to the company’s website, its mission is to enable speech-first applications for India. 

Similarly, Ankush Sabharwal, CEO of CoRover.ai and the creator of BharatGPT, told AIM that the company is actively working on building voice models which can work on WhatsApp for translation and Q&A. The goal is to make it voice-to-voice.

In June, the Indian startup smallest.ai launched its TTS model, AWAAZ. With state-of-the-art Mean Opinion Scores (MOS) in Hindi and Indian English, AWAAZ can fluently converse in over ten accents, reflecting the diverse linguistic landscape of India. Most recently, the company also introduced Lightning, TTS model capable of generating up to 10 seconds of audio within 100 milliseconds.

“When we started building, we realised that the models required for a voice bot were not mature for Indian languages. Existing models for non-English languages were nowhere close to production,” CEO Sudarshan Kamath told AIM.

Meanwhile, the voice of voice-based models is only increasing. Pragya Misra, lead (public policy and partnerships) for OpenAI in India, told AIM that the company will focus on multimodal AI going forward and solving for Indic AI is also something they want to focus on. 

Since the focus is on building voice models for the Bhartiya market, Indian companies are obsessed with Indic language models. And naturally, if the future is all about communication in voice and giving it to the millions of UPI users in the country, speech models are necessary.

The post Why India is Even Building Speech Models appeared first on Analytics India Magazine.

]]>
IIT Bombay Duo Builds Platform to Create Devin-Like AI Agents Effortlessly https://analyticsindiamag.com/it-services/iit-bombay-duo-builds-platform-to-create-devin-like-ai-agents-effortlessly/ Tue, 03 Dec 2024 12:28:46 +0000 https://analyticsindiamag.com/?p=10142357

Composio can be integrated with multiple AI agentic frameworks and the company has partnered with Langchain, Llama Index, and Crew AI.

The post IIT Bombay Duo Builds Platform to Create Devin-Like AI Agents Effortlessly appeared first on Analytics India Magazine.

]]>

AI startup Composio made an early move into AI agents, offering tools and platforms to create production-ready AI applications. It recently launched AgentAuth, a product that integrates AI agents with third-party tools and APIs more efficiently. 

It supports a variety of authentication protocols, including OAuth 2.0, OAuth 1.0, API keys, JWT, and Basic Authentication. The platform also integrates over 250 widely used apps and services, catering to diverse needs such as customer relationship management (CRM) systems and ticketing platforms.

“The biggest problem that people face while building agents is connecting them to reliable tools. For example, if someone is building a sales agent, they would need to connect it with CRMs like Salesforce, HubSpot, etc,” said Karan Vaidya, Composio chief in an exclusive interview with AIM. 

Vaidya and his 2017 IIT Bombay batchmate Soham Ganatra cofounded Composio in July 2023 and have since raised $4 million, including from OpenAI executive Shyamal Hitesh Anadkat. Vaidya has worked at several startups, including Rubrik and Nirvana Insurance, before starting Composio. 

He believes that agents are the future, as LLMs are not improving dramatically. Vaidya explained that while LLMs like GPT-3.5 and GPT-4 have seen notable improvements, they are still limited when it comes to performing complex tasks. “The agents are the future because LLMs are getting better, but they need systems built on top of them to make them accurate and effective,” he said.

Composio can be integrated with multiple AI agentic frameworks and the company has partnered with Langchain, Llama Index, and Crew AI. “We have native integrations with all of them, so you can use Composio’s tool in conjunction with any of these frameworks,” said Vaidya. 

He further revealed that several startups today are using Composio to build AI agents that automate functions like sales. “AI agents are essentially trying to act as employees,” Vaidya said.

Companies are using their tool to integrate with platforms like Gmail, Calendar, Google Meet, Slack, and LinkedIn, automating tasks previously handled by humans, such as scheduling events. Composio charges customers based on the number of API calls made by LLMs. “We have a pretty good premium package where we allow, like, 10,000 API calls per month,” Vaidya added. 

It recently launched Composio SWE-kit, an open-source framework to help developers build coding agents. It helps you create PR agents that review code, suggest improvements, enforce coding standards, catch issues, automate merge approvals, and give feedback on best practices, making code reviews faster and improving quality.

Customers

Vaidya explained that one of the ways people are using Composio is to build sales agents. “Before, humans had to record calls, listen to them, and then update CRMs and create tickets for product updates or client needs. With our integrations, including Salesforce, HubSpot, Klaviyo, Pipedrive, and more, all of that is automated,” he said.

He added that early-stage startups and developers, including companies like Assista, Fabrile, and Ingram, are key customers of Composio. Recently, Microsoft, Oracle, and Salesforce have also launched their own AI agents. When asked about Composio’s unique advantage, Vaidya explained that their platform offers the flexibility to integrate multiple AI agents, freeing customers from being locked into any single ecosystem.

“Most companies don’t use just Microsoft products. They also use Salesforce and HubSpot for different needs, depending on what each company does best,” said Vaidya. He added that the company will soon launch a new product where people can build their own software engineer, like Devin.

Composio is planning to roll out a new feature that will allow two AI agents to interact with one another. Vaidya explained that the company also intends to train its own model using the structural data it has collected. “We plan to improve function-calling accuracy across the domain and introduce agent-to-agent interactions,” said Vaidya. He said that their long-term goal is to serve as the communication layer for AI agents. He mentioned that their platform is SoC-compliant.

He added that the rise of enterprises building AI agents is good news for Composio, as the company sees an opportunity to empower these businesses. “The fact that enterprises are creating their own agents is something we’re excited about because we can help them improve and scale those agents,” he concluded.

The post IIT Bombay Duo Builds Platform to Create Devin-Like AI Agents Effortlessly appeared first on Analytics India Magazine.

]]>
Meet the Indian AI Startup Quietly Taking Over the Enterprise World https://analyticsindiamag.com/it-services/meet-the-indian-ai-startup-quietly-taking-over-the-enterprise-world/ Thu, 28 Nov 2024 02:23:41 +0000 https://analyticsindiamag.com/?p=10141804

SUTRA yet again emerges as #1 in Indian language AI, the research report claims. 

The post Meet the Indian AI Startup Quietly Taking Over the Enterprise World appeared first on Analytics India Magazine.

]]>

Indian AI startup TWO is quietly marking a mark in the fast-emerging artificial intelligence market. In an interview with AIM, founder Pranav Mistry revealed that the company generated $4 million in revenue this quarter and expects $20 million for next year.

This development comes as TWO AI’s SUTRA, a series of multilingual online GenAI models, added a new feather in its cap. The company claims it outperformed GPT-4o, Llama 3.1, and Indic LLMs, including G42’s Nanda, Sarvam’s OpenHathi and AI4Bharat’s Airavata, and leading in over 14 Indian languages. 

Earlier this year, the company launched ChatSUTRA, a ChatGPT-like chatbot. Mistry shared that the platform currently has over 600,000 unique users. 

Unlike other startups, TWO AI targets only big enterprise customers instead of pursuing the consumer market. “Jio is one of our major enterprise customers, and we also work with clients like Shinhan Bank and Samsung SDS in Korea,” Mistry said. He further revealed that the company has started partnering with companies like NVIDIA and Microsoft from a technology perspective, and is working with them as well.

“We are targeting India, Korea, Japan, and some parts of Southeast Asia, like Vietnam, specifically the central region. APAC (Asia-Pacific) is one of the key markets that we are always going to focus on,” Mistry added. 

Recently, TWO hosted Mukesh Ambani, chairman of Reliance Industries, and Akash Ambani, chairman of Reliance Jio Infocomm, at their US office. Over a cup of tea, they discussed the evolving role of AI in India and beyond.

Without taking names, Mistry revealed that in India one of the largest banks and financial sectors is among the customers the company will onboard. “Our solutions are in high demand, particularly in industries like finance, services, and retail,” he said. 

SUTRA’s business model focuses on providing high-touch, customised solutions for a select group of large enterprises. “We don’t need 100 customers,” said Mistry. “We need 10 good customers.” He explained that by using this method, they are reaching billions of customers, as these enterprises already have millions of customers. 

“The focus is to become not the OpenAI of the world, not just an application layer company, but an AI solutions company, going after large enterprises and helping them solve problems with AI,” he added. He shares his goal of following a path similar to that of Palantir’s.

What’s Next?

Mistry revealed that the company’s next project is predictive AI. “Predictive AI is game-changing for these data-dependent industries. From manufacturing to finance, governance, and energy sectors, everyone can really leverage the decision-making capability of forecasting,” he explained.

The model is called Sutra Predict. Mistry pointed out that it is a small model which is trained on trillion data points and time series data entries. “The model is small because the architecture of this one is much easier than the text-based ones, and it is already showing great results in some particular domains that our customers are already trying.”

He explained that time series predictive models are a specific type of statistical model used to analyse and forecast data points that are collected over time. They are built to identify patterns, trends, and seasonal variations within the data to make predictions about future values. 

Mistry explained that with the advent of transformers, models can now process and integrate any kind of data, as seen in predictive models like Google TimesFM and Amazon Chronos. Highlighting a real-world application, he shared that an EV battery diagnostic company in India is using Sutra Predict to identify fire risks by monitoring temperature and voltage fluctuations.

Tackling GPU challenge 

“In India, no one had access to the level of GPU clusters we needed,” Mistry said. The SUTRA team overcame this limitation by porting their models to run on CPU clusters. Despite the challenges, he said the team was able to scale and serve up to 100 billion customers. 

Moreover, Mistry shared that they were the first ones to catch on to the trend of 1-bit LLMs. 

Notably, Microsoft recently introduced BitNet.cpp, an inference framework for 1-bit LLMs, enabling fast and efficient inference for models like BitNet b1.58.

Mistry said they have successfully adapted the SUTRA model to work with 1-bit weights, allowing it to run as a lightweight model on CPUs. 

Moreover, in partnership with NVIDIA, the company launched SUTRA-OP, offering systems like the NVIDIA DGX (Deep GPU Xceleration) box equipped with powerful GPUs for demanding AI tasks. 

For customers requiring lighter and more cost-effective solutions, SUTRA also provides its hardware options, including SUTRA OP2, OP4, and OP8, which are available for lease. “Customers are not purchasing this, they are leasing it from us. It’s a monthly lease for both the OP and the SUTRA solutions,” said Mistry. 

The company recently launched a voice-to-voice AI model called Sutra HiFi. Using a dual diffusion transformer architecture, the model effectively separates distinct voice tones from language-specific accents, promising a better voice interaction quality.

“Sutra HiFi brings the ability to interpret conversations seamlessly in languages that we care about. Currently, it supports 12 languages that we have tested properly,” Mistry said. He suggested that Sutra HiFi can easily empower applications in India or any multilingual market while keeping the cost low and accuracy high.

Discussing Infosys co-founder Nandan Nilekani’s point of view that India should be the use case capital of AI, Mistry said that he has a slightly different perspective. “India must focus on building the fundamental AI capabilities because we don’t want to become dependent on someone else in the future, as data is one of the gold mines in AI,” he concluded.

The post Meet the Indian AI Startup Quietly Taking Over the Enterprise World appeared first on Analytics India Magazine.

]]>
Why the Indian Open Source Community Loves Meesho Now More than Ever  https://analyticsindiamag.com/it-services/why-the-indian-open-source-community-loves-meesho-now-more-than-ever/ Tue, 26 Nov 2024 02:30:00 +0000 https://analyticsindiamag.com/?p=10141693

“Why should we sit on top of this capability? Let others build on it,” said Debdoot Mukherjee, referring to Meesho’s intent to open-source its ML platform. 

The post Why the Indian Open Source Community Loves Meesho Now More than Ever  appeared first on Analytics India Magazine.

]]>

A few weeks ago, Meesho announced open-sourcing its machine-learning platform. The news—first covered by AIM—quickly received praise from all over the world, especially the open source community, which urged the so-called unicorns to follow suit. 

Recently, AIM caught up with Debdoot Mukherjee, the head of AI and data science at Meesho, to learn about the e-commerce company’s open source initiatives, why the company chose to give back to the community, its significance, and more.

At the NVIDIA AI summit in October, Mukherjee provided a deep dive into Meesho’s machine learning infrastructure. The ML architecture, which handles Meesho’s platform used by over 2 million sellers and 160 million plus consumers, was open sourced. 

Meesho currently uses AI and ML for catalogue listing, pricing recommendations, ranking systems, fraud detection and user engagement. 

“It is built to deliver unparalleled cost-efficiency,” said Mukherjee, referring to Meesho’s ML platform at the NVIDIA AI Summit last month. Interestingly, he also said that Meesho operates at the lowest server costs per order. 

The M in Meesho Stands for Machine Learning

Mukherjee told AIM that Meesho plans to open source its feature store first, followed by more components in the future. A feature store is a centralised storage system for all data inputs the model requires to make predictions.

The store handles millions of requests per second, and 99% of these requests are handled in under 10 milliseconds. It is also said to serve two trillion features per day. 

Further, Meesho said that it is looking to introduce feature groups on its ML platform. This essentially creates a class of similar features that can be generated, stored, and reused together. By handling these features as a single unit, significant compression is achieved due to shared redundancies. 

Moreover, its model inference serves more than 500k requests per second in prediction. Meesho has also leveraged NVIDIA’s Triton Inference server, along with Kubernetes to scale its ML models, for cost optimisation and performance efficiency. 

Mukherjee said they were able to drastically reduce the pod boot-up time from 6 to 7 minutes, to just 10 seconds. 

Meesho uses TensorRT as the backend of the Triton Inference Server, which has reduced costs by 40%. It now takes $1.5 per 100M inferences, at a latency of 25ms. The platform used post-training quantisation techniques, which helped it achieve an eightfold reduction in costs and a fivefold improvement in latency. 

Moreover, instead of upgrading to a more expensive GPU, Meesho increased the batch size on its existing NVIDIA L4 GPUs. All of these techniques led to a ninefold decrease in costs and a tenfold improvement in latency. 

Mee-Sho Altruistic

Mukherjee believes that Meesho has built a platform with ‘fairly niche’ capabilities that aren’t available on open source markets yet. He said that their alternatives are mostly proprietary, and startups would have to pay a hefty price if they were to use them.

“If startups start building on top of our projects, hopefully, life will be simpler for them. They will be able to take large-scale machine learning workloads to production much faster and for much cheaper as well,” said Mukherjee. 

By open-sourcing its ML model, Meesho will benefit from community feedback while allowing open source developers to build upon its work. Mukherjee said that the feedback regarding bugs and issues will help them make the architecture more robust. 

“Whenever an open source project becomes successful, the community builds on top of it. They build capabilities and features, which can be potentially useful for us too,” said Mukherjee, who believes further development on the open source parts of Meesho’s ML infrastructure will be driven by contributions from the community. 

Not a Zero-Sum Game for Indian Startups 

When discussing open source in the Indian context, Zerodha deserves a mention. The company recently announced a $1M FOSS fund to invest in open source projects. In an interview with AIM a few months ago, Zerodha CTO Kailash Nadh said that almost every startup in India leverages the best of open source tech, but very little is being done to give back to the community. 

Expressing a similar sentiment, Mukherjee acknowledged that Meesho has used several open source technologies in its tech stack, and now wants to give back. “We benefit quite a lot from open source. A lot of it [Meesho’s tech stack] is built on top of open source projects. So, it’s time to give back to the community,” he said. 

While there’s room for more, Indian startups are certainly doing their bit to give back to the open source circle. With this move, Meesho has joined the group of Indian startups open-sourcing a part of their tech stack. The list includes CRED, Flipkart, Swiggy, PhonePe, Zomato, and Zerodha. Besides, many Indian startups have public repositories on their GitHub page. 

Zomato recently open sourced its weather monitoring system, and design system Sushi. Similarly, CRED, which sells itself on design and user experience, has open sourced its neumorphic design system, NeoPop

A public repository on Flipkart’s GitHub, with over 5,000 stars, called RecyclerList View, provides a high-performance list view component for React Native and web applications. One of PhonePe’s public repositories’, Mantis, is a security framework designed to automate the workflow of asset discovery and vulnerability scanning. 

The post Why the Indian Open Source Community Loves Meesho Now More than Ever  appeared first on Analytics India Magazine.

]]>
Indian Developers Aren’t Using AI Tools Enough  https://analyticsindiamag.com/it-services/indian-developers-arent-using-ai-tools-enough/ Fri, 15 Nov 2024 10:49:05 +0000 https://analyticsindiamag.com/?p=10141021

Several Indian companies advise against, or sometimes even ban, the use of AI coding tools.

The post Indian Developers Aren’t Using AI Tools Enough  appeared first on Analytics India Magazine.

]]>

For 24-year-old Anurag Parekh, a senior front-end developer at Groovy Web hailing from Gujarat’s Nadiad, the promise of AI-powered coding tools often feels just out of reach. Like many developers from tier-2 cities, Parekh relies on the free version of ChatGPT to code, as premium coding tools are beyond his budget. 

“When the free version of GPT-4o mini version runs out, I lose the context length I need to code effectively. That’s when I switch to the free version of Gemini,” he told AIM. Parekh added that the issue is common among developers who have graduated from tier-2 cities as they are paid lower salaries and receive limited support from their companies to access expensive tools.

The affordability of such tools has emerged as one of the key reasons behind their low adoption in India. 

The Cost Barrier

During a discussion with AIM, Jarvislabs.ai founder Vishnu Subramanian pointed out that many university students are still using the free version of ChatGPT because of the “frugal mentality” of Indians.

For developers in the US, $20 a month seems like a decent price. For those in India, however, it comes across as a pricey deal, even if it could boost their careers 10 times faster. 

“Once I see that my mindset is pulling me backwards, I started changing it,” Subramanian said. While revealing that computer scientist Andrej Karpathy’s posts on ChatGPT, Cursor, and Claude got him interested in using these AI tools, he stressed that other Indian developers should do the same.

Why Some Developers Can’t Use AI Tools at Work

During a Reddit discussion among the Indian developer community, it was revealed that most of them still code manually despite the availability of AI coding tools for the job. While several professionals found these tools useful, a notable segment of the community was critical of them, raising concerns about over-reliance, reduced skill development, and privacy issues. 

According to a recent survey by GitHub, 56% of Indian developers said using AI tools helps them boost their chances for employment owing to the skills they develop. Moreover, around 80% of them said that the code quality has improved because of AI tools.

Despite this, several Indian companies advise against, or sometimes even ban, the use of AI coding tools, discouraging developers from adopting them. According to the survey, only 40% of companies have been promoting the use of AI coding tools.

A major reason behind Indian companies’ cautious approach to using AI coding tools is privacy. Many fear that these tools, especially those developed by large corporations, might expose sensitive or proprietary code to third parties. 

Even though AI tool providers like GitHub Copilot assert that enterprise versions don’t store proprietary code, the mistrust remains. This concern is not unfounded, as developers in industries handling confidential data may prefer to avoid tools that could potentially compromise their security.

This, coupled with the narrative claiming “companies would replace Indian software developers with AI coding tools to cut costs” has contributed to a growing sense of dislike among the developer community towards such tools.

When it comes to adoption, however, GitHub CEO Thomas Dohmke said Indian developers are one of the largest user bases of Copilot and contributors on GitHub. This is why many companies are also building coding tools that allow developers to code in natural languages, including several Indian languages.

For instance, GitHub Copilot charges $10 per month after a one month free trial. Cursor AI is free for most use cases, but the Pro version costs $20 per month with unlimited completions and 500 fast premium requests using GPT-4 or Claude-3.5 Sonnet. Meanwhile, Claude Pro Version also costs $20 per month.

ChatGPT Plus, which is the paid version of OpenAI’s model, is $20 per month and Gemini Code Assist’s Standard subscription is $22.80 per user per month.

While Meta’s Code Llama is open source and free to use, it has to be implemented within the workflow after fine-tuning, which makes it a little harder for developers to adopt.

What Industry Leaders Think

Siddharth Sharma, former CTO of Shaadi.com, revealed in a post on X that he cancelled his Cursor subscription when he realised that “AI tools were generating verbose, inelegant code that was negatively influencing his thought process”. 

“In general, I found the LLM never pointed to a refactoring that would make the code less verbose. And this was painful as it happened. And it is going to be 10 times more painful in teams and large codebases and so on,” he added.

In contrast, Sergey Brin, co-founder and former president of Alphabet, believes developers are not using AI coding tools enough

Meanwhile, Adarsh Shirawalmath, the founder of Tensoic AI, told AIM while his team is using such tools internally, he feels discussions about them are useless. “It’s better to stick to one tool that we know is good, such as Cursor. Though there are some better IDEs out there, I personally don’t have the bandwidth to experiment with them,” Shirawalmath added. He believes that such tools can be expensive for university students.

Adithya S Kolavi, AI researcher and founder of CognitiveLab, informed AIM that many of his peers have started using Cursor and highly recommend it. “I personally haven’t switched because I’m too comfortable with my current development setup, and I generally use Claude to generate code,” he added. 

This seems to be the case for several Indic developers, who, once they start using a specific AI tool, do not experiment so quickly with others as the cost, along with the time spent on learning them, might seem too high.

Amartya Jha, co-founder & CEO of CodeAnt AI, told AIM that tools like GitHub Copilot have seen some uptake in code generation, especially in tech-first B2B enterprises and mid-market companies. However, in India, many businesses are tech-enabled rather than tech-driven. Even within B2B or B2C, code generation tools have limited penetration.

“A common concern is measuring developer productivity gains. Since these tools often work within IDEs, management struggles to get a holistic view of productivity improvements. Plus, as team sizes scale to over 200 developers, costs skyrocket—often averaging $40,000. This price point is tough to justify given typical budgets for developer productivity tools,” Jha further said.

Risk of Over-Reliance

“Earlier I used to read so much on Stack Overflow and try out multiple methods myself…Now I just tell AI to write something for me,” said a developer. 

Several developers have expressed concerns over how over-reliance on AI tools could potentially lead to erosion of foundational programming skills, causing them to reconsider using such tools for the sake of maintaining their competency. 

Another developer highlighted the effect of extensively using AI coding assistants. “I became a terrible programmer. I suck at conditions now, I used to be super good at them, but now I make mistakes and take a lot of time to figure them out,” he added. 

This user noticed a sharp decline in his problem-solving abilities. He was heavily relying on AI to think of solutions instead of building a deeper understanding of code structure. 

Another developer highlighted the loss of “collateral knowledge”. In the past, solving one problem would lead to exposure to other related issues, which enriched developers’ coding knowledge. Now, the immediate, perfect solutions provided by AI skip that learning process, potentially stunting skill development in the long run.

In contrast, we’ve seen rapid adoption of AI-driven code quality and security tools for two main reasons. First, these tools quantify the number of critical software bugs and vulnerabilities they prevent, which directly impacts development and auditing costs. Second, they often support on-premises deployment, making them attractive to large companies prioritising security.

A hybrid approach seems like a logical solution, combining AI assistance with traditional problem-solving methods like reading documentation or consulting Stack Overflow.

As a developer on Reddit said, “Use only these tools, and you miss much. Use none, and you miss the way. Balance, my friend, is the key to all.”

The post Indian Developers Aren’t Using AI Tools Enough  appeared first on Analytics India Magazine.

]]>
Most Indian CS Graduates Can’t Code https://analyticsindiamag.com/it-services/most-indian-cs-graduates-cant-code/ Tue, 12 Nov 2024 05:29:33 +0000 https://analyticsindiamag.com/?p=10140814

Notably, instead of developing industry-relevant skill sets, students are often forced to do research work in universities.

The post Most Indian CS Graduates Can’t Code appeared first on Analytics India Magazine.

]]>

The growing emphasis towards earning a computer science degree has saturated the market, and there are not enough jobs available for graduates. While this seems like a plausible explanation, on the contrary, many computer science graduates still struggle with the foundational skills required to write code. This seems to be another key reason why graduates in India are not able to land jobs despite having the necessary qualifications.

This topic was further highlighted in a recent Reddit discussion, where the original poster asked if the market is actually saturated or if it is simply that CS majors can’t code. Though the answer is nuanced, job seekers have been sending out hundreds of applications and not receiving any responses. At the same time, according to them, the resumes are also not up to par.

“I have a cousin who’s in his third year of CS with no internship and always complains at family gatherings about how bad the market is. I then asked him to show me his resume… Let’s just say bro has a calculator as his project,” said the Redditor, explaining that it is indeed true that a lot of graduates are just not competent enough in coding to land CS jobs.

Curious, they gave their cousin a simple task using an API from a game they both enjoyed. Instead of implementing a few lines of code, the cousin painstakingly copied and pasted thousands of lines of JSON into ChatGPT, then attempted to break it into smaller chunks for processing. 

The actual solution shared by the Reddit user was just a few lines of JavaScript, calling to attention the inadequate practical skills among CS students.

Inadequately Training in Universities

This highlights the huge gap between what universities are offering to CS graduates and what the market actually requires. According to a report released by Teamlease Digital, only 5.5% of Indian engineers are qualified with basic programming skills

Moreover, according to the ManpowerGroup Talent Shortage Survey 2024, 75% of employers worldwide are struggling to fill critical positions due to a lack of qualified candidates.

According to employers, the vast majority of graduates are highly “unemployable”. Several discussions on X narrate similar stories underscoring how many recent college graduates do not have the basic coding skills for entry-level positions, and therefore, employers prefer hiring engineers holding at least two to three years of experience. 

In a post, Ratnakar Sadasyula, an IT professional and author, narrated the story of candidates demanding extremely high salaries. “Now that would not be an issue, if these people were extraordinarily brilliant, or IIT, NIT passouts,” said Sadasyula. “Most of them are from ordinary engineering colleges and, forget about being extraordinary, they are not even of decent ability (sic),” he said, adding that most did not even possess proper communication skills.

A developer told AIM that though the market is saturated, he was not able to land a job because he also lacked the relevant skill set. He explained that he had to opt for unpaid internships for more than a year as the production requirement in the job market wasn’t taught in universities. Left with no other option, freshers have to up-skill themselves and learn from courses and videos available online.

Notably, instead of developing industry-relevant skill sets, students are often forced to do research work in universities, which is usually redundant for the jobs they apply for. Most of them start learning coding only after graduating. 

The same is the case with Indian professors at universities. AIM had earlier reported that many Indian engineers believe their college professors lack the necessary expertise required to teach programming. “Most people learn what college teaches them and nothing more,” said a Reddit user during the discussion, which resonated highly among college students.

The tech world is buzzing with the idea that one doesn’t need a college degree in computer science to land a job. AI tools like ChatGPT, Claude, and GitHub Copilot can instantly answer technical questions, raising the question of whether CS majors need to memorise basic concepts. 

One developer asserted, “Memorising all this stuff [code] is just insane; you can Google it or use ChatGPT in one or two searches.” 

While one can argue that there is no need to learn programming since most of the coding in the near future will be done by AI tools like ChatGPT and others, the reality of the job market shows that these AI tools are still not advanced to replace the need for developers

Nothing’s New

A simple Google search shows that this has been the case for the longest time in the Indian tech industry. In 2017, it was reported that 95% of Indian engineers couldn’t code because the skill demand was so high.

To this day, a record number of CS graduates coming out of college are not adequately trained with even the basic skills to secure a low-paying job in any of the Indian IT companies. 

Despite producing around five to 10 million STEM graduates annually, India, as a populated and developing country, is running out of skilled software engineers. Instead, the country faces an oversupply of graduates, as the most talented ones move to developed nations. 

The truth is, though the qualification from a premier institute is not exactly an ideal standard, technical knowledge is definitely necessary. That is a mistake that several Indian companies make. “A couple of freshers were recruited from CS streams of Tier 3 colleges… Nine out of 10 didn’t know how to code at all,” a developer told AIM.

Though India is seeing an increase in talent retention, there seems to be a surplus of underskilled CS graduates, who do not even know how to code.

The post Most Indian CS Graduates Can’t Code appeared first on Analytics India Magazine.

]]>
How Microsoft’s Copilot and Meta’s Llama Turned Infosys Into an AI-first Company https://analyticsindiamag.com/it-services/how-microsofts-copilot-and-metas-llama-turned-infosys-into-an-ai-first-company/ Fri, 08 Nov 2024 13:30:00 +0000 https://analyticsindiamag.com/?p=10140699

Around 18,000 developers at Infosys have written 7 million lines of code using GitHub Copilot.

The post How Microsoft’s Copilot and Meta’s Llama Turned Infosys Into an AI-first Company appeared first on Analytics India Magazine.

]]>

Infosys, the Indian IT giant that claims to be doing incredibly well on generative AI, has begun swiftly integrating AI into its offerings. For the past two years, the company has been pushing not just its clients but also its employees to use AI tools for their work. 

For instance, Infosys developers have been using GitHub Copilot for over a year, which now has about 20,000 users generating nearly a million lines of code every few weeks.

Puneet Chandok, the president of Microsoft India & South Asia, affirmed this. “Around 18,000 developers at Infosys have written 7 million lines of code using GitHub Copilot,” he said, adding that Copilot has boosted productivity, streamlined workflows, and transformed how Infosys engages with its customers.

Addressing concerns about AI’s impact on jobs, Infosys has reassured its employees that AI would serve purely to amplify their potential. To support this vision, Infosys has provided each employee in different departments with an AI assistant tailored to their specific roles.

Meanwhile, at the Building AI Companions for India event in Bengaluru, Infosys CTO Rafee Tarafdar spoke about how Infosys was using Microsoft Copilot internally. “In 2022, we began our journey to become an AI-first company, aiming to integrate AI into every aspect of our business. This meant embedding AI into our workflows and ensuring that all employees are AI-aware, skilled, and empowered to use AI tools effectively,” he said.

Rafee Tarafdar at Microsoft Building AI Companions for India

Copilot is All You Need for Employees

“In my role, I interact with clients across various fields—blockchain, quantum computing, AI—and finding relevant information can be challenging,” said Tarafdar. This is what, he said, led the team to launch InfyMe, a personal assistant that helps employees access information quickly, improving efficiency in client interactions, powered by Microsoft Copilot. 

In addition, the leadership also prioritised continuous learning, recognising that the 370,000-strong workforce stayed updated. Infosys introduced a personalised learning platform called Springboard to help employees acquire new technical skills and adapt to complex scenarios, making this approach mainstream across the organisation.

“One key lesson we learned is that providing AI tools alone doesn’t drive adoption.” Hence, to fully realise the impact, Infosys trained all its employees to be AI-ready. This meant that whether they worked in finance, HR, sales, or operations, everyone was trained to effectively use AI tools and become skilled “prompt engineers”. 

Beyond this, Infosys has developed roles for AI “builders”, who create applications for our clients, and AI “masters”, who have created around 15 specialised models, including a powerful 2.5 billion parameter called Infosys Topaz BankingSLM, a pre-trained model for banking and IT operations that outperforms other language models in finance benchmarks.

As the demand for generative AI applications grew, Infosys built an AI infrastructure platform using Azure and its own AI cloud called Infosys Topaz. “This platform has enabled our employees to innovate at the edge, developing apps in weeks rather than months,” said Tarafdar. These applications are used not only by our clients but also by institutions like museums, enhancing their impact.

Llama for the Clients

At Meta’s Build with AI Summit held in Bengaluru last fortnight, Infosys announced a partnership with Meta to utilise the Llama stack, a collection of open-source large language models and tools, to build AI solutions across industries.

As an early adopter of Llama 3.1 and 3.2 models, Infosys is integrating these models with Infosys Topaz, the in-house AI platform, to create tools that deliver business value. One example of such a tool is a document assistant powered by Llama that improves the efficiency of contract reviews.

“We are building industry-wide solutions. For market research, we have built a proof-of-concept and, internally, we are leveraging many Llama models for our AI-first journey,” an Infosys representative told AIM. The person further added that they have several other use cases, such as production use cases and document summarisation.

A Lot More to Come

In its latest Q2FY25 earnings call, Infosys yet again emphasised on its dedication to generative AI, but shied away from spilling the revenue details. On the brighter side, the company has finally revealed that its working on small language models for its clients for various applications. 

“It’s an incredible approach that leverages various open-source components, along with a narrow set of industry data and Infosys’ proprietary dataset,” said Salil Parekh, CEO and MD. 

“It’s a differentiated strategy, and although we haven’t shared much yet, we are already having promising discussions with clients. The work has begun, and we’re integrating generative AI deeply across key areas,” he added.

Parekh also shared a notable case involving building a multi-agent framework for a client, where the agents handle specific business processes almost entirely on their own. “We are building enterprise generative AI platforms and multi-agent frameworks for clients,” he said.

NVIDIA Coming in

Besides Microsoft and Meta, Infosys has also partnered with NVIDIA to incorporate NVIDIA AI Enterprise into its Infosys Topaz suite to enable businesses to rapidly implement and integrate generative AI into their workflows. 

Meanwhile, TCS is creating AI solutions using NVIDIA’s NIM Agents Blueprints for sectors including telecommunications, retail, manufacturing, automotive, and financial services. Its offerings feature NeMo-powered, domain-specific language models that can handle customer inquiries and respond to company-specific questions across all enterprise functions, such as IT, HR, and field operations.

Wipro, on the other hand, with its AI-powered consulting and extensive employee reskilling efforts, is looking to build an “AI-powered Wipro” that drives efficiency and transformation. “Net-net, I think GenAI will be positive for us and for the industry,” said Srini Pallia, CEO and MD at Wipro, adding that they are investing big into GenAI. 

“We have now trained and certified over 44,000 employees on advanced AI, and we also have a significant number of employees actively using AI developer tools across the company for all our clients,” said Pallia.

Moreover, Tech Mahindra’s recent Indus 2 launch, a Hindi-centric AI model, is powered by Nemotron-4-Hindi 4B, targeting local language engagement. The company has reskilled 45,000 employees, supporting their AI roadmap through an internal proficiency framework.

Looking ahead, Infosys aims to foster a collaborative working model where AI re-engineers processes and automates tasks, creating a future where humans and AI work seamlessly together. “Our next steps focus on building agentic systems and AI ‘workers’ to drive productivity and allow us to reimagine how we work,” concluded Tarafdar at the Microsoft summit.

The post How Microsoft’s Copilot and Meta’s Llama Turned Infosys Into an AI-first Company appeared first on Analytics India Magazine.

]]>
When Mustafa Suleyman was in Bengaluru https://analyticsindiamag.com/it-services/when-mustafa-suleyman-was-in-bengaluru/ Thu, 07 Nov 2024 11:48:47 +0000 https://analyticsindiamag.com/?p=10140563

Suleyman referred to Microsoft’s AI teams in India as the strength of the company.

The post When Mustafa Suleyman was in Bengaluru appeared first on Analytics India Magazine.

]]>

After Meta’s chief AI scientist Yann LeCun and NVIDIA CEO  Jensen Huang’s India visit last month, Mustafa Suleyman, CEO of Microsoft AI, also decided that it was high time for Microsoft AI to make its mark on the country’s larger audience. Suleyman showcased Copilot’s prowess.

Speaking of AI companions, Suleyman, before visiting India for the first time, asked Copilot about the weather and GDP of Bengaluru and was quite surprised by it. 

At the Building AI Companions for India event, besides hosting a Cafe Copilot featuring an entire food menu designed by Copilot, and RaagaTrippin’ using Copilot as its band member to generate lyrics on stage, Suleyman and the Microsoft team demonstrated why it is important for India to build its own AI models in the era of agentic AI while also finding the right use cases.

Puneet Chandok, president of Microsoft India and South Asia, highlighted how Microsoft is helping Indian companies build AI products. “Around 18,000 developers at Infosys have written seven million lines of code using GitHub Copilot,” said Chandok, while adding that Cognizant is a massive user of Copilot.

Notably, HCLTech’s AI Force platform and Genpact’s AI Guru have been infusing Microsoft AI into their services, streamlining software and code development while also teaching people about AI.

This is when Chandok invited Suleyman on stage to talk more about the upcoming Copilot products that are going to change how Indians communicate with AI companions. 

Building AI in India is Crucial

Suleyman referred to Microsoft’s AI teams in India, especially those in Bengaluru and Hyderabad, as the strength of the company. According to him, these are where talented engineers and developers work on every layer of the company’s tech stack.

“AI is going to put knowledge at everyone’s fingertips, synthesised, distilled and personally tuned to the way you want to learn and use information,” he said, highlighting AI’s potential to democratise knowledge, making it accessible across work environments and enabling informed decision-making. 

In a fireside chat with S Krishnan, secretary of the Indian government’s MeitY, Suleyman discussed Microsoft’s collaborative efforts with specialists from diverse fields to build AI systems that resonate with human values. Talking about the IndiaAI Mission and Bhashini, Krishnan said that, originally, they had envisioned building an LLM on their own. “Now that we are actually having a meeting, it may not be worth the effort of building an entire LLM on our own,” he added. 

It might be better to adapt AI systems to be practical in the Indian context and useful for specific sectors in areas that India seeks to adapt based on ground reality. This resonates with the idea of Adbhut India, or AI use case capital of the world.

“We try to converse in 22 different Indian languages… Voice really is the ultimate way to make these tools accessible,” Krishnan pointed out. 

Suleyman’s response to this was slightly different from what was expected. Reflecting on DeepMind’s inception in 2010, Suleyman said, “Timing is everything… It’s critical that you get the timing right.” He said that India was one of Microsoft’s fastest-growing markets and has one of its strongest R&D teams globally. 

“I really feel that now is the right time to create these new models. All of the resources are now widely available. The APIs are brilliant. There are open-source models surfacing everywhere. And so it just feels like a very creative moment. And I’m excited to see so many startups and new businesses really experiment with this stuff,” Suleyman further said.

From Pessimist to Optimist

During the event, Suleyman also showcased the redesigned Microsoft Copilot, which now includes a conversational human-like voice, screen-viewing capabilities, and advanced reasoning functions. These innovations aim to transform daily user interactions with technology. 

He described the recent strides in AI as “the first step towards distilling intelligence into algorithmic constructs”.

After co-founding GenAI startup Inflection in 2022, Suleyman and co-founder Karén Simonyan joined Microsoft in March 2024 following the acquisition of key Inflection AI team members. Surprisingly, he has always taken a critical approach towards AI, highlighting the dangers of the technology. 

But, lately, it has changed. “For too long, software has principally been utilitarian. My personal vision for AI has always been about how it can be a companion that can make each and every one of us feel more supported and smarter and more capable,” he said. 

Addressing the fear of job losses and regulations around AI globally, Suleyman explained that it’s super important to be as attentive and thorough about observing the potential upside as it is to really meditate on and deeply think about the potential downsides. 

“It won’t just be able to talk to you. It’s going to be able to use APIs. It is going to be able to search through databases. It will generate new programs from scratch. It is going to be able to get things done in the digital world on a large scale. That’s actually what is coming in the next few years, more than anything else,” he said.

Suleyman said it’s time to be thoughtful and deliberate and not treat AI regulation as a taboo. He believes that nations need to start having an important dialogue around this.

Giving examples of drones, he said that regulations are effective. “We don’t have drones flying around randomly all over the world with autonomous capabilities. We were deliberate and proactive and careful, and I think that’s just the approach that we have to take,” he reiterated.

The post When Mustafa Suleyman was in Bengaluru appeared first on Analytics India Magazine.

]]>
India’s Digital Infrastructure is Changing Super Fast with AI https://analyticsindiamag.com/it-services/indias-digital-infrastructure-is-changing-super-fast-with-ai/ Thu, 31 Oct 2024 07:36:38 +0000 https://analyticsindiamag.com/?p=10139939

AI is transforming India’s digital infrastructure and reaching the farthest of underserved communities.

The post India’s Digital Infrastructure is Changing Super Fast with AI appeared first on Analytics India Magazine.

]]>

AI is transforming India’s e-governance landscape and reaching the farthest of underserved communities. From processing numerous queries round the clock to enabling bilingual support, advancements in AI are making government initiatives more accessible and efficient than ever.

“Our AI systems now process between 5 to 7 lakh queries every month, operating 24/7, which is crucial in ensuring millions of citizens get timely assistance,” said Sharmishtha Dasgupta, deputy director general of the National Informatics Centre, during the Nvidia Summit held in Mumbai last week.

She goes on to mention that these queries range from enrollment and eligibility checks to updates and troubleshooting. 

The system’s capacity to manage such high volumes highlights its scalability and efficiency, as well as its alignment with the Digital India campaign’s goal of making government services accessible to every citizen.

Dasgupta also noted that AI-powered bilingual support systems have been functional in bridging linguistic and digital divides.

“Bilingual AI-powered support systems are making interactions with government schemes like PM Kisan Samman Nidhi Yojana straightforward, reducing complexities and ensuring citizens can engage in their preferred language,” she added.

For instance, IRCTC, an extension of Indian Railways, is also using the conversational AI platform AskDISHA 2.0 chatbot to help customers book railway tickets through voice, chat, and click-based operations.

“In our constant pursuit to enhance the user experience, leveraging new age technologies, today we are taking a giant leap. Now passengers can book train tickets in a conversational manner, leveraging our AI Virtual Assistant, AskDISHA 2.0, powered by CoRover Conversational AI platform,” Rajini Hasija, CMD, IRCTC said.

During the Nvidia Summit, Tanusree Barma, deputy director general of the Unique Identification Authority of India (UIDAI), highlighted the transformative AI developments within UIDAI, as the organisation moves towards adopting and using indigenous AI capabilities. 

Barma stated, “Within UIDAI, we have been silently having an AI revolution to enable indigenous implementation of LLMs as well as AI models to apply for biometric duplication and detection.”

Barma also talked about the critical role that AI now plays in the nation’s digital identity framework, emphasising how these AI models help strengthen security, improve accuracy and drive innovation for the UIDAI.

She further said that by focusing on homegrown AI solutions, UIDAI not only aims to reduce dependency on foreign AI technologies but also ensures data sovereignty and control, a major step in India’s broader AI ambitions.

Innovative Case Studies

Manohar Paluri, VP of AI at Meta, recently told AIM, “India is possibly among the top three in terms of Llama downloads and variants, and it’s also among the top two in terms of how many developers there are. The appetite for technology and the appetite of people adapting to new technology in this part of the world is amazing.”

Paluri has talked about one of the case studies, Pratham, a nonprofit for education.

“Pratham is an example of how this technology is being used to teach kids in an affordable way. So, you are scaling yourself where these chatbots actually can help you learn about a particular subject very quickly,” he added.

He explained how farmers can now use this technology in their native language and get the best information on agriculture and financial assistance, which was previously not feasible or accessible.

“So when you think about learning, it is going to be redefined now with technologies like Llama as well as MovieGen,” Paluri added.

He also discussed the benefits of an open ecosystem, where people can take the model and fine-tune it as per their needs.

“I was actually in Japan two weeks ago. There were Japanese versions of Llama, where they were able to tune it,” he said. Paluri also mentioned a Korean version and a Korean high school math version of Llama. 

“We are trying to bring high-quality Indian tokens into Llama, so that it works for Indian languages,” said Paluri.

He also mentioned that Llama, being the engine for Meta-AI, will work for Indian languages. This will help billions of people in India, who are using Meta-AI on WhatsApp and other Meta products.

Future Prospects

Meanwhile,BharatGen, India’s first government backed multimodal AI initiative, has released e-vikrAI, an advanced solution powered by Vision Language Models, tailored for product images in Indic e-commerce.
Spearheaded by IIT Bombay under the National Mission on Interdisciplinary Cyber-Physical Systems (NM-ICPS) of the Department of Science and Technology (DST), BharatGen focuses on creating AI models tailored to India’s diverse languages and cultural contexts.

The post India’s Digital Infrastructure is Changing Super Fast with AI appeared first on Analytics India Magazine.

]]>
NVIDIA, Meta Push Indian IT to Aim High on GenAI, Finally!   https://analyticsindiamag.com/it-services/nvidia-meta-push-indian-it-to-aim-high-on-genai-finally/ Fri, 25 Oct 2024 14:05:51 +0000 https://analyticsindiamag.com/?p=10139449

As an early adopter of Llama 3.1 and 3.2 models, Infosys is integrating these models with Infosys Topaz, the in-house AI platform, to create tools that deliver business value.

The post NVIDIA, Meta Push Indian IT to Aim High on GenAI, Finally!   appeared first on Analytics India Magazine.

]]>

Indian IT is finally stepping up its generative AI game. At Meta’s Build with AI Summit in Bengaluru held on October 23, Infosys announced a partnership with Meta to utilise the Llama stack, a collection of open-source large language models and tools, to build AI solutions across industries.

As an early adopter of Llama 3.1 and 3.2 models, Infosys is integrating these models with Infosys Topaz, the in-house AI platform, to create tools that deliver business value. One example is a document assistant powered by Llama that improves the efficiency of contract reviews.

“We are building industry-wide solutions. For market research, we have built one proof-of-concept (POC), and internally we are leveraging many Llama models for our AI-first journey,” an Infosys representative told AIM. The person further added that they have several other use cases, such as production use cases and document summarisation.

This comes after Infosys revealed that it is working on small language models for its clients’ various applications. During the recent earnings call, Infosys CEO and MD Salil Parekh said, “It’s an incredible approach that leverages various open-source components along with a narrow set of industry data and Infosys’ proprietary dataset.”

Interestingly, in March, Yann LeCun, the chief of Meta AI disclosed that he met with an Infosys co-founder who was funding a project based on Llama 2, the open-source model produced by Meta so that it recognises all 22 official Indian languages. 

Moreover, Infosys has strengthened its partnership with Meta and established a Meta Center of Excellence (COE) to accelerate the integration of enterprise AI and promote contributions to open-source communities. The COE will cultivate expertise in the Llama stack, develop industry-specific applications, and facilitate customer adoption of generative AI. 

Besides, Infosys has also partnered with NVIDIA. The company is incorporating NVIDIA AI Enterprise into its Infosys Topaz suite to enable businesses to rapidly implement and integrate generative AI into their workflows. 

Infosys has also launched an NVIDIA Center of Excellence dedicated to employee reskilling, solution development, and the widespread adoption of NVIDIA technology across organisations.

How Competitors are Leveraging AI

Besides Infosys, NVIDIA is also supporting other Indian IT companies. Tata Consultancy Services is creating AI solutions using NVIDIA’s NIM Agents Blueprints for sectors including telecommunications, retail, manufacturing, automotive, and financial services. 

Its offerings feature NeMo-powered, domain-specific language models that can handle customer inquiries and respond to company-specific questions across all enterprise functions, such as IT, HR, and field operations.

In Q2 FY25, TCS reported over 600 GenAI engagements, a significant increase from nearly 270 last quarter. “Last quarter, we had eight engagements that went into production. This quarter, we have almost 86 engagements going into production,” shared TCS chief K. Krithivasan, noting that it’s a sign of maturity.

Wipro, on the other hand, with its AI-powered consulting and extensive employee reskilling efforts, is looking to build an “AI-powered Wipro” that drives efficiency and transformation. “Net-net, I think GenAI will be positive for us and for the industry,” said Srini Pallia, CEO and MD at Wipro, adding that they are investing big into GenAI. 

“We have now trained and certified over 44,000 employees on advanced AI, and we also have a significant number of employees actively using AI developer tools across the company for all our clients,” said Pallia.

Wipro is applying NVIDIA AI Enterprise software, which includes NIM Agent Blueprints and NeMo, to assist businesses in crafting custom conversational AI solutions, such as digital humans, for customer service engagement.

Meanwhile, Tech Mahindra’s recent Indus 2 launch, a Hindi-centric AI model, is powered by Nemotron-4-Hindi 4B, targeting local language engagement. The company has reskilled 45,000 employees, supporting their AI roadmap through an internal proficiency framework.

As Indian IT ramps up its GenAI capabilities, companies are eyeing new possibilities to integrate these advanced tools across verticals. With strong partnerships and investments, the industry is set for a rapid shift in enterprise AI adoption across the country.

The post NVIDIA, Meta Push Indian IT to Aim High on GenAI, Finally!   appeared first on Analytics India Magazine.

]]>
Yann LeCun Wants India to Make Local and Global AI Products https://analyticsindiamag.com/it-services/yann-lecun-wants-india-to-make-local-and-global-ai-products/ Thu, 24 Oct 2024 12:06:33 +0000 https://analyticsindiamag.com/?p=10139326

“Going forward, India has an important role to play, not just in technology and product development, not just for local products, but for international products and also for research,” said Yann LeCun.

The post Yann LeCun Wants India to Make Local and Global AI Products appeared first on Analytics India Magazine.

]]>

“You guys are building the future here,” Yann LeCun, the chief scientist of Meta AI, remarked as he photographed the crowd with his Meta Ray-Ban smart glasses at the Meta Build with AI Summit held in Bangalore on Wednesday. He was referring to India’s talent pool working on open-source initiatives in India. 

LeCun highlighted how diverse contributions enhance the development of AI, noting that it has significantly impacted open-source initiatives in India driven by companies like Sarvam AI, AI4Bharat, and others. 

He said that while promoting open source serves Meta’s interest, he was fascinated by how people were developing products that even Meta hadn’t envisioned. “I see how people try to use the technology with good open-source data.” He acknowledged the work of AI4Bharat, a research lab at IIT Madras which works on developing open-source datasets to use AI for Indic languages. 

“Going forward, India has an important role to play, not just in technology and product development, not just for local products, but for international products and also for research,” adding that India is brimming with talent.

He was quick to draw parallels to the establishment of FAIR Paris 10 years ago. “I think we can sort of use this blueprint to accelerate the progress of AI in India even further.” 

The day before, LeCun visited IIT Madras and AI4Bharat, and was impressed by projects that focus on language translation and cultural preservation, such as IndicTrans2 and others. “AI is going to help inspire,” he remarked, particularly in regions with many languages and cultural complexities.

Speaking at IIT Madras, LeCun highlighted that the current language models might seem like they are reasoning, but are, in fact, only carrying out intelligent retrieval, and are not the pathway to human-level intelligence.

LLMs are Not All We Need

To reach the next level of AI—what LeCun calls autonomous machine intelligence or AMI, also known as “friend” in French—we need systems that can truly help people in their daily lives. This involves developing systems that can understand cause and effect and model the physical world. 

LeCun used the example of a child who can figure out how to load a dishwasher on their first attempt—something no AI has yet been able to achieve. We need systems that can learn and adapt to new tasks with common sense. This has been LeCun’s vision for autonomous machine intelligence for the longest time.

Following up on his long debate about LLMs, one of the key challenges that LeCun highlighted about current AI models is that even the most powerful ones have seen less data than a human child absorbs in the first few years of their life. 

“We need to train AI to understand the world by observing it, just like humans do. This requires new architectures that move beyond today’s models.” LeCun emphasised focusing on object-driven AI instead of prompt-based, as they would be able to solve problems that current AI models can’t.

“These systems will be able to plan actions and predict outcomes, and they’ll do so while adhering to built-in safety measures, or ‘guardrails’. This approach not only makes AI more powerful but also ensures it operates within safe boundaries,” said LeCun.

Local to Global and Decentralised

LeCun envisions AI as a shared infrastructure that democratises access to knowledge. He articulated a future where “the big frontier AI systems will not be produced by smaller companies… but will be trained in a distributed fashion”. This approach would allow for localised data processing while maintaining a global consensus.

“Open-source technology is crucial today and will become even more so in the future. This is because AI is increasingly becoming a shared infrastructure that can be used globally,” LeCun said and added that for AI to reach its full potential, it needs to serve as a repository of human knowledge. 

While we’re attempting to achieve this now, we’re limited by a lack of diverse data in terms of languages, cultures, values, and interests. 

Left to right: Tanuj Bhojwani, Yann LeCun, and Nandan Nilekani

While speaking at the panel with Infosys co-founder Nandan Nilekani and head of People+ai Tanuj Bhojwani, LeCun shared the idea of building India as the AI use case capital of the world. Nilekani said, “We think that we can build on top of digital public infrastructure.” He said this is providing the foundation to go to AI much more rapidly, a concept he previously referred to as “DPI to the power of AI’.

Centralising all this data isn’t feasible, nor would it fully represent the vast range of human knowledge. LeCun underscored the importance of collaboration among governments, industries, and researchers to establish the necessary infrastructure for this vision. “There’s a lot of work to do,” he acknowledged, stressing the need to empower individuals, especially in underserved areas.

Even if we could access the necessary data, fine-tuning these AI systems would require input from people with diverse cultural and linguistic backgrounds. No single organisation or country can do this alone; it must be a collective global effort. In this sense, AI development needs to be transparent and decentralised.

India will need to invest in infrastructure, expertise, and policies that support such a decentralised approach to AI. Rather than seeing it as a threat to sovereignty, this would enable true global sovereignty over knowledge and technology.

“I see a future where we’ll all have smart glasses or devices that allow us to interact with AI assistants,” said LeCun. These systems could eventually be smarter than us, but LeCun said that we shouldn’t see that as threatening. Instead, it’s empowering—like having a team of experts at our disposal. 

Everyone, not just people in tech or academia, should have access to these tools. “Imagine rural communities in India being able to ask questions in their own languages, improving sectors like agriculture and healthcare. This kind of future would bring significant positive changes,” added LeCun.

The post Yann LeCun Wants India to Make Local and Global AI Products appeared first on Analytics India Magazine.

]]>