We consider three layers when we talk about generative AI: AWS’ Anupam Mishra
Amazon's cloud unit, AWS, recently made a slew of announcements, predominantly centred around artificial intelligence, aligning with the prevailing tech trends. A notable highlight was the introduction of Q, AWS' proprietary chatbot, trained on the company's 17-year knowledge base, which goes beyond just conventional question-answering capabilities. TechCircle spoke to Anupam Mishra, the Head of Technology and Solution Architecture for Commercial Sales at AWS India and South Asia, about use cases of generative AI, cloud adoption, India's market dynamics, and AWS' escalating investments in the region. Edited excerpt:
AWS announced a mega-investment of $12.7 billion in India by 2030, following an earlier investment of $3.7 billion (between 2016 and 2022). How is the bet on India panning out?
We invest in data centres, people on the ground, partnership programs, and startups. Our philosophy is to think in terms of long-term commitment to the country and its potential. Companies ranging from startups and enterprises are moving to the cloud and adopting newer technologies like generative AI, and we want to be the platform of choice for these customers, driving the next wave of innovation. Currently, we have several customers benefitting from some of the investments we have made, including brands like Axis Bank, Ashok Leyland, Broadridge, and Titan.
As per data collated by VCCEdge (TechCircle’s sister unit), AWS’ net sales grew 43% in India in FY23. What are the factors driving such growth?
I cannot comment on the numbers, but I can tell you what fuels our growth in India. The country has a thriving startup ecosystem and one of the world's largest small to medium business (SMB) ecosystems. Enterprises within the country are swiftly embracing innovation, recognising the importance of leveraging data for strategic advantages. Furthermore, there exists a cohort of Indian companies that are catering to not just local but global demands. Their modus operandi revolves around leveraging the cloud to deliver high-efficiency customer-centric solutions, which puts us in a beneficial position.
What are the dominant trends in enterprises’ adoption of cloud?
Enterprises have realised that if they want to be moving fast, that is the only way for them to keep innovating and delivering what their customers need. To this end, they are doubling down on faster speed to market, and better leveraging their analytics and data. Enterprises are also focusing on being ready for the next wave of innovation. All this is facilitated by cloud, which allows quicker innovation, along with the capability to scale up and down.
Moving from cloud to the hottest tech of the year — how are you approaching generative AI offerings to your customers?
We consider three layers when we talk about generative AI.
The bottom layer is to make generative AI very cost-efficient. To do so, we are investing in creating our own hardware, including Inferentia and Trainium. These chips bring down cost to a great extent, for instance, Inferentia offers a 4x higher throughput, but at 10x lower latency — all of that at 40% better price performance.
Next is the middle layer, where we are working towards making generative AI application development easier. To this end, we have made the Amazon Bedrock platform, a fully managed and serverless service, generally available. It consists of leading foundational models from providers like HuggingFace and Anthropic, along with our homegrown Amazon Titan foundational model. Customers have the choice to choose the model best suited for their business.
As for the third and the top layer, we believe in making generative AI useful without them having an expert level of knowledge about the technology. We are building tools that add to the customer experience, and one of the examples is our AI code assistant — Code Whisperer — which can support developers in up to ten programming languages.
We believe all these three layers are really adding big differentiators to businesses. Now they can offer services to their customers at a lower cost, at a simpler operational capability or professional investment, while also offering tools to increase the efficiency of their developers.
In your experience of working with a plethora of customers, what are the pain points they face in generative AI adoption?
For companies looking at adopting generative AI solutions, the most important task is working backward from the customers. You don’t want to build an application without knowing what challenge you are solving and how will it ultimately benefit the organisation. Getting the goal in order is one of the basic challenges.
Next comes getting your data house in order. A lot of companies still have fragmented or siloed data. While generative AI technology is very powerful, it rests on the foundation of clean and good data. Hence, companies need to get the foundation of data to be corrected, improve its accessibility and availability, and form a data strategy where they connect these siloed data sources together.
Lastly, this is a fast-evolving ecosystem. Every week there are new movements and updates in this space. So, investing in upskilling people is core to being successful in generative AI. From AWS’ end, we have trained upto 4 million people since 2017 in AI and other related technologies.