Generative AI - The Current Scenario & Future Prospects


Generative AI - The Current Scenario & Future Prospects

Having completed his B.Tech degree in Computer Science from Jawaharlal Nehru Technological University, Ganesh has over 16 years of experience in the areas of DevOps, Information Security and Application Performance Management. Prior to joining New Relic in 2021, he has handled key responsibilities across Wipro Technologies, SapientNitro, SapientRazorfish and Zest Money.

Siliconindia recently got a chance to interact with Ganesh Narasimhadevara, Principal Technologist, APJ , New Relic , wherein he shared his insights on the current user adoption of Generative AI and its future use cases. Below are the excerpts from the exclusive interview

What are your thoughts on the current Generative AI scenario?

Businesses across many industries around the world are wholeheartedly embracing Generative AI to streamline operations and enhance productivity. The scenario is no different in India either; every Indian IT organization is striving to meet its end users’ expectations and find new ways to serve them better each day. Generative AI presents a very compelling opportunity to support these desires if applied correctly. Thus, companies that envision faster and more efficient implementations are choosing to partner with hyper-scalers instead of building their own solutions. In terms of availability, Microsoft’s Jugalbandi has been able to combine ChatGPT and Indian language translation models to democratize Generative AI by making it accessible to every Indian, no matter what language they speak. This has the potential to make Generative AI a part of the daily lives of the Indian population by breaking-down the language barriers that currently exist.

Briefly explain a few use cases of Generative AI for Indian Businesses.

Jugalbundi is a groundbreaking solution for government initiatives and public programs alike as it creates greater accessibility across the country, but that’s just the first use case. Imagine how powerful this could be in India if extended to healthcare or banking. With a population of 1.4 billion people and a high availability of skilled workers, access to a solution like ChatGPT will no doubt make people even more productive in building products and solutions that cater directly to the needs of the community. This is an incredible opportunity to build meaningful and innovative products that improve lives.

By integrating generative AI into existing processes, businesses can unlock significant bandwidth and enhance their operations. This further frees-up their workforce to experiment with new ideas & technologies, and ultimately find innovative solutions to business problems. Additionally, Generative AI-powered observability assistants can make it easy for engineers to find the root cause of issues and fix errors in code without needing to sift through mountains of data. Also, the conversational nature of the solution enables engineering teams to place queries with ease and gain valuable insights into the state of a system or app simply by asking the right questions. This improves the quality of work and helps engineers work more efficiently. However, while integrating Generative AI into their processes, businesses need to be mindful of not becoming completely dependent on it. Generative AI should be complementary to the core business as and certain level of human reasoning will always need to be applied.

Enlighten us about the impact Generative AI & observability can have on data security.

Similar to Generative AI, observability platforms become more powerful when they are fed quality data. Without quality inputs or robust training, you’re likely going to get substandard results. Although they require a lot of data to be effective, observability platforms by design do not require sensitive information to operate. As a result, there’s really no incentive or requirement to feed sensitive data into a platform’s integrated generative AI assistant. However, due to strict rules and regulations around sensitive data storage, it is hypercritical that the necessary steps are taken to secure and protect business & customer data.

Learning Language Models (LLMs) learn from the information that they are fed. So by feeding sensitive data into an AI assistant that utilizes LLMs, you’re essentially offering that sensitive data as learning material. The key takeaway here is not to feed any data into the system that you wouldn’t want the system to learn from – and that is the case for all types of sensitive data.

How do you expect the Generative AI segment to evolve in the near future?

The ability to further enhance customer or end-user experience by leveraging Generative AI that is integrated into an all-in-one observability platform is going to be game-changing. By having a Generative AI assistant as part of the user interface, interactions with such platforms are going to be more accurate because all data will be located in one location and not siloed across multiple tools, thus resulting in context-driven AI assistance. This also means that assistant functionality is continually learning from all data that it is fed, making it a much more accurate system that continues to evolve. We have a high level of expertise in software development in India, but the industry is still suffering from a shortage of skilled workers. Supplying quality data to Integrated AI assistants is going to free-up a lot of time for IT professionals and give them the ability to focus on strategic & creative work.

Another powerful aspect is that of tribal knowledge. Traditionally, this is something that is lost once a tenured employee chooses to leave a business. In the near future, a Generative AI assistant that is integrated into an observability platform, like New Relic Grok, will be able to store tribal knowledge as it learns and make it available to all folks in the business. The assistant will have a complete understanding of all data in the context of where the user is at that point in time. Engineers new to an organization or even to those new to a specific observability platform can reap massive advantages due to this, as they can simply type-in their question and receive an answer that is relatively accurate; providing a good starting point and incentive to continue exploring the platform, and in time, master it.