Data integration is becoming a strategic issue for senior IT with the emergence of the Cloud and Big Data. Our top priorities continue to be helping companies take advantage of the cloud, mobility, social data and Big Data, to help them manage data as a true asset. This has meant a broadening of traditional data integration techniques, with enterprises adopting Integration as a Service, agile business intelligence, Data Virtualization Data Governance and Big Data Integration.
I am looking at how Informatica’s technology, based on its unique Virtual Data Machine, can provide value to and accelerate these trends. These have been our priorities for a few years now. We released our cloud offering in 2005. Informatica 9, which we released in 2009, is the foundation that lets us design in Informatica and run our platform directly on a Hadoop cluster. My role has evolved with the passage of time. With the number and velocity of the industry trends that affect data accelerating, my focus is much more on emerging customer needs.
Trends developing in the industry
Social media, mobile and cloud are the big trends in IT right now, as well as virtual data integration and big data which are all driving a large amount of change and opportunity within the data integration market.
The IT industry is going through several major shifts at once — from on premise applications to the cloud, from desktops to mobile devices, from a focus on transactions to social and machine interactions. Not only are these trends driving Big Data, they continue to create data fragmentation and fueling growth in the data integration market.
Data Integration helps better operations
Companies have been using data to become more efficient and understand their customers better. Take a trucking company like US Xpress. Believe it or not, trucking companies have been some of the big innovators even though they have some of the lowest IT budgets as a percentage of revenues. US Xpress used data from their truck maintenance systems and GPS devices to make their supply chain more efficient, to cut costs, shorten delivery times and meet SLAs, which makes customers happier. This data showed them where they were wrong, and where their drivers were right. Their drivers know the best roads, the fastest truck stops, even shortest bathroom lines. This saved US Xpress millions in fuel costs, and kept more trucks on the road longer. Others have been using technologies like MDM to understand their customers better. We have seen 10 percent revenue gains from these initiatives as they have sold more of the right products to customers at the right time, and eliminated the inefficiencies of bad or duplicate data.
Future of the data industry
There are companies who are truly starting to treat data as a first class citizen within their companies. They have competency centers that manage data integration across different projects. Beyond the 2 - 4x productivity gains, the lower costs, and the much faster business agility they have achieved and helped businesses truly innovate. We have seen companies achieve 10 percent revenue growth and 30 percent business cost reductions from a better use of data and data integration. These gains are bigger than the cost of most IT organizations.
Potholes to avoid for data integration projects
Companies tend to overestimate how much they know about their existing data and how clean it is. Questions like: where is the data I need even located? How complete is it? How consistent is it? are often ignored. The result is that people move data from the old system to the new system or into a data warehouse and it is so dirty that it is not useful for making intelligent business decisions and then they wonder why they moved the data in the first place. As my high school Latin teacher used to say, “Garbage in, garbage out.” She was talking about the quality of our education, but the same is true for data integration. It is not just about integration; it is about the quality of that data as well.
First, they do not ask tough questions about cutting costs. Many companies are spending millions on hardware upgrades to support data volume growth of 50-60 percent a year when they can change their architecture and avoid these upgrades. Second, they assume the data is clean. It is not, and that becomes most obvious when you merge data together in a warehouse. So it does not matter how good the integration is. If you start with bad data, you will end up with bad data. Third, when they evaluate new technologies, they handcode. It is not bad to do some handcoding at the initial stages. But it is much more costly to keep repeating it.
Staying ahead of the times
We have one of the highest R&D budgets in the world for data integration, and are constantly investing in new innovations. That’s exciting for developers. We are using a lot of new technologies to solve hard problems in the cloud, in mobility, with social data, and with Hadoop. The people you get to work with at Informatica have been at the forefront of all of this innovation for close to 20 years now. There are not too many R&D organizations that can say that.
EVP & Chief Product Officer-Informatica Corp
Informatica's product portfolio is focused on Data Integration: Application Information Lifecycle Management, B2B Data Exchange, Cloud Data Integration, Complex Event Processing, Data Masking, Data Quality, Data Replication, Data Virtualization, Enterprise Data Integration, Master Data Management, Informatica Ultra Messaging; currently at version 9.5.