Micro Focus’ John Delk on AI’s growing role and enterprise technologies to watch out for
John Delk is chief marketing officer and general manager (security and information management and governance product groups) at Micro Focus International Plc, a global enterprise software company. Delk has 34 years of experience in the IT industry working for various companies and has worked on artificial intelligence during his PhD tenure. In an interview with TechCircle, he talks about the growing role of AI, the top enterprise technologies that will make a difference in the future and how big data will be processed. Edited excerpts:
How would you compare India’s growth to other countries where you have a base?
Our Asia Pacific dynamic is to delve deep into the smart city initiatives. In North America, we have our own political and economic challenges. The advent of Brexit might affect the European market. There is a very strong undercurrent for a conservative approach in the future planning for Europe as it could go south. But I do not think there will be any stepping back on growth in the APAC region, this includes the likes of Singapore, Malaysia and India.
The focus on digital transformation plays well with the complex environments needed in a smart city infrastructure. The question is how do we move at the pace of the software development and make sure that the infrastructure matches the expectations from the user. The growth also needs to take place at a pace where it can be monitored and managed. This is very different from the traditional waterfall model.
We have a necessity to monitor hybrid IT management, datacenters, and make sure that all of the sensors communicate properly to deliver results and insights. In that perspective, APAC has a strong monitoring presence.
As compared to cities that are developed and aren’t cash sensitive, what challenges do you face in India?
In modern cities there is no need to get to start from scratch, and the infrastructure doesn’t need to be built from the ground up, so it leads to the requirement of invasive construction. The thought is to approach projects in a logical and stepwise manner. However, Singapore, Malaysia or Dubai might have the capacity to spread the wings a little quicker, and provide the possibility of solving multiple problems at the same time.
How important is AI in the future of enterprises?
Having taken PhD courses in AI about 34 years ago, I chuckle when somebody says that the technology is new, as the underlying algorithm and the math have been the same for a long time. The biggest change however is the computing power at our on which the math can be run. Data is at a scale where previous algorithms can be deployed in predictive analytics to derive good insights.
The previous pain point was on how do we combine the efforts put into enterprise data lakes and tap into the previous BI wave. With the aggregation of computing and data, we have the ability to tap into the huge pools of data. At the pace at which tech is changing, software can make certain decisions faster and deploy automated results quickly. This can help in avoiding frequent training of employees and keeping the manpower low.
How does machine learning help make your client’s businesses better?
Machine learning is of great help for testing tools to reduce the amount of false positives. If a developer gets a list of 72 possible security violations, and 71 of them are false positives, we lose trust with the AI tool. In this regard, machine learning learns characteristics of the environment and helps to reduce the false positives.
What are the top three enterprise technologies to watch out for?
Serverless computing is of high interest and we are working towards how they can be deployed in Hybrid IT environments. We are keeping a watch of the next big thing to happen in the public and private clouds.
In today’s world, everything is conjoined and works simultaneously to take care of infrastructures, which is a better method than the traditional way of installing and deploying solutions.
5G and connectivity changes the way we deploy software. We aren’t focused on building big monolithic programs, but on creating discrete components. The world is much more microservices oriented and AI driven before, and computing power will increase in a 5G world.
What is the future of how big data is processed?
Currently, all the data collected is brought to a central machine learning platform, and then the data is computed upon. There are concerns about piping data in and out of the systems.
However, the future is the opposite of this technique. The inline machine learning will be deployed directly at the source of the data. And will be capable of providing real time decision support models. For example, we have nuclear energy reactors with thousands of sensors, and it wouldn’t be feasible to wait for data to be collected and then figure out the problem.
Taking a database, breaking the compute from the storage and turning it into a service-based architecture, that is the future of big data.
How important is COBOL today? How did you bring it back from the dead?
COBOL is still mission critical to many large BFSI companies. Probably the biggest innovation in COBOL was to allow customers to choose more than just the mainframe deployment model. Micro Focus allows customers to redeploy those applications in just about any form factor.
There are many customers who are running 20-30 year old banking or insurance platforms on COBOL, but they have chosen to migrate to a new deployment model. This has kept COBOL relevant today.
COBOL is always still percolating and strong with the customers, we continue to invest in the technology by facilitating redeployment models. COBOL hasn’t changed but customers are more willing to talk about it now.