Just like any emerging technology, AI is still, to some extent, an enigma. However, the consumer market gets constant glimpses of how AI can have a positive impact on people’s lifestyle. Amazon, Samsung, Microsoft, Apple – to name a few – have all introduced smart AI solutions to their consumer products. From smart voice assistants that help us with our voice searches to controlling homes with digital assistants, users have been impressed with their early interactions with AI. Many think that the same way as AI has percolated into their personal lives, it will one day be pervasive in enterprises as well. The requirements of an enterprise AI solution is completely different and complex. For example, wearables and wellness mobile apps can help you take control of your health, but for them to become part of the healthcare system, they require FDA approval, a well-documented workflow and policy implementations. But, wearables get people curious and create a buzz about the role of AI in healthcare.
Several governments are engaging and getting serious about AI and are investing in AI R&D. Many have created an AI roadmap including governance, to promote the adoption of AI. AI.gov was launched by the US Government to centralise AI efforts, share knowledge on AI and drive adoption across government agencies and departments. Some departments have already adopted AI. The US National Oceanic and Atmospheric Administration (NOAA) has been using AI to improve their forecasts. This helps in better prediction of high-impact weather events. Smart city applications are also seeing increased adoption of AI, including in citizen engagement. Cities and government departments are investing in AI-based call centres to answer repeated or routine queries. For example, the United States Army uses an interactive virtual assistant to check qualifications and answer questions with more accuracy. When governments back a technology area, it creates an interest in the citizens.
Every day we read about some AI implementation that has positively impacted an organisation or its customers. Twitter’s use of AI-driven text and image analytics to detect hate speeches and terrorist activities has been well-publicised. Gaming companies are actively using AI to improve user experience through Mixed Reality and AI technologies. The recent coronavirus outbreak was first detected by BlueDot, a Canadian company using AI technologies. Such success stories encourage other enterprises to evaluate the technology.
Beyond the Buzz
While we are adopting AI/automation as part of our consumer goods (such as phones, smart home systems) and services (such as search engines, online maps) the enterprise adoption of AI does not really match up to the hype around it.
Ecosystm research shows us what the popular AI solutions are and what their current adoption is globally, as well as what it is likely to be at the end of the year (Figure 1). While some solutions have become popular, especially in industries such as Manufacturing, Mining & Resources and Construction, the reality is that we have not yet reached mass adoption. Of the organisations that are planning to implement some AI solutions, 44% consider the investment as strategic to their organisational goal. The rest are mostly looking at ad-hoc implementations to test the waters.
What is hampering more widespread adoption of AI? For both organisations that have embarked on their AI journeys and those who plan to in 2020, the challenges are the same (Figure 2).
AI integration is a complex process. The more organisations want to integrate AI investments into their transformation journey, the more complicated the integration becomes. There needs to be an identification of the expected outcomes, mapping of the data that will be required to help the algorithms, real-time or near real-time data sources and consistency in data infrastructure and sources. Organisations have legacy systems that run in siloes. Integration requires a clear roadmap and dedicated resources, often a third party.
Even in industries that have access to huge organisational data repositories, data access can be a challenge, for technological or compliance reasons. AI requires a huge amount of data to train and run algorithms. Data scientists are often challenged with access to quality training data at the scale required to train the AI systems.
Cybersecurity concerns are natural for any emerging technology area. AI systems have access to enormous organisational data. With threats ranging from ransomware, data breaches to hackers tampering with physical and industrial systems, it is dangerous if AI system falls in the hands of cybercriminals. Instances such as when criminals used an AI-based software to impersonate the voice of a company CEO to commit a €220,000 fraud, also add to the concerns around cybersecurity and AI.
Another reason why organisations find deploying AI solutions difficult is that they do not involve the right organisational stakeholders in the AI decision-making process (Figure 3). While IT is likely to know where the data resides and have a better understanding of the systems implemented, the true success of AI deployments will be measured in user acceptance. An AI solution will obviously impact the existent workflows in an organisation, and if the stakeholders are not convinced, or are unsure of the benefits, it will be difficult for AI to have an organisation-wide impact.
Moreover, internal IT may not have the right skills to implement and maintain an AI solution. It will become important for organisations to involve strategic partners who can guide the implementations, at least in the initial stages. While 51% of organisations that have an AI solution engage an external strategic partner, only 33% of organisations that are planning to adopt AI have planned for a strategic partner to guide them. A strategic partner – with the right technical expertise and business experience – can help combat some of the challenges around integration issues and provide guidance on cybersecurity best practices.
AI clearly has immense possibilities and indeed is a revolutionary technology that will bring value to almost all industries. What is required for a successful AI implementation however is a roadmap – including a cross-departmental Centre of Excellence (CoE), a clear timeframe and KPIs to measure both business and technological success of the AI models. Unless organisations can plan their AI investments, the technology will not translate from hype to reality.
For more insights from our ongoing AI research, please create your account on the Ecosystm platform.
A new high speed CPU-to-device interconnect standard, the Common Express Link (CXL) 1.0 was announced by Intel and a consortium of leading technology companies (Huawei and Cisco in the network infrastructure space, HPE and Dell EMC in the server hardware market, and Alibaba, Facebook, Google and Microsoft for the cloud services provider markets). CXL joins a crowded field of other standards already in the server link market including CAPI, NVLINK, GEN-Z and CCIX. CXL is being positioned to improve the performance of the links between FPGA and GPUs, the most common accelerators to be involved in ML-like workloads.
Of course there were some names that were absent from the launch – Arm, AMD, Nvidia, IBM, Amazon and Baidu. Each of them are members of the other standards bodies and probably are playing the waiting game.
Now let’s pause for a moment and look at the other announcement that happened at the same time. Nvidia and Mellanox announced that the two companies had reached a definitive agreement under which Nvidia will acquire Mellanox for $6.9 billion. Nvidia puts the acquisition reasons as “The data and compute intensity of modern workloads in AI, scientific computing and data analytics is growing exponentially and has put enormous performance demands on hyperscale and enterprise datacenters. While computing demand is surging, CPU performance advances are slowing as Moore’s law has ended. This has led to the adoption of accelerated computing with Nvidia GPUs and Mellanox’s intelligent networking solutions.”
So to me it seems that despite Intel working on CXL for four years, it looks like they might have been outbid by Nvidia for Mellanox. Mellanox has been around for 20 years and was the major supplier of Infiniband, a high speed interconnect that is common in high performance workloads and very well accepted by the HPC industry. (Note: Intel was also one of the founders of the Infiniband Trade Association, IBTA, before they opted to refocus on the PCI bus). With the growing need for fast links between the accelerators and the microprocessors, it would seem like Mellanox persistence had paid off and now has the market coming to it. One can’t help but think that as soon as Intel knew that Nvidia was getting Mellanox, it pushed forward with the CXL announcement – rumors that have had no response from any of the parties.
Advice for Tech Suppliers:
The two announcements are great for any vendor who is entering the AI, intense computing world using graphics and floating point arithmetic functions. We know that more digital-oriented solutions are asking for analytics based outcomes so there will be a growing demand for broader commoditized server platforms to support them. Tech suppliers should avoid backing or picking one of either the CXL or Infiniband at the moment until we see how the CXL standard evolves and how nVidia integrates Mellanox.
Advice for Tech Users:
These two announcements reflect innovation that is generally so far away from the end user, that it can go unnoticed. However, think about how USB (Universal Serial Bus) has changed the way we connect devices to our laptops, servers and other mobile devices. The same will true for this connection as more and more data is both read and outcomes generated by the ‘accelerators’ for the way we drive our cars, digitize our factories, run our hospitals, and search the Internet. Innovation in this space just got a shot in the arm from these two announcements.
In the recent announcement, HCL Technologies and IBM stated that they were collaborating to assist organisations on their hybrid cloud journey. The joint services will offer seamless integration of their customers’ cloud across any private, public and on-premise environment. The hybrid cloud approach is aimed at eliminating the concerns of mixing the different cloud environments yet maintaining scalability and security. HCL will feature new refactoring and re-platforming services to allow organisations to migrate, integrate and manage apps and workloads on IBM’s cloud.
Speaking on the subject, Phil Hassey, Principal Advisor ,Ecosystm, thinks that “the most attractive aspect of this collaboration is that it will bring together the best of both vendors. IBM has such a long history in Infrastructure Management, whereas HCL has – most particularly in the past 5 years or so – built up capability in the space.”
The good news for tech buyers is that they will derive benefits from HCL’s customer knowledge and hands-on approach with the combination of IBM technology, cloud and – of course – Watson. HCL has acquired significant IBM assets that they could extract more value from. The infusion of AI into hybrid cloud will lead to increased automation, improved outcomes and effective investments by clients.
When we look at the individual benefits to IBM and HCL from this deal, Hassey says, “IBM needs new customer logos and HCL can provide that. Conversely, IBM can provide larger enterprise clients to HCL. However, it is not a straight customer swap, they will work to maintain relationships independently.”
IBM is pushing hard towards their plans to capture more market with many big deals announced recently – and undoubtedly many more yet to happen. It is well-documented that IBM has gone through many reinventions but is still deeply entrenched in many of the world’s largest organisations. “IBM is such a large machine that different aspects and offerings are always operating at different speeds”, says Phil. “It needs to accelerate the uptake of Watson, and Watson being available on any cloud will help this.”
The hybrid cloud is desperately looking for big scale integration and transformation capabilities, so this agreement between HCL and IBM will hopefully help kickstart that.
Recently IBM and Vodafoneannounced a new strategic commercial agreement, as a joint venture, to provide their clients with the ability to integrate multiple clouds that have a need to access emerging technologies such as 5G, AI, Edge Computing and Software Defined Networking. Under an eight-year engagement valued $550 million (€480 million), IBM will provide managed services to Vodafone Business’ cloud and hosting unit.
Businesses are becoming more and more challenged to run their operations and business processes in a seamless manner as data is distributed and managed across more and more clouds. Together, Vodafone Business and IBM aim to remove these complexities to support the basis of any digital transformation and enable a company to share data freely and securely across its organization.
On the surface, this announcement makes sense if you are a Vodafone business customer who wants to take the next step in a digital transformation journey. The convergence of multi-clouds has the ability for companies to enrich their own data management systems with external sources. With the purchase of Red Hat late in 2018, IBM now has the ability and credibility to offer that capability. However, as many IoT-based solutions create the data to fuel these cloud processes, IBM has not had a clear Edge Computing or network connectivity strategy. This is where Vodafone can help IBM connect the edge of the network to the enterprise systems. This announcement seems like a complimentary win-win situation for both sets of IBM and Vodafone customers.
However, this market is still shaking itself out and there are many other competitive offerings to Red Hat. There are startups such as RightScale and Morpheus who can offer up multi-cloud management. Alternatively, as a mature company, VMwarecompetes head to head with Red Hat and has had a long-standing partnership with Vodafone. In particular, VMware and Vodafone have partnered in telco specific functions such as NFV and 5G.
To understand the importance of VMware in the midst of this announcement is to appreciate the end-to-end customer experience that VMware can bring to telco customers such as Vodafone. As 5G rolls out and NFV-based network slicing becomes a valuable onboarding differentiator VMware could offer its vCloud NFV solution to Vodafone’s customers. Vodafone’s customers could have access to the same multi-cloud services from VMware and not IBM while obtaining AI, cognitive and ML services available from the major public cloud providers (such as AWS, Google and Microsoft). VMware’s position at the edge of the network would, therefore, appear to leapfrog IBM’s position. Vodafone Business’ customers could bypass IBM and its cloud services strategy. At the end of the day, IBM could be left with only the managed services contract while missing out on analytics and cognitive business services.
To negate this scenario, IBM will have to lead more and more with Red Hat and be willing to downplay the cognitive and machine learning services. Business solutions in vertical markets such as agriculture are extremely price sensitive and customers will look closely at the cost of connectivity followed by the cost of data acquisition to enrich their business outcomes. We believe that if the cost to run data science and cognitive services are too expensive, then Vodafone customers will seek the same tools and services from other cloud service providers and not IBM.
Our advice to tech buyers who are in the midst of business transformation should consider how they fuel their decision-making engines for analytics, machine learning, and cognitive computing. Real-time processing and dissemination of business outcomes is one of the table stakes for a successful digital company. As a result of that, seamless end-to-end processing across a complex and distributed enterprise infrastructure is a challenge that needs to be overcome. Tech buyers should ask if IBM’s edge computing strategy and Vodafone’s connectivity are mature enough to funnel IoT-data generated smart data to a broad inter-cloud infrastructure.