In the course of my facilitation, I realised that the key focus of the discussions was on how organisations today handle their biggest asset – data – and manage all the moving parts in large and diverse settings and in very traditional enterprises that are transitioning into data-driven “New Age” businesses.
The Impossible Tech Triangle
Data has never been so prolific or strategic as it is today. Multiple sources of data generation and technologically savvy customers have seen a data explosion. Simultaneously, personalised services and data insights have seen a drive toward using the data for increased intelligence in most industries. The biggest risk organisations face and what CSOs and business leaders spend sleepless nights on, is significant disruption due to compromise. Moreover, the complexity of the technology platforms means that no organisation is 100% certain that they have it right and that they are managing the risk effectively.
Does this give rise to a similar situation as in Marketing and Advertising where the triangle of price, quality and speed appear to be unattainable by many organisations?
Key takeaways from the sessions
#1 Drivers & Challenges of Cloud adoption
The fact that discussions around agility and innovation can happen with the intensity it does today, is because organisations have embraced Cloud infrastructure and application development platforms and SaaS solutions.Every organisation’s Cloud journey is unique, driven by its discrete set of requirements. Organisations choosing cloud may not have the resources to build in-house systems – or may choose to migrate to the cloud for various reasons such as cost, productivity, cross-border collaboration or for compliance.
When embarking on a Cloud journey it is important to have a clear roadmap that involves instilling a Cloud-First culture and training the IT organisation in the right skills for the environment. Concerns around costs, security, and data ownership are still synonymous with Cloud, therefore, organisations can distill the workload from a cost angle before jumping on Cloud. It is important for organisations to appreciate when a Cloud option will not work out from a cost angle and to have the right cost considerations in place, because organisations that do a straight resource swapover on Cloud are likely to end up paying more.
Data ownership and data residency can also be challenging, especially from a compliance standpoint. For some, the biggest challenge is to know the status of their data residency. The challenges are not just around legacy systems but also in terms of defining a data strategy that can deliver the desired outcomes and managing risk effectively without ruining the opportunities and rewards that data utilisation can bring. Cloud transformation projects bring in data from multiple and disparate sources. A clear data strategy should manage the data through its entire lifecycle and consider aspects such as how the data is captured, stored, shared, and governed.
#2 Perception on Public Cloud Security
While security remains a key concern when it comes to Cloud adoption, Cloud is often regarded as a more secure option than on-premise. Cloud providers have dedicated security focus, constantly upgrade their security capabilities in response to newer threats and evolve their partner ecosystem. There is also better traceability with Cloud as every virtual activity can be tracked, monitored, and is loggable.
However, the Cloud is as secure as an organisation makes it. The Cloud infrastructure may be secure, but the responsibility of securing applications lies with the organisation. The perception that there is no need to supplement public Cloud security features can have disastrous outcomes. It is important to supplement the Cloud provider’s security with event-driven security measures within an organisation’s applications and cloud infrastructure. As developers increasingly leverage APIs, this need to focus on security, along with functionality and agility should be emphasised on. Organisations should be aware that security is a shared responsibility between the Cloud provider and the organisation.
#3 Viewing Security as a Business Risk – not IT Risk
The Executive Management and the Board may be involved in the Security strategy and GRC policies of an organisation. But a consistent challenge Security teams face is convincing the Board and Senior Management on the need for ongoing focus and investments on cybersecurity measures. Often, these investments are isolated from the organisation’s KPIs and are harder to quantify. But Security breaches do have financial and reputational impact on organisations. Mature organisations are beginning to view Security as a business risk requirement and not a matter of IT risk alone. One of the reasons why Senior Management and Boards do not understand the full potential of data breaches is because CISOs do not translate the implications in business terms. It is their responsibility to find ways to procure senior management buy-in, so that Security becomes part of the Strategy and the costs associated gets written into the cost of doing business.
Training sessions that educate the stakeholders on the basics of the risks associated with using knowledge systems can help. Simulation of actual cybersecurity events and scenario testing can bring home the operational issues around recovery, assessment and containment and it is important to involve senior stakeholders in these exercises. However, eventually the role of the CSO will evolve. It will become a business role and traverse Security across the entire organisation – physical as well as cybersecurity. This is when organisations will truly recognise investment in Security as a business requirement.
#4 Moving away from Compliance-driven Security Practices
Several organisations look at Security as part of their compliance exercise, and compliance is built into their organisational risk management programmes. Often, security practices are portrayed as a product differentiator and used as a marketing tool. An organisation’s Security strategy should be more robust than that and should not only be focused on ticking the right compliance boxes.
A focus on compliance often means that Security teams continually create policies and call out non-compliance rather than proactively contribute to a secure environment. Applications teams do not always have the right skills to manage Security. The focus of the Security team should not be on telling Applications teams what they are doing wrong and writing copious policies, procedures and standards, expecting others to execute on the recommendations. There should be a focus on automated policy-driven remediation that does not restrict the Applications team per se but focuses on unsafe practices, when they are detected. Their role is to work on the implementation and set up Security practices to help the Applications team do what they do best.
#5 Formulating the Right Incident Response Policy
In the Ecosystm Cybersecurity study, 73% of global organisations think that a data breach is inevitable – so organisations largely believe that “it is not about if, but when”. About 50% of global organisations have a cyber insurance policy or are evaluating one. This trend will only rise. Policy-driven incident response measures are an absolute requirement in all enterprises today. However, to a large extent even their incident response policies are compliance driven. 65% of the organisations appear to be satisfied with their current breach handling processes. It is important to keep evolving the process in the face of new threats.
Organisations should also be aware of the need for people management during an incident. Policies might be clear and adhered to, but it is substantially harder to train the stakeholders involved on how they will handle the breach emotionally. It extends to how an organisation manages their welfare both during an incident and long after the incident response has been closed off.
Over the two sessions, we explored how to achieve the ‘unattainable triangle’ of Cloud, Agility and Security. What I found interesting – yet unsurprising – is that discussions were heavily focussed on the role of Security. On the one hand, there is the challenge of the current threat landscape. On the other hand, Security teams are required to deliver a Cloud and an agile development strategy in tandem. This disconnect ultimately highlights the need for Security and data management to be embedded and managed from the very start, and not as an afterthought.
I was recently invited to an AWS Connect, Zendesk and Voice Foundry event in Sydney and it was great to hear from Carsalesabout how they re-invented CX. Prior to making the leap to deploying the solution from AWS Connect and Zendesk, they had been running their contact centre for years using a traditional contact centre platform. Some of the issues they have faced over the years included the following:
Difficult and costly to customise
Expensive support costs
Expensive and difficult integrations
Difficult to extract reporting
Downtime for upgrades
Difficult to use
These issues are common challenges posed by traditional contact centre platforms. High costs of maintenance and expensive integration costs are some of the challenges I hear of when speaking to end-users. The contact centre and CX industry are at an inflection point where organisations are evaluating how best to drive great CX and at the same time considering how to work with vendors that can help drive innovation in CX. Carsales eventually shortlisted 4 players before making the decision on which cloud provider to work with. They ended up working with Zendesk and AWS Connect.
Carsales recognised the need for a CX solution that could use the data they already have on their customers in Zendesk and Salesforce CRM systems to create a unique experience for each interaction. It was important for them to have a solution that would simplify data warehousing and analytics to make it easier to get a full view of the customer. By integrating the CRM application to AWS Connect as the CX orchestration engine. to bring the contact centre and CRM applications together helped Carsales deliver a personalised CX for their customers.
WHY AWS Connect?
These have come off the points mentioned by Carsales as to why they selected AWS Connect:
Cloud-Based (accessible anywhere)
No downtime for upgrades
Access to Data (Lambda and APIs) via ZenDesk and Salesforce
Support from implementation partner Voice Foundry
Access to great technology such as Speech to Text (Polly), Speech Recognition (Lex) and Analytics (Transcribe and Comprehend)
Scalable and customisable call flows
In the global Ecosystm Cloud study, as depicted by the chart below, about 53% of organisations state that increased work processes and efficiency are a key benefit of the cloud. Nearly half the organisations rate flexibility and scalability and improved service levels and agility as the main benefits of a cloud deployment.
What Carsales found about the AWS Connect solution, is how changes can be made easily. Most configurations can be made by the contact centre staff and there is no need to go to IT. Their primary aim was to deliver a personalised CX by accessing data from other internal systems (CRM, proprietary databases, etc) and the solution addressed this need.
The advice that Carsales gives to others implementing a Cloud Contact Centre are:
Ensure that you have invested in the network to support voice over IP.
Make sure that your headsets are compatible to ensure full functionality.
Engage with a partner rather than implementing the platform on your own. Although you can implement AWS Connect solution on your own, it can be difficult. Voice Foundry was a great implementation partner.
The Importance of Data-Driven CX
The market is witnessing a shift where organisations are looking for new and more agile platforms for CX. The challenges, as highlighted by Carsales – such as existing solutions being difficult and costly to customise – are some of the common challenges we are hearing from organisations about the limitations of traditional telephony and contact centre solutions. Whilst the traditional vendors still have a majority share of the market, that is changing. Some of the new cloud contact centre vendors are offering new and dynamic ways of driving a better experience for the users of the technology – from agents to those that manage the contact centre solution. The ability to add agents when needed has become easier (without intervention from IT) and cloud provides better security due to the multiple back-ups and redundancies it offers. The ability to reduce maintenance and customise applications with new agile methodologies and APIs are driving a new era in the contact centre market. The single most important area is deep analytics. The ability to have deep analytics to understand the customer better as a starting point before a call, during a call and after the call is critical. Artificial intelligence can be used to better understand customer sentiment and detect trends in customer data.
The shift from traditional contact centres to cloud contact centres is happening and no longer just with mid-market companies. Large organisations are making the shift to the cloud as the benefits are apparent. Implementing a data-driven culture is key to driving a personalised CX. The tight integration between CRM databases and the applications in the contact centre is becoming more important than ever.
Today, many businesses use Tableau (over 86,000), including a lot of Salesforce customers. They have chosen Tableau because it is easy to deploy and use, and like Salesforce own applications, it targets the ultimate decision maker – the business user – and sometimes even the consumer. Recent research into the BI systems integrators in Asia Pacific shows that Tableau is one of the leading analytics platforms for the partner community in the region – the big SIs have many people focused on Tableau. But that dominance is being challenged by a re-energised Microsoft, whose Power BI is also witnessing strong growth – and who is typically the price leader in the market.
For Salesforce customers, there is some overlap between products – their own Einstein Analytics tools do much of what Tableau can do – although Tableau helps customers see insights from data stored both on the cloud and inside their own data centres. It also moves Salesforce closer to the Customer 360 vision – the ability to get a view of customers across the Commerce, Marketing and Service Clouds. Salesforce customers not using Tableau today will get a better user experience by using Tableau as the visualisation platform.
History has shown that it is hard to make such acquisitions successful. Tableau was a huge success because it was independent. The same was for Business Objects and Cognos before their acquisitions. History has shown that when the large BI and analytics vendors are acquired, others move into that space. While Salesforce has announced they will run Tableau as a separate business, it will no longer be independent. Partners will need to be maintained and provided a growth path – and partners are the cornerstone of Tableau’s success. Some of these partners might have strong ties to other software or cloud platforms too such as SAP, Oracle, AWS or Google. Customers of Tableau might feel sales pressure to move to a Salesforce environment – and will likely see Salesforce integration happen at a deeper level than on other platforms.
Tableau’s independence will disappear. However keeping Tableau as a separate business may not be the long term goal for Salesforce – it might be to offer the best application and analytics solution in the market – to make the entire suite more attractive to more potential buyers and users. It may be to take Salesforce beyond the current users in their customers to many other users who may not need the full application but need the analytics and visualisations that the data can provide. If this is the case, then the company is onto a winner with the Tableau acquisition.
The long term goal is not analytics reports delivered to employees. It is not visualisation. It is automation. It is applications doing smart, AI-driven analysis, and deciding for employees. It is about taking the human out of the process. In a factory you don’t need a report to tell you a machine is down – you need to book a repair person automatically – or a service technician to visit before the machine has even broken down. And you don’t need a visualised report to show that a machine is beyond its life expectancy. You need the machine replaced before it fails catastrophically.
Too often, we are putting humans in processes where they are not required. We are making visualisations more attractive and easier to consume when, in reality, we just needed the task automated. While we employ humans, there will be a need to make decisions more effectively, and we will still require tools like Tableau. But don’t let the pretty pictures distract you from the main prize – intelligentautomation.
If you would like to speak to Tim Sheedy or another analyst at Ecosystm about what the acquisition Tableau by Salesforce might mean to your business or industry, please feel free to schedule an inquiry call on the profile page.
For example, Florida Virtual School is a full-time online school providing virtual K-12 education to students all over the world. It is a recognised eLearning school and provides custom solutions to meet students’ requirements. This model is being replicated globally especially in remote areas where an actual school premise may not be feasible or is too expensive.
Research & Experimentation
The remote handling of projects and experiments is enabling education institutions to overcome the challenge of carrying them out in a controlled and safe environment. ChemCollective, a project of the National Science Digital Library in the US, enables students to interact with a flexible learning environment in which students can access online chemistry labs to apply formulas, perform experiments and learn in realistic and engaging ways, like working scientists.
Open Education Resources
Cloud is enabling the development of open source content for schools and colleges. The challenge with the existing books and lectures is that they get dated. Cloud is enabling a wealth of content through open repositories and legal protocols to allow a community to collaborate and update the information. Open educational resources (OERs) are developed and can be modified by the creators and administrators. The community can contribute to maps, slides, worksheets, podcasts, syllabi or even textbooks. The copyright is associated via legal tools such as Creative Commons licenses, so others can freely access, reuse, translate, and modify them.
As textbooks and course material can now be updated in real-time and offered through a cloud-based subscription model, this now opens up new streams of revenue for publishers. However, this then raises the conversation that textbook prices are increasing while students have no option to purchase second-hand books or sell books once they are done with them.
Massive Open Online Courses (MOOCs) platforms both provide content to students in areas of personal interest and additional sources of revenue to renowned global institutions. A quick look at Coursera’s website shows online courses from reputed institutions such as MIT and Johns Hopkins University. There are still providers such as the Khan Academy that do not actively monetise the material they provide, but increasingly institutions look at MOOC to generate more revenues, by offering remote learning options to individuals, as well as by collaborating with local universities to make their courses available to overseas students – a previously untapped market.
Cloud computing is transforming the classroom and learning experiences the way educators, curriculum leads, and specialists recommend. The technology has a huge role to play in enabling transformation in Education – for national education systems, for educational institutions, and ultimately for the students.
How else do you think Cloud can transform the education industry? Let us know in your comments below.
Following these announcements, this month Microsoft unveiled its fully managed Azure Blockchain Service, a package designed to simplify the processes and eliminate the pain points of blockchain networks. Microsoft Azure blockchain service will provide the required infrastructure, connection to services to develop, run and take advantage of applications on its Cloud-based platform.
To leverage blockchain Microsoft and J.P. Morgan announced a partnershipto accelerate the adoption of enterprise blockchain. Quorum, an Ethereum-based distributed ledger protocol developed by J.P. Morgan will be the first ledger available through Azure Blockchain Service, on the cloud.
Joining the bandwagon, Starbucks will use Azure and the Ethereum blockchain to track coffee from farm to the cup. In the same way, with a forward-thinking approach, Microsoft and GE Aviation collaborated to bring blockchain into aviation. GE Aviation has built a supply chain track-and-trace blockchain with the help of Microsoft Azure to monitor and collate data in relation to aircraft engine parts, life cycle, when to repair, this technology that the group has come up with is termed as ‘TRUEngine’.
Unfolding blockchain for “regular” businesses and SMEs
Blockchain technology, by its very nature leads itself to the digital transformation journey of an enterprise. Blockchain can address some of the pitfalls of digital transformation such as identity, security, and trust. From digital identity to tokenisation to using smart contracts to automate businesses, blockchain technology is swiftly establishing itself as a key enabler of the emerging digitised enterprise.
Speaking on the subject, Ecosystm’s Principal Advisor, Amit Sharmathinks that “For Small and Mid-Size Enterprise (SMEs), blockchain can simplify and automate processes related to Trade Finance which would mean less paperwork and automation in supply chains and it also opens up a huge alternative finance channel to deal with their cash flow challenges.”
Overall the blockchain network should facilitate the interworking between IT systems, financial systems and ledgers that are today primarily managed in silos and require heavy manual processes.
Are we already there?
“All disruptive technology has a ‘tipping point’ – the exact moment when it moves from early adopters to widespread acceptance. We are now approaching the tipping point for blockchain. Even though the development of blockchain for business is still in its early stages, business leaders have swiftly moved from understanding blockchain and its potential uses to running pilots,” says Sharma.
Blockchain has attracted attention across industries such as financial services, transportation and shipping, healthcare, energy and utilities, and supply chain management.
These share some common themes. Blockchain is a natural fit for use cases that are transactional but with a high degree of process complexity or volume. Blockchain will become the default technology wherever there is a need to ensure the integrity of data.
Blockchain Adoption by Organisations
Despite the flurry of activity and promising initial developments, blockchain faces a number of obstacles that will need to be overcome before companies choose to adopt it on a broader scale. Its decentralised network runs counter to the current business emphasis on centralising data or functions to support security efforts. Users and operators alike must shift their mindset to embrace and trust the system.“Among blockchain’s selling points is its security: high encryption and protocols. Since the general public largely doesn’t understand how the technology works, many still have concerns with data privacy and cyber security” says Sharma. “As with all new technology, when it comes to blockchain, business leaders should view any initial use cases as part of their enterprise risk management. Executives are attuned to the business and risk implications of blockchain. And in many cases, blockchain, like other technology platforms and systems, can be covered under existing insurance programs.”
Implementation by the large technology providers
“With the large technology providers such as Microsoft and AWS now offering BaaS (Blockchain-as-a-Service) over multiple frameworks supported by a ‘Pay as you use’ model, this technology is much more accessible. Pre-built integrations to the network and infrastructure services that are being offered by some of these players will significantly reduce the development time and cost for enterprise customers” says Sharma.
The next several years could see blockchain move from testbed to becoming an essential business tool, so staying abreast of the latest developments and how it is being used will be critical.
However, let’s assume that the 5G hype is in the rear view mirror and we look to see what could be ahead of us in the mobile and telecom industry. At the end of the first day of this year’s MWC, I may have seen the future opportunity – and it is awesome! Pat Gelsinger asked the question “why can’t we build the telco networks like the clouds have been built for with scalability, flexibility, efficiency, and agility”? It’s a very fair question. After all, we do have Network Function Virtualization (NFV), and we will have new 5G services, so why not a new telco cloud?
I spent time with two companies that may show us a glimpse of the future network and cloud infrastructure. The first is the Israeli software startup, DriveNets with its solution “Network Cloud”. DriveNets is focused on helping service providers disaggregate proprietary routers from their networks as they move to 5G. DRiveNet’s Network Cloud solution aims to disrupt the current network business model by separating network costs (e.g. proprietary hardware functions) to create network functions from its software stack and two ‘white label’ hardware building blocks. The entire network infrastructure is software-centric allowing for agility, scalability, and normalizing costs with business growth.
However, Network DriveNets is an unusual startup in that it came out of stealth mode with $110 million in its first round of funding. The company was founded in 2015 by Ido Susan who should be familiar to Cisco watchers as he sold his first startup, Intucell (self-optimising network technology), to them for $475 million in 2013. DriveNets other co-founder, Hillel Kobrinsky founded Interwise (web conferencing) which was snapped up by AT&T for $121 million. To that end, the company is well funded and has the ability to sustain itself long enough to potentially disrupt the $50 billion network hardware business.
The second is Rakuten Mobile, a well-known name in Japan, but the first mobile virtual network operator (MVNO) to launch there in over 10 years. MNOs are not new so what makes Rakuten different? The company’s CTO, Tareq Amen explained to me that they are building the world’s first end-to-end fully virtualized cloud-native network running all of its workloads in the cloud. Being a fully virtualized network enables Network Function Virtualization to take advantage of cloud computing basis assets where a service delivery platform can be implemented, customized and scaled at speed. Finally, all of Rakuten’s core technology including its Radio Access Network (RAN – a topic that has been highly discussed at this year’s MWC) on 5G thus delivering immediate and actual 5G services. This compares to most of the rest of the industry who will have to build an uncomfortable transformation roadmap from 2/3G and LTE to 5G. While Amen’s strategy is compelling, there are a few technical hurdles to overcome. For example, enterprise-grade 5G indoor coverage isn’t fully there yet so Amen will have to rely on the operators that he is competing against who have that real estate.
So why highlight DriveNet and Rakuten in the same blog? In Rakuten’s case the CAPEX and OPEX business models for operators may be turned on their heads by the fact that its network is taking all of the competitive advantages of 5G while offering customers as disruptive pricing models and services. In a country such as Japan where traditional operators have struggled to modernize their networks, this could be a competitive threat. Equally, there could be a global rise in copy-cat pure 5G/cloud-based MVNOs spring up and fiercely compete against the incumbent local operators as well as give other MVNOs a tough time. As for DriveNets – it’s simple…it’s software and virtualization of the switch and router market which is very appealing to the service providers. It will commoditize the hardware, lowering their costs while allowing them to continue to focus on new 5G services.
In the recent announcement, HCL Technologies and IBM stated that they were collaborating to assist organisations on their hybrid cloud journey. The joint services will offer seamless integration of their customers’ cloud across any private, public and on-premise environment. The hybrid cloud approach is aimed at eliminating the concerns of mixing the different cloud environments yet maintaining scalability and security. HCL will feature new refactoring and re-platforming services to allow organisations to migrate, integrate and manage apps and workloads on IBM’s cloud.
Speaking on the subject, Phil Hassey, Principal Advisor ,Ecosystm, thinks that “the most attractive aspect of this collaboration is that it will bring together the best of both vendors. IBM has such a long history in Infrastructure Management, whereas HCL has – most particularly in the past 5 years or so – built up capability in the space.”
The good news for tech buyers is that they will derive benefits from HCL’s customer knowledge and hands-on approach with the combination of IBM technology, cloud and – of course – Watson. HCL has acquired significant IBM assets that they could extract more value from. The infusion of AI into hybrid cloud will lead to increased automation, improved outcomes and effective investments by clients.
When we look at the individual benefits to IBM and HCL from this deal, Hassey says, “IBM needs new customer logos and HCL can provide that. Conversely, IBM can provide larger enterprise clients to HCL. However, it is not a straight customer swap, they will work to maintain relationships independently.”
IBM is pushing hard towards their plans to capture more market with many big deals announced recently – and undoubtedly many more yet to happen. It is well-documented that IBM has gone through many reinventions but is still deeply entrenched in many of the world’s largest organisations. “IBM is such a large machine that different aspects and offerings are always operating at different speeds”, says Phil. “It needs to accelerate the uptake of Watson, and Watson being available on any cloud will help this.”
The hybrid cloud is desperately looking for big scale integration and transformation capabilities, so this agreement between HCL and IBM will hopefully help kickstart that.