
When non-organic (man-made) fabric was introduced into fashion, there were a number of harsh warnings about using polyester and man-made synthetic fibres, including their flammability.
In creating non-organic data sets, should we also be creating warnings on their use and flammability? Let’s look at why synthetic data is used in industries such as Financial Services, Automotive as well as for new product development in Manufacturing.
Synthetic Data Defined
Synthetic data can be defined as data that is artificially developed rather than being generated by actual interactions. It is often created with the help of algorithms and is used for a wide range of activities, including as test data for new products and tools, for model validation, and in AI model training. Synthetic data is a type of data augmentation which involves creating new and representative data.
Why is it used?
The main reasons why synthetic data is used instead of real data are cost, privacy, and testing. Let’s look at more specifics on this:
- Data privacy. When privacy requirements limit data availability or how it can be used. For example, in Financial Services where restrictions around data usage and customer privacy are particularly limiting, companies are starting to use synthetic data to help them identify and eliminate bias in how they treat customers – without contravening data privacy regulations.
- Data availability. When the data needed for testing a product does not exist or is not available to the testers. This is often the case for new releases.
- Data for testing. When training data is needed for machine learning algorithms. However, in many instances, such as in the case of autonomous vehicles, the data is expensive to generate in real life.
- Training across third parties using cloud. When moving private data to cloud infrastructures involves security and compliance risks. Moving synthetic versions of sensitive data to the cloud can enable organisations to share data sets with third parties for training across cloud infrastructures.
- Data cost. Producing synthetic data through a generative model is significantly more cost-effective and efficient than collecting real-world data. With synthetic data, it becomes cheaper and faster to produce new data once the generative model is set up.

Why should it cause concern?
If real dataset contains biases, data augmented from it will contain biases, too. So, identification of optimal data augmentation strategy is important.
If the synthetic set doesn’t truly represent the original customer data set, it might contain the wrong buying signals regarding what customers are interested in or are inclined to buy.
Synthetic data also requires some form of output/quality control and internal regulation, specifically in highly regulated industries such as the Financial Services.
Creating incorrect synthetic data also can get a company in hot water with external regulators. For example, if a company created a product that harmed someone or didn’t work as advertised, it could lead to substantial financial penalties and, possibly, closer scrutiny in the future.
Conclusion
Synthetic data allows us to continue developing new and innovative products and solutions when the data necessary to do so wouldn’t otherwise be present or available due to volume, data sensitivity or user privacy challenges. Generating synthetic data comes with the flexibility to adjust its nature and environment as and when required in order to improve the performance of the model to create opportunities to check for outliers and extreme conditions.

Why do we use AI? The goal of a business in adding intelligence is to enhance business decision-making, and growing revenue and profit within the framework of its business model.
The problem many organisations face is that they understand their own core competence in their own industry, but they do not understand how to tweak and enhance business processes to make the business run better. For example, AI can help transform the way companies run their production lines, enabling greater efficiency by enhancing human capabilities, providing real-time insights, and facilitating design and product innovation. But first, one has to be able to understand and digest the data within the organisation that would allow that to happen.
Ecosystm research shows that AI adoption crosses the gambit of business processes (Figure 1), but not all firms are process optimised to achieve those goals internally.

The initial landscape for AI services primarily focused on tech companies building AI products into their own solutions to power their own services. So, the likes of Amazon, Google and Apple were investing in people and processes for their own enhancements.
As the benefits of AI are more relevant in a post-pandemic world with staff and resource shortages, non-tech firms are becoming interested in applying those advantages to their own business processes.
AI for Decisions
Recent start-up ventures in AI are focusing on non-tech companies and offering services to get them to use AI within their own business models. Peak AI says that their technology can help enterprises that work with physical products to make better, AI-based evaluations and decisions, and has recently closed a funding round of USD 21 million.
The relevance of this is around the terminology that Peak AI has introduced. They call what they offer “Decision Intelligence” and are crafting a market space around it. Peak’s basic premise was to build AI not as a business goal for itself but as a business service aided by a solution and limited to particular types of added value. The goal of Peak AI is to identify where Decision Intelligence can add value, and help the company build a business case that is both achievable and commercially viable.
For example, UK hard landscaping manufacturer Marshalls worked with Peak AI to streamline their bid process with contractors. This allows customers to get the answers they need in terms of bid decisions and quotes quickly and efficiently, significantly speeding up the sales cycle.

AI-as-a-Service is not a new concept. Canadian start-up Element AI tried to create an AI services business for non-tech companies to use as they might these days use consulting services. It never quite got there, though, and was acquired by ServiceNow last year. Peak AI is looking at specific elements such as sales, planning and supply chain for physical products in how decisions are made and where adding some level of automation in the decision is beneficial. The Peak AI solution, CODI (Connected Decision Intelligence) sits as a layer of intelligence that between the other systems, ingesting the data and aiding in its utilisation.
The added tool to create a data-ingestion layer for business decision-making is quite a trend right now. For example, IBM’s Causal Inference 360 Toolkit offers access to multiple tools that can move the decision-making processes from “best guess” to concrete answers based on data, aiding data scientists to apply and understand causal inference in their models.
Implications on Business Processes
The bigger problem is not the volume of data, but the interpretation of it.
Data warehouses and other ways of gathering data to a central or cloud-based location to digest is also not new. The real challenge lies with the interpretation of what the data means and what decisions can be fine-tuned with this data. This implies that data modelling and process engineers need to be involved. Not every company has thought through the possible options for their processes, nor are they necessarily ready to implement these new processes both in terms of resources and priorities. This also requires data harmonisation rules, consistent data quality and managed data operations.
Given the increasing flow of data in most organisations, external service providers for AI solution layers embedded in the infrastructure as data filters could be helpful in making sense of what exists. And they can perhaps suggest how the processes themselves can be readjusted to match the growth possibilities of the business itself. This is likely a great footprint for the likes of Accenture, KPMG and others as process wranglers.


The process of developing advertising campaigns is evolving with the increasing use of artificial intelligence (AI). Advertisers want to optimise the amount of data at their disposal to craft better campaigns and drive more impact. Since early 2020, there has been a real push to integrate AI to help measure the effectiveness of campaigns and where to allocate ad spend. This now goes beyond media targeting and includes planning, analytics and creative. AI can assist in pattern matching, tailoring messages through AI-enabled hyper-personalisation, and analysing traffic to communicate through pattern identification of best times and means of communication. AI is being used to create ad copy; and social media and online advertising platforms are starting to roll out tools that help advertisers create better ads.
Ecosystm research shows that Media companies report optimisation, targeting and administrative functions such as billing are aided by AI use (Figure 1). However, the trend of Media companies leveraging AI for content design and media analysis is growing.

WPP Strengthening Tech Capabilities
This week, WPP announced the acquisition of Satalia, a UK-based company, who will consult with all WPP agencies globally to promote AI capabilities across the company and help shape the company’s AI strategy, including research and development, AI ethics, partnerships, talent and products.
It was announced that Satalia, whose clients include BT, DFS, DS Smith, PwC, Gigaclear, Tesco and Unilever, will join Wunderman Thompson Commerce to work on the technology division of their global eCommerce consultancy. Prior to the acquisition, Satalia had launched tools such as Satalia Workforce to automate work assignments; and Satalia Delivery, for automated delivery routes and schedules. The tools have been adopted by companies including PwC, DFS, Selecta and Australian supermarket chain Woolworths.
Like other global advertising organisations, WPP has been focused on expanding the experience, commerce and technology parts of the business, most recently acquiring Brazilian software engineering company DTI Digital in February. WPP also launched their own global data consultancy, Choreograph, in April. Choreograph is WPP’s newly formed global data products and technology company focused on helping brands activate new customer experiences by turning data into intelligence. This article from last year from the WPP CTO is an interesting read on their technology strategy, especially their move to cloud to enable their strategy.

Ethics & AI – The Right Focus
The acquisition of Satalia will give WPP and opportunity to evaluate important areas such as AI ethics, partnerships and talent which will be significantly important in the medium term. AI ethics in advertising is also a longer-term discussion. With AI and machine learning, the system learns patterns that help steer targeting towards audiences that are more likely to convert and identify the best places to get your message in front of these buyers. If done responsibly it should provide consumers with the ability to learn about and purchase relevant products and services. However, as we have recently discussed, AI has two main forms of bias – underrepresented data and developer bias – that also needs to be looked into.
Summary
The role of AI in the orchestration of the advertising process is developing rapidly. Media firms are adopting cloud platforms, making IP investments, and developing partnerships to build the support they can offer with their advertising services. The use of AI in advertising will help mature and season the process to be even more tailored to customer preferences.


As we return to the office, there is a growing reliance on devices to tell us how safe and secure the environment is for our return. And in specific application areas, such as Healthcare and Manufacturing, IoT data is critical for decision-making. In some sectors such as Health and Wellness, IoT devices collect personally identifiable information (PII). IoT technology is so critical to our current infrastructures that the physical wellbeing of both individuals and organisations can be at risk.
Trust & Data
IoT are also vulnerable to breaches if not properly secured. And with a significant increase in cybersecurity events over the last year, the reliance on data from IoT is driving the need for better data integrity. Security features such as data integrity and device authentication can be accomplished through the use of digital certificates and these features need to be designed as part of the device prior to manufacturing. Because if you cannot trust either the IoT devices and their data, there is no point in collecting, running analytics, and executing decisions based on the information collected.
We discuss the role of embedding digital certificates into the IoT device at manufacture to enable better security and ongoing management of the device.
Securing IoT Data from the Edge
So much of what is happening on networks in terms of real-time data collection happens at the Edge. But because of the vast array of IoT devices connecting at the Edge, there has not been a way of baking trust into the manufacture of the devices. With a push to get the devices to market, many manufacturers historically have bypassed efforts on security. Devices have been added on the network at different times from different sources.
There is a need to verify the IoT devices and secure them, making sure to have an audit trail on what you are connecting to and communicating with.
So from a product design perspective, this leads us to several questions:
- How do we ensure the integrity of data from devices if we cannot authenticate them?
- How do we ensure that the operational systems being automated are controlled as intended?
- How do we authenticate the device on the network making the data request?
Using a Public Key Infrastructure (PKI) approach maintains assurance, integrity and confidentiality of data streams. PKI has become an important way to secure IoT device applications, and this needs to be built into the design of the device. Device authentication is also an important component, in addition to securing data streams. With good design and a PKI management that is up to the task you should be able to proceed with confidence in the data created at the Edge.
Johnson Controls/DigiCert have designed a new way of managing PKI certification for IoT devices through their partnership and integration of the DigiCert ONE™ PKI management platform and the Johnson Controls OpenBlue IoT device platform. Based on an advanced, container-based design, DigiCert ONE allows organisations to implement robust PKI deployment and management in any environment, roll out new services and manage users and devices across your organisation at any scale no matter the stage of their lifecycle. This creates an operational synergy within the Operational Technology (OT) and IoT spaces to ensure that hardware, software and communication remains trusted throughout the lifecycle.

Rationale on the Role of Certification in IoT Management
Digital certificates ensure the integrity of data and device communications through encryption and authentication, ensuring that transmitted data are genuine and have not been altered or tampered with. With government regulations worldwide mandating secure transit (and storage) of PII data, PKI can help ensure compliance with the regulations by securing the communication channel between the device and the gateway.
Connected IoT devices interact with each other through machine to machine (M2M) communication. Each of these billions of interactions will require authentication of device credentials for the endpoints to prove the device’s digital identity. In such scenarios, an identity management approach based on passwords or passcodes is not practical, and PKI digital certificates are by far the best option for IoT credential management today.
Creating lifecycle management for connected devices, including revocation of expired certificates, is another example where PKI can help to secure IoT devices. Having a robust management platform that enables device management, revocation and renewal of certificates is a critical component of a successful PKI. IoT devices will also need regular patches and upgrades to their firmware, with code signing being critical to ensure the integrity of the downloaded firmware – another example of the close linkage between the IoT world and the PKI world.
Summary
PKI certification benefits both people and processes. PKI enables identity assurance while digital certificates validate the identity of the connected device. Use of PKI for IoT is a necessary trend for sense of trust in the network and for quality control of device management.
Identifying the IoT device is critical in managing its lifespan and recognizing its legitimacy in the network. Building in the ability for PKI at the device’s manufacture is critical to enable the device for its lifetime. By recognizing a device, information on it can be maintained in an inventory and its lifecycle and replacement can be better managed. Once a certificate has been distributed and certified, having the control of PKI systems creates life-cycle management.


Organisations have found that it is not always desirable to send data to the cloud due to concerns about latency, connectivity, energy, privacy and security. So why not create learning processes at the Edge?
What challenges does IoT bring?
Sensors are now generating such an increasing volume of data that it is not practical that all of it be sent to the cloud for processing. From a data privacy perspective, some sensor data is sensitive and sending data and images to the cloud will be subject to privacy and security constraints.
Regardless of the speed of communications, there will always be a demand for more data from more sensors – along with more security checks and higher levels of encryption – causing the potential for communication bottlenecks.
As the network hardware itself consumes power, sending a constant stream of data to the cloud can be taxing for sensor devices. The lag caused by the roundtrip to the cloud can be prohibitive in applications that require real-time response inputs.
Machine learning (ML) at the Edge should be prioritised to leverage that constant flow of data and address the requirement for real-time responses based on that data. This should be aided by both new types of ML algorithms and by visual processing units (VPUs) being added to the network.
By leveraging ML on Edge networks in production facilities, for example, companies can look out for potential warning signs and do scheduled maintenance to avoid any nasty surprises. Remember many sensors are linked intrinsically to public safety concerns such as water processing, supply of gas or oil, and public transportation such as metros or trains.
Ecosystm research shows that deploying IoT has its set of challenges (Figure 1) – many of these challenges can be mitigated by processing data at the Edge.

Predictive analytics is a fundamental value proposition for IoT, where responding faster to issues or taking action before issues occur, is key to a high return on investment. So, using edge computing for machine learning located within or close to the point of data gathering can in some cases be a more practical or socially beneficial approach.
In IoT the role of an edge computer is to pre-process data and act before the data is passed on to the main server. This allows a faster, low latency response and minimal traffic between the cloud server processing and the Edge. However, a better understanding of the benefits of edge computing is required if it has to be beneficial for a number of outcomes.


If we can get machine learning happening in the field, at the Edge, then we reduce the time lag and also create an extra trusted layer in unmanned production or automated utilities situations. This can create more trusted environments in terms of possible threats to public services.
What kind of examples of machine learning in the field can we see?
Healthcare
Health systems can improve hospital patient flow through machine learning (ML) at the Edge. ML offers predictive models to assist decision-makers with complex hospital patient flow information based on near real-time data.
For example, an academic medical centre created an ML pipeline that leveraged all its data – patient administration, EHR and clinical and claims data – to create learnings that could predict length of stay, emergency department (ED) arrival models, ED admissions, aggregate discharges, and total bed census. These predictive models proved effective as the medical centre reduced patient wait times and staff overtime and was able to demonstrate improved patient outcomes. And for a medical centre that use sensors to monitor patients and gather requests for medicine or assistance, Edge processing means keeping private healthcare data in-house rather than sending it off to cloud servers.
Retail
A retail store could use numerous cameras for self-checkout and inventory management and to monitor foot traffic. Such specific interaction details could slow down a network and can be replaced by an on-site Edge server with lower latency and a lower total cost. This is useful for standalone grocery pop-up sites such as in Sweden and Germany.
In Retail, k-nearest neighbours is often used in ML for abnormal activity analysis – this learning algorithm can also be used for visual pattern recognition used as part of retailers’ loss prevention tactics.
Summary
Working with the data locally on the Edge, creates reduced latency, reduced cloud usage and costs, independence from a network connection, more secure data, and increased data privacy.
Cloud and Edge computing that uses machine learning can together provide the best of both worlds: decentralised local storage, processing and reaction, and then uploading to the cloud, enabling additional insights, data backups (redundancy), and remote access.


The last year has really pushed the Education sector into transforming both its teaching and learning practices. The urgency of the situation accelerated the use of networking to extend the reach and range of educational opportunities for remote learning.
Education technology has rushed to embrace opportunities to facilitate a new normal for Education. This new normal must enable and support education access, experiences, and outcomes as well as aid in developing strong relationships within Education ecosystems.
Education technology, commonly known as EdTech, focuses on leveraging emerging technologies like cloud and AI to deliver interactive and multimedia coursework over online platforms. This also requires a state-of-the-art network to support. 5G provides instantaneous access to cloud services. Use of 5G – as well as network function virtualisation (NFV), network slicing, and multi-access edge computing (MEC) – has the capability of delivering significant performance benefits across these emerging educational applications and use cases.
At present, many educational institutions are aware of the possibilities, but are not active users of 5G network infrastructure (Figure 1).

Educational institutions plan to do some near-term investments but are not clear in what areas to apply the enhanced capabilities (Figure 2).

Role of the Network in Adaptive Learning
In their recent whitepaper, network provider Ciena talks about “the concept of an adaptive learning strategy – a technology-based teaching method that replaces the traditional one-size-fits-all teaching style with one that is more personalised to individual students. This approach leverages next-generation learning technologies to analyse a student’s performance and reactions to digital content in real-time, and modifies the lesson based on that data.”
To create an adaptive learning strategy that can be individualised, these learners need to be enabled by technology to be immersed in a learning experience, complete with multimedia and access to a knowledge base for information. And this is where a solid 5G network implementation can create access and bandwidth to the resources required.
Example of 5G and Immersive Learning
An example of adaptive learning where the technology not only supports but challenges the learner can be found in a BT-led new immersive classroom developed within the Muirfield Centre in Cumbernauld, North Lanarkshire, using innovative technology to transform a classroom into an engaging and digital learning environment.
Pupils at Carbrain Primary School, Cumbernauld, were the first to dive into the new experience with an underwater lesson about the ocean. The 360-degree room creates a digital projection that uses all four classroom walls and the ceiling to bring the real-world into an immersive experience for students. The concept aims to push beyond traditional methods of teaching to create an inclusive digital experience that helps explain abstract and challenging concepts through a 3D model. It will also have the potential to support students with learning difficulties in developing imagination, creative and critical thinking, and problem-solving skills. BT has deployed its 5G Rapid Site solution to support 5G innovation and digital transformation of UK’s Education sector. The solution is made possible through the EE 5G network which brings ultrafast speeds and enhanced reliability to classrooms.
Conclusion
5G is expected to provide network improvement in the areas of latency, energy efficiency, the accuracy of terminal location, reliability, and availability – therefore creating the ability to better leverage cloud capacity.
With the greater bandwidth that 5G provides, learners and instructors, can connect virtually from any location with minimal disruption with more devices than on previous networks. This allows students to enjoy a rich learning experience and not be disadvantaged by their location for remote learning, or by the uncertainty of educational access. This also provides more possibilities of exploration and discovery beyond the physical confines of the classroom and puts those resources in the hands of eager learners.
As educational institutions reopen, institutions are looking at ways to redesign the education experience. Connected devices are helping schools and universities expand the boundaries of education. Explore what the IoT-enabled future of education would look like


Many opinions exist on how automation and machine learning will help our return to the office environment. Removing physical touchpoints and leveraging machine learning to trace employee behaviour can help with the transition back to the workplace. But will people trust the office’s automated suggestions on where to work in the building, or help themselves to alternative workspaces?
Processes & Trust for People Engagement
Organisations such as Disney and Amazon understand what kinds of processes and trust it takes to engage people. These organisations took their time to create a vision of the contactless trusted experience before developing an implementation plan. The RFID wristbands at Disney that open hotel doors and get you on to rides involve many elements of trust and privacy. The automated order and delivery tracking of Amazon, along with suggestions and buying patterns, require the person to opt-in and share information to make happen.
So for your company, once employees re-enter the workplace, how will your company create those processes, that level of trust and faith, that would allow movements and health status to be tracked by office automation? For example, how often should employees overtly be aware of their temperature being scanned?
Abilities of Buildings to Manage
Facilities management is trending towards intelligent building management systems (iBMS) which know about room occupancy, room hygiene and are tracking who has been where and with whom. Elevators will limit occupancy and direct users to the correct lift going to the correct location. I have already seen this in our city hospital where you get directed to the correct lift once you have entered information on your destination. This combines user interface devices such as touchless pads, system hardware, and access control management software.
The building can also possibly direct you via a building app to request a place to work. You could swipe your personnel card and then be shown several options based on your personal profile and job role, including private quiet rooms, communal areas, and outside meeting tables. Previous occupants can be noted to share hygiene tracing if necessary. Intelligent buildings already offer direct support to the employees who interact with them for HVAC, lighting control, and occupation sensor. They have the ability to reduce user friction while raising workplace experience metrics to create a measured environment.
User Trust & Participation
Users should be willing to participate to get access. To create the trust that is required for employees to be willing to participate in the process, companies need to share policies and demonstrate stewardship of the data accessed. Who is holding my locational data, for how long, and for what purpose?
Trust facilitates successful data sharing, which in turn reinforces trust. Trust is built when the purpose of data sharing is made clear, and when those involved in the process know each other, understand each other’s expectations, and carry out their commitments as agreed. Trust increases the likelihood of further collaboration and improves core surveillance capacity by supporting surveillance networks.
Conclusion
Will we put our trust in buildings and facilities management on our return to the office? If communication is clear and policy well articulated, the building can play a role in engaging users to return to some standards of in-office participation. But if communication is muddy and policy not made clear, people will make their own way to safety – potentially impacting the environment of others.
Transform and be better prepared for future disruption, and the ever-changing competitive environment and customer, employee or partner demands in 2021. Download Ecosystm Predicts: The top 5 Future of Work Trends For 2021.


We are heading into the one-year anniversary of global COVID confinements. This confinement period has seen the Hospitality industry impacted strongly by the lack of mobility of populations and government regulations. Hotels had previously used a consistent flow of booking and revenue information using historical and current pricing data from distribution and revenue management tools. They adapted in the “new normal” and the evolution of hotel infrastructure during this period – forced by necessity – has led them to try to create a contactless, more automated interaction, both for efficiency and for the work-from-home status of many employees.
Ecosystm research shows the digital technology focus of the industry to address the necessary shifts, in 2021 (Figure 1).

Distribution Data in the New Normal
Hotels are still struggling to get a clear overview of demand forecasting. Their data infrastructure is evolving and will continue to evolve to tackle this problem. The reliance on distribution information had to shift as fluidity in bookings could not rely on historical norms.
Hotels use a complex structure of promotion via distribution channels. This included direct booking via websites or central call centres, and use of online travel agents (OTAs), bed banks and wholesalers. That mix of channels was monitored and managed by the properties to leverage across these channels to optimise room occupancy. Over the past decades there has been an increased reliance on OTAs. But in more recent years, many hotel players have pushed back, promoting direct bookings made through own website booking engines or other direct means.
The pandemic has disrupted this complex orchestration of data. Moving from 65-75% occupancy to 10-15% was not financially viable for hotels. Because the pandemic reduced demand, both direct booking and OTA bookings have grown their share at the expense of other channels such as bed banks and global distribution systems (GDS). Guests wanted confirmation of the status of the hotel and what services were available, so data with extra content from the hotel itself or frequently updated OTA services were reliable.
Building Better Bundles and Contact Points
The goals for many hotels were to create frictionless digital customer journey (preferably by brand), leveraging existing infrastructures and integrating them to mobile apps, more robust CRM, and a more flexible property management set of tools. Part of that integration was having newly launched hygiene initiatives and branding those as part of the offering.
New bundles and packages were created to deal with the hygiene constraints and the new form of guest stays (daycation, staycation, remote learning) that have developed from the pandemic conditions.
Workcations using the hotel facilities as a workplace became attractive for those stuck at home with many interruptions. InterContinental Hotels Group, Marriott and Accor are among the major names that have launched or are considering monthly payment plans, as the hotel industry tries to attract restless remote workers ready for a change of scene.
The disconnect in guest information is being addressed by rebuilding the infrastructure of the guest journey – tracking their pre-stay investigation and booking interaction, the kind of on-property engagement they have with the hotel and its staff, their in-room experience, and their sharing of feedback on social media post-stay are all part of their guest experience.
Multiple business priorities will guide the industry in 2021 (Figure 2).

For the hotels serving different customer segments, specific actions were initiated.
- For the economy hotel chains, the flow of customers was not that significantly different, but how they booked and how many rooms they needed changed. This was handled more at the individual hotel property level as different COVID constraints applied to different regions.
- Larger chains already had their property management systems (PMS) set up as tied to a centralised structure, but a chunk of their business (leisure, corporate and business events) was directly tied to the restrictions on the domestic population and inability to access international guests.
- For luxury brands, it was a bit of a challenge as the hygiene aspect impacted the use of several extras that luxury brands rely on, such as spas, one-to-one interaction and facilities.
- Independent hotels needed some guidance that they were not getting from historical norms. Many went to external infrastructure providers to try to create workflow processes that would help them stay afloat.
Technology investments: Some Examples
One of the first concerns of regional travellers was the operational status of the hotel. One example of a digital investment was the Louvre Hotels Group, Europe’s second-largest hotel group that used used its ‘Résa Pro’ dedicated reservation platform for working professionals. It showed the listing of available accommodation per city and region for business travellers to meet the accommodation and catering needs of retail and sales professionals. Using this digital platform, companies could locate the Group’s open hotels in the city or region of their choice and see what guest offering best suited their requirements.
This webcast of Radisson’s Remy Merckx and Managing Director Sally Richards from RaspberrySky is a great example of building a digital platform to restructure the guest experience. Radisson outsourced the building of a digital platform that linked their eight hotel brands under one platform for a consistent digital experience, leveraging mobile, social and cloud technologies. The higher engagement rate with the mobile app and the chatbot helped create the contactless experience the guests are now looking in their accommodation journeys.
Many brands are now focusing on app-centric approaches for the guests, adding the value of human engagement for the more complex tasks. The emphasis is on the brand and digitising the guest journey to make it more customer-centric. This has been a time of reflection for some of the more organised hotel chains to make the time investment into the digital journey, upskill and upscale their operations to be in line with customer engagement.
New Normal for Hotel Stays
But not every independent hotel or small hotel chain had that financial investment to make during this period. According to Ecosystm data, approximately 41% of hospitality firms put their digital transformation on hold in 2020 – higher than any other industry that we cover. Technologies that will see increased investments in 2021 included cloud collaboration (44%) and cloud enterprise solutions (23%).
What does cloud have to do with this? Cloud is part of the infrastructural investment that allows the Hospitality industry to connect and enable its participants throughout the ecosystem, enabling mobile and social as well. This enables service providers to engage with intermediary partners, travel agents and consolidators and consumers, hyperconnecting in ways that provide convenience, ease of use and seamless information retrieval to bed banks and timetables, from business rules to collaborative mapping of codes.
This use of technology transforms the elements of inventory and availability into experiences and destinations.
- Messaging tools help harmonise communication across the network.
- Monitoring apps manage factors that impact distribution health, including rate integrity, availability, and visibility.
- AI – for example in the form of voice assistants – helps guide consumers and partners to timely information and decision making.
But it will still be a blend of digital solutions and human interaction, where humans add the core competency and collective knowledge, and technology provides the seamless data exchange and network connectivity.
Acknowledgements
- Sally Richards at RaspberrySky
- Anders Johansson at Hospitality Visions
- Mark Haywood and Ankit Chaturvedi at RateGain
New Normal for The Hospitality Industry
Get more insights on the priorities and the road to recovery of the Hospitality industry. Create your free Ecosystm account to take part in this study and gain access to a benchmark of how you compare to your peers.


Authored by Alea Fairchild and Audrey William
There is a lot of hope on AI and automation to create intellectual wealth, efficiency, and support for some level of process stability. After all, can’t we just ask Siri or Alexa and get answers so we can make a decision and carry on?
Automation has been touted as the wonder formula for workplace process optimisation. In reality it’s not the quick fix that many business leaders desire. But we keep raising the bar on expectations from automation. Investments in voice technologies, intelligent assistants, augmented reality and touchscreens are changing customer experience (Figure 1). Chatbots are ubiquitous, and everything has the potential to be personalised. But will they solve our problems?

100 percent automation is not effective
Let’s first consider using automation to replace face-to-face interactions. There was a time when people were raving about the check-in experience at some of the hotels in Japan where robots and automated systems would take care of the check-in, in-stay and check-out processes. Sounds simple and good? Till 2019, if you checked into the Henn-na Hotel in Japan, you would be served and taken care of by 243 robots. It was viewed by many as a template for what a fully automated hotel could look like in the future.
The hotel had an in-room voice assistant called Churi. It could cope with basic commands, such as turning the lights on and off, but it was found to be deficient when guests started asking questions about places to visit or other more sophisticated queries. It was not surprising that the hotel decided to retire their robots. In the end it created more work for the hotel staff on-site.
People love the personal touch when they are in a hotel; and talking to someone at the front desk, requesting assistance from hotel staff, or even just a short chat over breakfast are some of the small nuances of why the emotional connection matters. Many quarantine hotels today use robots for food delivery, but the hotel staff is still widely available for questions. That automation is good, but you need the human intervention. So, getting the balance right is key.
Empathy plays a big role in delivering great Customer Experience
Similarly, there was a time when many industry observers and technology providers said that a contact centre will be fully automated, reducing the number of agents. While technologies such as Conversational AI have come along where you can now automate common or repetitive questions and with higher accuracy levels, the human agent still plays a critical role in answering the more complex queries. When the customer has a complicated question or request, then they will WANT to speak to an agent.
When it reaches a point where the conversation with the chatbot starts getting complicated and the customers need more help there should be the option – within the app, website or any other channel – to escalate the call seamlessly to a human agent. Sometimes, a chat is where the good experience happens – the emotional side of the conversation, the laughter, the detailed explanation. This human touch cannot be replaced by machines. Disgruntled customers are happier when an agent shows empathy. Front line staff and human agents act as the face of a company’s brand. Complete automation will not allow the individual to understand the culture of the company. These can be attained through conversations.
Humans as supervisors for AI – The New Workplace
Empathy, intuitiveness, and creativity are all human elements in the intelligence equation. Workers in the future will need to make their niche in a fluid and unpredictable environment; and translating data into action in a non-replicable way is one of the values of human input. The essence of engineering is the capacity to design around human limitations. This requires an understanding of how humans behave and what they want. We call that empathy. It is the difference between the engineer who designs a product, and the engineer who delivers a solution. We don’t teach our computer scientists and engineering students a formula for empathy. But we do try to teach them respect for both the people and the process.
For efficiency, we turn to automation of processes, such as RPA. This is designed to try to eradicate human error and assist us in doing our job better, faster and at a lower cost by automating routine processes. If we design it right, humans take the role of monitoring or supervisory controlling, rather than active participation.
At present, AI is not seen as a replacement for our ingenuity and knowledge, but as a support tool. The value in AI is in understanding and translating human preferences. Humans-in-the-loop AI system building puts humans in the decision loop. They also shift pressure away from building “perfect” algorithms. Having humans involved in the ethical norms of the decision allows the backstop of overly orchestrated algorithms.
That being said, the astute use of AI can deepen insights into what truly makes us human and can humanise experiences by setting a better tone and a more trusted engagement. Using things like sentiment analysis can de-escalate customer service encounters to regain customer loyalty.
The next transformational activity for renovating work is to advance interactions with customers by interpreting what they are asking for and humanising the experience of acquiring it which may include actually dealing with a human contact centre agent – decisions that are supported at the edge by automation, but at the core by a human being.
Implications
Ecosystm research shows that process automation will be a key priority for technology investments in 2021 (Figure 2).

With AI and automation, a priority in 2021, it will be important to keep these considerations in mind:
- Making empathy and the human connection the core of customer experiences will bring success.
- Rigorous, outcome-based testing will be required when process automation solutions are being evaluated. In areas where there are unsatisfactory results, human interactions cannot – and should not – be replaced.
- It may be easy to achieve 90% automation for dealing with common, repetitive questions and processes. But there should always be room for human intervention in the event of an issue – and it should be immediate and not 24 hours later!
- Employees can drive greater value by working alongside the chatbot, robot or machine.
Ecosystm Predicts: The Top 5 Customer Experience Trends for 2021
Download Ecosystm’s complimentary report detailing the top 5 customer experience trends for 2021 that your company should pay attention to along with tips on how to stay ahead of the curve.
