Confidential Computing Consortium aims to define and accelerate the adoption of Confidential Computing by bringing together a community of open-source technologies and experts.
New Cross-Industry Effort
There are many government agencies, consortiums, and software and hardware vendors working on data security so a key question here is “who needs it?”
Commenting on CCC, Claus Mortensen, Principal Advisor at Ecosystm said, “whether this is really ‘needed’ is a matter of perspective though. Many would argue, that a project driven by technology giants is contrary to the grassroots approach of earlier open source projects. However, Confidential Computing is a complex area that involves both hardware and software. Also, the stakes have become high for businesses and consumers alike, as data privacy has become the focal point. Arguably, the big tech companies need this Open Source initiative more than anyone at the moment.”
How CCC would benefit enterprises and business users?
With the increasing adoption of the cloud environments, on-premise servers, the edge, IoT and other technologies, it’s crucial for enterprises to increase the security of the data. While there have been approaches to encrypt data at rest (storage) and in transit (network) it’s quite challenging to fully encrypt the data life cycle and that’s what Confidential Computing aims to solve.
Mortensen said, “when it comes to data security, the actual computing part of the data life-cycle is arguably the weakest link and with the recent increased focus on privacy among the public, this ‘weakness’ could possibly be a stumbling block in the further development and update of cloud computing.” Mortensen added, “that doing it as part of an open-source initiative makes sense, not only because the open-source approach is a good development and collaboration environment – but, crucially, it also gives the initiative an air of openness and trustworthiness.”
To drive the initiative, members have planned to make a series of open source project contributions to the Confidential Computing Consortium.
Intel will contribute Software Guard Extensions (SGX) SDK to the project. This is hardware-based memory level data protection, where data and operations are encrypted in the memory and isolated from the rest of the system resources and software components.
Microsoft will be sharing the Open Enclave SDK to develop broader industry collaboration and ensure a truly open development approach. The open-source framework allows developers to build a Trusted Execution Environment (TEE) applications.
Red Hat will provide its Enarx to the consortium. – an open-source project aimed at reducing the number of layers (application, kernel, container engine, bootloader) in a running workload environment.
Mortensen said, “like other open-source initiatives, this would allow businesses to contribute and further develop Confidential Computing. This, in turn, can ensure further uptake and the development of new use cases for the technology.”
How Data Security Efforts will shape up
Confidential Computing is a part of a bigger approach to privacy and data and we may see other possible developments around AI, in distributed computing, and with Big Data analysis.
“Initiatives that can be seen as part of the same bucket include Google’s “Federated Learning” where AI learning on data is distributed privately. This allows Google to apply AI to data on the users’ devices without Google actually seeing the data. The data remains on the user’s device and all that is sent back to Google is the input or learning that the data has provided” said Mortensen.
Consequently, Confidential Computing seems to ease matters for data security at this point and the collaboration expects the results will lead to greater control and transparency of data for users.
Let us know your opinion on the Confidential Computing Consortium in the comments.
In the course of my facilitation, I realised that the key focus of the discussions was on how organisations today handle their biggest asset – data – and manage all the moving parts in large and diverse settings and in very traditional enterprises that are transitioning into data-driven “New Age” businesses.
The Impossible Tech Triangle
Data has never been so prolific or strategic as it is today. Multiple sources of data generation and technologically savvy customers have seen a data explosion. Simultaneously, personalised services and data insights have seen a drive toward using the data for increased intelligence in most industries. The biggest risk organisations face and what CSOs and business leaders spend sleepless nights on, is significant disruption due to compromise. Moreover, the complexity of the technology platforms means that no organisation is 100% certain that they have it right and that they are managing the risk effectively.
Does this give rise to a similar situation as in Marketing and Advertising where the triangle of price, quality and speed appear to be unattainable by many organisations?
Key takeaways from the sessions
#1 Drivers & Challenges of Cloud adoption
The fact that discussions around agility and innovation can happen with the intensity it does today, is because organisations have embraced Cloud infrastructure and application development platforms and SaaS solutions.Every organisation’s Cloud journey is unique, driven by its discrete set of requirements. Organisations choosing cloud may not have the resources to build in-house systems – or may choose to migrate to the cloud for various reasons such as cost, productivity, cross-border collaboration or for compliance.
When embarking on a Cloud journey it is important to have a clear roadmap that involves instilling a Cloud-First culture and training the IT organisation in the right skills for the environment. Concerns around costs, security, and data ownership are still synonymous with Cloud, therefore, organisations can distill the workload from a cost angle before jumping on Cloud. It is important for organisations to appreciate when a Cloud option will not work out from a cost angle and to have the right cost considerations in place, because organisations that do a straight resource swapover on Cloud are likely to end up paying more.
Data ownership and data residency can also be challenging, especially from a compliance standpoint. For some, the biggest challenge is to know the status of their data residency. The challenges are not just around legacy systems but also in terms of defining a data strategy that can deliver the desired outcomes and managing risk effectively without ruining the opportunities and rewards that data utilisation can bring. Cloud transformation projects bring in data from multiple and disparate sources. A clear data strategy should manage the data through its entire lifecycle and consider aspects such as how the data is captured, stored, shared, and governed.
#2 Perception on Public Cloud Security
While security remains a key concern when it comes to Cloud adoption, Cloud is often regarded as a more secure option than on-premise. Cloud providers have dedicated security focus, constantly upgrade their security capabilities in response to newer threats and evolve their partner ecosystem. There is also better traceability with Cloud as every virtual activity can be tracked, monitored, and is loggable.
However, the Cloud is as secure as an organisation makes it. The Cloud infrastructure may be secure, but the responsibility of securing applications lies with the organisation. The perception that there is no need to supplement public Cloud security features can have disastrous outcomes. It is important to supplement the Cloud provider’s security with event-driven security measures within an organisation’s applications and cloud infrastructure. As developers increasingly leverage APIs, this need to focus on security, along with functionality and agility should be emphasised on. Organisations should be aware that security is a shared responsibility between the Cloud provider and the organisation.
#3 Viewing Security as a Business Risk – not IT Risk
The Executive Management and the Board may be involved in the Security strategy and GRC policies of an organisation. But a consistent challenge Security teams face is convincing the Board and Senior Management on the need for ongoing focus and investments on cybersecurity measures. Often, these investments are isolated from the organisation’s KPIs and are harder to quantify. But Security breaches do have financial and reputational impact on organisations. Mature organisations are beginning to view Security as a business risk requirement and not a matter of IT risk alone. One of the reasons why Senior Management and Boards do not understand the full potential of data breaches is because CISOs do not translate the implications in business terms. It is their responsibility to find ways to procure senior management buy-in, so that Security becomes part of the Strategy and the costs associated gets written into the cost of doing business.
Training sessions that educate the stakeholders on the basics of the risks associated with using knowledge systems can help. Simulation of actual cybersecurity events and scenario testing can bring home the operational issues around recovery, assessment and containment and it is important to involve senior stakeholders in these exercises. However, eventually the role of the CSO will evolve. It will become a business role and traverse Security across the entire organisation – physical as well as cybersecurity. This is when organisations will truly recognise investment in Security as a business requirement.
#4 Moving away from Compliance-driven Security Practices
Several organisations look at Security as part of their compliance exercise, and compliance is built into their organisational risk management programmes. Often, security practices are portrayed as a product differentiator and used as a marketing tool. An organisation’s Security strategy should be more robust than that and should not only be focused on ticking the right compliance boxes.
A focus on compliance often means that Security teams continually create policies and call out non-compliance rather than proactively contribute to a secure environment. Applications teams do not always have the right skills to manage Security. The focus of the Security team should not be on telling Applications teams what they are doing wrong and writing copious policies, procedures and standards, expecting others to execute on the recommendations. There should be a focus on automated policy-driven remediation that does not restrict the Applications team per se but focuses on unsafe practices, when they are detected. Their role is to work on the implementation and set up Security practices to help the Applications team do what they do best.
#5 Formulating the Right Incident Response Policy
In the Ecosystm Cybersecurity study, 73% of global organisations think that a data breach is inevitable – so organisations largely believe that “it is not about if, but when”. About 50% of global organisations have a cyber insurance policy or are evaluating one. This trend will only rise. Policy-driven incident response measures are an absolute requirement in all enterprises today. However, to a large extent even their incident response policies are compliance driven. 65% of the organisations appear to be satisfied with their current breach handling processes. It is important to keep evolving the process in the face of new threats.
Organisations should also be aware of the need for people management during an incident. Policies might be clear and adhered to, but it is substantially harder to train the stakeholders involved on how they will handle the breach emotionally. It extends to how an organisation manages their welfare both during an incident and long after the incident response has been closed off.
Over the two sessions, we explored how to achieve the ‘unattainable triangle’ of Cloud, Agility and Security. What I found interesting – yet unsurprising – is that discussions were heavily focussed on the role of Security. On the one hand, there is the challenge of the current threat landscape. On the other hand, Security teams are required to deliver a Cloud and an agile development strategy in tandem. This disconnect ultimately highlights the need for Security and data management to be embedded and managed from the very start, and not as an afterthought.
The Personal Data Protection Commission (PDPC), which oversees the country’s Personal Data Protection Act (PDPA), has developed a new framework to better support organisations in the hiring and training requirements of Data Protection Officers.
Why the Need?
PDPA has been around for a while and the new framework is brought into practice to enable a greater focus of government and organisations on data privacy. According to Ecosystm’s expert on GDPR and Data Privacy, Claus Mortensen, “the initiative reflects the difference between ‘theory’ and ‘practice’ when it comes to data security. PDPA is not making changes to the present regulatory framework, but they are putting together a program and guidelines for how companies can apply and abide by the present regulatory framework.”
To ensure the data flow mechanisms and to ensure security, Infocomm Media Development Authority (IMDA) has been appointed as Singapore’s Accountability Agent (AA) for the Asia-Pacific Economic Cooperation (APEC) Cross Border Privacy Rules (CBPR) and Privacy Recognition for Processors (PRP) Systems certifications. IMDA will allow Singaporean organisations to be certified in APEC CBPR and PRP Systems for accountable data transfers.
“The PDPA requires a data protection officer to be appointed in every organisation. This framework is focused on educating and certifying these officers. However, it mostly makes it easier for slightly larger companies who can afford to send employees on longer training programs or who are able to hire people, who have taken the certificates. Smaller organisations – such as start-ups – would benefit more from detailed guidelines and from on-premises guidance. Establishing a framework for such services could be the next area of focus for the PDPC.” said Mortensen.
Legislature for the data handling and exchange practices
The private data has become an increased target of hackers as well as an international commodity. Attackers always mine the cyberspace for any leaks or financial information that they can exploit to their advantage.
“Managing sensitive data is notoriously complicated – especially for ‘legacy’ companies that still have or rely on non-digitised data. Even when all data is digital, employees may have copies on their PCs, they may have partial backups on removable media, some data may need to be shared with sub-contractors, moved around between cloud providers, etc. This can make it very difficult to map out PDPA relevant data in the organisation. Even when the data has been mapped, it can be difficult to ensure that all business and data processes are compliant. This is where on-premises guidance can make a difference,” said Mortensen. “While the government clearly aim the new framework initiatives at helping SMEs, it will help further protect consumer’s sensitive data.”
Importance of Cyber Security
A business harnessing digital technology can’t afford to gamble with sensitive data and rising cyber-attacks. The government is taking initiatives by forming guidelines and regulations to prevent cyber-attacks, but it is the responsibility of businesses to have a cybersecurity strategy in place to prevent a breach. If a business becomes a victim of hacking, it is perceived as a failure to the company.
The government passed the PDPA law and compliance to ensure that businesses understand the importance of cybersecurity. Therefore, every business, organisation or academic institution must ensure compliance with the data protection act and must have security best practices to safeguard the data of its customers.
The biggest fines issued by the ICO in the UK were to Facebook and Equifax – both of whom were fined GBP 500,000.
Facebook’s fine was for the notorious Cambridge Analytica data scandal, where the information of 87 million Facebook users was shared with the political consultancy through a quiz app that collected data from participants as well as their friends without their consent.
Equifax Ltd. fine was for something more similar to the British Airways case: In May 2017, hackers stole personal data including names, dates of birth, addresses, passwords, driving licences and financial details of 15 million UK customers. In its ruling, the ICO said that Equifax had failed to take appropriate steps to ensure the protection of this sensitive data despite warnings from the US government.
But these fines were all from before the General Data Protection Regulation (GDPR) came into effect. Now, under the new rules, fines can be as high as EUR 10,000,000 or 2% of total global annual turnover for the previous year (whichever is higher) for lesser data breach incidents. For significant data breaches and non-compliance, the fines can be double that: EUR 20,000,000 or 4% of total global annual turnover (whichever is higher).
British Airways’ GBP 183 million fine is the equivalent of 1.5% of its turnover in 2017. Had the ICO gone for the maximum limit, the fine could have been as much as GBP 489 million.
A lot can still happen before the fine is finally issued, and BA is likely to dispute the decision in court (Willie Walsh, the CEO of BA’s parent company, IAG, has said they will). But even if the fine ends up being significantly lower, there are obvious lessons to be learned from this case:
“People’s personal data is just that—personal.” These were the words spoken by Elizabeth Denham, the ICO Information Commissioner, in response to media enquiries on the fine. In other words: companies will need to take data privacy extremely seriously from now on or expect very hefty fines.
Attitude matters. British Airways chairman and chief executive, Alex Cruz, said in a statement that BA was “…surprised and disappointed in this initial finding from the ICO. British Airways responded quickly to a criminal act to steal customers’ data. We have found no evidence of fraud/fraudulent activity on accounts linked to the theft. We apologize to our customers for any inconvenience this event caused.”
Although we can’t know this for certain, the response may reflect what could be described as an “attitude problem” in how BA has been dealing with the ICO: a whiff of arrogance, blaming the breach on criminal hackers and failing to accept any blame or real responsibility for the incident.
We know from other GDPR cases in other countries that any failure to cooperate with the authorities may result in larger fines. Full transparency, full cooperation and accepting responsibility are the way to go. If it’s your data, then the buck stops with you.
The risks associated with IT cutbacks just went through the roof. The operating losses following the financial crises of 2008 made the carrier slash back its IT budgets (as well as other “expenditures”). Airlines, in general, are notorious for under-spending on IT, but when combining that with further cutbacks on IT expenditure, disaster may ensue. BA’s recent IT related woes may or may not be a direct result of under-spending on IT, but in the court of public opinion, this connection has been made.
In any case, with the new fine regime under the GDPR, the risks associated with under-spending on IT – and on IT security in particular – have now gotten substantially bigger.
More than ever, the notion that IT is an expenditure that can be cut back on is a false economy.