A combination of pressures prompted Derby City Council to review its on-premise data centre strategy in 2015.
The recent publication by the UK Cabinet Office of the Phase 2 Reports on the government’s G-Cloud Programme, following the work of some seven different working groups, begins to chart a path of greater clarity around what this will mean for the future procurement of ICT services, as these support the delivery of services to the public.
In addition to the Vision, which is reconfirmed, and building on previous reports including Digital Britain, the Data Handling Review and Smarter Government, the programme reports spell out the roadmap for defining the benefits, as well as the processes for migrating from current datacentre and application architectures and third-party services, to more agile and efficient ways of sourcing and delivering ICT services, while at the same time providing assurance as to the correct handling of the information concerned.
The business case
The now broadly accepted definition of Cloud Computing (with reference to work done by NIST in the US, Cloud Security Alliance, UK Government and others) is of a range of services around Infrastructure, Platform or Software, which can be sourced on an as-needs basis, with more flexibility, scalability, and choice than with traditional in-house or managed service approaches, and with corresponding significant cost savings – increasingly a key driver in these cost-constrained times.
The standard NIST definition2 goes on to define five essential characteristics of a cloud service, namely:
On-demand self-service: A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service’s provider.
Broad network access: Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).
Resource pooling: The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. There is a sense of location independence in that the customer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or data centre). Examples of resources include storage, processing, memory, network bandwidth, and virtual machines.
Rapid elasticity: Capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.
Measured Service: Cloud systems automatically control and optimise resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported providing transparency for both the provider and consumer of the utilised service.
The Cabinet Office Phase 2 G-Cloud report builds upon these principles in each of the working group reports, which consider Commercial aspects, Service Specification, Technical Architecture, Service Management, Information Assurance and Implementation Strategy, including transition arrangements.
It is not just the UK Government that is moving ahead with such plans. The US Administration has been conducting work to help guide those in government organisations who wish to consider moving to cloud-based services for a part of their service delivery. This guidance covers standardised definitions for service levels, implementation advice and particularly guidance around Security Assessment and Authorisation for cloud Services – see FedRAMP, below.
The EU has also been progressing similar initiatives. At the World Economic Summit in Davos, January 2011, Neelie Kroes, Vice-President of the European Commission responsible for the Digital Agenda stated3: “The European Commission has done preliminary work over the last few years, such as funding cloud research or analysing the security implications of cloud computing. For example, our European Network and Information Security Agency (ENISA) has just published a report4 on this.
Now is the time to bring it all together. As foreseen in the Digital Agenda for Europe, I have started work on an EU-wide cloud computing strategy. This goes beyond a policy framework. I want to make Europe not just ‘cloud-friendly’ but ‘cloud-active’.
We can deliver cloud computing by using research and innovation to bring about better clouds. Along the way we can modernise our computing infrastructure and give our SMEs a new platform for innovation.”
Risks and concerns – as well as benefits
This brings us to the oft-cited question of risk in the cloud, still referred to in some surveys as the greatest area of concern, and one which needs to be addressed if those driving the procurement and management of ICT-supported services are going to have the confidence to make change decisions in this space.
In some of the early debate around cloud, there was a perception that information processing responsibilities could not be handed over to Cloud Service Providers (CSPs) unless information security could be guaranteed 100 per cent. And of course this is nonsense – as everything ought to be determined by risk, and risk appetite.
“Everything we do in life carries a risk. What do we do about that?” says Susan Stoker, management consultant at Stoker Watts McLeod. “Risk management, in any situation, is just that – the management of risk, rather than the elimination of all risks. The only way to truly eliminate risks is to end the activity associated with that risk. This is not always possible or practical.”
Clearly, though, the procuring organisation will need to satisfy itself that it is still meeting its legal, regulatory and internal compliance obligations, even when moving information and services over to be processed by third-parties and cloud-based services. Much of the work done by the previously-mentioned working groups and other broader industry collaboration efforts (see Assurance Frameworks, below) is attempting to define baseline controls for cloud services, or assessment of control maturity on the part of cloud service providers, with reference in some cases to certification against existing or new standards, thus enabling a more informed and confident sourcing decision on the part of information owners.
Susan adds: “Good risk management, which is there to support the achievement of objectives, is about taking opportunities as much as ending risks. For it to be effective, however, the level of exposure that is deemed acceptable (risk appetite) needs to be defined. This concept applies as much to the use of cloud computing as it does with any business process.”
John Morrison, managing director, Sapphire makes a different point: “One of the major concerns for organisations today is security with the cloud. But, it is more of a nebulous concern derived from a lack of control over the environment where the data will reside. The truth is, most of the major cloud providers have much stronger security on both a physical and logical level than most datacentres currently have. But, does this mean that your data is safe and secure? NO. This is where you must identify ways to control access to the Cloud and your valuable business applications and data.”
So this raises a specific point – an Access Control and Identity Management system which is portable, and can transcend
organisational boundaries, and many of the control frameworks and definitions give guidance on this point.
“The cloud does not have your authentication infrastructure,” continues Morrison. “It has its own. And, you’re not in control of it, the cloud provider is. That means that your users have another set of passwords and logins to remember for every Cloud application they need to access. When you give your users too many passwords and user names to remember, they create security risks… they write them down.”
So here’s a significant source of risk – you don’t control the access to your own data and applications, your cloud provider does. And when your users have too many different, usually complex, passwords, non-reusable, expiring at different time periods, they will tend to write them down. Since you don’t have control over the authentication process, you rely on the cloud provider. But, many do not offer multi-factor authentication, which may help with extra levels of protection.
Morrison concludes: ”The cloud providers don’t provide centralised access control for your cloud data and applications. Every provider is different. You need to get access to the data and applications back into your control. Then, you can determine who can access what and when. And, you can have a full audit trail over what happened, by whom, when and from where. With audit requirements today, this will quickly become a strong requirement for organisations for the future.”
Brian Honan, CSA UK & Ireland Board member, adds: “Cloud computing can bring many advantages to an organisation in terms of better efficiencies, ability to scale to meet demand and better use of resources. However, as well as offering many advantages cloud computing also presents a number of risks and challenges to organisations, not least of which is assuring the security of the data within the cloud and complying with specific regulations such data privacy.”
And so a cornerstone of the efforts to fill some of this confidence-gap and to provide specific assurance relating to cloud-provided services, is to be found in several different initiatives, aiming to develop controls guidance or assessment frameworks.
These relate to both public and private sectors, and have origins in audit, security and business requirements, and on both sides of the Atlantic. And there is already some move toward consolidation and mutual recognition or mapping.
Obviously the standards, specifications and guidance in the UK G-Cloud reports can be seen as one such framework. So too, the Federal Risk and Authorization Management Program FedRAMP5, published last November, by the US Government’s General Services Administration and CIO Council, offers a clear guidance framework to be followed by those government organisations considering cloud services, and wondering how to address the risk and assurance aspects. FedRAMP does not itself embody a control framework itself, but refers to the US Government’s NIST 800.53 control standard (Recommended Security Controls for Federal Information Systems and Organizations) – in much the same way as the G-Cloud reports make reference to the ISO 27000 control set.
A number of initiatives emanating from industry and professional groups in the private sector are also worthy of mention. CloudAudit is a volunteer cross-industry effort from the best minds and talent in cloud, networking, security, audit, assurance and architecture backgrounds. CloudAudit’s charter is to provide a common interface that allows cloud providers to automate the Audit, Assertion, Assessment, and Assurance (referred to as A6) of their environments and allow authorised consumers of their services to do likewise via an open, extensible and secure set of interfaces.
The Cloud Security Alliance6 is a not-for-profit organisation with a mission to promote the use of best practices for providing security assurance within Cloud Computing, and to provide education on the uses of cloud computing to help secure all other forms of computing. The Cloud Security Alliance is led by a broad coalition of industry practitioners, corporations, associations and other key stakeholders. CloudAudit has recently become a joint project of the CSA with a shared mission. CSA has published a Control framework for the Cloud, as well as a personal knowledge certification Certificate in Cloud Security Knowledge (CCSK), and is also running a broad programme of research and communication – also through a global network of country-specific chapters.
The Common Assurance Maturity Model7 is an initiative similarly driven by an international collective of professional thought leaders and industry representatives. Its aim is to provide a framework to give the necessary transparency in attesting the Information Assurance Maturity of a third party (e.g. a cloud provider). This will also allow the publication of results to be performed in an open and transparent manner, without the mandatory need for third party audit functions, and allow for data processors to demonstratively publicise their attention to Information Assurance over other suppliers that may not take it as seriously. The CAMM initiative is working in partnership with the CSA, with the aim of mutuality of mapping and recognition, and is also supported by ENISA, The European Network and Information Security Agency. Currently the CAMM team has developed a draft control framework which it is currently piloting with industry organisations, and which it aims to publish in mid-2011, with certification an objective in the near future.
So there you have it – hopefully upon reading this article, you will realise that much work has already been under way in helping to bring consistency, clarity and substance to the definition of cloud services, the objectives and benefits, implementation guidance and assurance standards – and it is still ongoing, throughout this year, and probably next.
The references at the foot of this article will allow you to keep in touch with these developments as they unfold, and hopefully allow you to take part in better-informed discussion and decision-making in respect of the cloud. It’s the future – it is just a different way of sourcing service and delivering information processing – and it’s here to stay.
For more information
A combination of pressures prompted Derby City Council to review its on-premise data centre strategy in 2015.