The Complete Magazine on Open Source

The Pros and Cons of Cloud Computing

, / 300 0

Cloud computing

Cloud computing is often touted as the future of business and enterprise technology. Like everything else, cloud computing has its own pros and cons. This article discusses the advantages and disadvantages of using the cloud, and gives a brief overview of various open source cloud computing technologies.

Cloud computing is a term used to describe a new class of network based computing that takes place over the Internet. These platforms hide the complexity and details of the underlying infrastructure from users and applications by providing a very simple graphical interface.
Clouds are transparent to users and applications and they can be built in multiple ways. In general, they are built on clusters of PC servers combined with in-house applications and systems software.
Cloud computing enables companies and applications that are dependent on the system infrastructure to remain infrastructure-less and, instead, use the cloud infrastructure on a ‘pay as used’ basis; thereby, the companies can save capital and operational investments.
Clients can put their data and applications on the cloud instead of on their own desktop PCs or their own servers. Also, they can use the servers within the cloud to do processing, data manipulation, etc.

Advantages of cloud computing

  • Lower computer costs: Since applications run in the cloud, one’s desktop PC does not need the processing power or hard disk space demanded by traditional desktop software.
  • Improved performance: Computers in a cloud computing system boot up and run faster because they have fewer programs and processes loaded into the memory.
  • Reduced software cost: Instead of purchasing expensive software applications, you can get most of what you need for almost free.
  • Instant software updates: When the application is Web based, updates happen automatically. That is, when you access a Web based application, you get the latest version.
  • Improved document format compatibility: You do not have to worry about the documents you create on your machine being compatible with other users’ applications or operating systems.
  • Unlimited storage capacity: Cloud computing offers virtually limitless storage.
  • Increased data reliability: If your personal computer crashes, all your data is safe out there in the cloud, still accessible.
  • Easier group collaboration: Multiple users can collaborate easily on documents and projects.
  • Device independence: You are no longer tethered to a single computer or network.

In spite of all the advantages of the cloud, stored data might not be secure. Since all your data is stored on the cloud, the important question is: How secure is the cloud? Can unauthorised users gain access to your confidential data?
When an organisation elects to store data or host applications on the public cloud, it loses the ability to have physical access to the servers hosting its information. As a result, sensitive and confidential data is at risk from outsider and insider attacks.

Measures to be taken to improve the security of the cloud

  • Cloud service providers must ensure proper data isolation
    In order to conserve resources, cloud service providers often store more than one client’s data on the same server. As a result, there is a chance that one user’s private data can be viewed by other users (possibly even competitors). To handle such sensitive situations, cloud service providers should ensure proper data isolation and logical storage segregation.
  • Encryption of data
    Enterprises must select a cloud storage provider that supports encryption of data in-flight and data at-rest. Amazon Web Services (AWS) moves data from Simple Storage Service (S3) across an SSL connection and protects that data using AES-256 encryption. Also, enterprises can choose available third-party encryption tools such as Viivo, Sookasa or Cloudfogger.
  • Data centres must be frequently monitored
    According to a recent report, insider attacks are the third biggest threat in cloud computing. Therefore, cloud service providers must ensure that thorough background checks are conducted for employees who have physical access to the servers in the data centre. Additionally, data centres must be frequently monitored for suspicious activity.
  • Use of virtualisation should be reduced
    Virtualisation alters the relationship between the OS and underlying hardware – be it computing, storage, or even networking. The extensive use of virtualisation in implementing cloud infrastructure brings along unique security concerns for customers of a public cloud service. It is said that even a breach in the administrator’s workstation, by the management software of the virtualisation software, can cause the whole data centre to go down or be reconfigured to an attacker’s liking.

Open source cloud computing technologies
A few open source cloud computing technologies are briefly discussed in this section.

OpenStack comprises a set of software tools for building and managing cloud computing platforms for public and private clouds. It is one of the most important open source technologies for enterprises and developers. OpenStack is considered as Infrastructure as a Service (IaaS). It provides infrastructure, making it easy for users to quickly add new instances, upon which other cloud components can run. The infrastructure then runs a platform upon which a developer can create software applications that are delivered to the end users. OpenStack serves a continuously increasing number of IT environments as a foundation for public, private and managed infrastructure. Organisations in particular have used OpenStack to build their own private clouds. For instance, Deutsche Telekom (Business Marketplace) uses OpenStack to build its cloud platforms.
If you want to use OpenStack, first use tryStack, which will let you test your applications in a sandbox environment. This will enable you to understand how OpenStack works and whether it is the right solution for you.

Cloud Foundry
In the growing Platform-as-a-Service (PaaS) market, Cloud Foundry takes a leading position. The project was initialised by Pivotal, a spin-off by EMC/VMware. Cloud Foundry is primarily written in Ruby and Go. Applications deployed to Cloud Foundry access external resources via services. In a PaaS environment, all external dependencies such as databases, messaging systems and file systems are services. When an application is pushed to Cloud Foundry, the services it should use can also be specified. Depending on the application language, auto-configuration of services is possible; for example, a Java application requiring a MySQL database picks up the MySQL service on Cloud Foundry if it is the only one defined in the current space.

KVM (Kernel-based Virtual Machine) is the preferred hypervisor of infrastructure solutions like OpenStack or openQRM, and enjoys a good reputation within the open source community. It is a full virtualisation solution for Linux on x86 hardware containing virtualisation extensions. It consists of a loadable kernel module, kvm.ko, which provides the core virtualisation infrastructure and a processor-specific module, kvm-intel.ko or kvm-amd.ko. Using KVM, one can use multiple virtual machines running unmodified Linux or Windows images. Each virtual machine has private virtualised hardware – a network card, disk, graphics adapter, etc.

Docker is an open platform for building, shipping and running distributed applications. It gives programmers, development teams and operations engineers the common toolbox they need to take advantage of the distributed and networked nature of modern applications. The container technology, which was created as a by-product during the development of the dotCloud Platform-as-a-Service, is currently experiencing a strong momentum and gets support from large players like Google, Amazon Web Services and Microsoft. Docker enables the loosely coupled movement of applications that are bundled in containers, across several Linux servers, thus improving application portability. At first glance, Docker looks like a pure tool for developers. From the point of view of an IT decision-maker, however, it is definitely a strategic tool for optimising modern application deployments. Docker helps to ensure the portability of an application, to increase the availability and to decrease the overall risk.

Apache Mesos
Mesos rose to be a top-level project of the Apache Software Foundation. The Mesos kernel runs on every machine and provides applications (e.g., Hadoop, Spark, Kafka, Elastic Search, etc) with APIs for resource management and scheduling across entire data centre and cloud environments. It was conceived at the University of California in Berkeley and helps to run applications in isolation from one another. At the same time, the applications are dynamically distributed on several nodes within a cluster. Mesos can be used with OpenStack and Docker. Popular users are Twitter and Airbnb.

Deltacloud is an open source project started last year by Red Hat. It is now an Apache incubator project. Deltacloud abstracts the differences between clouds and maps a cloud client’s application programming interface (API) into the API of a number of popular clouds. As a result, Deltacloud is a way of enabling and managing a heterogeneous cloud virtualisation infrastructure. It allows for any certified virtualised environment to be managed from one common management interface. And it does this by enabling different virtual machines to be transferred or migrated in real-time from one virtualisation capacity to another. If an enterprise is already using IBM Tivoli or HP OpenView, Deltacloud can be integrated.

OpenNebula is an open source toolkit for cloud computing. It allows you to build and manage private clouds with Xen, KVM and VMware ESX, as well as hybrid clouds with Amazon EC2 and other providers through Deltacloud adaptors. The remote public cloud provider could be a commercial cloud service provider such as Amazon, or it could be a partner private cloud running a different OpenNebula instance.