codete cloud computing history and the future main 10c9cbfe27
Codete Blog

Cloud Computing - History and the Future

Karol Przystalski c529978f2b

21/07/2021 |

10 min read

Karol Przystalski

More and more organizations move their workloads to the cloud to reap benefits from its scalability and flexibility. 

Flexera’s State of the Cloud 2021 report showed that 92% of enterprises have a multi-cloud strategy - meaning that they use more than one public cloud platform at the same time. And another 82% have a hybrid cloud strategy where they use public cloud services together with local data centers. 

Cloud spendings are increasing as well - today, 36% of organizations spend more than $12 million per year on public cloud, and 90% of them expect their cloud usage to grow further because of the COVID-19 pandemic.

But how has cloud computing become such a critical resource? Why did it become so popular among startups and enterprises alike? What’s the future of the cloud? 

To answer these questions, let’s take a look at the past. In this article, we trace the evolution of the cloud - from its beginnings to the present times and beyond - and analyze all the key concepts that lead to disrupting the industry.


Here's and outline of what you will find in the article:

  1. When did cloud computing start?
  2. Private cloud, cloud security, and the future of cloud computing
  3. Cloud history - what does it mean for business?

 

When did cloud computing start?

Let’s start with the basics. What exactly is cloud computing all about?

Cloud computing offers on-demand access to computing resources via the internet. The resources in question can be servers (both physical and virtual), data storage, development tools, applications, networking capabilities, and more. They’re all hosted at a data center owned and managed by a cloud services provider like Amazon Web Services, Microsoft Azure, or Google Cloud Platform. They all make such resources available for a monthly subscription fee or bill them according to usage data.

Why is cloud computing so disruptive relative to traditional on-premises IT?

  • It reduces the costs of the IT infrastructure - purchasing, installing, configuring, and managing an on-premises infrastructure is an expensive endeavor. Public cloud offloads most or all of this effort for a cost-effective solution.
  • It enhances your agility and time-to-value - thanks to the cloud, teams can start using enterprise applications in minutes instead of waiting for the IT department to respond to their request for weeks or even months.
  • It’s easier to scale - instead of buying excess capacity that ends up sitting unused during periods of low usage, you can easily scale capacity up and down in response to spikes in traffic.

A brief history of cloud computing

Milestone 1: Shared time mainframe

In its essence, the cloud is someone else’s computer that we can use remotely. The origins of this idea date back to the 1950s when the first concepts of time-sharing emerged in the IT field.

Back then, computers were very expensive, and buying one for each individual user in a company wasn’t feasible. Instead, several people would connect to a single shared machine and use it at the same time. By using the so-called dumb terminals, they avoided wasting processor cycles. This idea was pioneered by John Backus, who described it in the 1954 summer session at MIT. Bob Berner developed it further in a 1957 article in Automatic Control Magazine and W. F. Bauer took it up in a scientific paper published in 1958.

The first implementation happened at MIT under the name CTSS (Compatible Time-Sharing System) in 1959 by John McCarty. In 1961, Donald Blitzer demonstrated something similar - the PLATO II system.

But the commercially successful product was the Dartmouth Time-Sharing System released a few years later, in 1964. Computing power started to be treated as a commodity, and new computer bureaus emerged where customers could purchase processing power. This model worked really well until the 1980s, when cheap personal computers entered the scene.

Milestone 2: Emergence of a global network, aka the internet

An essential ingredient of the cloud’s success is connectivity. The idea is that users can access cloud-based resources from any location in the world.

The first mainframes used by several people at the same time were usually located in the same building. Local networks were in operation by the end of the 1950s. But in 1960, a new idea emerged - J.C.R Licklider proposed a global network that would connect all of the existing computing centers. In 1962, DARPA hired him to serve as the director of the new office. The idea was to connect the United States Department of Defence mainframes located at the Pentagon, Cheyenne Mountain complex, and Strategic Air Command.

The project ARPANET based on packet switching began in 1966 and became operational in 1969. It went from 4 nodes to over 200 by 1981. This constituted a core part of the network that later developed into the internet during the 1990s. 

When the internet became widespread, companies started building web applications like crazy. This, in turn, created an increased demand for servers and data centers for hosting them. So, the idea of selling computer power as a commodity reemerged.

Milestone 3: The rise of Virtual Machines (VMs)

Virtualization is another key aspect of the cloud that provides users with virtual computers that are independent of the underlying hardware. Moving between on-premise and cloud - as well as between different cloud platforms - should be easy. 

Virtualization is nothing new under the sun. The idea of a virtual machine simulating enough hardware to allow system operation to run in isolation was introduced in 1966 - with the IBM CP-40 and CP-67 operating systems.

IBM released the first hardware-assisted virtualization that enables a machine to provide architectural features for virtualization in 1972 (the IBM/370 machine). Intel added the modern x86 virtualization features we know so well in 2005 (asVT-x) and to AMD processors in 2006 (AMD-V). 

Virtualization at the level of the operating system led the way to containers like Docker that emerged in 2013 and opened the door to microservices

Milestone 4: The early cloud

The term “cloud” was used already in the 1990s to describe something on the internet or located outside in network diagrams. The rise of web applications that allows performing tasks inside web browsers created the need to differentiate these apps from desktop apps that users had to install on their computers.

This is when the term Software as a Service (SaaS) emerged. Salesforce encapsulated this approach perfectly in the late 1990s. When web development boomed, the industry was looking for a simpler way of hosting new applications. This led to Platform as a Service (PaaS), with the precursor Zimki launched in 2006. In 2008, Google released its App Engine - a service that would later become Google Cloud Platform.

Milestone 5: The rise of the cloud

As internet-based companies grew, they needed more computing power to handle traffic peaks like Black Friday sales. This was when the idea of renting computing power in a flexible way was born, leading to Infrastructure as a Service (IaaS).

Amazon Web Services was the first company in Infrastructure as a Service, which is at the core of the modern cloud’s origins. AWS started in 2003 with SQS queues but made a mark on the industry in 2006 with the release of Elastic Compute Cloud (EC2). The service allows users to rent virtual machines billed per second of usage instead of investing in classic servers and building a traditional on-premises data center. Microsoft followed with a similar service Azure Virtual Machines in 2010 and Google with Google Compute Engine in 2012.

Today, these three companies are considered the big three cloud providers, with Oracle, IBM, and Alibaba trailing behind. Modern cloud platforms use virtual machines to offer compute capacity but the cloud is about so much more: providing networks, container engines, various forms of storage, applications, and much more.

Private cloud, cloud security, and the future of cloud computing

Cloud security

Security is the primary obstacle for embracing the cloud, especially public cloud services. However, cloud service providers are investing a lot in security solutions that outperform on-premises security.

According to McAfee, 52% of companies enjoy better security in the cloud than on-premises. Gartner predicted that by 2020, Infrastructure as a Service (IaaS) cloud workloads will experience 60% fewer security incidents than workloads ran in traditional data centers.

Still, keeping the cloud secure requires companies to implement different procedures and develop different employee skillsets such as:

  • Shared responsibility for infrastructure security with the cloud provider
  • Data encryption at rest, in transit, and in use
  • Smart user identity and access management
  • Thorough security and compliance monitoring.

Private clouds

Organizations that need to comply with strict regulations - for example, in the financial services industry - often turn to private cloud systems that are designed to meet these requirements. 

Private cloud platforms are used mainly for storing and sharing sensitive data, but also services like email. A private cloud offers the advantages of the public cloud like scalability and elasticity but allows controlling security and addressing privacy concerns (like in an on-prem environment).

A private cloud is usually hosted on-premises in the company’s data center but can also be hosted on a cloud provider’s infrastructure or developed on rented infrastructure located in an offsite data center.

The future of cloud computing

Here are the most interesting trends that will be shaping the industry’s direction in the years to come. 

1. Edge computing

Edge computing technologies and the Internet of Things (IoT) are going to power one key future development of the cloud. By 2025, there will be more than 75 billion IoT-connected devices around the world. 

Edge computing brings decentralization to the table by allowing companies to store and process data near (at the “edge”) of the data collection source (the device). The idea is to locate the application closer to the data source or the end-user. This is how applications can work in real time with almost zero latency.

2. Quantum computing

Processing massive data volumes fast is still a challenge. Quantum computing is here to solve the problem by handling complex calculations and large data sets in a matter of a few minutes. Another potential use is encrypting communication and enhancing cybersecurity.

Quantum computers promise to speed up the development of innovative medical solutions and machine learning techniques for accurate diagnoses or financial strategies. The idea is to save time and costs through this type of optimization.

3. Cloud automation 

Automation solutions help teams to automate manual tasks, eliminate human errors, and optimize processes related to the provisioning, management, and monitoring of cloud-based environments.

Companies looking to optimize their cloud resource use will be looking for AI-based tools that make quick decisions about resource provisioning and work towards cost optimization of the infrastructure.

4. The Internet Of Everything (IoE)

The Internet of Things (IoT) is a strong trend, but the Internet of Everything is going to take it to the next level. The idea here is to connect all the devices that were previously unconnected. 

The Internet of Everything aims to connect more than just physical devices, but also people, data, processes, and objects. The technology is set to revolutionize the public sector by improving labor productivity and cost utilization.

Cloud history - what does it mean for business?

Today’s cloud offerings are the result of years of development in research areas that lead to the birth of some pretty sophisticated projects. It’s hard to believe that the originals of modern cloud solutions can be traced back to the 1950s!

The general trend in the IT industry is automating as many aspects of system development and maintenance as possible, slashing the amount of required code and opportunities for human-made errors. This would allow developers to focus their effort on mission-critical tasks. The cloud is now at the forefront of this trend and likely to cause some more disruptions soon.

Are you using public cloud services or planning to migrate? We have many years of experience in this area and are happy to provide you with expert advice to help you make the most of the cloud for your business.

Rated: 5.0 / 1 opinions
Karol Przystalski c529978f2b

Karol Przystalski

CTO at Codete. In 2015, he received his Ph.D. from the Institute of Fundamental Technological Research of the Polish Academy of Sciences. His area of expertise is artificial intelligence.

Our mission is to accelerate your growth through technology

Contact us

Codete Global
Spółka z ograniczoną odpowiedzialnością

Na Zjeździe 11
30-527 Kraków

NIP (VAT-ID): PL6762460401
REGON: 122745429
KRS: 0000983688

Get in Touch
  • icon facebook
  • icon linkedin
  • icon instagram
  • icon youtube
Offices
  • Kraków

    Na Zjeździe 11
    30-527 Kraków
    Poland

  • Lublin

    Wojciechowska 7E
    20-704 Lublin
    Poland

  • Berlin

    Bouchéstraße 12
    12435 Berlin
    Germany