Not long ago, I was planning a trip to San Francisco to take the bio-metrics test for my U.K. visa application. I had all my materials prepared, and I was ready to hop on the train. But then, I encountered a big problem. According to the rules of the U.S. Citizenship and Immigration Services, I cannot bring my phone with me. Ok but wait, how am I going to navigate the city without Uber? And how am I going to check into my Airbnb with ease? It turned out, I can. I can just run around hoping to spot a taxi, and book a hotel a few weeks in advance like what we did 10 – 20 years ago. However, this is certainly not the way most of us with a smart phone prefer to do nowadays.
In less than 20 years since the turn of the century, the world has witnessed exponential growth in the cloud computing industry, and the dramatic impact it has on how we live our lives and how we run our businesses. Cloud computing is the practice of storing and accessing data and software over the Internet instead of through your computer’s hard drive. In short, cloud is a data center (many server computers) at somewhere else, and you are utilizing it with Internet connection. While most of us assume that the cloud computing industry was born alongside the Generation Z, the earliest cloud concept dates back to the 1960s.
With the development of mainframe computers, companies began to explore different options other than buying and maintaining computers for each employee, and one economical way is through shared access mainframe. At the same time, J.C.R Licklider, the director for the development of ARPANET, and computer scientist John McCarthy both introduced the idea of having computation delivered as a public utility, a vision much similar to the cloud computing we have today.
In 1970s, the concept of virtual machines (VMs) was created. Software like VMware made it possible to run multiple operating systems simultaneously on one physical environment, bringing the shared access mainframe up to a new level.
Later on in early 1990s, telecommunication companies started to provide shared data connections to users through the same physical infrastructure. Yet, software was still packed and sold to customers at a high price, including installation, customization and maintenance charges. Everything changed after 1999, when Salesforce.com launched its first software application service (SaaS) over the Internet. Rather than buying the software as a package, users only need to purchase for the right of using it online.
The 2000s was the decade where cloud computing industry really took off. In 2002, Amazon.com introduced its Amazon Web Services, a series of cloud-based infrastructures (IaaS) that enable storage of data, computation and even human intelligence. In 2006, Amazon Elastic Compute cloud (EC2) came onto the market, allowing companies to rent computers on which to run their own computer applications. Google also launched its Google Docs services, bringing the power of cloud computing and document sharing directly to end users. In 2009, Google launched the first Platform-as-a-Service, Google App Engine, allowing software developers to design apps on the platform without the need to worry system management or platform maintenance.
The decade of 2010 welcomed the blossoming of the entire cloud computing industry. With the entire market size of $67 billion in 2015 and about $82 billion in 2016, cloud computing market is projected to reach $99 billion by 2017 and $162 billion by 2020 with an ACGR of 19%. Compared to a 3% ACGR for IT spending, cloud computing is dubbed as one of the fastest growing IT segments in the next five to ten years.