Has the Cloud Tipping Point Been Reached?


Cloud computing used to be a fringe concept. Its predecessors - such as time sharing systems and thin clients - go back decades, but it wasn't until the launch of Amazon Web Services (AWS) and Microsoft Azure in the late 2000s that it really took off. It's now so thoroughly in the mainstream that a core consumer-facing Apple service - iCloud, which is backed by AWS and Azure assets - has "cloud" right in its name.

On the business side, growth in the adoption of comparable cloud solutions that follow the Infrastructure-, Platform- and Software-as-a-Service service models has rapidly eaten into the share of IT spend going toward traditional hardware and software:

If these current trends hold, cloud would reach what Cringely called a tipping point. But what does that mean for IT professionals and their departments?

How the cloud computing tipping point compares to its predecessors

We can think of corporate IT as having evolved through seven key eras, similar to how life on earth has changed over time. These eras are:


From the late 19th to mid 20th centuries, what we would now call computing was performed with punch cards and programming languages such as FORTRAN. This type of computing was noninteractive but had advantages in processing large amounts of stored data, either on cards or tape.


Introduced and refined in the 1950s and 1960s, timesharing was sort of proto-cloud. It allowed multiple users to concurrently access a centralized CPU, which had sufficient resources to serve all of them simultaneously.


PCs originally were driven by command-line interfaces that could efficiently execute many tasks with the right commands, but weren't super user-friendly. Their evolution eventually initiated a fourth major era, characterized by...

4. ...GUIS

A graphical user interface (GUI) presents an easily navigable layer on top of the operating system. When you click or tap on icons to open apps or perform action, you're using a GUI. All major operating systems (OSes) such as Microsoft Windows and Apple macOS have GUIs.


The rise of the World Wide Web in the 1990s did for the internet what GUIs did for PC OSes: make it simpler to use. The web and IP initiated a shift in power away from native on-machine applications to websites.


Beginning with the original iPhone in 2007, PC-grade OSes were downsized to run on IP-enabled mobile hardware such as cellphones and tablets. These devices are now the primary way many people access the internet, while their raw computing power is often comparable to a laptop.

"Cloud has become deeply interwoven with today's OSes, applications and websites."


Cloud has become deeply interwoven with today's OSes, applications and websites, to the extent that many of them are basically nothing more than interfaces for accessing cloud resources behind the scenes. For example, every app from Instagram to Office 365 has at one time relied on AWS or Azure.

Cloud turned out to be a "next big thing" that actually panned out, unlike duds such as early virtual reality headsets or 1980s-era artificial intelligence. As an IT professional, it will be a central skill to have in almost any context.

New Horizons Computer Learning Centers offer many cloud-focused courses, including a track in Azure. Learn more by viewing its course listings and finding a location near you today.

Nov 2017

By: Morgan Landry