By Debra Littlejohn Shinder.

Circles and cycles – it's the way of the world, and everyone from Willie Nelson to Walt Disney has paid homage to it. The computer industry hasn't been around as long as the lion kings or the relationship dilemmas posed in the redheaded stranger's album, but the tech sector is now mature enough that it has seen trends come and go – and come again. Even those trends that seem "new and revolutionary" often turn out to be simply a repackaged form of what came before, just as one generation's "short shorts" emulate a previous generation's "hot pants."

Consider the cloud. What a radical and innovative idea it was: instead of saving our data and running our applications from a local computer or a server housed in an on-premises data center, we could take advantage of economies of scale by using lower-powered devices to access those resources that are actually located on larger, more powerful systems in a remote place.  To those of us who have been around for a while, it sounds a lot like mainframe computing. Of course there are differences. Technology has changed and grown far more sophisticated in the intervening years. However, the basic premise – "timesharing" of compute power on large systems by many different customers – is more or less the same.

Let's take a look at how we got from mainframe to cloud and what that progression might portend for future trends. With any common practice, whether it's technologically based or not, as time goes on we have a tendency to notice its limitations and disadvantages. Centralized computing in the mainframe era had obvious benefits – cost was a major one, since early systems not only cost hundreds of thousands of dollars (or more) but also took up entire rooms or buildings, so that the cost of the space to house them added to their TCO. Personnel who knew how to program, operate, maintain and troubleshoot them were in short supply, as well. For all those reasons, timesharing made sense.

However, it also had its drawbacks. Big businesses (and later small ones) wanted more control over the systems, so "mini-mainframes" came into being. These could be owned and controlled by the individual company instead of renting time on a system shared among companies and individuals.  Then, as technology advanced, along came the PC revolution. Computers got personal – and a lot cheaper. Bill Gates' dream of a computer on every desktop became a reality.  Many businesses provided PCs to dozens or hundreds of employees in the form of standalone systems. Transferring information from one of these computers to another was done via "sneakernet" – copying it to a floppy disk and carrying that disk to the other PC where it was physically inserted into the second computer's drive.

That wasn't very efficient, so networked PCs evolved in the next natural step. Token ring, ARCnet, Ethernet – we devised ways to connect all those computers together so information could be shared between them over cables. Something else that wasn't very efficient was having everyone store copies of that information on their own computers. It made more sense, now that the systems were networked together, to put data on one system that everybody could access: a file server.

That worked so well that we looked for ways to share other resources, and as security concerns entered the picture, and companies started requiring authentication to log onto computers and access files, it became unwieldy to deal with logon credentials for all the different PCs that were sharing things with one another. A centralized security setup overseen by network professionals could be made more secure, so the client-server model morphed into the server-based computing model, as exemplified by the NetWare directory, Windows Active Directory domains, UNIX Kerberos realms, and so forth. Centralized security and administration within the confines of the local area network became the name of the game, and dominated the IT world for many years.

Then business went mobile. Hardware trends took the miniaturization of computers to a new level. Computers weighing only a couple of pounds now have capabilities that exceed those of the original room-sized mainframes by many orders of magnitude. Wireless networking made mobility even easier. We can access the company resources from anywhere – home, a hotel room, an airplane – using our laptops, tablets, or smartphones.

Since we're doing our work on smaller devices that necessarily have less storage space, and since we're switching between multiple devices, storing those resources that we need in a centralized location makes it easier for users, and using a public cloud provider to accomplish that takes much of the administrative burden off of IT personnel and can also increase the level of security since the major providers are able to invest more in securing their datacenters than the typical company is able to do.

And that's where we are today, with many companies having already moved to the cloud and many more contemplating such a move. This client-cloud computing model currently offers cost savings, convenience, and other benefits. However, when we look at it in light of the history of computing, we can see that this trend is mostly part of a cyclic progression rather than a linear one. What does that mean in terms of predicting future trends?

Some believe that cloud computing, like other trends, will run its course and culminate in a second PC revolution (or its equivalent, which may be based not on the discrete but generalized computing devices we have today but on an Internet of Things that will be used to perform various computing tasks while integrating more fully into our everyday lives).  They base this on the likelihood that the cloud's drawbacks – costs that will probably rise as other alternatives fade or become even more expensive, lack of ability to do anything when disconnected from the network, speed and reliability that will always lag behind that that's possible with connections made locally, and loss of control over their own data and applications – will eventually lead at least some savvy users to seek "new" ways of computing that solve those problems.

No one can say with certainty what the future will bring, but a study of both recent and ancient history (are emoji just the modern day equivalent of hieroglyphics?) might lead one to put some credence into the theory that circles and cycles will keep leading us back into the scenes that we've all seen before.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.