When cloud computing is just computing
According to Midsize Insider (an IBM publication), “The term ‘cloud’ has been an IT catchphrase for a short period although the types of services that define this technology have been in use for some time. This kind of computing may already be mainstream and no longer have the need for a catchphrase.”
This is something I’ve been saying for awhile, that the term ‘cloud’ will eventually bake out of cloud computing and it will be just another model for computing. Much like other technologies that came along in the past, it just becomes a part of our architectural arsenal.
The term “cloud” always seemed buzz-wordy to me. Indeed, we called it other things before the term “cloud” stuck, including on-demand, Web 2.0, Internet-hosted, etc., but I’m not sure those are any better.
Cloud computing is about how IT resources are consumed, and not about any new computing patterns. Storage, compute, database, etc., are all concepts that pre-exist cloud computing, but they are now a part of most clouds. What’s changed is not the patterns of technology offered, but the on-demand, self- and auto-provisioned consumption model. Moreover, the ability to leverage these resources using metered use.
Will I miss the term “cloud” when it finally becomes just a part of IT lore? I suspect we’ll use it for the next several years; we just like the term way too much. However, when we stop talking about cloud computing, that’s typically when we’re finally getting the value out of the model. That’s a good thing.